I’ve been on the fence for awhile about whether the Singularity Institute is worth donating to. I alluded to my uncertainty in my post on Kony 2012. I finally made my first donation just now. Two things pushed me over the edge.
First is Luke Muehlhauser’s reply to Holden Karnofsky of GiveWell regarding donating to the Singularity Institute. A couple points that stood out were the argument that the case for worrying about AI risk does not depend on specific views of Singularity Institute folks, as well as Luke’s citation of an endorsement of the Singularity Institute by Nick Bostrom (who, in my opinion, does good work and probably knows what he’s talking about in these matters).
Second was just the fact that some Singularity Institute donors have pledged to match donations to the Singularity Institute for up to $150,000 for the month of July, which doubles the effectiveness of donations made between now and either when that goal is reached or the end of July, whichever comes first.
But I guess the real issue is not what put me over the edge, but what had me seriously considering donating to the Singularity Institute in the first place. The main reason that I think big changes are coming soon enough that they’re worth worrying about now.
Furthermore, I think even if the Singularity Institute doesn’t directly accomplish everything Eliezer Yudkowsky hopes it will, it’s work is (right now) having good effects by getting important ideas out there by means of things like the Singularity Summit and the academic papers Luke has been working on. I think it’s valuable to increase the number of smart people working on these issues.
This is not to say you definitely should donate to the Singularity Institute; I’m actually somewhat uncertain as to whether it’s a better place for your charitable donations than the charities recommended by GiveWell. But I’d advise considering it.