AnonymousAccount

262Joined Aug 2022

Comments
21

MacAskill was definitely a longtermist in 2012. But I don't think he mentioned it in Doing Good Better, or any of the more public/introductory narrative around EA.

I think the "pivot to longermism" narrative is a reaction to a change in communication strategy (80000 hours becoming explicitly longtermist, EA intro materials becoming mostly longtermist). I think critics see it as a "sharp left turn" in the AI Alignment sense, where the longtermist values were there all along but were much more dormant while EA was less powerful.

There's a previous discussion here

Not exactly an answer, but as an anecdote I know an EA employee who was asked not to publish something 100% supportive of their employer in response to some criticism. We both found it a bit weird, but I assume that's how all organisations work

Thanks so much for sharing your perspective, as the main party involved.[1] A minor nitpick:

As an example, the top-rated fund at GWWC is the one for Climate Change: https://www.givingwhatwecan.org/charities/founders-pledge-climate-change-fund

I think this might be misleading: there are 12 top-rated funds. The Climate Change one is one of the three "Top-rated funds working across multiple cause areas".

  1. ^

     (and thank you for all the good you're doing)

As other comments have noted, a lot of the proposals seem to be bottlenecked by funding and/or people leading them.

I would recommend people interested in these things to strongly consider Earning to Give or fundraising, or even just actually donating much more of their existing income or wealth.
If the 10 authors of this post can find other 10 people sympathetic to their causes, and each donate or fundraise on average 50k/year, they would have $1M/year of funding for causes that they think are even better than the ones currently funded by EA! If they get better results than existing EA funds people and resources would flock to them!

If you think the current funding allocation is bad, the value of extra funding that you would be able to allocate better becomes much higher.

Especially if you want to gather funds for work on climate change, I suspect fundraising would be easier than for any other global cause area. Instead of asking Moskovitz/Openphil for funding, it might be even higher EV to ask Gates, Bezos, or other billionaires. Anecdotically, when I talk to high net worth people about EA (non billionaires), the first comment is almost always "but what about climate change, which clearly is the most important thing to fund?"

I disagree with a lot of of the content and the general vagueness/EA Should of this post, but I appreciate the huge effort that went into it and here I am trying to be constructive. I might write a less constructive comment with all my disagreements at some point if I can think of a way that could inform more consequentially useful criticism.

After reading most of the post, as a person that is in EA to give money/support, and not to get money/support, it's not clear to me how can I help you.

Also, I would add at the very least Gwern (which might be relevant to note regarding the current topic) and Scott Alexander as other two clear cases of "personalities" in LW

I'm pretty sure EVF will send all that it's legally required to send to EVF depositors.

After that I really hope that the argument "that money could instead go to FTX depositors" will not replace "that money could literally save several lives", when talking about buying food/drinks vs donating money.

Thank you for the pushback on the title!

I wonder what are your thoughts on delaying timelines, instead of working on tooling, but I guess it might hinge on being more longtermist and personal fit.

Edited the title, do you think this is good enough?

Could you please point out your estimation? Since at the end of the day we do need to decide what to work on.

Do you think it would be possible to edit this post to make it less harmful/bad/wrong, and still allow me to get feedback on what's wrong with my thinking? (I think something's wrong, and posted asking for feedback/thoughts).

E.g. keeping feedback like this

 

 

It's easy to talk about the importance of x-risks without making poverty and health charities the direct comparison. 

For me it is the direct comparison that matters though, I need to choose between those two

I believe so.

I don't understand, you believe which one?

I still presume you care about people who suffer from systemic issues in the world. This kind of post would not be the kind of thing that would make anyone like this feel respected. 

Does that also  apply to any post about e.g. animal welfare and climate change?

As for damage: maybe I can write more clearly that I'm probably wrong and that I'm a random anonymous account? Would be happy to edit this post!

This assumes that there is a 10% chance of extinction via AGI per day. I don't think anyone believes it to be that high; frankly if it ever gets that high we've already lost.


I don't think so, I think it assumes a 10% of extinction one time after we get AGI.

E.g. "we get AGI in 2050 -> 10B people die (on average) in 2060 with a 10% chance" vs "we get AGI in 2050 + 1 day -> 10B people die (on average) in 2060 + 1 day with a 10% chance"

Load More