GiveWell's previously recommended MSF as a good disaster relief org, so that would be my best guess. I'd love to know more, though.
“is there no EA press or comms unit that journalists contact before publishing such articles” — sometimes CEA or Forethought get asked for comment on pieces, but the vast majority of the time no one contacts us. It’s quite frustrating.
I don’t have an answer, but would suggest you talk to the folks at the Good Food Institute if you haven’t already — they might have advice, or at the very least be able to point you towards other people you could ask about this.
This is great, thanks for highlighting. Evidence Action is another excellent charity that’s nominated, here’s the link to vote for them: https://charitynavigator.typeform.com/to/PsmPZTwp#organization=Evidence Action Inc.
Thanks for this. I agree that we’ve been neglecting social media; the main reason for this as far as I can tell is that no one at CEA was primarily focused on comms/marketing until I was hired in September; then other events proved to be attention-stealing.
Social media is going to be a major part of the communications strategy I outlined here; I expect you'll see us being more active in the coming months. https://forum.effectivealtruism.org/posts/mFGZtPKTjqrfeHHsH/how-cea-s-communications-team-is-thinking-about-ea
This is interesting and I broadly agree with you (though I think Habryka’s comment is important and right). On point 2, I’d want us to think very hard before adopting these as principles. It’s not obvious to me that non-violence is always the correct option — e.g. in World War 2 I think violence against the Nazis was a moral course of action.
As EA becomes increasingly involved in campaigning for states to act one way or another, a blanket non-violence policy could meaningfully and harmfully constrain us. (You could amend the clause to be “no non-state-sanc...
Thanks for this post, it's a really important issue. On tractability, do you think we'll be best off with technical fixes (e.g. maybe we should just try not to make sentient AIs?), or will it have to be policy? (Maybe it's way too early to even begin to guess).
This is really exciting, nice work on putting it together. Do you have any plans to put the teaching materials (even if that’s just a reading list) online at any point? I think I’m not the right sort of person to do the course but I’d love to slowly work my way through a reading list in my own time.
I think this is interesting but don't think this is as clear cut as you're making out. There seem to me to be some instances where making the "first strike" is good — e.g. I think it'd be reasonable (though maybe not advisable) to criticise a billionaire for not donating any of their wealth; to criticise an AI company that's recklessly advancing capabilities; to criticise a virology lab that has unacceptably lax safety standards; or to criticise a Western government that is spending no money on foreign aid. Maybe your "personal attack" clause means this kind of stuff wouldn't get covered, though?
Great question, to which I don't have a simple answer. I think I agree with a lot of what Sjir said here. I think claims 2 and 4 are particularly important — I'd like the effective giving community to grow as its own thing, without all the baggage of EA, and I'm excited to see GWWC working to make that happen. That doesn't mean that in our promotion of EA we won't discuss giving at all, though, because giving is definitely a part of EA. I'm not entirely sure yet how we'll talk about it, but one thing I imagine is that giving will be included as a call-to-action in much of our content.
Really great post, thanks for writing this! EA's animal successes are indeed really impressive. I want to push back a bit on "no one cares about" this though. The "good things" forum post and Twitter thread I did back in December both did well; much of EAG programming is about wins; Animal Liberation Now, which has got a ton of attention, contains a whole chapter on progress in animal welfare; and indeed your own post got a ton of upvotes.
I do agree that we could always do more to celebrate and reflect on wins like this — I'm just pushing back becaus...
Thank you Shakeel, very good criticism.
My title was a bit gimmicky, I flagged it as such, but it was accepted (I think by Ben West to incriminate him). I like such titles, so it may be my bias. Nevertheless, it was built from a very true disappointment (or something along that) of people not mentioning/celebrating the wins of EA when discussing and criticizing it, but I think this "frustration" was built more on reading the external takes rather than the internal ones - which internally was hard to disentangle for me. I think I failed to make this clear in...
Definitely agreed that we need to showcase the action — hence my mention of "real-world impact and innovation" (and my examples of LEEP and far-UVC work as the kinds of things we're very excited to promote).
Sorry that you're struggling to find something here! I don't have any great ideas, but some stuff that might be promising avenues to explore:
I wonder how much this is a US/UK thing because of the types of flights people are taking. My assumption is that in the US the vast majority of flights are domestic, and I’d agree that business class just isn’t worth it on those planes (aside from the length of the flight, the planes are also not that nice!). The equivalent would be UK-Europe flights, for which business class definitely doesn’t seem worth it. But most UK travel in my experience ends up being very long haul, normally transatlantic — and on those, business class is clearly much, much better ...
I think this is a really great report — by far the most comprehensive set of policy proposals I’ve seen. I don’t agree with everything in it, but I do hope the UK government takes it seriously. I particularly like the Sentinel idea, and the proposals for tiered governance of AI systems.
OpenPhil likely has some research on this — they fund work in this area https://www.openphilanthropy.org/grants/university-of-california-berkeley-aging-research-irina-conboy-2023/
I agree with everything you said here, and would also add an analogy: in the for-profit world it is very common, and actively encouraged, for major investors to have board seats, because it ensures the investor has some level of control and visibility over how their money is used — which seems very reasonable to me.
I'm really glad you wrote this; I've been worried about the same thing. I'm particularly worried at how few people are working on it given the potential scale and urgency of the problem. It also seems like an area where the EA ecosystem has a strong comparative advantage — it deals with issues many in this field are familiar with, requires a blend of technical and philosophical skills, and is still too weird and nascent for the wider world to touch (for now). I'd be very excited to see more research and work done here, ideally quite soon.
This is a good idea! I think Longview and Effective Giving are already doing this to some extent, so it could be worth reaching out to them.
I’m really sorry you’re feeling this way!
I wanted to add my personal perspective. I joined CEA in September, after a career in journalism. One of the things I was most delighted by when I joined was just how good the work-life balance was — so, so much better than in any other job I’ve had. I didn’t feel any obligation to work evenings or weekends, and indeed was actively encouraged not to (my boss, Max, didn’t have Slack on his phone and left his work laptop at the office when he went home — which set a really good example for the rest of us). I also real...
FWIW I actively like that the new comments are less prominent now — the previous design made me anxious; it felt like there was pressure to click everything to make the loud blue stuff go away. The new design feels like less of an attention-stealing design.
I'm really excited that this is happening. As far as I can tell, there's a dearth of effective, Zakat-compliant giving options; this is a huge step towards remedying that.
Hi Jeremiah. I was the hiring manager here and I think there's been something of a misunderstanding here: I don't think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.
Max is a phenomenal leader, and I’m very sad to see him go. He’s one of the most caring and humble people I’ve ever worked with, and his management and support during a very difficult few months has been invaluable. He’s also just a genuine delight to be around.
It’s deeply unfair that this job has taken a toll on him, and I’m very glad that he’s chosen the right thing for him.
Max has taught me so much, and I’ll be forever grateful for that. And I’m looking forward to continuing to work with him as an advisor — I know he’ll continue to be a huge help.
Hi! Thanks for this post. I do want to highlight that there is EA-linked work on nuclear security — most notably Carl Robichaud’s program at Longview Philanthropy. From conversations I’ve had with Carl, there are some really interesting and potentially very cost-effective interventions in this space. It certainly sounds like there’s room for collaboration here!
The most relevant bit of the Page Six article:
...A press release heralding the launch of Henry Elkus’ startup Helena — which said it “brings together global influencers to create positive world change,” but remained, at best, hazy on exactly how — also claimed that luminaries from pop star Selena Gomez to Gen. Stanley McChrystal were on board in various roles.
But when we asked Gomez’s publicist about her role, we got a curt, “She’s not involved.”
Henry explained the discrepancy to Page Six: “Selena was asked directly to be part of the group. We haven’t been de
Sorry for the slow response.
I wanted to clarify and apologise for some things here (not all of these are criticisms you’ve specifically made, but this is the best place I can think of to respond to various criticisms that have been made):
I agree with various concerns that have been raised about CEA and others in the community caring too much about PR concerns; I think truthfully saying what you believe — carefully and with compassion — is almost always more important than anything else
CEA's current media policy forbids employees from commenting on controversial issues without permission from leaders (including you). Does the view you express here mean you disagree with this policy? At present it seems that you have had the right to shoot from the hip with your personal opinions but ordinary CEA employees do not.
Thanks for writing this up Amber — this is the sense that we intended in our statement and in the intro essay that it refers to (though I didn’t write the intro essay). We have edited the intro essay to make clearer that this is what we mean, and also to make clear that these principles are more like “core hypotheses, but subject to revision” than “set in stone”.
Thanks for calling me out on this — I agree that I was too hasty to call for a response.
I’m glad that FLI has shared more information, and that they are rethinking their procedures as a result of this. This FAQ hasn’t completely alleviated my concerns about what happened here — I think it’s worrying that something like this can get to the stage it did without it being flagged (though again, I'm glad FLI seems to agree with this). And I also think that it would have been better if FLI had shared some more of the FAQ info with Expo too.
I do regret calling fo...
Thanks for this Shakeel. This seems like a particularly rough time to be running comms for CEA. I’m grateful that in addition to having that on your plate, in your personal capacity you’re helping to make the community feel more supportive for non-white EAs feeling the alienation you point to. Also for doing that despite the emotional labour involved in that, which typically makes me shy away from internet discussions.
Responding swiftly to things seems helpful in service of that support. One of the risks from that is that you can end up taking a particular...
Hey Shakeel,
Thank you for making the apology, you have my approval for that! I also like your apology on the other thread – your words are hopeful for CEA going in a good direction.
Some feedback/reaction from me that I hope is helpful. In describing your motivation for the FLI comment, you say that it was not to throw FLI under the bus, but because of your fear that some people would think EA is racist, and you wanted to correct that. To me, that is a political motivation, not much different from a PR motivation.
To gesture at the difference (in my ontology...
Hey Shakeel, thanks for your apology and update (and I hope you've apologized to FLI). Even though call-out culture may be popular or expected in other contexts, it is not professional or appropriate for the Comms Head of CEA to initiate an interaction with an EA org by publicly putting them on blast and seemingly seconding what could be very damaging accusations (as well as inventing others by speculating about financial misconduct). Did you try to contact FLI before publicly commenting to get an idea of what happened (perhaps before they could prepare th...
Thanks for sharing this. However it doesn't really answer the core question of why FLI ever thought this was okay. "We ultimately decided to reject it because of what our subsequent due diligence uncovered" — given that your brother is a writer there, did you not know beforehand that Nya Dagbladet publishes horrific, racist content? I find it hard to believe this was not known until the due diligence stage.
My brother never worked there. He published some articles there, but they've never paid him anything.
I'm very sorry for your loss and apologise for jumping to conclusions about why there wasn't an immediate statement.
Hi Jack — reasonable question! When I wrote this post I just didn't see what the legal problems might be for FLI. With FTX, there are a ton of complications, most notably with regards to bankruptcy/clawbacks, and the fact that actual crimes were (seemingly) committed. This FLI situation, on face value, didn't seem to have any similar complications — it seemed that something deeply immoral was done, but nothing more than that. Jason's comment has made me realise there might be something else going on here, though; if that is the case then that would m...
The following is my personal opinion, not CEA's.
If this is true it's absolutely horrifying. FLI needs to give a full explanation of what exactly happened here and I don't understand why they haven't. If FLI did knowingly agree to give money to a neo-Nazi group, that’s despicable. I don't think people who would do something like that ought to have any place in this community.
Hi Shakeel,
Thanks for this. I agree with your post and upvoted it.
However, I do also wonder if they are following what seems to be a common theme in EA crisis comms recently, which is to say as little as possible (presumably on the basis of legal advice). You wrote about this here: https://forum.effectivealtruism.org/posts/Et7oPMu6czhEd8ExW/why-you-re-not-hearing-as-much-from-ea-orgs-as-you-d-like
I agree with you that just about any comment or explanation from FLI would seem to help, and that passing the email exchange with Max over to Denton's seems to ma...
Want to note on this thread that CEA has published a statement on this: "Effective altruism is based on the core belief that all people count equally. We unequivocally condemn Nick Bostrom’s recklessly flawed and reprehensible words. We reject this unacceptable racist language, and the callous discussion of ideas that can and have harmed Black people. It is fundamentally inconsistent with our mission of building an inclusive and welcoming community."
Hi — this was removed accidentally while updating other text on the page. We'll put it back ASAP (might take a few days though because of timezones/people being on holiday, and I don't have access to edit that page).
Thanks for drawing our attention to this and calling us out for it — we definitely appreciate it (and, at the meta-level, I'm very glad we have a community that pushes us on things like this).
Hi — I think this post overstates the level of program-level centralisation here.
The EVF and CEA US boards provide overall oversight and governance of the projects and their executive directors, and will occasionally step in to change something important. But they have largely delegated program-level responsibility to each project’s executive director, who each set their own strategy for how to best have a positive impact on the world.
In practice, those strategies do differ: to give a couple of examples, CEA and 80,000 Hours have pretty different app...
Yeah that was in process but I'm not sure what the timeline on it is. We generally use "EV" to refer to the thing your diagrams are pointing to, though EV is not an organisation in itself — just an umbrella term for the two distinct entities. Thanks!
One clarification on this: CEA, 80,000 Hours etc. are all projects of the Effective Ventures group — which is the umbrella term for EVF (a UK registered charity) and CEA USA Inc. (a US registered charity), two separate legal entities which work together.
Hi! Thanks for asking. (And sorry for the slow response — trying not to work too much over the weekend...)
As per the Commission’s guidance, we’re supposed to file such a report in cases where we expect a reputational or financial impact from events. This is one of those cases, and we expect there might be an ongoing conversation with the Charity Commission as a result.
I've downvoted this, since it's basically a non-answer that doesn't address the OP's concerns / requests for information, and we haven't had any further info or reassurance from CEA since the question was asked.
Thanks for replying. Can you confirm for people that money donated to or through CEA is safe as far as CEA officials know (at least, as safe as it seemed before there was widespread knowledge of problems at FTX)?
I recently took the Giving Pledge, and I'm planning to donate 10% of my annual income to GiveWell this year. It looks like this year is a particularly good time to donate to GiveWell, as they are both more funding constrained than expected and have found more cost-effective opportunities that need funding. I think I'm going to donate to the All Grants Fund, because I'm excited about the prospect of evaluating and scaling new charities.
Everything that's posted on the EA Forum is public, and so journalists can (and often will) quote it. (Though obviously a lot of stuff is posted on the forum, and most of it won't get attention from journalists!).
Hi — thanks for the question.
In April, Effective Ventures purchased Wytham Abbey and some land around it (but <1% of the 2,500 acre estate you're suggesting). Wytham is in the process of being established as a convening centre to run workshops and meetings that bring together people to think seriously about how to address important problems in the world. The vision is modelled on traditional specialist conference centres, e.g. Oberwolfach, The Rockefeller Foundation Bellagio Center or the Brocher Foundation.
The purchase was made from a large grant made ...
As someone who works on comms stuff, I struggle with this a lot too! One thing I've found helpful is just asking decision makers, or people close to decision makers, why they did something. It's imperfect, but often helpful — e.g. when I've asked DC people what catalysed the increased political interest in AI safety, they overwhelmingly cited the CAIS letter, which seems like a fairly good sign that it worked. (Similarly, I've heard from people that Ian Hogarth's FT article may have achieved a similar effect in the UK.)
There are also proxies that can be ki... (read more)