All of Shakeel Hashim's Comments + Replies

As someone who works on comms stuff, I struggle with this a lot too! One thing I've found helpful is just asking decision makers, or people close to decision makers, why they did something. It's imperfect, but often helpful — e.g. when I've asked DC people what catalysed the increased political interest in AI safety, they overwhelmingly cited the CAIS letter, which seems like a fairly good sign that it worked. (Similarly, I've heard from people that Ian Hogarth's FT article may have achieved a similar effect in the UK.)

There are also proxies that can be ki... (read more)

This is really cool, thanks for organising it!

GiveWell's previously recommended MSF as a good disaster relief org, so that would be my best guess. I'd love to know more, though.  

“is there no EA press or comms unit that journalists contact before publishing such articles” — sometimes CEA or Forethought get asked for comment on pieces, but the vast majority of the time no one contacts us. It’s quite frustrating.

Yeah, the phrase "woke mob" (and similar) is extremely common in conservative media!

I don’t have an answer, but would suggest you talk to the folks at the Good Food Institute if you haven’t already — they might have advice, or at the very least be able to point you towards other people you could ask about this.

8
Michelle Hauser
7mo
Hey Sahkeel, I actually work part-time for GFI. I have spoken to several of the Scitech folks, and surprisingly there is no set strategy for these cases. There seems to be a gradual change over time from strongly supporting open-source science to more nuanced 'maybe patent but in a non-limiting way'.

This is great, thanks for highlighting. Evidence Action is another excellent charity that’s nominated, here’s the link to vote for them: https://charitynavigator.typeform.com/to/PsmPZTwp#organization=Evidence Action Inc.

1
Ann Garth
7mo
Thanks for catching! I missed them when doing my initial scan of the nominees list.

Thanks for this. I agree that we’ve been neglecting social media; the main reason for this as far as I can tell is that no one at CEA was primarily focused on comms/marketing until I was hired in September; then other events proved to be attention-stealing.

Social media is going to be a major part of the communications strategy I outlined here; I expect you'll see us being more active in the coming months. https://forum.effectivealtruism.org/posts/mFGZtPKTjqrfeHHsH/how-cea-s-communications-team-is-thinking-about-ea

1
Andie Rodriguez
8mo
Thank you so much for the response! I look forward to seeing the upcoming changes. There’s so many potential opportunities and avenues of growth, it’s exciting to think about!

This is interesting and I broadly agree with you (though I think Habryka’s comment is important and right). On point 2, I’d want us to think very hard before adopting these as principles. It’s not obvious to me that non-violence is always the correct option — e.g. in World War 2 I think violence against the Nazis was a moral course of action.

As EA becomes increasingly involved in campaigning for states to act one way or another, a blanket non-violence policy could meaningfully and harmfully constrain us. (You could amend the clause to be “no non-state-sanc... (read more)

7
Aaron Gertler
9mo
Regarding point 2, I'd argue that both "honesty" and "non-violence" are implied by the actual text of the fourth principle on the page: I think this text, or something very similar, has been a part of this list since at least 2018. It directly calls out honesty as important, and I think the use of "compassion" and the discouragement of "ends justify the means" reasoning both point clearly towards "don't do bad things to other people", where "bad things" include (but are not limited to) violence.
4
freedomandutility
9mo
Agree that non-violence and honesty aren’t always the best option, but neither is collaboration, and collaborative spirit is listed as a core value. I think “true in 99% of cases” is fine for something to be considered a core EA value. I’d also add that I think in practice we already abide by honesty and non violence to a similar degree to which we abide by the collaborative spirit principle. I do think honesty and non-violence should be added to the list of core principles to further promote these values within EA, but I think the case of adding these values is stronger from a “protection against negative PR if someone violates these principles” perspective.

Thanks for this post, it's a really important issue. On tractability, do you think we'll be best off with technical fixes (e.g. maybe we should just try not to make sentient AIs?), or will it have to be policy? (Maybe it's way too early to even begin to guess).

2
jeffsebo
8mo
Good question! I think that the best path forward requires taking a "both-and" approach. Ideally we can (a) slow down AI development to buy AI ethics, safety, and sentience researchers time and (b) speed up these forms of research (focusing on moral, political, and technical issues) to make good use of this time. So, yes, I do think that we should avoid creating potentially sentient AI systems in the short term, though as my paper with Rob Long discusses, that might be easier said than done. As for whether we should create potentially sentient AI systems in the long run (and how individuals, companies, and governments should treat them to the extent that we do), that seems like a much harder question, and it will take serious research to address it. I hope that we can do some of that research in the coming years!

Makes total sense — thank you, and looking forward to the handbook!

This is really exciting, nice work on putting it together. Do you have any plans to put the teaching materials (even if that’s just a reading list) online at any point? I think I’m not the right sort of person to do the course but I’d love to slowly work my way through a reading list in my own time.

5
Weaver
9mo
This is exactly my thought right here. I would like to go through the materials but full time is too rapid for me currently.

I think this is interesting but don't think this is as clear cut as you're making out. There seem to me to be some instances where making the "first strike" is good — e.g. I think it'd be reasonable (though maybe not advisable) to criticise a billionaire for not donating any of their wealth; to criticise an AI company that's recklessly advancing capabilities; to criticise a virology lab that has unacceptably lax safety standards; or to criticise a Western government that is spending no money on foreign aid. Maybe your "personal attack" clause means this kind of stuff wouldn't get covered, though?

Great question, to which I don't have a simple answer. I think I agree with a lot of what Sjir said here. I think claims 2 and 4 are particularly important — I'd like the effective giving community to grow as its own thing, without all the baggage of EA, and I'm excited to see GWWC working to make that happen. That doesn't mean that in our promotion of EA we won't discuss giving at all, though, because giving is definitely a part of EA. I'm not entirely sure yet how we'll talk about it, but one thing I imagine is that giving will be included as a call-to-action in much of our content.

6
Gemma Paterson
9mo
That seems reasonable - I think the target audience for effective giving is much bigger. The call-to-action is really what I'm getting at so pleased to see that ☺️

Really great post, thanks for writing this! EA's animal successes are indeed really impressive. I want to push back a bit on "no one cares about" this though. The "good things" forum post and Twitter thread I did back in December both did well; much of EAG programming is about wins; Animal Liberation Now, which has got a ton of attention, contains a whole chapter on progress in animal welfare; and indeed your own post got a ton of upvotes.

 I do agree that we could always do more to celebrate and reflect on wins like this — I'm just pushing back becaus... (read more)

Thank you Shakeel, very good criticism.

My title was a bit gimmicky, I flagged it as such, but it was accepted (I think by Ben West to incriminate him). I like such titles, so it may be my bias. Nevertheless, it was built from a very true disappointment (or something along that) of people not mentioning/celebrating the wins of EA when discussing and criticizing it, but I think this "frustration" was built more on reading the external takes rather than the internal ones - which internally was hard to disentangle for me. I think I failed to make this clear in... (read more)

Definitely agreed that we need to showcase the action — hence my mention of "real-world impact and innovation" (and my examples of LEEP and far-UVC work as the kinds of things we're very excited to promote).

Sorry that you're struggling to find something here! I don't have any great ideas, but some stuff that might be promising avenues to explore:

... (read more)
7
will112
9mo
Hey! Thank you SO much for all these resources! They were very helpful, and I'm glad there are a lot more opportunities to write about EA than I thought.

I wonder how much this is a US/UK thing because of the types of flights people are taking. My assumption is that in the US the vast majority of flights are domestic, and I’d agree that business class just isn’t worth it on those planes (aside from the length of the flight, the planes are also not that nice!). The equivalent would be UK-Europe flights, for which business class definitely doesn’t seem worth it. But most UK travel in my experience ends up being very long haul, normally transatlantic — and on those, business class is clearly much, much better ... (read more)

I think this is a really great report — by far the most comprehensive set of policy proposals I’ve seen. I don’t agree with everything in it, but I do hope the UK government takes it seriously. I particularly like the Sentinel idea, and the proposals for tiered governance of AI systems.

1
indrekk
10mo
Wow, nice! Thanks for sharing! That's great news!

I agree with everything you said here, and would also add an analogy: in the for-profit world it is very common, and actively encouraged, for major investors to have board seats, because it ensures the investor has some level of control and visibility over how their money is used — which seems very reasonable to me.

I'm really glad you wrote this; I've been worried about the same thing. I'm particularly worried at how few people are working on it given the potential scale and urgency of the problem. It also seems like an area where the EA ecosystem has a strong comparative advantage — it deals with issues many in this field are familiar with, requires a blend of technical and philosophical skills, and is still too weird and nascent for the wider world to touch (for now). I'd be very excited to see more research and work done here, ideally quite soon.

This is a good idea! I think Longview and Effective Giving are already doing this to some extent, so it could be worth reaching out to them.

1
Kyle Smith
1y
I am definitely working on that. Early returns suggest that this type of approach is outside their strategic plan. I think option (2) is ideal, but I'm not sure they will be up for that.

I’m really sorry you’re feeling this way!

I wanted to add my personal perspective. I joined CEA in September, after a career in journalism. One of the things I was most delighted by when I joined was just how good the work-life balance was — so, so much better than in any other job I’ve had. I didn’t feel any obligation to work evenings or weekends, and indeed was actively encouraged not to (my boss, Max, didn’t have Slack on his phone and left his work laptop at the office when he went home — which set a really good example for the rest of us). I also real... (read more)

FWIW I actively like that the new comments are less prominent now — the previous design made me anxious; it felt like there was pressure to click everything to make the loud blue stuff go away. The new design feels like less of an attention-stealing design.

I'm really excited that this is happening. As far as I can tell, there's a dearth of effective, Zakat-compliant giving options; this is a huge step towards remedying that.

Hi Jeremiah. I was the hiring manager here and I think there's been something of a misunderstanding here: I don't think this is an accurate summary of why we made the decision we did. It feels weird to discuss this in public, but I consent to you publishing the full rejection email we sent, if you would like.

-3
JeremiahJohnson
1y
I don't particularly feel it would be a valuable use of anyone's time to get into a drawn out public back-and-forth debate where we both nitpick the implications of various word choices. I'll just say that if your intention was to communicate something other than "We prefer a candidate who is tightly value aligned" then there was a significant failure of communication and you shouldn't have specifically used the phrase "tightly aligned" in the same sentence as the rejection.

Max is a phenomenal leader, and I’m very sad to see him go. He’s one of the most caring and humble people I’ve ever worked with, and his management and support during a very difficult few months has been invaluable. He’s also just a genuine delight to be around.

It’s deeply unfair that this job has taken a toll on him, and I’m very glad that he’s chosen the right thing for him.

Max has taught me so much, and I’ll be forever grateful for that. And I’m looking forward to continuing to work with him as an advisor — I know he’ll continue to be a huge help.

Hi! Thanks for this post. I do want to highlight that there is EA-linked work on nuclear security — most notably Carl Robichaud’s program at Longview Philanthropy. From conversations I’ve had with Carl, there are some really interesting and potentially very cost-effective interventions in this space. It certainly sounds like there’s room for collaboration here!

https://forum.effectivealtruism.org/posts/M7wNHbpqnLfDzmDK9/new-nuclear-security-grantmaking-programme-at-longview

The most relevant bit of the Page Six article:

A press release heralding the launch of Henry Elkus’ startup Helena — which said it “brings together global influencers to create positive world change,” but remained, at best, hazy on exactly how — also claimed that luminaries from pop star Selena Gomez to Gen. Stanley McChrystal were on board in various roles.

But when we asked Gomez’s publicist about her role, we got a curt, “She’s not involved.”

Henry explained the discrepancy to Page Six: “Selena was asked directly to be part of the group. We haven’t been de

... (read more)

Sorry for the slow response.

I wanted to clarify and apologise for some things here (not all of these are criticisms you’ve specifically made, but this is the best place I can think of to respond to various criticisms that have been made):

  1. This statement was drafted and originally intended to be a short quote that we could send to journalists if asked for comment. On reflection, I think that posting something written for that purpose on the Forum was the wrong way to communicate with the community and a mistake. I am glad that we posted something,
... (read more)

I agree with various concerns that have been raised about CEA and others in the community caring too much about PR concerns; I think truthfully saying what you believe — carefully and with compassion — is almost always more important than anything else

CEA's current media policy forbids employees from commenting on controversial issues without permission from leaders (including you). Does the view you express here mean you disagree with this policy? At present it seems that you have had the right to shoot from the hip with your personal opinions but ordinary CEA employees do not.

I appreciate this

Thanks for writing this up Amber — this is the sense that we intended in our statement and in the intro essay that it refers to (though I didn’t write the intro essay). We have edited the intro essay to make clearer that this is what we mean, and also to make clear that these principles are more like “core hypotheses, but subject to revision” than “set in stone”.

Thanks for calling me out on this — I agree that I was too hasty to call for a response.

I’m glad that FLI has shared more information, and that they are rethinking their procedures as a result of this. This FAQ hasn’t completely alleviated my concerns about what happened here — I think it’s worrying that something like this can get to the stage it did without it being flagged (though again, I'm glad FLI seems to agree with this). And I also think that it would have been better if FLI had shared some more of the FAQ info with Expo too.

I do regret calling fo... (read more)

Thanks for this Shakeel. This seems like a particularly rough time to be running comms for CEA. I’m grateful that in addition to having that on your plate, in your personal capacity you’re helping to make the community feel more supportive for non-white EAs feeling the alienation you point to. Also for doing that despite the emotional labour involved in that, which typically makes me shy away from internet discussions.

Responding swiftly to things seems helpful in service of that support. One of the risks from that is that you can end up taking a particular... (read more)

-3
D0TheMath
1y
I find myself disliking this comment, and I think its mostly because it sounds like you 1) agree with many of the blunders Rob points out, yet 2) don’t seem to have learned anything from your mistake here? I don’t think many do or should blame you, and I’m personally concerned about repeated similar blunders on your part costing EA much loss of outside reputation and internal trust. Like, do you think that the issue was that you were responding in heat, and if so, will you make a future policy of not responding in heat in future similar situations? I feel like there are deeper problems here that won’t be corrected by such a policy, and your lack of concreteness is an impedance to communicating such concerns about your approach to CEA comms (and is itself a repeated issue that won’t be corrected by such a policy).
4
RobBensinger
1y
Makes sense to me! I appreciate knowing your perspective better, Shakeel. :) On reflection, I think the thing I care about in situations like this is much more "mutual understanding of where people were coming from and where they're at now", whether or not anyone technically "apologizes". Apologizing is one way of communicating information about that (because it suggests we're on the same page that there was a nontrivial foreseeable-in-advance fuck-up), but IMO a comment along those lines could be awesome without ever saying the words "I'm sorry". One of my concerns about "I'm sorry" is that I think some people think you can only owe apologies to Good Guys, not to Bad Guys. So if there's a disagreement about who the Good Guys are, communities can get stuck arguing about whether X should apologize for Y, when it would be more productive to discuss upstream disagreements about facts and values. I think some people are still uncertain about exactly how OK or bad FLI's actions here were, but whether or not FLI fucked up badly here and whether or not FLI is bad as an org, I think the EA Forum's response was bad given the evidence we had at the time. I want our culture to be such that it's maximally easy for us to acknowledge that sort of thing and course-correct so we do better next time. And my intuition is that a sufficiently honest explanation of where you were coming from, that's sufficiently curious about and open to understanding others' perspectives, and sufficiently lacking in soldier-mindset-style defensiveness, can do even more than an apology to contribute to a healthy culture. (In this case the apology is to FLI/Max, not to me, so it's mostly none of my business. 😛 But since I called for "apologies" earlier, I wanted to consider the general question of whether that's the thing that matters most.)
Ruby
1y46
14
3

Hey Shakeel,

Thank you for making the apology, you have my approval for that! I also like your apology on the other thread – your words are hopeful for CEA going in a good direction.

Some feedback/reaction from me that I hope is helpful. In describing your motivation for the FLI comment, you say that it was not to throw FLI under the bus, but because of your fear that some people would think EA is racist, and you wanted to correct that. To me, that is a political motivation, not much different from a PR motivation.

To gesture at the difference (in my ontology... (read more)

I liked this apology.

Hey Shakeel, thanks for your apology and update (and I hope you've apologized to FLI). Even though call-out culture may be popular or expected in other contexts, it is not professional or appropriate for the Comms Head of CEA to initiate an interaction with an EA org by publicly putting them on blast and seemingly seconding what could be very damaging accusations (as well as inventing others by speculating about financial misconduct). Did you try to contact FLI before publicly commenting to get an idea of what happened (perhaps before they could prepare th... (read more)

Thanks for sharing this. However it doesn't really answer the core question of why FLI ever thought this was okay. "We ultimately decided to reject it because of what our subsequent due diligence uncovered" — given that your brother is a writer there, did you not know beforehand that Nya Dagbladet publishes horrific, racist content? I find it hard to believe this was not known until the due diligence stage.

Tegmark
1y26
15
14

My brother never worked there.  He published some articles there, but they've never paid him anything. 

I'm very sorry for your loss and apologise for jumping to conclusions about why there wasn't an immediate statement.

Hi Jack — reasonable question! When I wrote this post I just didn't see what the legal problems might be for FLI. With FTX, there are a ton of complications, most notably with regards to bankruptcy/clawbacks, and the fact that actual crimes were (seemingly) committed. This FLI situation, on face value, didn't seem to have any similar complications — it seemed that something deeply immoral was done, but nothing more than that. Jason's comment has made me realise there might be something else going on here, though;  if that is the case then that would m... (read more)

6[anonymous]9mo
Please don't do this. I think it's very unfair to act like there are clear standards in a gray area where those standards are just enough to cover your own ass. The example I often think of is an adult brother and sister keeping in touch with their parents, where the brother decides the standard is to call them daily and does so, the sister decides the standard is to call them weekly and does so, and then the brother chastises the sister for not doing enough. (And perhaps it's easier for the brother to call daily, and perhaps the sister knows their parents find it annoying how often the brother calls, and perhaps the sister is doing other nice things for their parents etc.) I feel a bit bad about singling you out because I think other orgs do this too (CE being a recent example), but I guess I'd come to expect more from CEA as I generally consider them to be one of the more professional EA orgs. I'm thankful that I haven't seen any more comments from you like this one or the one above in recent months, but I thought someone should say something about this comment too.
5
Jack Lewars
1y
Thanks for responding Shakeel.

The following is my personal opinion, not CEA's.

If this is true it's absolutely horrifying.  FLI needs to give a full explanation of what exactly happened here and I don't understand why they haven't. If FLI did knowingly agree to give money to a neo-Nazi group, that’s despicable.  I don't think people who would do something like that ought to have any place in this community.

Hi Shakeel,

Thanks for this. I agree with your post and upvoted it.

However, I do also wonder if they are following what seems to be a common theme in EA crisis comms recently, which is to say as little as possible (presumably on the basis of legal advice). You wrote about this here: https://forum.effectivealtruism.org/posts/Et7oPMu6czhEd8ExW/why-you-re-not-hearing-as-much-from-ea-orgs-as-you-d-like

I agree with you that just about any comment or explanation from FLI would seem to help, and that passing the email exchange with Max over to Denton's seems to ma... (read more)

0
Daniel_Eth
1y
Agree.

I resonated with this post a lot. Thank you for writing it.

Want to note on this thread that CEA has published a statement on this: "Effective altruism is based on the core belief that all people count equally. We unequivocally condemn Nick Bostrom’s recklessly flawed and reprehensible words. We reject this unacceptable racist language, and the callous discussion of ideas that can and have harmed Black people. It is fundamentally inconsistent with our mission of building an inclusive and welcoming community."

-65
Jinx
1y

Hi — this was removed accidentally while updating other text on the page. We'll put it back ASAP (might take a few days though because of timezones/people being on holiday, and I don't have access to edit that page).

Thanks for drawing our attention to this and calling us out for it — we definitely appreciate it (and, at the meta-level, I'm very glad we have a community that pushes us on things like this).

1
pseudonym
1y
Removed accidentally? I don't know much about web dev, but how do you accidentally edit a website and accidentally delete a specific paragraph?

Hi — I think this post overstates the level of program-level centralisation here. 

The EVF and CEA US boards provide overall oversight and governance of the projects and their executive directors, and will occasionally step in to change something important. But they have largely delegated program-level responsibility to each project’s executive director, who each set their own strategy for how to best have a positive impact on the world.

In practice, those strategies do differ: to give a couple of examples, CEA and 80,000 Hours have pretty different app... (read more)

Yeah that’d be great. Thanks!

3
Jeff Kaufman
1y
I'm confused: Announcing Interim CEOs of EVF consistently uses "EVF", including in phrases like "80k, one of the largest EVF projects". Should I switch these diagrams back to say "EVF"? (I raised this when the other post came out, but no response.)
3
Jeff Kaufman
1y
edited!

Yeah that was in process but I'm not sure what the timeline on it is. We generally use "EV" to refer to the thing your diagrams are pointing to, though EV is not an organisation in itself — just an umbrella term for the two distinct entities. Thanks!

2
Jeff Kaufman
1y
Would redoing the diagrams and text with "EV" instead of "EVF" be better? (Personally I like "EVF" better, mostly because three-letter acronyms have fewer conflicts, but if the umbrella org wants to go by "EV" I'm happy to help)

One clarification on this: CEA, 80,000 Hours etc. are all projects of the Effective Ventures group — which is the umbrella term for EVF (a UK registered charity) and CEA USA Inc. (a US registered charity), two separate legal entities which work together.

4
Guy Raveh
1y
Who formally employs the staff of these projects, though?
7
Jeff Kaufman
1y
Thanks! I had thought CEA USA was in the process of being renamed to EVF USA? What is the acronym you would like to see for the umbrella organization? "EVF", "EVG", "EV"? I'm happy to edit my diagrams if EVF is not the preferred acronym.

Hi! Thanks for asking. (And sorry for the slow response — trying not to work too much over the weekend...)

As per the Commission’s guidance, we’re supposed to file such a report in cases where we expect a reputational or financial impact from events. This is one of those cases, and we expect there might be an ongoing conversation with the Charity Commission as a result.

I've downvoted this, since it's basically a non-answer that doesn't address the OP's concerns / requests for information, and we haven't had any further info or reassurance from CEA since the question was asked.

Thanks for replying. Can you confirm for people that money donated to or through CEA is safe as far as CEA officials know (at least, as safe as it seemed before there was widespread knowledge of problems at FTX)?

Answer by Shakeel HashimNov 23, 202220
❤️13
🌟4

I recently took the Giving Pledge, and I'm planning to donate 10% of my annual income to GiveWell this year. It looks like this year is a particularly good time to donate to GiveWell, as they are both more funding constrained than expected and have found more cost-effective opportunities that need funding. I think I'm going to donate to the All Grants Fund, because I'm excited about the prospect of evaluating and scaling new charities.

Everything that's posted on the EA Forum is public, and so journalists can (and often will) quote it. (Though obviously a lot of stuff is posted on the forum, and most of it won't get attention from journalists!).

Hi — thanks for the question.

In April, Effective Ventures purchased Wytham Abbey and some land around it (but <1% of the 2,500 acre estate you're suggesting). Wytham is in the process of being established as a convening centre to run workshops and meetings that bring together people to think seriously about how to address important problems in the world. The vision is modelled on traditional specialist conference centres, e.g. Oberwolfach, The Rockefeller Foundation Bellagio Center or the Brocher Foundation.

The purchase was made from a large grant made ... (read more)

Load more