Community
Community
Posts about the EA community and projects that focus on the EA community

Quick takes

10
4d
1
Have Will MacAskill, Nick Beckstead, or Holden Karnofsky responded to the reporting by Time that they were warned about Sam Bankman-Fried's behaviour years before the FTX collapse?
5
6d
14
Would love for orgs running large-scale hiring rounds (say 100+ applicants) to provide more feedback to their (rejected) applicants. Given that in most cases applicants are already being scored and ranked on their responses, maybe just tell them their scores, their overall ranking and what the next round cutoff would have been - say: prompt 1 = 15/20, prompt 2 = 17.5/20, rank = 156/900, cutoff for work test at 100. Since this is already happening in the background (if my impression here is wrong please lmk), why not make the process more transparent and release scores - with what seems to be very little extra work required (beyond some initial automation). 
1
13d
Stay safe, and stay alert (especially on the roads/while driving)
61
13d
47
Please people, do not treat Richard Hannania as some sort of worthy figure who is a friend of EA. He was a Nazi, and whilst he claims he moderated his views, he is still very racist as far as I can tell. Hannania called for trying to get rid of all non-white immigrants in the US, and the sterilization of everyone with an IQ under 90 indulged in antisemitic attacks on the allegedly Jewish elite, and even post his reform was writing about the need for the state to harass and imprison Black people specifically ('a revolution in our culture or form of government. We need more policing, incarceration, and surveillance of black people' https://en.wikipedia.org/wiki/Richard_Hanania).  Yet in the face of this, and after he made an incredibly grudging apology about his most extreme stuff (after journalists dug it up), he's been invited to Manifiold's events and put on Richard Yetter Chappel's blogroll.  DO NOT DO THIS. If you want people to distinguish benign transhumanism (which I agree is a real thing*) from the racist history of eugenics, do not fail to shun actual racists and Nazis. Likewise, if you want to promote "decoupling" factual beliefs from policy recommendations, which can be useful, do not duck and dive around the fact that virtually every major promoter of scientific racism ever, including allegedly mainstream figures like Jensen, worked with or published with actual literal Nazis (https://www.splcenter.org/fighting-hate/extremist-files/individual/arthur-jensen).  I love most of the people I have met through EA, and I know that-despite what some people say on twitter- we are not actually a secret crypto-fascist movement (nor is longtermism specifically, which whether you like it or not, is mostly about what its EA proponents say it is about.) But there is in my view a disturbing degree of tolerance for this stuff in the community, mostly centered around the Bay specifically. And to be clear I am complaining about tolerance for people with far-right and fasc
21
14d
Applications are still open for upcoming EA Global conferences in 2024! • EA Global: London (31 May–2 June) | Application deadline is in ~6 weeks • EA Global: Boston (1–3 November) Apply here and find more details on our website, you can also email the team at hello@eaglobal.org if you have any questions.  
26
15d
18
EA (via discussion of SBF and FTX) was briefly discussed on the The Rest is Politics Podcast today (the 3rd of April) and .... I'm really irritated by what was said. This is one of the largest politics podcasts in the world at the moment, and has a seriously influential listener-base. Rory Stewart said that after 15min someone at FTXFF cut his call with Rory short because that person wanted to go have lunch. The person reportedly also said "I don't care about poverty". Rory Stewart (the ex-President of GiveDirectly, and ex-MP) now seems to think that we are weird futurists who care more about "asteroids and killer robots" than we care about the 700M people currently in poverty. Great work, whoever that FTX person was...
17
16d
2
https://old.reddit.com/r/slatestarcodex/comments/1brg5t3/the_deaths_of_effective_altruism/kx91f5k/ Scott Alexander response to the Leif Wenar article
19
1mo
Some related thoughts and questions: NunoSempere points out that EA could have been structured in a radically different way, if the "specific cultural mileu" had been different. But I think this can be taken even further. I think that it's plausible that if a few moments in the history of effective altruism had gone differently, the social makeup—the sort of people that make up the movement—and their axiological worldviews—the sorts of things they value—might have been radically different too.  As someone interested in the history of ideas, I'm fascinated by what our movement has that made it significantly different than the most likely counterfactual movements. Why is effective altruism the way it is? A number of interesting brief histories have been written about the history of EA (and longer pieces about more specific things like Moynihan's excellent X-Risk) but I often feel that there are a lot of questions about the movement's history, especially regarding tensions that seem to present themselves between the different worldviews that make up EA. For example, 1. How much was it the individual "leaders" of EA who brought together different groups of people to create a big-tent EA, as opposed to the communities themselves already being connected? (Toby Ord says that he connected the Oxford GWWC/EA community to the rationality community, but people from both of these "camps" seem to be at Felicifia together in the late 2000s.)  2. When connecting the history of thought, there's a tendency to put thinkers after one another in lineages as if they all read and are responding to those who came before them. Parfit lays the ground for longtermism in the the late 20th century in Reasons and Persons and Bostrom continues the work when presenting the idea of x-risk in 2001. Did Bostrom know of and expand upon Parfit's work, or was Bostrom's framing independent of that, based on risks discussed by the Extropians, Yudkowsky, SL4, etc? There (maybe) seems to be multiple
Load more (8/120)

Posts in this space are about

CommunityEffective altruism lifestyle