Someone needs to be doing mass outreach about AI Safety to techies in the Bay Area.
I'm generally more of a fan of niche outreach over mass outreach, but Bay Area tech culture influences how AI is developed. If SB 1047 is defeated, I wouldn't be surprised if the lack of such outreach ended up being a decisive factor.
There's now enough prominent supporters of AI Safety and AI is hot enough that public lectures or debates could draw a big crowd. Even though a lot of people have been exposed to these ideas before, there's something about in-person events that make ideas seem real.
My point is not that the current EA forum would censor topics that were actually important early EA conversations, because EAs have now been selected for being willing to discuss those topics. My point is that the current forum might censor topics that would be important course-corrections, just as if the rest of society had been moderating early EA conversations, those conversations might have lost important contributions like impartiality between species (controversial: you're saying human lives don't matter very much!), the ineffectiveness of developmen...
Reflections on a decade of trying to have an impact
Next month (September 2024) is my 10th anniversary of formally engaging with EA. This date marks 10 years since I first reached out to the Foundational Research Institute about volunteering, at least as far as I can tell from my emails.
Prior to that, I probably had read a fair amount of Peter Singer, Brian Tomasik, and David Pearce, who might all have been considered connected to EA, but I hadn’t actually actively tried engaging with the community. I’d been engaged with the effective animal advocacy commun...
After some discussions with someone offline that were clarifying, I want to clarify my decrease in confidence in the statement, "Farmed vertebrate welfare should be an EA focus".
I think my view is slightly more complicated than this implies. I think that given that OpenPhil and non-EA donors are basically able to fund what seem like the entirety of the good opportunities in this space, I don't think these groups are that talent constrained, and it seems like the best bets (e.g. corporate campaigns) will continue to have decreasing cost-effectiveness, new a...
Brazil has been dealing with massive criminal wildfires for the last few weeks, and the air quality is record-breakingly bad. Besides other obvious issues (ineffective government response in going after the criminals setting fires, climate change making everything worse), hardly anyone is talking about how to deal with the immediate air quality problem. It's a bit bizarre.
People aren't widely adopting PFF2 masks and air purifiers. These remain somewhat niche topics even though pretty much everyone is suffering. To be fair, there are occasional media report...
I think more EAs should consider operations/management/doer careers over research careers, and that operations/management/doer careers should be higher status within the community.
I get a general vibe that in EA (and probably the world at large), that being a "deep thinking researcher"-type is way higher status than being an "operations/management/doer"-type. Yet the latter is also very high impact work, often higher impact than research (especially on the margin).
I see many EAs erroneously try to go into research and stick to research despite having very ...
The original website for Students for High Impact Charities (SHIC) at https://shicschools.org is down (You can find it in the Wayback Machine), but the program scripts and slides they used in high schools are still available at their google drive link at https://drive.google.com/drive/folders/0B_2KLuBlcCg4QWtrYW43UGcwajQ
Could potentially be a valuable EA community building resource
Nonprofit organizations should make their sources of funding really obvious and clear: How much money you got from which grantmakers, and approximately when. Any time I go on some org's website and can't find information about their major funders, it's a big red flag. At a bare minimum you should have a list of funders, and I'm confused why more orgs don't do this.
Has anyone talked with/lobbied the Gates Foundation on factory farming? I was concerned to read this in Gates Notes.
"On the way back to Addis, we stopped at a poultry farm established by the Oromia government to help young people enter the poultry industry. They work there for two or three years, earn a salary and some start-up money, and then go off to start their own agriculture businesses. It was a noisy place—the farm has 20,000 chickens! But it was exciting to meet some aspiring farmers and businesspeople with big dreams."
It seems a disaster that the ...
I used to frequently come across a certain acronym in EA, used in a context like "I'm working on ___" or "looking for other people who also use ___". I flagged it mentally as a curiosity to explore later, but ended up forgetting what the acronym was. I'm thinking it might be CFAR, which seems to have meant CFAR workshops? If so, 1) what happened to them, and 2) was it common for people to work through the material themselves, self-paced?
The copyright banner at the bottom of their site extends to 2024 and the Google form for workshop applications hasn't been deactivated.
I got a copy of the CFAR handbook in late 2022 and the intro had an explicit reference to self study - along the lines of 'we have only used this in workshops, we don't know what the results of self study of this material does and it wasn't written for self study'
So I assume self study wasn't common but I may be wrong
Please people, do not treat Richard Hannania as some sort of worthy figure who is a friend of EA. He was a Nazi, and whilst he claims he moderated his views, he is still very racist as far as I can tell.
Hannania called for trying to get rid of all non-white immigrants in the US, and the sterilization of everyone with an IQ under 90 indulged in antisemitic attacks on the allegedly Jewish elite, and even post his reform was writing about the need for the state to harass and imprison Black people specifically ('a revolution in our culture or form of governmen...
Just to expand on the above, I've written a new blog post - It's OK to Read Anyone - that explains (i) why I won't personally engage in intellectual boycotts [obviously the situation is different for organizations, and I'm happy for them to make their own decisions!], and (ii) what it is in Hanania's substack writing that I personally find valuable and worth recommending to other intellectuals.
London folks - I'm going to be running the EA Taskmaster game again at the AIM office on the afternoon of Sunday 8th September.
It's a fun, slightly geeky, way to spend a Sunday afternoon. Check out last year's list of tasks for a flavour of what's in store 👀
Sign up here
(Wee bit late in properly advertising so please do spread the word!)
I’m looking for podcasts, papers, or reviews on fish sentience.
Specifically:
I would also like to know if there are practical methods to reduce the amount of harm done if you are fishing.
Rethink priorities had their moral weights report which placed salmon at 0.056, I’m not sure I understood completely what that figure meant. I think this means they have 5% of the...
The Economist has an article about China's top politicians on catastrophic risks from AI, titled "Is Xi Jinping an AI Doomer?"
...Western accelerationists often argue that competition with Chinese developers, who are uninhibited by strong safeguards, is so fierce that the West cannot afford to slow down. The implication is that the debate in China is one-sided, with accelerationists having the most say over the regulatory environment. In fact, China has its own AI doomers—and they are increasingly influential.
[...]
China’s accelerationists want to keep th
I'm proud to announce the 5-minute animated short on mental health I wrote back in 2020 is finally finished! I'd love you to watch it and let me know what you think (like, share…). It's currently "unlisted" as I wait to see how the production studio wants to release it publicly. But in the meantime I'm sharing it with my extended network.
https://projects.propublica.org/nonprofits/ is a great American nonprofit resource:
New Incentives in particular seems poised to spend much more after large ~Givewell cash grants
I like the idea of carbon offsets for flights etc, but I think most carbon offset schemes are probably garbage. A year ago I made a personal pledge that whenever I was prompted to pay extra to carbon offset something, I would decline, but then immediately donate the same amount or more to effective environmental funds (in my case, Effective Altruism Australia Environment.) It's easy to remember and easy to do. Perhaps this simple pledge will be similarly sticky for other people :)
My personal (skeptical) benchmark for price per unit of non-garbage carbon offsets comes from Scott Alexander's mention of Climeworks:
“Pessimistic” comes from Climeworks, a company that builds giant reverse-factories which take carbon out of the air. If you’re maximally skeptical about any charity's ability to offset CO2, these are the people for you - they can literally hand you a bottle full of the carbon they removed, so you don't need to take anything on faith. But they charge as much as $1000/ton.
Climeworks actually charges more now, at least fo...
[crossposted from my blog; some reflections on developing different problem-solving tools]
When all you have is a hammer, everything sure does start to look like a nail. This is not a good thing.
I've spent a lot of my life variously
1) Falling in love with physics and physics fundamentalism (the idea that physics is the "building block" of our reality)
2) Training to "think like a physicist"
3) Getting sidetracked by how "thinking like a physicist" interacts with how real people actually do physics in practice
4) Learning a bunch of different skills to tackle i... (read more)
Oh, if you read some of Plato's dialogues it seems very untrue...Plato was really into strawmanning his opponents' arguments unfortunately :)
Anyway. To try and answer your (very thoughtful) question:
- Get people from different disciplines together in the same physical space on a regular basis. Maybe you put the software engineers next to the literary critics and get them to have lunch together regularly, or something. People are easier to relate to up close.
- Get people to work together on big interdisciplinary problems such as satellite imagery for conservati
... (read more)