The fact that everyone in EA finds the work we do interesting and/or fun should be treated with more suspicion.
I know that "everyone" was an intentional exaggeration, but I'd be interested to see the actual baseline statistics on a question like "do you find EA content interesting, independent of its importance?"
Personally, I find "the work EA does" to be, on average... mildly interesting?
In college, even after I found EA, I was much more intellectually drawn to random topics in psychology and philosophy, as well as startup culture. When I read nonfiction ... (read more)
the culture of people who spend lots of time on EA Twitter or the Forum
there's an EA Twitter?
I think I agree with everything here, though I don't think the line is exactly people who spend lots of time on EA Twitter (I can think of several people who are pretty deep into EA research and don't use Twitter/aren't avid readers of the Forum). Maybe something like, people whose primary interest is research into EA topics? But it definitely isn't everyone, or the majority of people into EA.
Despite the real visual + other issues, I still think the website is very reasonable!
The changes to make, including some to the grant page, are tiny relative to the overall size of the project. It seems very easy to find our grants and other content, and overall reception from key stakeholders has been highly positive. OP staff seem to like the changes, too (and we had tons of staff feedback at all points of the process).
If you have other specific feedback, I'm happy to hear it, but I don't know what e.g. "a little more focus and polish" means.
The 2019 'spike' you highlight doesn't represent higher overall spending — it's a quirk of how we record grants on the website.
Each program officer has an annual grantmaking "budget", which rolls over into the next year if it goes unspent. The CJR budget was a consistent ~$25 million/year from 2017 through 2021. If you subtract the Just Impact spin-out at the end of 2021, you'll see that the total grantmaking over that period matches the total budget.
So why does published grantmaking look higher in 2019?
The reason is that our published grants generally "fr... (read more)
So this doesn't really dissolve my curiosity.
In dialog form, because otherwise this would have been a really long paragraph:
NS: I think that the spike in funding in 2019, right after the GiveWell’s Top Charities Are (Increasingly) Hard to Beat blogpost, is suspicious
AG: Ah, but it's not higher spending. Because of our accounting practices, it's rather an increase in future funding commitments. So your chart isn't about "spending" it's about "locked-in spending commitments". And in fact, in the next few years, spending-as-recorded goes down because the lock... (read more)
(Writing from OP’s point of view here.)
We appreciate that Nuño reached out about an earlier draft of this piece and incorporated some of our feedback. Though we disagree with a number of his points, we welcome constructive criticism of our work and hope to see more of it.
We’ve left a few comments below.
The importance of managed exits
We deliberately chose to spin off our CJR grantmaking in a careful, managed way. As a funder, we want to commit to the areas we enter and avoid sudden exits. This approach:
If this dynamic leads you to put less “trust” in our decisions, I think that’s a good thing!
I will push back a bit on this as well. I think it's very healthy for the community to be skeptical of Open Philanthropy's reasoning ability, and to be vigilant about trying to point out errors.On the other hand, I don't think it's great if we have a dynamic where the community is skeptical of Open Philanthropy's intentions. Basically, there's a big difference between "OP made a mistake because they over/underrated X" and "OP made a mistake because they were politically or PR motivated and intentionally made sub-optimal grants."
So one of the things I'm still confused is about having two spikes in funding, one in 2019 and the other one in 2021, both of which can be interpreted as parting grants:
So OP gave half of the funding to criminal justice reform ($100M out of $200M) after writing GiveWell’s Top Charities Are (Increasingly) Hard to Beat, and this makes me less likely to think about this in terms of exit grant and more in terms of, idk, some sort of nefariousness/shenanigans.
This is excellent, thanks!
These two papers, in particular, were what I was looking for. The corresponding information on QALYs was also great.
(For future readers of my post, the relevant info is under the "descriptive system" and "valuation methods" subheadings in Derek's post.)
Thanks! The correlation graphs were helpful to see, though I'm sad about the muddled results from the graph in the updated section.
This is very good feedback — I'll look into making that change.
Thanks for all of this feedback! Lots of good points to consider moving forward, and exactly the kind of thing I hoped to get from this post.
This website was a weird project — passed around between owners and developers over a period of ~2 years. I think there was a good amount of usability testing before my time, but I'm not sure how much of that was holistic and focused on the final design (vs. focused on specific elements). I agree with most of your points myself and also trust your experience in this area.
A couple of reports had their footnotes get jumbled — a fix is in progress. Thanks for the note!
Thanks for this feedback. The horizontal scroll is a matter of having long email addresses on those page, and I'll clean that up after checking with page owners.
Agree with info density dropping on the grants page — I think there's an easy improvement or two to be made here (e.g. removing the "Learn More" arrow), which I'll be aiming to make as the new site owner (with input from others at OP).
Thanks for the link! I was aware of the most recent study, but you prompted me to dig deep and see what they said about their survey methodology.
The most relevant bits I found were sections 4.8 and 4.8.1 in this PDF, which describe multiple surveys done across a bunch of countries.
I'm still not sure where to find actual response counts by country or demographic data on respondents — it's easy to find tons of data on how different health issues are ranked and how common they are, but not to find a full "factory tour" of how the estimates were pu... (read more)
Thanks, all resolved!
The license still applies! We'll have it back up on the footer soon.
This was a nice little post!
One of the biggest draws to the EA community for me — and something that's kept me involved — is how much small-scale altruism goes on here. Unsurprisingly, a movement founded on practical altruism draws a lot of people who enjoy helping and actually care about providing good help.
This manifests in a bunch of ways. Two that come to mind: EA Global participants swarming me to help carry heavy conference items through a shopping mall when I was at CEA, and a bunch of cases where someone in the community encountered a persona... (read more)
I don't share your view about what a downvote means.
What does a downvote mean to you? If it means "you shouldn't have written this", what does a strong downvote mean to you? The same thing, but with more emphasis?
It'd be interesting to have some stats on how people on the forum interpret it.
Why not create a poll? I would, but I'm not sure exactly which question you'd want asked.
Most(?) readers won't know who either of them is, not to mention their relationship.
Which brings up another question — to what extent should a comment be written for an author vs. t... (read more)
Personally, I primarily downvote posts/comments where I generally think "reading this post/comment will on average make forum readers be worse at thinking about this problem than if they didn't read this post/comment, assuming that the time spent reading this post/comment is free."
I basically never strong downvote posts unless it's obvious spam or otherwise an extremely bad offender in the "worsens thinking" direction.
The flower was licensed from this site.
The designer saw and appreciated this comment, but asked not to be named on the Forum.
I didn't get that message at all. If someone tells me they downvoted something I wrote, my default takeaway is "oh, I could have been more clear" or "huh, maybe I need to add something that was missing" — not "yikes, I shouldn't have written this". *
I read Max's comment as "I thought this wasn't written very clearly/got some things wrong", not "I think you shouldn't have written this at all". The latter is, to me, almost the definition of a strong downvote.
If someone sees a post they think (a) points to important issues, and (b) gets important things wrong... (read more)
I'll read any reply to this and make sure CEA sees it, but I don't plan to respond further myself, as I'm no longer working on this project.
Thanks for the response. I agree with some of your points and disagree with others.
To preface this, I wouldn't make a claim like "the 3rd edition was representative for X definition of the word" or "I was satisfied with the Handbook when we published it" (I left CEA with 19 pages of notes on changes I was considering). There's plenty of good criticism that one could make of it, from almost any perspec... (read more)
This is a minor point in some ways but I think explicitly stating "I downvoted this post" can say quite a lot (especially when coming from someone with a senior position in the community).
I ran the Forum for 3+ years (and, caveat, worked with Max). This is a complicated question.
Something I've seen many times: A post or comment is downvoted, and the author writes a comment asking why people downvoted (often seeming pretty confused/dispirited).
Some people really hate anonymous downvotes. I've heard multiple suggestions that we remove anonymity from vo... (read more)
I think the problem isn't with saying you downvoted a post and why (I personally share the view that people should aim to explain their downvotes).
The problem is the actual reason:
I think you're pointing to some important issues... However, I worry that you're conflating a few pretty different dimensions, so I downvoted this post.
I think you're pointing to some important issues... However, I worry that you're conflating a few pretty different dimensions, so I downvoted this post.
The message that, for me, stands out from this is "If you have an important idea but can't present it perfectly - it's better not to write at all." Which I think most of us would not endorse.
While at CEA, I was asked to take the curriculum for the Intro Fellowship and turn it into the Handbook, and I made a variety of changes (though there have been other changes to the Fellowship and the Handbook since then, making it hard to track exactly what I changed). The Intro Fellowship curriculum and the Handbook were never identical.
I exchanged emails with Michael Plant and Sella Nevo, and reached out to several other people in the global development/animal welfare communities who didn't reply. I also had my version reviewed by a dozen test readers (... (read more)
This is a tricky question to answer, and there's some validity to your perspective here.
I was speaking too broadly when I said there were "rare exceptions" when epistemics weren't the top consideration.
Imagine three people applying to jobs:
I could imagine Bob beating Alice for a "build a new group" role (though I think many CB people would prefer Alice), because friendliness is so cru... (read more)
In August 2014, I co-founded Yale EA (alongside Tammy Pham). Things have changed a lot in community-building since then, and I figured it would be good to record my memories of that time before they drift away completely.
If you read this and have questions, please ask!
I was a senior in 2014, and I'd been talking to friends about EA for years by then. Enough of them were interested (or just nice) that I got a good group together for an initial meeting, and a few agreed to stick around and help me r... (read more)
I'd recommend cross-posting your critiques of the "especially useful" post onto that post — will make it easier for anyone who studies this campaign later (I expect many people will) to learn from you.
Thanks for sharing all of this!
I'm curious about your fear that these comments would negatively affect Carrick's chances. What was the mechanism you expected? The possibility of reduced donations/volunteering from people on the Forum? The media picking up on critical comments?
If "reduced donations" were a factor, would you also be concerned about posting criticism of other causes you thought were important for the same reason? I'm still working out what makes this campaign different from other causes (or maybe there really are similar issues across a... (read more)
I think I was primarily concerned that negative information about the campaign could get picked up by the media. Thinking it over now though, that motivation doesn't make sense for not posting about highly visible negative news coverage (which the media would have already been aware of) or not posting concerns on a less publicly visible EA platform, such as Slack. Other factors for why I didn't write up my concerns about Carrick's chances of being elected might have been that:
I think that the principal problem pointed out by the recent "Bad Omens" post was peer pressure towards conformity in ways that lead to people acting like jerks, and I think that we're seeing that play out here as well, but involving central people in EA orgs pushing the dynamics, rather than local EA groups. And that seems far more worrying.
What are examples of "pressure toward conformity" or "acting like jerks" that you saw among "central people in EA orgs"? Are you counting the people running the campaign as “central”? (I do agree with some of Matthew’s... (read more)
Overall, I agree with Habryka's comment that "negative evidence on the campaign would be 'systematically filtered out'". Although I maxed out donations to the primary campaign and phone banked a bit for the campaign, I had a number of concerns about the campaign that I never saw mentioned in EA spaces. However, I didn't want to raise these concerns for fear that this would negatively affect Carrick's chances of winning the election.
Now that Carrick's campaign is over, I feel more free to write my concerns. These included:
Here are some impressions of him from various influential Oregonians. No idea how these six were chosen from the "more than a dozen" originally interviewed.
Thanks for writing this. While I don’t personally enjoy being featured, I appreciate the post as a Forum reader and former mod.
A few notes on my approach to donating, since I was quoted:
I found this post harder to understand than the rest of the series. The thing you're describing makes sense in theory, but I haven't seen it in practice and I'm not sure what it would look like.
What EA-related lifestyle changes people would other people find alienating? Veganism? Not participating in especially expensive activities? Talking about EA?
I haven't found "talking about EA" to be a problem, as long as I'm not trying to sell my friends on it without their asking first. I don't think EA is unique in this way — I'd be annoyed if my relig... (read more)
This is a great post! Upvoted. I appreciate the exceptionally clear writing and the wealth of examples, even if I'm about 50/50 on agreeing with your specific points.
I haven't been involved in university community building for a long time, and don't have enough data on current strategies to respond comprehensively. Instead, a few scattered thoughts:
I was talking to a friend a little while ago who went to an EA intro talk and is now doing one of 80,000 Hours' recommended career paths, with a top score for direct impact. She’s also one of the most charismati
Minor elaboration on your last point: a piece of advice I got from someone who did psychological research on how to solicit criticism was to try to brainstorm someone's most likely criticism of you would be, and then offer that up when requesting criticism, as this is a credible indication that you're open to it. Examples:
Privately discussed info in a CRM seems like an invasion of privacy.
I've seen non-EA college groups do this kind of thing and it seems quite normal. Greek organizations track which people come to which pledge events, publications track whether students have hit their article quota to join staff, and so on.
Doesn't seem like an invasion of privacy for an org's leaders to have conversations like "this person needs to write one more article to join staff" or "this person was hanging out alone for most of the last event, we should try and help them feel more comfortable next time".
I keep going back and forth on this.
My first reaction was "this is just basic best practice for any people-/relationship-focused role, obviously community builders should have CRMs".
Then I realised none of the leaders of the student group I was most active in had CRMs (to my knowledge) and I would have been maybe a bit creeped out if they had, which updated me in the other direction.
Then I thought about it more and realised that group was very far in the direction of "friends with a common interest hang out", and that for student groups that were less like... (read more)
I've seen people make these complaints about EA since it first came to exist.
As EA becomes bigger and better-known, I expect to see a higher volume of complaints even if the average person's impression remains the same/gets a bit better (though I'm not confident that's the case either).
This includes groups with no prior EA contact learning about it and deciding they don't like it — but I think they'd have had the same reaction at any point in EA's history.
Are there notable people or groups whose liking/trust of EA has, in your view, gone down over time?
The 80K board is an understandable proxy for "jobs in EA". But that description can be limiting.
Many non-student EA Global attendees had jobs at organizations that most wouldn't label "EA orgs", and that nevertheless fit the topics of the conference.
Some of these might have some of their jobs advertised by 80K, but there are also ton... (read more)
A count of topics at EAG and EAGx's from this year show a roughly 3:1 AI/longtermist to anything else ratio
I'm not sure where to find agendas for past EAGx events I didn't attend. But looking at EAG London, I get a 4:3 ratio for LT/non-LT (not counting topics that fit neither "category", like founding startups):
If you want recommendations, just take the first couple of items in each category. They are rated in order of how good I think they are. (That's if you trust my taste — I think most people are better off just skimming the story summaries and picking up whatever sounds interesting to them.)
Huzzah! Hope you enjoy your time here.
We publish our giving to political causes just as we publish our other giving (e.g. this ballot initiative).
As with contractor agreements, we publish investments and include them in our total giving if they are conceptually similar to grants (meaning that investments aren't part of the gap James noted). You can see a list of published investments by searching "investment" in our grants database.
We're still in the process of publishing our 2021 grants, so many of those aren't on the website yet. Most of the yet-to-be-published grants are from the tail end of the year — you may have noticed a lot more published grants from January than December, for example.
That accounts for most of the gap. The gap also includes a few grants that are unusual for various reasons (e.g. a grant for which we've made the first of two payments already but will only publish once we've made the second payment a year from now).
We only include contractor agreeme... (read more)
A belated thanks for this reply! I've reached the end of my knowledge/spare time for research at this point, but I'll keep an eye out for any future posts of yours on these topics.
The group was small and didn't accomplish much, and this was a long time ago. I don't think the post would be interesting to many people, but I'm glad you enjoyed reading it!
From August 2015 - October 2016, I ran an effective altruism group at Epic, a large medical software corporation in Wisconsin. Things have changed a lot in community-building since then, but I figured it would be good to record my memories of that time, and what I learned.
Launching the group
The book poses an interesting and difficult problem that characters try to solve in a variety of ways. The solution that actually works involves a bunch of plausible game theory and feels like it establishes a realistic theory of how a populous universe might work. The solutions that don't work are clever, but fail for realistic reasons.
Aside from the puzzle element of the book, it's not all that close to ratfic, but the puzzle is what compelled me. Certainly arguable whether it belongs in this category.
I think that "intense, fanatical dedication to worldbuilding" + "tons of good problem-solving from our characters, which we can see from the inside" adds up to ratfic for me, or at least "close to ratfic". Worm delivers both.
I've dedicated far too much time to reading rationalist fiction. This is a list of stories I think are good enough to recommend.
Here's my entire rationalist fiction bookshelf —a mix of works written explicitly within the genre and other works that still seem to belong. (I've written reviews for some, but not all.)
Here are subcategories, with stories ranked in rough order from "incredible" to "good". The stories vary widely in scale, tone, etc., and you should probably just read whatever seems most interesting to... (read more)
The Forum doesn't have built-in support for internal links, in either editor.
Internal links take the form of PostURL#subheading_title_with_spaces_represented_by_underscores, with punctuation and extra spaces taking the form of additional underscores.
You can also right-click on a subheading and select "copy link address" to get the URL on your clipboard. Or just click the subheading and see what URL shows up in your address bar.
If you want to push for an internal links feature, use this thread (not sure if someone else has suggested yet, you may want to look around a bit).
If your intention was to elicit stories rather than to get a sense for how common dishonesty was, your wording makes sense.
I had assumed you were trying to do the second thing, and my comment was honest. I try to be straightforward with all of my Forum comments.
I shared this with someone who asked for my advice on finding someone to help them improve their writing. It's brief, but I may add to it later.
I think you'll want someone who:* Writes in a way that you want to imitate. Some very skilled editors/teachers can work in a variety of styles, but I expect that most will make your writing sound somewhat more like theirs when they provide feedback.* Catches the kinds of things you wish you could catch in your writing. For example, if you want to try someone out, you could ask t
I think you'll want someone who:
* Writes in a way that you want to imitate. Some very skilled editors/teachers can work in a variety of styles, but I expect that most will make your writing sound somewhat more like theirs when they provide feedback.
* Catches the kinds of things you wish you could catch in your writing. For example, if you want to try someone out, you could ask t
Depends on your AI timelines.
There's no limit to when you can edit a post.
Ahead of the full post, I'd like to know what you think the most compelling evidence is for non-invasive brain stimulation actually working. This could be a paper, a blog post from some self-experimenter, or something else — whatever made you think this was important to study further.
(I know nothing about this topic at all, and don't even have a mental picture of what NIBS would physically look like.)
Thanks! This is exactly the kind of thing I was looking for.
If you share info about this project elsewhere, I'd recommend mentioning your background with TED-Ed! I thought the videos linked from the SWS website were great, and people might be more interested in checking out the open roles if they know more about who they'll work with before they click through to the site.