864Joined Sep 2014


Thanks! I think what you said is correct from the viewpoint of individual donees -- an individual donee isn't guaranteed to get all or most of the donation of a donee at a high level. Though especially in the EA community it's probably true that donors donating large sums of money in total will usually donate nontrivial sums per donee if they donate at all (for instance, a level 5 donor is unlikely to make donations of just $100 to things they deem as the most effective uses of money, because marginal value functions rarely cross that quickly). I don't have rigorous data to back this up and maybe I'm wrong about it.

Title should say CCP not CPP?

The footer on your site says to post any questions as public comments on this post, so here goes (this is not directly related to the content of this post):

I noticed that a $470,000 grant to Charity Entrepreneurship, that was visible on your site as recently as August 31 (see, is now no longer visible on your grants page. What happened to the grant?

I also noticed that lists a grant to Michael Robkin, but that this grant is not listed at either (open call grants) or (staff-led grants). My understanding was that everything in should be listed either as an open call grant or a staff-led grant, but this does not seem to be the case. What might be going on?

Thank you!

Good point. My understanding is that Open Phil made a general decision to focus only on US policy for most of their policy areas, for the reason that there are high fixed costs to getting familiar with a policy space. In some areas like animal welfare they've gone beyond US policy, but those are areas where they are spending way more money.

Their grants to Labor Mobility Partnerships stand out as not being US-specific, though LaMP is still currently more focused on the US.

I do expect that if there are shovel-ready, easy-to-justify opportunities outside the US, Open Phil would take them.

Hello! I'm wondering what implications the switch to rolling applications has on how payout reports are published? doesn't include anything beyond April 1, 2021. Previously there would be three reports per year tied to the (discrete) grant rounds.

Hi Darius!

I appreciate that you've raised this issue and provided a reasonably thorough discussion of it. I would like to highlight a bunch of aspects based on my experience editing Wikipedia as well as studying its culture in some depth. While the paid editing phase and the subsequent fallout inform my views partly, these are actually based on several years of experience before (and some after) that incident.

While none of what I say falsifies what you wrote, it is in tension with some of your tone and emphasis. So in some ways these observations are critical of your post.

How much reverence does Wikipedia's process deserve?

I think that, if your goal is to spend a lot of time editing Wikipedia, it's really important to study Wikipedia's policies -- both the de jure ones and the de facto ones. That's because the policies are not completely intuitive, and the enforcement can often be swift and unfriendly -- giving you little chance to recover once you get on the bad side of it.

So in that sense, it is important to respect and understand Wikipedia's policies.

But, that is not the same as being reverent toward the policies and enforcement mechanisms. I think your post has some of that reverence, as well as a "just world" belief, that the policies and their enforcement are sensible and just, and align with effective altruist ideals. For instance, you write:

Therefore, anyone considering making contributions to Wikipedia should become familiar with its rules, and in particular adhere to the requirement not to approach editing as an advocacy tool. This is important both because trying to paint an overly favourable picture of EA-related topics will, as Brian notes above, likely backfire, and because observing such a requirement is in line with EA's commitment to intellectual honesty and moral cooperation. Wikipedia is one of the world’s greatest altruistic projects—their contributors share many of our core values, and we should respect their norms and efforts to maintain Wikipedia’s high quality.


Don’t feel like you need to have read all articles about Wikipedia rules and norms before you can start to edit. While reading them upfront may help you avoid some frustrating experiences later, the biggest failure mode is getting overwhelmed and being discouraged from ever taking the first step on your editing journey. Most of Wikipedia’s rules and norms are commonsensical, and you are bound to become familiar with them as you gather editing experience.

In contrast, my take on understanding the Wikipedia system is that it bears many resemblances to other legal and bureaucratic systems -- many of the rules make sense in theory, and have good rationales, but their application is often really bad. Going in with a positive "just world" belief in Wikipedia seems like a recipe for falling down rather hard at the first incident. I think the best way is to be well-prepared in terms of understanding the dynamics and the kinds of attacks you may endure, so that then once you do get in there you have no false expectations, and if you do get into a fight you can bow down and stay cool without feeling rattled.

You've linked to Gwern's inclusionism article already; a few other links I recommend: Wikipedia Bureaucracy (continued), Robert Walker's answer on frustrating aspects of being a Wikipedia editor, and Gwern's piece on dark side editing.

On that note, what kind of preparation is necessary?

Based on my experience editing Wikipedia, and seeing my edited articles spend several years surviving, growing, getting deleted, or shrinking -- all of which have happened to me -- I can say it's important to be prepared when editing Wikipedia in a few ways:

  • Prepare for your work getting deleted or maimed: On a process level, this means keeping off-Wikipedia backups (Issa and I implemented technical solutions to back up the content of articles we were editing automatically, in addition to manual syncing we did at every edit). During a mass deletion episode following the paid editing, we almost lost the content of several articles, but were fortunately able to retrieve it. At an emotional level, it means accepting the possibility that stuff you spent a lot of time writing can, sometimes immediately and sometimes after years, randomly get deleted or maimed beyond recognition. And even if reasons are proffered for the maiming or deletion, you are unlikely to consider them good reasons.

  • Prepare to be attacked or questioned in ways you might find ridiculous: This may not happen to you for years, and then may suddenly happen even if you are on your best behavior -- because somebody somewhere notices something. While there are a number of strategies to reduce the probability of this happening (don't get into fights, avoid editing controversial stuff, avoid overtly promotional or low-quality edits) they are no guarantee. And if you have a large corpus of edits, once somebody is suspicious of you, they can go after your whole body of work. The emotional and psychological preparation for that -- and the background knowledge of it so that you can make an informed decision to edit Wikipedia -- is important.

A few specific tripping points of effective altruist projects to edit Wikipedia

When do you get into trouble on Wikipedia, keep in mind these likely truths about the other side (though this could vary a lot from situation to situation, and you could well get lucky enough for these not to apply to you):

  • The bulk of the people will be highly suspicious of you.
  • Those opposing you probably have a lot more time than you do and a better ability to navigate Wikipedia's channels.
  • They will not be impressed by your efforts to defend yourself, even against points you consider clearly illogical.
  • Efforts to point to noble goals (e.g. effective altruism) or measurement tools (e.g. pageviews) will make them more suspicious of yours, as it will be taken as evidence of a conflict of interest.
  • Your efforts to recruit people through off-Wikipedia channels (e.g., this EA Forum post) may make matters worse, as it might lead to accusations of canvassing.
  • Being mindful of your feelings will not be a priority for them.

What kind of Wikipedia editing might still be safe and okay to do?

This will vary from person to person. I think the following are likely to be okay for anybody altruistically inclined but moderately risk-averse:

  • Drive-by fixes to spelling, grammar, punctuation, formatting, broken links, etc.: Once you have acquired basic familiarity with Wikipedia editing, making these fixes when you notice issues is quick and easy.
  • Substantive edits or even new page creations where you have fairly high confidence that your edits will pass under the radar of zealous attackers (this tends to work well for obscure but protected topics; some academic areas such as in higher mathematics could be like this).
  • Substantive edits or even new page creations where, even if the edit gets reverted or the page deleted, the output you create (in terms of update to your state of mind, or the off-Wikipedia copy of the updated content) makes it worthwhile.

A positive note to end on

I will end with a wholehearted commendation of the spirit of your post; as I see it, this is about being prosocial in a broad sense, "giving back" to a great resource, and finding opportunities to benefit multiple communities and work in a collaborative fashion with different groups to create more for the world. I generally favor producing public output while learning new topics; where the format and goals allow it, this could be Wikipedia pages! Issa Rice has even documented this "paper trail" approach I follow.

PS: I thank Issa Rice for some of the links and thoughts that I've included in this comment as well as for reviewing my draft of the comment. Responsibility for errors and omissions is fully mine; I did not incorporate all of Issa's feedback.

Hi Linch! I have a loose summary of my sponsored Wikipedia editing efforts at that I have just updated to include more information and links.

For third-party coverage of the incident, check out -- I'm linking to Wayback Machine since that wiki seems to no longer exist; also a warning that the site's general viewpoints are redpill, which might be a dealbreaker for some readers. But this particular article seems reasonably well-done in terms of its reporting/coverage, and isn't too redpill.

It looks like the NIST randomness beacon will be back in time for the draw date of the lottery. says "NIST will reopen at 6:00 AM on Monday, January 28, 2019."

Might it make sense to return to the NIST randomness beacon for the drawing?

The comments on naming beliefs by Robin Hanson (2008) appears to be how the consensus around the impressions/beliefs distinction began to form (the commenters include such movers and shakers as Eliezer and Anna Salamon).

Also, impression track records by Katja (September 2017) recent blog post/article circulated in the rationalist community that revived the terminology.

Load More