EA Forum Prize: Winners for February 2019

by Aaron Gertler3 min read29th Mar 201920 comments

46

Forum PrizeCommunity
Frontpage

CEA is pleased to announce the winners of the February 2019 EA Forum Prize!

In first place (for a prize of $999): "Evidence on good forecasting practices from the Good Judgment Project", by kokotajlod.

In second place (for a prize of $500): "Small animals have enormous brains for their size”, by eukaryote.

In third place (for a prize of $250): "Will companies meet their animal welfare commitments?", by saulius.

We also awarded prizes in November, December, and January.

What is the EA Forum Prize?

Certain posts exemplify the kind of content we most want to see on the EA Forum. They are well-researched and well-organized; they care about informing readers, not just persuading them.

The Prize is an incentive to create posts like this. But more importantly, we see it as an opportunity to showcase excellent content as an example and inspiration to the Forum's users.

About the winning posts

"Evidence on good forecasting practices from the Good Judgment Project" is a thorough, well-organized summary of forecasting — a topic often discussed on the Forum, but rarely with this amount of data.

We may know that prediction markets are “useful”, but the author goes far beyond that, explaining how well different types of markets (and non-market mechanisms) have performed in prediction tournaments, and which characteristics the best forecasters tend to have. This research could be useful to any number of future forecasting projects in the community.

Additionally, the author:

  • Uses numbered headers to separate sections.
  • Includes hyperlinked footnotes for all citations.
  • Notes cases where information from original sources is missing or uncertain, giving readers ideas for ways to contribute to his research. (For example, I’d love to learn more about Tetlock’s “perpetual beta” concept, if anyone cares to go and find it.)

Overall, this is a remarkable post, and I hope that other Forum users create similarly excellent summaries of important concepts.

"Small animals have enormous brains for their size” makes a single, simple point (you can see it in the title), but does so with unusual elegance.

I still remember the core simile — "you have as many neurons as a half-full bucket of ants" — many weeks after I first read the article, and expect to remember it for years to come, thanks to the original art which enlivens the piece. Illustrations aren’t essential to Forum posts, but making good ideas memorable, however you choose to do it, amplifies their impact.

Additionally, the author:

  • Recommends further reading for anyone who found the article interesting (this is surprisingly rare for EA Forum posts, despite the vast literature that informs many of our ideas).
  • Doesn’t overstate her point; instead, we get facts about neurons, plus a list of ways in which these facts could interact with certain beliefs to produce other beliefs, without advocacy for any of those beliefs.
    • There’s nothing wrong with advocating beliefs, of course, but there can be major benefits to separating "fact posts" from "belief posts". For example, a fact post is more likely to be cited by authors with a range of beliefs, making everyone’s belief posts more evidence-based in the process.

"Will companies meet their animal welfare commitments?"offers crucial context on one of the most popular causes in EA: animal-advocacy campaigns targeting corporations.

If companies don’t actually live up to their promises, we haven’t made an impact. The author pulls together dozens of different sources from inside and outside of the EA community to show that… well, these promises may not be as impactful as they first seemed. But he doesn’t just explain the issue; he also notes the high level of uncertainty around particular facts and figures (providing better information even at the risk of undercutting his “point”) and suggests ways to improve the situation.

Additionally, the author:

  • Uses our built-in header system to separate sections (I'm repeating myself here, because this is a really useful feature and I strongly encourage authors to use it for anything longer than a page or so).
  • Proposes improvements that animal charities could make without harshly criticizing those charities (distinguishing between “things could be better” and “things are actively bad” is a good habit).
  • Points out the ways in which his findings might affect our cost-effectiveness estimates around animal advocacy. Explaining a crucial consideration is good; estimating its impact makes the explanation even better.

The voting process

All posts published in the month of February qualified for voting, save for those written by CEA staff and Prize judges.

Prizes were chosen by six people:

Voters recused themselves from voting on posts written by their colleagues. Otherwise, they used their own individual criteria for choosing posts, though they broadly agree with the goals outlined above.

Winners were chosen by an initial round of approval voting, followed by a runoff vote to resolve ties.

The future of the Prize

After reviewing feedback we’ve received about the Prize, we’ve decided to continue giving it out for another six months (February through July) before running a second round of review. We don’t have any current plans to change the format, but we won’t rule out potential changes in future months.

If you have thoughts on how the Prize has changed the way you read or write on the Forum, or about ways we should consider changing the current format, please let us know in the comments or contact Aaron Gertler.

46

21 comments, sorted by Highlighting new comments since Today at 6:26 AM
New Comment

I'm surprised that "After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation" did not win, given that:

  • It started an important conversation, likely valuable for people seeking EA jobs, people providing EA career advice, and people hiring for EA jobs.
  • It generated 259 upvotes and 177 comments, which is more than I remember ever seeing.
  • It must have been unusually difficult for the author to write.

I'll disclose that as one of the voters, I found this post very interesting and helpful, but I didn't value it as much as the specific research content that won the top three prizes. (Though note that I recuse myself from voting on content from Rethink Priorities.)

+1, very surprised by this. I believe it's the most-upvoted post of all time, plus probably quite helpful to many stakeholders in the community.

As one of the people who voted, I was also surprised and disappointed by this. But different voters applied different standards on what kind of content they wish to support.

Edit: The original text of this comment below remains unedited, but I made the mistake of stating the CEA sets the conditions of the EA Forum Prizes, when they only provide the funding for them.

Summary: It makes sense the EA Forum is currently set up to promote or incentivize content that clearly advances one or more of EA's current objectives framed so it's generally accessible. That content is prioritized based on the view it's the most important role or function the EA Forum serves as a platform. This is different than the priority of promoting and incentivizing popular content, because it raises awareness and starts a conversation of what is a top priority for the greatest number of community members (active on the EA Forum). This post advances the latter as opposed to the former goal, which is probably why it wouldn't receive an EA Forum Prize. It seems starting a conversation about what the priorities for promotion and incentives on the EA Forum, and what the criterion for selecting those priorities, should be would be how to best broach this subject.

Why different posts receive the reward, and why this post didn't receive the reward, is a matter of what kind of posts people want to reward and incentivize, and why. It also makes sense to keep in mind the rewards are given and the EA Forum maintained by the Centre for Effective Altruism (CEA) as an institution. I'm aware with the current strategy for the EA Forum, the goal is to promote content that is:

  • generally accessible.
  • content that is more basic, and doesn't assume advanced background knowledge of one or more particular cause areas.
  • makes intellectual and/or material progress on the general goals of effective altruism, or successfully appeals to a wide audience about why and how a particular means can be applied to achieve those goals.

This is based on the ultimate goal of having the EA Forum be a platform primarily focused on community-building, both in terms of growing the effective altruism movement, and enhancing the level of involvement from people who only relate to EA in a more casual way (e.g., inducing those who merely to 'subscribe' to EA as a philosophy to personally 'identify' with it, and change what they themselves personally do to be aligned with EA values).

This contrasts with how the current EA community tends to use Facebook groups, which are used for conversations that tend to either be more specialized and technical, e.g., about a specific cause area or career, or social and informal. For a bulk of the current active EA community, their use of the EA Forum is based on prioritizing conversation of affairs in EA that are both official, and general, in that the conversation is, at least in theory, relevant to everyone in EA. It makes sense to a lot of the EA community this should be a primary purpose for the EA Forum, and they've gotten accustomed to using it that way.

The problem is what much of the EA community sees as a primary priority for the EA Forum's role/function is not the top priority for what is promoted or incentivized as part of the EA Forum's moderation strategy. The EA Forum serves as a public square for whatever topics and subjects are a priority for the EA community at large. The content being incentivized through rewards, or promoted to the frontpage, is content that advances EA's objectives, as opposed to discussions themed on grievances with the EA community's social dynamics, what a lot of people in EA call a more 'meta-level' discussion or issue. The dedicated space for this on the EA Forum 2.0 so far has been the 'Community' section.

One obvious factor here is how promoting or incentivizing content that raises awareness of disagreements and controversies within EA is something that could be offputting to a general readership, or get them more involved in ways that distract from rather than advance progress on EA's objectives. For what it's worth, I think this was an unusually fruitful hashing out in public of a common grievance in EA. I also don't believe the CEA is not rewarding posts critical of community dynamics out a desire to starve these discussions of awareness and attention. They consider these conversations important, but it's just they merely consider posts that directly advance the objectives of EA as a movement in various ways more valuable.

So based on moderation strategy of the EA Forum, there is a criterion for awarding EA Forum Prizes that are not aligned with the content that tends to be most popular, for whatever reasons. It's similar to how the Academy Awards don't usually go to the films that earn the most money at the box office. The next step seems to be having a conversation aiming to reconcile with what the EA Forum's moderation strategy prioritizes, with why the community at large thinks the most upvoted EA Forum posts are the most important, and should be incentivized.

Do note that while CEA distributes the prize, CEA employees are only a minority of the overall judges that cast votes for the winner of these prizes.

Does sharing personal experiences that contribute to better guidelines about whether to pursue direct work, or counterbalancing an excessive emphasis on work at an EA organization, not further the objectives of EA? It’s certainly at a more meta level, but hey, meta EA is still one of the “four cause areas” and one of the EA Funds. I’m not saying it necessarily is more valuable than the winners of the prize, but I don’t think it should disqualified on that basis.

I also don’t think we should shy away from incentivizing posts that reflect disagreements within EA or are critical of EA as it is. That’s not too far off from disincentivizing disagreement (something like if you write about that topic you have zero chance of winning the prize), and that feels wrong on an open forum.

EA Forum content generally considered most valuable tends to be the kind that advances the objectives of one or more of EA's cause areas, or the philosophy of the movement in general. Content focused on EA itself as a social community is a different kind of content that is typically related as less valuable. I think this judgement can be inferred from what articles tend to win the EA Forum Prizes. The sticking point is that this post is perceived as a particularly valuable example (perhaps the most valuable example) among a kind of post that are generally regarded as less valuable.

Of course the post in question advances the objectives of EA. At least in the evaluation of the judges, a handful of other posts this month were more valuable still. It wasn't disqualified.

Whether by coincidence of typically being on the topic of 'community,' or another reason, I agree we should neither shy away from incentivize posts that reflect disagreements in EA, or are critical of EA as it is; nor directly disincentivize disagreement. I do believe there is a tendency towards that. While I am wary of incentivizing discussion of disagreement for its own sake, since that could introduce the perverse incentives of people posting articles that don't do the disagreement justice, overall I believe it's fairly achievable.

I've got a lot on my plate, and it is also not as much a personal priority for me in EA, so I wouldn't do it, but I would recommend you (or someone else concerned) write an EA Forum article discussing what you think the criteria or priorities should be for the EA Forum Prizes, relative to the kinds of articles that win the prize now, and in particular why it is important they should include incentivizing high-quality treatments of critical disagreements in EA. I would be willing to proofread or otherwise aid in writing the article.

Makes sense. FYI, I'm not currently interested in writing such a post, so if anyone else wants to, please do!

All of the topics you discussed are indeed useful, and posts about them are eligible for the Prize. The only non-eligible posts are those written by voters or those that come from CEA.

I hope that "being in the 97% of posts that don't win a prize this time around" isn't a major disinc, and I think it would be disingenuous to specifically favor critical pieces in the voting process. For now, we're doing a really hands-off process with no formal guidelines for voters, which has led to a mix of research and meta posts winning the prize, some of which contained direct criticism of EA organizations and charity recommendations.

...there is a criterion for awarding EA Forum Prizes that are not aligned with the content that tends to be most popular, for whatever reasons.

Makes sense to me!

In part I'm surprised because many prominent EA stakeholders (including Denise_Melchin, Larks, Max_Daniel, agdfoster, Jonas Vollmer, Tee) noted that they found the post helpful, which seems different from raw popularity.

As Peter noted, while CEA provides funding for the prizes, only two of the six voters work for CEA. I'm one of those two, and I vote according to a personal standard that doesn't have anything to do with "what CEA wants", and is more related to some combination of "average utility per reader" + "sets a good example for how to write good Forum posts" + "other minor factors too numerous to list".

One note on upvotes: They correlate heavily with "number of people who read something". If posts A and B are equally high-quality, and post B is shared in a bunch of large Facebook groups, B will almost certainly get more upvotes, but that doesn't mean it was more useful to the average reader. (I don't think any kind of voting metric should be the sole standard for the Prize, but if we were thinking about such metrics, we could look for something like "among posts with 100+ unique visitors, which had the highest karma-to-visitor ratio?")

Thanks for your response. I was under a false impression. My apologies for the mistake.

Dovetailing Milan, I remember from a discussion in the comments of that post itself, it was reckoned that even taking into account changes to the karma system in the EA Forum 2.0, that post received the highest absolute number of upvotes from any post in the history of the EA Forum.

Voters are important people whose time is valuable, and I'm a bit concerned about the time they spend to decide whom to vote for. For example, I don't want them to read the very long post I've written just to decide whether to vote for it (provided it's not relevant/interesting for them otherwise). I expect them to have more important things to do with their time. I understand that they are not obliged to read it. But being a voter probably puts some pressure on them to read the forum more than they would otherwise, and that might come at the expense of other work. Also, making voting decisions of this kind can be mentally tiring. And if voters don't put much energy into it because they are busy with more important stuff, then wrong posts get selected.

Speaking as one of the judges, I read a lot of the forum anyway because I find a broad selection of content to be relevant/interesting, and I find judging to be a trivial additional time burden (maybe ~10min a month).

Do you think of this as an argument against the existence of the Prize? Do you like the Prize, but think we should have a different voting system?

I guess another thing to watch out for is whether the prize consistently creates controversies like the one in the thread above. If it does, then maybe the prize is more distracting than useful.

I didn't think that far, I just expressed a concerned. But no one said it requires a significant time investment and Peter said the opposite, so maybe there is no problem :)

[+][comment deleted]2y 0