In round 3 of the impact purchase, $2600 of certificates were purchased from sellers (and $700 of certificates were repurchased from the impact purchase organizers). Thanks to Larks and Owen Cotton-Barratt, who also purchased certificates this round.
The deadline for round 4 is June 25. If you are interested in selling, apply here (any kind of submission is welcome, and you are free to opt out of public scrutiny). If you have questions, feel free to get in touch or leave a comment here.
- We purchased another 1/70th of Ryan Carey and Brayden McLean's organization of EA Melbourne for $1700 (a price of $119k for the whole thing, significantly higher than in the last round). This money was our $1000 budget plus the $700 we received by reselling old certificates.
- Larks purchased 9.9% of Oliver Habryka's organization of wrap parties, paying us $300 and Oliver $700 (a price of around $10k, somewhat less than we paid)
- Owen purchased 1/3 of Ben Kuhn's donation matching blog post from us for $400 (a price of $1.2k, exactly what we originally paid)
- Owen purchased 0.4% of EA Melbourne for $200 (a price of $50k, much less than what we are paying)
We're going to experiment with starting a comment thread here for each project that was submitted to the impact purchase (where we had permission to start a thread). We'll use these threads to keep track of transactions, and to discuss our evaluations. We invite discussion of the projects, criticism of the evaluations and our decisions, offers to purchase certificates or sell similar certificates, questions, etc.
If you might be interested in purchasing certificates, please send us an email or leave a comment. We can't really make money (our counterparties always receive all of the gains from trade), but we'd love to see a more liquid market for impact in general.
Submission: Gina Stuessy's organization of EA Madison.
EA Madison has had a handful of meetups with an average of 6-7 attendees. Gina has been responsible for most of the organization. The submission covers the organizational work through the end of April 2015.
Our very crude evaluation:
We expect that EA Madison has reached something like 50 person-hours of attention (excluding Gina and Ben), though that could easily be off by a factor of 2 in either direction and we have talked to no one else involved with it.
Our very rough guess for the stimulated donations per hour was around $20. A lot of this comes in the form of general engagement rather than directly in the form of donations. This gives a value of about $1000 of donations stimulated.
We expect it to be possible to make a much less noisy evaluation in the future once we can evaluate some of the impact of the group in retrospect. We think our evaluation will predictably rise in expectation over time, reflecting our general preference for purchasing things after their impacts are easier to assess. Our actual expectation for the project's impact obviously can't predictably rise.
What do you think the average (mean) expected value of an EA group is?
The average is very sensitive to how many small groups there are and how many of them we choose to count as an EA group. I don't know the answer, and it doesn't seem very informative about the value of any particular group. Perhaps it would be easier to talk about the average EV per participant-hour, and $20 of EA donations was our guess of that (though it will depend on the kind of participant---there are lots of ways of recruiting participants that I would evaluate lower, and some I would evaluate higher).
Nice update! :) Just one small suggestion - It might be nice to have either an explanation or a link somewhere to the post introducing the concept of impact purchasing for anyone who is new to the idea.
Good idea, done.
Submission: Alexei Andreev's blog post Maximizing donations via a job
Our very rough evaluation:
We looked at the number of upvotes on LessWrong and the list of people who Alexei thought might have benefited during their own job searches. We contacted a handful of these people and asked about how much the post helped them. The sample was not random, and Alexei's list was clearly not exhaustive.
Our impression is that the largest benefits were from:
Most of these effects have somewhat ambiguous signs, i.e. people made tradeoffs differently and it's not obvious that they made improvements. Although we have some reservations with some of the advice in this post, the biggest effects seemed to be improving people's understanding of the recruitment process, and we expect that effect to be positive.
Our extremely rough estimate was that this effect amounted to $200k in total increased (pre-tax) earnings. We guess that on the order of 5-15 people benefited significantly from the post, that the per individual benefits were on the order of $5-10k / year, and that these benefits probably persist anywhere from one to a few years (in many cases the benefit was finding a similarly good job faster, which seems to have a comparably large impact concentrated in one year). This is the crux of our estimate, but it isn't very well informed.
Based on the individuals who benefited we'd guess that about 10% of this money is used to cost-effectively make the future better (and probably somewhat more), and we are valuing the intervention at about $20k EA dollars moved.
We are generally inclined to count financial benefits beyond the fraction that is donated, and our 10% reduction is not motivated entirely by this estimate for donations (which we think is a bit low, and which we haven't adjusted to account for the fact that beneficiaries were disproportionately likely to support causes that we like rather than other EA causes). In this case there are a number of considerations that justify the discount to 10%:
Please post this analysis as a comment on the original article.
Boo-yah! The only thing better than accruing moral value is accruing moral value cost-effectively.
Submission: a chapter of Joao Fabiano's doctoral thesis discussing some consequences of moral enhancement and the complexity of value. Old versions of some of these arguments apper in this blog post, but the thesis itself is not yet public.
I think this case was probably my biggest disagreement with Paul; I thought this project had quite high value. As it happened I didn't end up purchasing any, presumably because the gap between
was larger than the gap between
but I would like to signal that I am willing in general to bid non-trivial amounts for academic work on X-risk and value drift.
The technique I used to value it was to estimate how long it would take MIRI to produce such a piece, how valuable it was compared to MIRI's typical output, and how much it costs MIRI to hire researchers.
The fact it was a doctoral thesis did cause me to assign a significant discount to my valuation. I'm not really sure how to think about this.
This was another project that was very hard to evaluate. Overall, we were sufficiently uncertain and skeptical about the content of the chapter that we thought it was very unlikely that we would have a higher (honest) valuation than the author, and we didn't make a more detailed evaluation.
Thanks for running this Paul! It's a really interesting idea. Thanks for your hard work. I enjoyed the valuation process.
Submission: "I facilitated and lead Computer Science Education Week activities for 25 5-7 year olds at an elementary school in Santa Monica, CA. The event lasted for approximately 90 minutes. And included roughly an hour of preparation time in advance."
Our very crude evaluation:
This was pretty tough to compare to the direct EA work, and it was a small project. We did a ridiculous evaluation just to see how it turned out.
We tried to make the comparison by thinking about how many dollars of EA donations would be required to achieve a comparably good outcome (according to our values). To think about this we considered the "scaled up" version of the activity that reached essentially all US youth from a single year and was repeated enough to have a substantial effect on their education and view of computer science, which we imagined was about 2 million times (or 10 such sessions per student). We then compared that impact to the effect of targeted funding in the areas of science with the most leverage and altruistic impact. We guesstimated that the impact would be about 1/100k the impact of an EA grant the size of the annual budget for US R&D and science (and change) which is something like $1T. (The 100k came from multiplying estimates for the impact of marginal STEM education on research quality/enthusiasm/etc., the relative importance of research quality vs. funding, the extra bang-for-your-buck by targeting the best areas and spending money effectively. We chose the annual US budget because the scaled up intervention reached 1 year of students in the US. The impact on marginal STEM education takes into account the fact that the intervention is just 10 sessions.)
This all suggests a value of something like $5 of stimulated EA donations, which we wouldn't take too seriously :)
(In case it's not clear, we don't endorse this procedure for prioritizing very different causes.)
Submission: Ryan Carey and Brayden McLean's organization of EA Melbourne during 2013, summarized here.
Our crude evaluation:
We think the EA community in Melbourne has had a significant impact, with many people either making large donations or working on EA projects full time who would not have done so otherwise. We're evaluating this impact at around $100k / year, even considering only donations+replacement costs at EA organizations, which we expect to significantly understate the real impact.
Many of these effects seem likely to be long-lasting, and we feel comfortable extending the benefits over at least 5 years.
It is much harder to attribute these impacts to the formal EA Melbourne group, as compared to the informal community, the LW group, TLYCS, the actions of individual EA's in Melbourne, and so on. We had a few conversations to try to get a better sense of this allocation of responsibility. In the end we certainly didn't get a confident answer, but we got a vague intuitive feeling for the situation.
Based on this impression, we allocated 10% of the responsibility to the formal organization of EA Melbourne. Interpreted narrowly we think this is more likely to be an overestimate than an underestimate.
But we think that there are other benefits from EA Melbourne that can justify this estimate, and which will tend to be on the same scale as the direct effect. For example, the broader EA community in Melbourne clearly had a positive effect; but EA Melbourne looks like it will have a big impact building and growing a similar community in the future. And to the extent that other organized communities and the online presence of EA played a big role, it seems that EA Melbourne has made similar contributions back to the broader EA community.
Our estimates concern the impact of EA Melbourne during its first 6 months. We assumed that Brayden and Ryan were almost entirely responsible for the founding of EA Melbourne. People other than Brayden and Ryan were clearly involved with the operation of EA Melbourne over this period. Conversely, EA Melbourne continued to exist after this initial period and it seems likely that its founding will continue to have positive impacts going forward beyond those already mentioned. In the end we called this a wash.
Overall, we evaluated EA Melbourne at $70k in stimulated EA donations.
We purchased 1/70th of this in each of rounds 2 and 3 for a total of $2700. Owen Cotton-Barratt purchased 0.4% in round 3 for $200.
The original certificate was divided evenly between Brayden and Ryan.
Submission: Oliver Habryka's organization of wrap parties for the conclusion of Harry Potter and the Methods of Rationality, summarized here.
We purchase 3% in round 1 for $400, which we resold to Larks for $300 in round 3. Larks purchased 6.9% for $700 in round 3.
Our very rough evaluation:
Submission: Joao Fabiano's research comparing caffeine and modafinil. His description:
"I did a literature review, cost-benefit analysis, ethical assessment and produced a dissertation, blog posts and gave several talks about it. Most of the research was done at the beginning of 2014 as part of my MPhil dissertation (available here in Portuguese, there's an English abstract on page 5). That research spawned many blog posts and presentations in and outside academia, in Brazil and in the UK. Some of these blog posts are available in English here: one, two, three"
Our very crude evaluation:
We had a hard time estimating the total impact of this research. It laid out an interesting case that modafinil is a reasonable alternative to caffeine, at least setting aside social factors. It did not seem to credibly address the main empirical questions that would motivate us to adopt either modafinil or caffeine, and we expect that most readers would be similarly skeptical (if they were sufficiently open-minded to plausibly take modafinil on the basis of analysis). We thought about evaluation by considering how hard it would be to produce a similar amount of value by paying for empirical research or critical review that would help clarify the benefits of modafinil and potentially push adoption.
Submission: a $100 donation by Telofy to GiveDirectly.
It was most convenient to measure other submissions in terms of a metric like "dollars of EA donations stimulated," so to make our decision we tried to convert "dollars moved to GiveDirectly" to this metric.
Even though many EA donations will go to GiveDirectly or other GiveWell charities, there were a few reasons we used a discount:
We're not sure exactly what the multiplier should be, especially in light of positive institutional and cultural effects from funding effective work in development. But we felt comfortable using at least a factor of 5 discount, which we think is large enough that none of the development interventions are likely to be competitive unless they are extremely leveraged.
We plan to think about the multiplier more if we receive submissions that seem like they might be competitive anyway.
Submission: Ben Kuhn's blog post I just sold half of a blog post, explaining and advertising the impact purchase.
We didn't really want to evaluate or purchase this item ourselves, because it seems too self-serving / meta.
Submission: Ben Kuhn's blog post Does Donation Matching Work?
At the time of submission the post had been read by about 300 people for a total of 20 hours. It has since been posted to hacker news and read by more like 1500 people for a total of 60 hours.
Is the number of reads really relevant? How come? I figure people who read content generally don't act on it, and certainly not in high impact ways.
I don't think it's that important, but I certainly think it's relevant. I expect better and more impactful pieces to be read more often.
It's cited mostly because it's a relevant fact that you couldn't infer from public info.
We purchased 1/2 of this post in round 1 for $600. We have since sold 1/3 of the post to Owen Cotton-Barratt for $400. Current holdings:
Who is "Impact Purchase"? Is that you?
Katja and I jointly.
Our very crude evaluation:
In addition to this analysis we did some sanity-checking based on readership and the plausible direct impact of the post on smaller matching drives. Overall this seemed like a plausible estimate to us.