Augur is a decentralized protocol using the blockchain which allows anyone to setup a prediction market about anything. Although I’m not sure about the legality, the fact that no one individual/institution owns or runs Augur suggests to me it might be easier to build niche/specific prediction markets on top of it.
Can you clarify why you think your three criteria are enough to ascribe benign intentions the majority of the time? The point I was trying to get at was that there’s no relation to thinking a lot about how to make the world a better place and making sacrifices to achieve that AND also having benign intentions towards other groups. People can just more narrowly define the world that they are serving.
A concrete example of how believing women have less worth than men could be harmful in evaluating charities; one charity helps women by X utils, one charity hel
You identify the number one issue you have with activists from demographic groups being that they are suspicious of EA motivations.
One of the major problems driving social justice fear and offense in the US right now is the failure of right-wing and centrist actors to credibly demonstrate that they're not secretly harboring bias and hate. If I was going to pick something that activists for underrepresented demographics need to revise when they look at EA, it's that they should stop applying their default suspicions to the people in EA.
And yo... (read more)
+1 for pointing out the hazard of having funding concentrated in the hands of a very few decision makers
Got it. I would recommend cutting this post down roughly in half -- you take a while to get to the point (stating your thesis in roughly the 14th paragraph). I understand the desire to try and warn the audience for what is coming, but the first section until you get to the thesis just seems overwrought to me. I know cutting is hard, but I'm confident the rewards from increased clarity will be worth it.
Hi, I hope this doesn’t offend, but is this meant to be satire? I’m unclear if that’s the case (and I don’t think this post is well structured whether it’s meant to be satire or serious). If it’s not satire, I’ll engage more.
That makes sense — on a second look, I misread your first comment. Absolutely agree that the community shouldn’t have a go big or go home mentality, ie it shouldn’t be seen as impossible to do good if you can’t get an ultra selective job at one of these organizations.
I would disagree with that line of reasoning -- as donors, we should be seeking to channel money into the most effective places it can do good, not trying to spread out the opportunity to do good to different individuals within the EA movement.
So if donor A can create 10 utils by donating $1 to Org Z, or create 5 utils and one new EA job by donating $1 to Org Y, the choice seems to be clear. My understanding is that our current research suggests that this is the case. (I also agree with Arepo, however, about donors potentially being irrational.)