All of Sarthak's Comments + Replies

How can prediction markets become more trendy, legal, and accessible?

Augur is a decentralized protocol using the blockchain which allows anyone to setup a prediction market about anything. Although I’m not sure about the legality, the fact that no one individual/institution owns or runs Augur suggests to me it might be easier to build niche/specific prediction markets on top of it.

Making discussions in EA groups inclusive

Can you clarify why you think your three criteria are enough to ascribe benign intentions the majority of the time? The point I was trying to get at was that there’s no relation to thinking a lot about how to make the world a better place and making sacrifices to achieve that AND also having benign intentions towards other groups. People can just more narrowly define the world that they are serving.

A concrete example of how believing women have less worth than men could be harmful in evaluating charities; one charity helps women by X utils, one charity hel

... (read more)
0zekesherman3yIt seems right based on all my experience talking to people, seeing what they say, considering their beliefs, and observing their behavior. Well in EA we don't "just more narrowly define the world that we are serving". We have philosophical rigor. There are people in EA who believe that animals have a lot of value, so they don't give money to charities that help women (or men). Are they harming women? What should we do about them? What do you mean by equal? It's a core EA belief that the interests of women are equally valuable to the interests of men. Also, your claim is that we should hide facts from people in order to prevent them from achieving their goals. This is only truly justified if people are actively trying to make life worse for women, which is obviously antithetical to EA. The mere fact that someone thinks women should be treated or prioritized a little differently doesn't necessarily mean that giving them facts will make their behavior worse under your view. EA is not for malevolent people, EA for people who are trying to make the world better. If you are worried about people lying to infiltrate EA, that's not going to change no matter what we do - people could lie to infiltrate any group with any rules. The EA cause is not a privilege. It's a duty. In my original comment, I explicitly said that it's a two-way street. The reason that it's a two-way street is that when these kinds of issues are only resolved by making demands on the offending party, the conflict never ends - there is a continued spiral of new issues and infighting.
Making discussions in EA groups inclusive

You identify the number one issue you have with activists from demographic groups being that they are suspicious of EA motivations.

One of the major problems driving social justice fear and offense in the US right now is the failure of right-wing and centrist actors to credibly demonstrate that they're not secretly harboring bias and hate. If I was going to pick something that activists for underrepresented demographics need to revise when they look at EA, it's that they should stop applying their default suspicions to the people in EA.

And yo... (read more)

0zekesherman3yThe criteria by themselves are sufficient to indicate that benign intentions are 90% likely. The remaining 10% chance is covered by the fact that we are Effective Altruists, so we extend the benefit of the doubt for a greater purpose. If we were Buddhists, then yes except for the fact that I am mainly talking about the offensive things that people really say, like "the variability hypothesis explains the gender disparity in academia" or "women tend to have worse policy preferences than men" and so on, which are not simple cases of rights and values. For the most incendiary issues, there is a point where you would expect any EA to know that the PR and community costs exceed the benefits, and therefore you should no longer give them the benefit of the doubt. And I would expect such a person, if they support EA, to say "ah I understand why I was censored about that, it is too much of a hot potato, very understandable thing for a moderator to do." Again, just the most incendiary things, that people across the political spectrum view as offensive. Some kinds of unequal rights would be like that. But there are some issues, like maternity leave or child custody or access to combat roles in the military, where people commonly support unequal rights. Even if you held such a belief, it does not follow that you would disregard the rights and well-being of women. You might give them less weight but it would not matter for most purposes, the same charities and jobs would generally still be effective. Science has improved, we know way more about people than we used to. I presume you would agree that the best science doesn't give people reasons to give unequal rights to people. Every wrong view in the history of science has been justified with science. So what do we do about that? Well, we have to do science as well as we can. There are no shortcuts to wisdom. In hindsight, it's easy to point at ways that science that went wrong in the past, but that's no good for telling us
After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation

+1 for pointing out the hazard of having funding concentrated in the hands of a very few decision makers

Suffering of the Nonexistent

Got it. I would recommend cutting this post down roughly in half -- you take a while to get to the point (stating your thesis in roughly the 14th paragraph). I understand the desire to try and warn the audience for what is coming, but the first section until you get to the thesis just seems overwrought to me. I know cutting is hard, but I'm confident the rewards from increased clarity will be worth it.

2DarwinsRottweiler3yI've added a summary. Thanks, this was the first time I wrote a post on this forum.
Suffering of the Nonexistent

Hi, I hope this doesn’t offend, but is this meant to be satire? I’m unclear if that’s the case (and I don’t think this post is well structured whether it’s meant to be satire or serious). If it’s not satire, I’ll engage more.

3DarwinsRottweiler3yNo offense taken. It's a serious post, but I completely understand why people would assume otherwise. I can have a bit of an eccentric take on certain topics and I'm probably not the best at explaining my own views :). If you have a recommendation on how to change the structure to make it look more serious, please tell me.
After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation

That makes sense — on a second look, I misread your first comment. Absolutely agree that the community shouldn’t have a go big or go home mentality, ie it shouldn’t be seen as impossible to do good if you can’t get an ultra selective job at one of these organizations.

After one year of applying for EA jobs: It is really, really hard to get hired by an EA organisation

I would disagree with that line of reasoning -- as donors, we should be seeking to channel money into the most effective places it can do good, not trying to spread out the opportunity to do good to different individuals within the EA movement.

So if donor A can create 10 utils by donating $1 to Org Z, or create 5 utils and one new EA job by donating $1 to Org Y, the choice seems to be clear. My understanding is that our current research suggests that this is the case. (I also agree with Arepo, however, about donors potentially being irrational.)

7Agrippa3yWhen people say all of the top orgs have enough money, my interpretation is that I can't really create any value at all by donating to them. That is, donor A can create 0 utils by donating to $1 to Org Z, because doing so doesn't actually allow Org Z to scale in a meaningful way. If I also can't work at Org Z, then donating to Org Y looks like my next best option.