D0TheMath

An undergrad at University of Maryland, College Park. Majoring in math.

After finishing The Sequences at the end of 9th grade, I started following the EA community, changing my career plans to fields which seem like they'd be more aligned with preventing existential threats. I have no precise plans other than 'keep opportunities open until you figure out where your comparative advantage lies'. Advice on this front is welcome! Just pm me!

I have since helped make the EA Jobs Twitter bot (@effective_jobs), and currently make the Effective Altruism Forum Podcast (https://anchor.fm/ea-forum-podcast).

Sequences

Wiki Contributions

Comments

Is EA over-invested in Crypto?

I think you should have made this post a question. It being a post made me think you actually had an answer, so I read it, and was disappointed you didn’t actually conclude anything.

Forecast procedure competitions

This sounds interesting. Alternatively, you could have the procedure-makers not know what questions will be forecasted, and their procedures given to people or teams with some stake in getting the forecast right (perhaps they are paid in proportion to their log-odds calibration score).

After doing enough trials, we should get some idea about what kinds of advice result in better forecasts.

How big are the intra-household spillovers for cash transfers and psychotherapy? Contribute your prediction for our analysis.

Question 7 is a bit confusing. The answer format implies cash transfers have both a 10% and 40% impact, and makes it impossible for (say) cash & psychotherapy to both have a 10% impact.

How To Raise Others’ Aspirations in 17 Easy Steps

With a few modifications, all of these are great questions to ask yourself as well.

Sasha Chapin on bad social norms in EA

What makes you think it isn't? To me it seems both like a reasonable interpretation of the quote (private guts are precisely the kinds of positions you can't necessarily justify, and it's talking about having beliefs you can't justify) as well as a dynamic that feels like one that I recognize as one that has been occasionally present in the community.

Because it also mentions woo, so I think it’s talking about a broader class if unjustified beliefs than you think.

Even if this interpretation wasn't actually the author's intent, choosing to steelman the claim in that way turns the essay into a pretty solid one, so we might as well engage with the strongest interpretation of it.

I agree, but in that case you should say make it clear how your interpretation differs from the author’s. If you don’t, then it looks like a motte-bailey is happening (where the bailey is “rationalists should be more accepting of woo & other unjustified beliefs”, and the bailey is “oh no! I/they really just mean you shouldn’t completely ignore gut judgements, and occasionally models can be wrong in known ways but still useful”), or you may miss out on reasons the post-as-is doesn’t require your reformulation to be correct.

Sasha Chapin on bad social norms in EA

If this is what the line was saying, I agree. But it’s not, and having intuitions & a track record (or some reason to believe) those intuitions correlate with reality, and useful but known to be not true models of the world is a far cry from having unjustified beliefs & believing in woo, and the lack of these is what the post actually claims is the toxic social norm in rationality.

Sasha Chapin on bad social norms in EA

Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.

If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.

But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.

Sasha Chapin on bad social norms in EA

you must reject beliefs that you can’t justify, sentiments that don’t seem rational, and woo things.

This isn’t a toxic social norm. This is the point of rationality, is it not?

What stops you doing more forecasting?

Overthinking forecasts, causing writing them down & tracking them diligently to be too much of a mental-overhead for me to bother with.

Load More