Arepo

Comments

Big List of Cause Candidates

Can you spell both of these points out for me? Maybe I'm looking in the wrong place, but I don't see anything in that tag description that recommends criteria for cause candidates.

As for Scott's post, I don't see anything more than a superficial analogy. His argument is something like 'the weight by which we improve our estimation of someone for their having a great idea should be much greater than the weight by which we downgrade our estimation of them for having a stupid idea'. Whether or not one agrees with this, what does it have to do with including on this list an expensive luxury that seemingly no-one has argued for on (effective) altruistic grounds?

Big List of Cause Candidates

Write a post on which aspect? You mean basically fleshing out the whole comment?

Big List of Cause Candidates

One other cause-enabler I'd love to see more research on is donating to (presumably early stage) for-profits. For all that they have better incentives it's still a very noisy space with plenty of remaining perverse incentives, so supporting those doing worse than they merit seems like it could be high value.

It might be possible to team up with some VCs on this, to see if any of them have a category of companies they like but won't invest in; perhaps because of a surprising lack of traction; or perhaps because of predatory pricing by companies with worse products/ethics; perhaps some other unmerited headwind.

Big List of Cause Candidates

Then I would suggest being more clear about what it's comprehensive of, ie by having clear criteria for inclusion. 

Big List of Cause Candidates

I would like to see more about 'minor' GCRs and our chance of actually becoming an interstellar civilisation given various forms of backslide. In practice, the EA movement seems to treat the probability as 1. We can see this attitude in this very post, 

I don't think this is remotely justified. The arguments I've seen are generally of the form 'we'll still be able to salvage enough resources to theoretically recreate any given  technology', which  doesn't mean we can get anywhere near the economies of scale needed to create global industry on today's scale, let alone that we actually will given realistic political development. And the industry would need to reach the point where we're a reliably spacefaring civilisation, well beyond today's technology, in order to avoid the usual definition of being an existential catastrophe (drastic curtailment of life's potential).

If the chance of recovery from any given backslide is 99%, then that's only two orders of magnitude between its expected badness and the badness of outright extinction, even ignoring other negative effects. And given the uncertainty around various GCRs, a couple of orders of magnitude isn't that big a deal (Toby Ord's The Precipice puts an order of magnitude or two between the probability of many of the existential risks we're typically concerned with).

Things I would like to see more discussion of in this area:

  • General principles for assessing the probability of reaching interstellar travel given specific backslide parameters and then, with reference to this:
  • Kessler syndrome
  • Solar storm disruption
  • CO2 emissions from fossil fuels and other climate change rendering the atmosphere unbreathable (this would be a good old fashioned X-risk, but seems like one that no-one has discussed - in Toby's book he details some extreme scenarios where a lot of CO2 could be released that wouldn't necessarily cause human extinction by global warming, but that some of my back-of-the-envelope maths based on his figures seemed consistent with this scenario)
  • CO2 emissions from fossil fuels and other climate change substantially reducing IQs
  • Various 'normal' concerns: antibiotic resistant bacteria; peak oil; peak phosphorus; substantial agricultural collapse; moderate climate change; major wars; reverse Flynn effect; supporting interplanetary colonisation; zombie apocalypse
  • Other concerns that I don't know of, or that no-one has yet thought of, that might otherwise be dismissed by zealous X-riskers as 'not a big deal'
Big List of Cause Candidates

I wish we could finally strike off cryonics from the list. The most popular answers in the linked 'Is there a hedonistic utilitarian case for Cryonics? (Discuss)' essay seem to be essentially 'no'.

The claim that 'it might also divert money from wealthy people who would otherwise spend it on more selfish things' gives no reason to suppose that spending money on yourself in this context is somehow unselfish. 

As for 'Further, cryonics might help people take long-term risks more seriously'. Sure. So might giving people better health, or, say, funding long-term risk outreach. At least equally as plausibly to me, constantly telling people that they don't fear death enough and should sign up for cryonics seems likely to make people fear death more, which seems like a pretty miserable thing to inflict on them.

I just don't see any positive case for this to be on the list. It seems to be a vestige of a cultural habit among Less Wrongers that has no place in the EA world.

Is it possible to change user name?

Is that an intentional policy, or just a feature that hasn't been implemented yet?

If intentional, could you say why? Obviously it could be confusing, but there are some substantial downsides to preventing it.

80,000 Hours: Anonymous contributors on flaws of the EA community

I'm not sure how public the hiring methodology is, but if it's fully public then I'd expect the candidates to be 'lost' before the point of sending in a CV.

If it's less public that would be less likely, though perhaps the best candidates (assuming they consider applying for jobs at all, and aren't always just headhunted) would only apply to jobs that had a transparent methodology that revealed a short hiring process.

Forum update: Tags are live! Go use them!

I think this will make the forum far more useful. Could you add some kind of taglist (or prominent link to one) to the home page?

Tips for overcoming low back pain

I wonder if there's a case for carrying heavier loads on your front if you can't easily use hands only. It seems counterintuitive, since that would pull you forward into a hunch, but maybe what matters would be working your posterior chain rather than the actual posture it temporarily puts you in.

Load More