Wiki Contributions


If You're So Smart, Why Aren't You Governor Of California? (Scott Alexander: Astral Codex Ten)

Yes, I kind of did see this coming (although not in the US) and I've been working on a forum post for like a year and now I will finish it. 

A formalization of negelectness

Yeah I wrote it in google docs and then couldn't figure out how to transfer the del and suffixes to the forum.

More EAs should consider “non-EA” jobs

I think this is correct and EA thinks about neglectedness wrong. I've been meaning to formalise this for a while  and will do that now. 

Nathan_Barnard's Shortform

If preference utilitarianism is correct there may be no utility function that accurately describes the true value of things. This will be the case if people's preferences aren't continuous or aren't complete, for instance if they're expressed as a vector. This generalises to other forms of consequentialism that don't have a utility function baked in. 

Nathan_Barnard's Shortform

A 6 line argument for AGI risk 

(1) Sufficient intelligence has capitalities that are ultimately limited by physics and computability  

(2) An AGI could be sufficiently intelligent that it's limited by physics and computability but humans can't be 

(3) An AGI will come into existence

(4)  If the AGIs goals aren't the same as humans, human goals will only be met for instrumental reasons and the AGIs goals will be met

(5) Meeting human goals won't be instrumentally useful in the long run for an unaligned AGI

(6) It is more morally valuable for human goals to be met than an AGIs goals

Non-consequentialist longtermism

Thank you, those both look like exactly what I'm looking for

Non-consequentialist longtermism

But thank you for replying, in hindsight by reply seems a bit dismissive :)

Non-consequentialist longtermism

Not really because that paper is essentially just making the consequentialist claim that axiological long termism implies that the action we should take are those which help the long run future the most. The Good is still prior to the Right.  

Load More