sapphire

1249 karmaJoined
0

Comments
126

Imo full enlightenment really means, or should mean, no suffering. There is no necessary suffering anyway. The Buddha, or the classic teaching, are pretty clear if you ask me. One can debate how to translate the noble truths but its pretty clear to me the fourth one says suffering can be completely overcome. 

FWIW you can get much faster progress combining meditation with psychedelics. Though as the Buddha said you must investigate for yourself, don't take anyones word for spiritual truth. Also enlightenment absolutely does make you better at most stuff. Including partial enlightenment. People just say 'you can suffer and be enlightened' and 'enlightenment doesnt make you better at things'  because they either want to feel accomplished or be accomplished. The Buddha sought the highest star, he was never satisfied by the teachers of his time. Let us emulate him by seeking only the highest star. In fact lets not settle for merely copying his methods. The original Sangha didn't even have LSD, we can do one better.

There are a lot of possible answers to where thoughts come from and which thoughts are useful. One charitable thought is some Elite EAs tried to do things which were all of: hard, extremely costly if you fuck them up, they weren't able to achieve given the difficulty. I have definitely updated a lot toward trying things that are very crazy but at least obviously only hurt me (or people who follow my example, but those people made their own choice). Fail gracefullly. If you dont know how competent you are make sure not to mess things up for other people. There is a lot of 'theater' around this but most people don't internalize what it really means.

Answer by sapphire17
5
3
1

The people who initially set up Givewell, did the research and conivnced Dustin to donate his money did a truly amazing jop.  AFAICT the people who currently run Givewell are doing a good job. A large fraction of the good EA has done, in total, is largely do to their work.

But I don't think its a good idea to frame things as their a bunch of elite EAs and the quality of their work is superb. The EA leadership has fucked up a bunch of stuff. Many 'elite EAs' were not part of the parts of EA that went well. Many were involved in the parts of Ea that went quite poorly. 

If you are a true altruist you should really reconsider whether you even want to trust the leadership and work under their direction. Maybe you should work at a different sort of charity or get funding from 'someone who doesn't ultimately get their money from Givewell'. Unless you really fit in well with the 'elite Eas' doing that is likely to be more fun.

'Think for yourself about how to make the world better and then do it (assuming its not insane)' is probably both going to be better for you and better for the world.

I don't think it makes any sense to punish people for past political or moral views they have sincerely recanted. There is some sense in which it shows bad judgement but ideology is a different domain from most. I am honestly quite invested in something like 'moral progress'. Its a bit of a naive position to have to defend philosophically but I think most altruists are too. At least if they are being honest with themselves. Lots of people are empirically quite racist. Very few people grew up with what I would consider to be great values. If someone sincerely changes their ways Im happy to call them brother or sister. Have a party. Slaughter the uhhhhh fattest pumpkin and make vegan pumpkin pie. 

However mr Hanania is stil quite racist. He may or may not still be more of a Nazi than he lets on but even his professed views are quite bad. Im not sure what the policy should be on cooperating with people with opposing value sets. Or on Hanania himself. I just wanted to say something in support of being truly welcoming to anyone who real deal rejects their past harmful ideology. 

Not to state the obvious but the 'criticism of EA' posts didn't pose a real risk to the power structure. It is uhhhhh quite common for 'criticism' to be a lot more encouraged/tolerated when it isnt threatening.

Im not trying to get dignity points. Im just trying to have a positive impact. At this point if AI is hard to align we all die (or worse!). I spent years trying to avoid contributing to the problem and helping when I could. But at this point its better to just hope alignment isn't that hard (lost cause timelines) and try to steer the trajectory positively.

Ime you can induce much more torture than a tattoo relatively safely. Though all the best 'safe' forms of torture do cause short term damage to the skin. 

I mean that 'at what income do GWWC pledgers actually start donating 10%+'. Or more precisely 'consider the set of GWWC pledge takers who make at least X per year, for what value X does is the mean donation at least X/10'. The value of X you get is around one million per year. Donations are of course even lower for people who didn't take the pledge! Giving 10% when you make one million PER YEAR is not a very big ask. You will notice EAs making large, but not absurd salaries, like 100-200K give around 5%. Some EAs are extremely altruistic, but the average EA isn't that altruistic imo. 

Load more