Most EAs want to be rich and close to power. Or at least they are way more into the "effective" optimization part than the altruism. They talk a big game but getting in early on a rising power (AI companies) is not altruistic. Especially not when you end up getting millions in compensation due to very rapid valuation increases.
I made a large amount of money in the 2021 crypto bom. I made a much smaller, though large for me, amount in the 2017 crash. I have never had a high paying job. Often I have had no job at all. My longterm partner has really bad...
It's not as much a pivot as a codification of what has been long true.
"EA is (experientially) about AI" has been sorta true for a long time. Money and resources do go to other causes. But the most influential and engaged people have always been focused on AI. EA institutions have long systematically emphasized AI. For example many editions of the EA handbook spend a huge fraction of their introductions to other cause areas effectively arguing why you should work on AI instead. CEA staffers very heavily favor AI. This all pushes things very hard in on...
I think its better to start something new. Reform is hard but no one is going to stop you from making a new charity. The EA brand isn't in the best shape. Imo the "new thing" can take money from individual EAs but shouldn't accept anything connected to OpenPhil/CEA/Dustin/etc.
If you start new you can start with a better culture.
I spent all day crying about this. An arms race is about the least safe way to approach. And we contributed to this. Many important people read Leopold's report. He promoted it quite hard. But the background work predates Leopold's involvement.
We were totally careless and self aggrandizing. I hope other people don't pay for our sins.
This sounds very much like the missile gap/bomber gap narrative, and yeah this is quite bad news if they actually adopt the commitments pushed here.
The evidence that China is racing to AGI is quite frankly very little, and I see a very dangerous arms race that could come:
Criticism of who? If anything EAs have been far too trusting of their actual leaders. Conversely they have been far too critical of people like Holly. Its not a simple matter of some parameter being too high.
Holden is married to Dario Amodei's sister. Dario is a founder of Anthropic. Holden was a major driver of EA AI policy.
Dustin is a literal billionaire who, along with his wife, has control over almost all EA institutions. Being critical of Dustin, while at all relying on EA funding or support, is certainly brave. Open Phil is known to be quite capricio...
I'm quite leftwing by manifest standards. I'm probably extremely pro-woke even by EA standards. I had a great time at less-online/summer-camp/manifest. I honestly tried to avoid politics. Unlike many people I don't actually like arguing. I'd prefer to collaborate and learn from other people. (Though I feel somewhat 'responsible for' and 'invested in' EA and so I find it hard not to argue about that particular topic). I mostly tried to talk to people about finance, health and prediction markets. Was honestly super fun and easy. People didn't force me to dis...
Emile seems to donate quite a bit:
"I’m passionate about alleviating global poverty, and have pledged to give away everything I earn over $40,000 a year. In December 2022, I started a fundraiser with Nathan Young, an Effective Altruist, that raised more than $321,000 for the charity Give Directly." -- https://www.xriskology.com/
I'm also quite critical of EA and have donated more than most EAs (both in absolute and percentage terms).
Even annoying critics may be quite sincere.
I donated a lot. Both in absolute and percentage terms. I gave a percentage many times higher than even most well off EAs. I think it would have been selfish to just keep the money. But I don't have any particularly great feelings about how I donated. 'Things are complicated' can be an applause light. Sometimes things aren't all that complicated. But this topic sure is. Saying 'those who criticize the movement as a whole are deeply intellectually unserious' just seems unserious to me. The movement has a lot of structural problems. Both 'extremely positive'...
Imo full enlightenment really means, or should mean, no suffering. There is no necessary suffering anyway. The Buddha, or the classic teaching, are pretty clear if you ask me. One can debate how to translate the noble truths but its pretty clear to me the fourth one says suffering can be completely overcome.
FWIW you can get much faster progress combining meditation with psychedelics. Though as the Buddha said you must investigate for yourself, don't take anyones word for spiritual truth. Also enlightenment absolutely does make you better at most stuf...
There are a lot of possible answers to where thoughts come from and which thoughts are useful. One charitable thought is some Elite EAs tried to do things which were all of: hard, extremely costly if you fuck them up, they weren't able to achieve given the difficulty. I have definitely updated a lot toward trying things that are very crazy but at least obviously only hurt me (or people who follow my example, but those people made their own choice). Fail gracefullly. If you dont know how competent you are make sure not to mess things up for other people. There is a lot of 'theater' around this but most people don't internalize what it really means.
The people who initially set up Givewell, did the research and conivnced Dustin to donate his money did a truly amazing jop. AFAICT the people who currently run Givewell are doing a good job. A large fraction of the good EA has done, in total, is largely do to their work.
But I don't think its a good idea to frame things as their a bunch of elite EAs and the quality of their work is superb. The EA leadership has fucked up a bunch of stuff. Many 'elite EAs' were not part of the parts of EA that went well. Many were involved in the parts of Ea that went...
I don't think it makes any sense to punish people for past political or moral views they have sincerely recanted. There is some sense in which it shows bad judgement but ideology is a different domain from most. I am honestly quite invested in something like 'moral progress'. Its a bit of a naive position to have to defend philosophically but I think most altruists are too. At least if they are being honest with themselves. Lots of people are empirically quite racist. Very few people grew up with what I would consider to be great values. If someone sincere...
I mostly agree with this, and upvoted strongly, but I don't think the scare quotes around "criticism" is warranted. Improving ideas and projects through constructive criticism is not the same thing as speaking truth to power, but it is still good and useful, it's just a different good and useful thing.
Im not trying to get dignity points. Im just trying to have a positive impact. At this point if AI is hard to align we all die (or worse!). I spent years trying to avoid contributing to the problem and helping when I could. But at this point its better to just hope alignment isn't that hard (lost cause timelines) and try to steer the trajectory positively.
I mean that 'at what income do GWWC pledgers actually start donating 10%+'. Or more precisely 'consider the set of GWWC pledge takers who make at least X per year, for what value X does is the mean donation at least X/10'. The value of X you get is around one million per year. Donations are of course even lower for people who didn't take the pledge! Giving 10% when you make one million PER YEAR is not a very big ask. You will notice EAs making large, but not absurd salaries, like 100-200K give around 5%. Some EAs are extremely altruistic, but the average EA isn't that altruistic imo.
I agree with the thrust of the argument but I think its a little too pessimistic. A lot of EAs aren't especially altruistic people. Tons of EAs got involved because of Xrisk. And it requires very little altruism to care about whether you and everyone you know will die. You can look at the data on EA donations and notice they aren't that high. EAs dont donate 10% until they have a pre-tax income of around one million dollars per year!
Hi folks, I’m coming in as a mod. We're doing three things with this thread: we're issuing two warnings and encrypting one person's name in rot13.
Discussions of abuse and sexual misconduct tend to be difficult and emotionally intense, and can easily create more confusion and hurt than clarity and improvement. They are also vitally important for communities — we really need clarity and improvement!
So we really want to keep these conversations productive and will be trying our best.
1.
We’re issuing a warning to @sapphire for this&...
sapphire leaves out that the bits they quote in their document look like this now, and have since just a few days after posting:
...[Edit]
What used to stand in this place was an imagined apology, generated by [my model of Brent] plus [my sense of what could be the *least* bad state of affairs that's consistent with reality].
I took that least-bad-of-all-possible-explanations, and wrote a statement out of it, specifically so that the discussion would not anchor on the most-bad-of-all-possible-explanations, the way it sometimes does, to the detriment of our moral
Okay I expect that is the default consensus, and is my default general desire too from a point of ignorance about any given case. I was just surprised that actors such as that weren't listed in this writeup.
I would also like to say though, that depending how many cases you take, a case will be handled in a way that you could call mismanagement eventually. Extreme mismanagement is one thing and generally having poor policies, but slight mismanagement now and again is a bug of the world. I don't expect 1000/1000 cases to be handled perfectly. Handling sexual...
[Edit: I apologize for the rude and aggressive tone of some of this comment. In case it is of use to others, I have written more here on what I am doing to make sure I don't cause a disruption or potentially hurt someone's feelings again. Contributing to a healthy forum environment is important to me: https://bit.ly/40dfT90 ]
Responding in case journalists stop by. I do not think [Qhapna] is abusive, and I don't think those claims would bear out if you investigated about his treatment today. I can easily state that and verify that as someone who follows him...
In my experience anonymous accounts work fine? Whats important is having the information in public. Whether the account is anonymous or not isn't very predictive of whether effective change occurs. For example Brent was defended by CFAR, but got kicked out once anonymous accounts were posted publicly.
If you cannot tell Duncan Sabien is an abusive person from reading his facebook posts you should probably avoid weighing in on community safety. He makes his toxicity and aggression extremely obvious. Lots of people have gotten hurt.
(Of course there is other evidence, like the fact he constantly defends bad behavior by others. He was basically the last person publicly defending Brent. But he continues to be conisdered a community leader with good judgment)
I think negative update since lots of the people with bad judgment remained in positions of power. This remains true even if some people were forced out. AFAIK Mike Valentine was forced out of CFAR for his connections to Brent, in particular greenlighting Brent meeting with a very young person alone. Though I dont have proof of this specific incident. Unsurprisingly, post-brent Anna Salomon defended included Mike Vassar.
With the exception of Brent, who is fully ostracized afaik, I think you seriously understate how much support these abusers still have. My model is sadly that a decent number of important rationalists and EAs just dont care that much about the sort of behavior in the article. CFAR investigated Brent and stood by him until there was public outcry! I will repost what Anna Salomon wrote a year ago, long after his misdeeds were well known. Lots of people have been updating TOWARD Vassar:
...I hereby apologize for the role I played in X's ostracism from the communi
CFAR investigated Brent and stood by him until there was public outcry!
This says very bad things about the leadership of CFAR, and probably other CFAR staff (to the extent that they either agreed with leadership or failed to push back hard enough, though the latter can be hard to do).
It seems to say good things about the public that did the outcry, which at the time felt to me like "almost everyone outside of CFAR". Everyone* yelled at a venerable and respected org until they stopped doing bad stuff. Is this a negative update against EA/rationality, ...
I think beating the uhhh 'market' is a lot easier than the EMH friends think. But its not exactly easy being a +EV 'gambler'/speculative-investor. Your counterparites usually aren't total idiots*. You are better off passing unless you think a bet is both really good and you can get in at least decent money. Its good policy to restrict your attention to only cases which plausibly fulfill both conditions**.
Ad hoc bets also have a very serious adverse selection problem. And in some cases betting people in private when they are being morons makes me feel preda...
Effective altruism's meta-strategy is about friendliness to (tech) power. All our funding comes from tech billionaires. We recruit at elite colleges. We strongly prioritize good relations with AI labs and the associated big tech companies. EA just isn't going to be genuinely critical or antagonistic toward the powerful groups we depend on for support and status. Not how EA works.
My honest reaction was: This is finally being taken sort of seriously. If an EVF board member acted badly then the community can't just pretend the Time article is about people totally peripheral to the community. At least we got some kind of accountability beyond "the same team that has failed to take sufficient action in the past is looking into things."
It honestly does feel like the dialogue is finally moving in a good direction. I already knew powerful people in EA acted very badly. So it's honestly a relief it seems like we might get real change.
A comment I made a few days ago said "But usually very little changes until someone goes public (at least anonymously). Nothing else remotely reliably creates the momentum to get bad actors out of power." Really aged quite well.
As always I would advise survivors who want change to be as public as possible. Anonymous public statements work fine. Of course prioritize your own safety. But private internal processes are not a vehicle for change. Owen would, as predicted, still be on the board if not for the Time article.
I think thats the public image but isn't how things actually work internally. Id really recommend reading this comment by Buck about how "You've also made the (IMO broadly correct) point that a lot of EA organizations are led and influenced by a pretty tightly knit group of people who consider themselves allies". Notably the post is pretty explicit that any proposed changes should be geared toward getting this small group onboard.
It is less public (at this point) but some of the core EAs have definitely been capricious in terms of who they want to re...
Okay so, if you'll bear with me a moment, your comment has actually convinced me that EA is in fact not hierarchical, but I do agree with your intended point.
Buck's comment, and the parent post by ConcernedEAs, point out that there's a small, tightly-knit group that's involved in many of the core EA organizations, who all know each other and collectively influence a lot of funding outcomes.
This is not the same thing as a hierarchy. There's no middle management, no corporate ladder you have to climb, and (as far as I've seen) no office politics you have to ...
I agree that private processors are often better for survivors (Though they can be worse). But usually very little changes until someone goes public (at least anonymously). Nothing else remotely reliably creates the momentum to get bad actors out of power. If the people in power weren't at least complicit we wouldn't have these endemic problems. Notably this has already played out multiple times with rationalist and Ea paces. Brent was extremely egregious but until public callouts nothing was seriously done about him. In fact community leaders like eliezer...
Yeah, they can be. I went through a brutal "restorative justice" process myself (I'm trained in traditional law, and at the time, was personally insulted that a bunch of hacks thought they could replace centuries of legal work/thought), with someone EA-adjacent (though I just confirmed that my rapist has some ties to EA via Google; he's one of the 14 and not 30) - I said no for weeks, had multiple people push into a process, went along because I wanted to tell my side, was silenced, and the "mediator" texted me to encourage me to kill myself before I left the country. Obviously, I'm not advocating for that.
And also, I had no idea to report this to CH. Nor, given how CH is handling this, would I report this today.
Working with official orgs to handle sexual abuse cases almost never goes well. For obvious reasons victims want to avoid backlash. And many victims understandably dont want to ruin the lives of people they still care about. I truly wish private processes and call-ins worked better. But the only thing that creates change is public pressure. I would always endorse being as public as you can without compromising victim privacy or pressuring them to be more open about what happened. It is just a very unfortunate situation.
I agree with your sentiment (and upvoted so your comment doesn't get hidden), but (1) victims don't always have a strong connection to their attacker and may not care strongly, and (2) in my six years of doing this, sometimes (not always) private processes work. Mostly importantly, private processes are easier on the survivors, who should take precedence in any process.
Under my old screen name, I had 3 commenters say they changed their minds about rape, for example. I know my work certainly has changed people's opinions on rape, both at large a...
I will just push back on the idea, in a top-level post, that EAG admissions are not a judgment on people as EAs. CEA is very concerned about the most promising/influential EAs having useful conversations. If you are one of the people they consider especially promising or influential you will get invited. Otherwise, they might let you in if EAG seems especially useful for shaping your career. But they will also be worried that you are lowering the quality of the conversations. Here are some quotes from Eli, the lead on EA global at CEA.
...
EAG is primarily a ne
Beware Trivial Inconveniences.