9242 karmaJoined


Researching Causality and Safe AI at Oxford

Previously, founder (with help from Trike Apps) of the EA Forum.

Discussing research etc at https://twitter.com/ryancareyai.


Topic contributions

I think the core issue is that the lottery wins you government dollars, which you can't actually spend freely. Government dollars are simply worth less, to Pablo, than Pablo's personal dollars. One way to see this is that if Pablo could spend the government dollars on the other moonshot opportunities, then it would be fine that he's losing his own money.

So we should stipulate that after calculating abstract dollar values, you have to convert them, by some exchange rate, to personal dollars. The exchange rate simply depends on how much better the opportunities are for personal spending, versus spending government money.

The fact that opportunities can get larger than your budget size seems not to be the core issue for the reason that you mention - that at realistic sizes of opportunity, it is possible to instead buy a lottery for a chance at the opportunity instead.

Also Nick Bostrom, Nick Beckstead, Will Macaskill, Ben Todd, some of whom have been lifelong academics.

Probably different factors in different cases.

It sounds like you would prefer the rationalist community prevent its members from taking taboo views on social issues? But in my view, an important characteristic of the rationalist community, perhaps its most fundamental, is that it's a place where people can re-evaluate the common wisdom, with a measure of independence from societal pressure. If you want the rationalist community (or any community) to maintain that character, you need to support the right of people to express views that you regard as repulsive, not just the views that you like. This could be different if the views were an incitement to violence, but proposing a hypothesis for socio-economic differences isn't that.

In my view, what's going on is largely these two things:

[rationalists etc] are well to the left of the median citizens, but they are to the right of [typical journalists and academics]

Of course. And:

biodeterminism... these groups are very, very right-wing on... eugenics, biological race and gender differences etc.-but on everything else they are centre-left. 

Yes, ACX readers do believe that genes influence a lot of life outcomes, and favour reproductive technologies like embryo selection, which are right-coded views. These views are actually not restricted to the far-right, however. Most people will choose to have an abortion when they know their child will have a disability, for example.

Various of your other hypotheses don't ring true to me. I think:

  • People aren't self-deceiving about their own politics very much. They know which politicians and intellectuals they support, and who they vote for. 
  • Rationalist leadership is not very politically different from the rationalist membership. 
  • Sexual misbehaviour doesn't change perceived political alignment very much.
  • The high % of male rationalist is at most a minor factor in the difference between perceived and actual politics.

This was just a "where do you rate yourself from 1-10" type question, but you can see more of the questions and data here.

I think the trend you describe is mostly an issue with "progressives", i.e. "leftists" rather than an issue for all those left of center. And the rationalists don't actually lean right in my experience. They average more like anti-woke and centrist. The distribution in the 2024 ACX survey below has perhaps a bit more centre-left and a bit less centre and centre-right than the rationalists at large but not by much, in my estimation.

There is one caveat: if someone acting on behalf on an EA organisation truly did something wrong which contributed to this fraud, then obviously we need to investigate that. But I am not aware of any evidence to suggest that happened. 

I tend to think EA did. Back in September 2023, I argued:

EA contributed to a vast financial fraud, through its:

  • People. SBF was the best-known EA, and one of the earliest 1%. FTX’s leadership was mostly EAs. FTXFF was overwhelmingly run by EAs, including EA’s main leader, and another intellectual leader of EA. 
  • Resources. FTX had some EA staff and was funded by EA investors.
  • PR. SBF’s EA-oriented philosophy on giving, and purported frugality served as cover for his unethical nature.
  • Ideology. SBF apparently had an RB ideology, as a risk-neutral act-utilitarian, who argued a decade ago why stealing was not in-principle wrong, on Felicifia. In my view, his ideology, at least as he professed it, could best be understood as an extremist variant of EA.

Of course, you can argue that contributing (point 1) people-time and (2) resources is consistent with us having just been victims, although I think that glosses over the extent to which EA folks at FTX had bought into Sam's vision, and folks at FTXFF might have more mildly lapsed in judgment. And we could regard (3) the PR issue as minor. But even so, (4) the ideology is important. FTX wasn't just any scam. It was one that a mostly-EA group was motivated to commit, to some degree or other, based on EA-style/consequentialist reasoning. There were several other instances of crypto-related crimes in and around the EA community. And the FTX implosion shared some characteristics with those events, and with other EA scandals. As I argued:

Other EA scandals, similarly, often involve multiple of these elements:

[Person #1]: past sexual harassment issues, later reputation management including Wiki warring and misleading histories. (norm-violation, naive conseq.)
[Person #2]: sexual harassment (norm-violation? naive conseq?)
[Person #3] [Person #4] [Person #5]: three more instances of crypto crimes (scope sensitivity? Norm-violation, naive conseq.? naivete?)
Intentional Insights: aggressive PR campaigns (norm-violation, naive conseq., naivete?)
Leverage Research, including partial takeover of CEA (risk appetite, norm-violation, naive conseq, unilateralism, naivete)
(We’ve seen major examples of sexual misbehaviour and crypto crimes in the rationalist community too.)

You could argue still that some of these elements are things that are shared with all financial crime. But then why have EAs committed >10% of the largest financial frauds of all-time, while consisting of about one millionth of the world's population, and less than 0.1% and perhaps 0.01% of its startups? You can suppose that we were just unlucky, but I don't find this particularly convincing.

I think that at this point, you should want to concede that EA appears to have contributed to FTX in quite a number of ways, and not all of them can be dismissed easily. That's why I think a more thorough investigation is needed.

As for PR, I simply think that shouldn't be the primary focus, and that it far from the most important consideration on the current margin. First, we need to get the facts in order. And then we need to describe the strategy. And then based on what kind of future EA deserves to have, we could decide how and whether to try to defend its image.

Their suggestions are relatively abstract, but you might consider reading Katja and Robin on the general topic of whether to focus on contributing money vs other things when you're young.

Yes, that's who I meant when I said "those working for the FTX Future Fund"

This is who I thought would be responsible too, along with the CEO of CEA, that they report to, (and those working for the FTX Future Fund, although their conflictedness means they can't give an unbiased evaluation). But since the FTX catastrophe, the community health team has apparently broadened their mandate to include "epistemic health" and "Special Projects", rather than narrowing it to focus just on catastrophic risks to the community, which would seem to make EA less resilient in one regard, than it was before.

Of course I'm not necessarily saying that it was possible to put the pieces together ahead of time, just that if there was one group responsible for trying, they were it.

Load more