Hide table of contents

I suggested that we would have trouble with FTX and funding around 4 months ago.  

SBF has been giving lots of money to EA. He admits it's a massively speculative bubble. Crypto crash hurts the most vulnerable, because poor uneducated people put lots of money into it (Krugman). Crypto is currently small, but should be regulated and has potential contagion effects (BIS). EA as a whole is getting loose with it's money due to large crypto flows (MacAskill). An inevitable crypto crash leads to either a) bad optics leading to less interest in EA or b) lots of dead projects. 

It was quite obvious that this would happen--although the specific details with Alameda were not obvious. Stuart Buck and Blonergan (and a few others) were the only one who took me seriously at the time.

Below are some suggestions for change.
 

1. The new button of "support" is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some % of the time), punt controversial posts back upwards to the front page. If you're like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism is excellent, but if the only way we aggregate how "good" is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
 

2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you're working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
 

For example, this is not good: 
 

"Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks." (New Yorker)

What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don't associate with EA.
 

3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it's because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don't donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition. [Edit--see comments/revision]


Revision:

Blonergan took my previous post very seriously--apologies.

Anonymizing finances may not be the best option. I am clearly naive about the legal implications. Perhaps other members of the community have suggestions about how to better mitigate conflicts of interest.

Although I have not looked into the data myself, it appears red teaming wins were mostly uncorrelated with upvotes. I still stand by having some way of pushing up controversial posts, as posts criticising FTX were heavily downvoted at the time.

54

0
0

Reactions

0
0

More posts like this

Comments31
Sorted by Click to highlight new comments since: Today at 9:16 AM

Downvoted because I think this is too harsh and accusatory:

I cannot believe that some of you delete your posts simply because it ends up being downvoted.

Also because I disagree in the following ways:

  • Donating anonymously seems precisely opposed to transparency. At the very least, I don't think it's obvious that donor anonymity works towards the values you're expressing in your post. Personally I think being transparent about who is donating to what organizations is pretty important for transparency, and I think this is a common view.
  • I don't think FTX's mistakes are particularly unique to crypto, but rather just normal financial chicanery.
  • "if the only way we aggregate how "good" red-teaming is is by up-votes, that is flawed"
    • IIRC the red-teaming contest did not explicitly consider up-votes in their process for granting awards, and the correlation between upvotes and prize-winners was weak.
  • "What makes EA, EA, what makes EA antifragile, is its ruthless transparency."
    • For better or for worse, I don't think ruthless transparency is a focus or a strength of EA. I agree with your sentence right after that, but I don't think that's much related to transparency.

Sorry that the post came off as very harsh and accusatory tone. I mainly meant to express my exasperation with how the situation unfolded so quickly. I’m worried about the coming months and how that will affect the community and in the long term. 
Clearly, revealing who is donating is good for transparency. However, if donations were anonymized from the perspective of the recipients, I think that would help mitigate conflicts of interest. I think there needs to be more dialogue about how we can mitigate conflicts of interest, regardless of whether we anonymize. (in fact, perhaps anonymizing is not the most feasible option)
Regarding whether the crash is just normal financial chicanery, it’s kind of like saying the housing bubble wasn’t due to mortgage backed securities per se, but just financial engineering. Clearly there is much at play here, and some attributes are unique to crypto being such a new unregulated area.
You’re right about redflagging. I more meant general posts critiquing EA. Thanks for correcting.

Thanks for the clarification. I agree that the FTX problems are clearly related to crypto being such a new unregulated area, and I was wrong to try to downplay that causal link.

I don't think anonymized donations would help mitigate conflicts of interest. In fact I think it would encourage COIs, since donors could directly buy influence without anyone knowing they were doing so. Currently one of our only tools for identifying otherwise-undisclosed COIs is looking at flows of money. If billionaire A donates to org B, we have a norm that org B shouldn't do stuff that directly helps billionaire A. If that donation was anonymous, we wouldn't know that that was a situation in which the norm applied.

There are some benefits of some level of anonymity in donations. For example, I dislike the practice of  universities putting a donor's name on a building in exchange for a large donation. Seems like an impressive level of hubris. I have more respect for donors who don't aggressively publicize their name in this way. However, I do think that these donations should still be available in public records. Donation anonymousness ranges from "put my name on the building" at one extreme to "actively obscure the source of the donation" at the other.

I have more thoughts on donor transparency but I'll leave it there for now.

I strongly agree with the spirit of the reforms being suggested here (although I might have some different opinions on how to implement it). We need large-scale reforms of the EA community's social norms to prevent future risks to movement-wide credibility.

  1. Strongly agree. The fact that net upvotes are the only concrete metric by which EA forum posts and LessWrong forum posts are judged has indeed been suboptimal for one of EA's main goals: to reflect on and adapt our previous beliefs based on new evidence. Reforms designed to increase the engagement of controversial posts would be very helpful for our pursuit of this goal. (Disclaimer: Most of my EA forum posts would rank highly on the "controversial" scale, in that many people upvote and many people downvote them, and the top comment is usually critical and has a lot of net upvotes. I think that we EAs need to increasingly prioritize both posting and engaging with controversial arguments that run contrary to status-quo beliefs, even if it's hard! This is especially true for LessWrong, which arguably doubles as a scientific venue for AI safety research in addition to an EA-adjacent discussion forum.)
  2.  Agree, although I think EAs should be more willing to write and engage with controversial arguments non-anonymously as well.
  3. Strongly agree in spirit. While a norm of unconditionally refusing non-anonymous donations above a certain threshold might be too blunt, I do think we need to have better risk-management about tying our EA movement's credibility to a single charismatic billionaire, or a single charismatic individual in general. Given how important our work is, we probably need better risk-management practices in general. (And we EAs already care earnestly about this! I do think this is a question not of earnest desire but of optimal implementation.) I also think that many billionaires would actually prefer to donate anonymously or less publicly, because they agree with the bulk of but not all of EA's principles. Realistically, leaving room for case-by-case decision-making seems helpful.

I strongly agree with the spirit of the reforms being suggested here (although I might have some different opinions on how to implement it)

How would you do things differently?

Mostly it was about Point 3. I think an unconditional norm of only accepting anonymous donations above a certain threshold would be too blunt.

I think a version of Point 3 I would agree with is to have high-contributing donor names not be publicized as a norm (with some possible exceptions). I think this captures most of the benefits of an anonymous donation, and most potential donors who might not be willing to make an anonymous donation would be willing to make a discreet, non-publicized donation.

Upvoted because it's worth pointing out if people were hostile to suggestions the FTX wealth was unstable-- that should be openly considered and discussed. However, crypto being a speculative bubble isn't that related to SBF committing fraud, so I don't give you much credit for predicting this.

Also isn't anonymous donation kind of inimical to transparency? Seems like you're openly suggesting the kind of reputation-crafting that you condemned in the New Yorker quote.

Yes, I don’t really care about getting credit for predicting this; I pointed out my previous post mainly to give credence to my suggestions. And based on the comments of other people, maybe anonymous donations or not the best, most feasible, nor most practical way to do things. But, given that EAs focus very much on catastrophic tail risks, it should be the case that we not become overly reliant on single donations or donations which generate such large conflicts of interest. I don't know what system would be best.

I don't understand #3. At the megabucks level, wouldn't anonymity deprive the community of information needed to make plans? Here, we knew the approximate net worth of the donor and that the wealth was in a high-volatility domain (although admittedly this particular risk wasn't known...), and people made various decisions with that knowledge in hand. 

Although we would have probably been better off without that knowledge this time, in general trying to hide it would make long-term planning a lot more difficult. If you give enough information to allow for effective planning, you've probably given enough information to identify the megadonor.

Also, didn't FTX establish its own organization (i.e., the FTX Foundation) to grant its money? I guess one could argue that EA-aligned organizations shouldn't accept money from such an entity, although such a stance would mean that the money goes to less effective charities instead.

Finally, as a practical matter, I think some people have to know donor identity from a legal/compliance perspective. I suspect that "my large non-profit organization's major funder is an untraceable Bitcoin address" would cause some significant practical problems. So you've created a bifurcation where some people have the knowledge ( = a potentially significant source of power), while others do not benefit from that transparency.

Edit: punctuation

Ok, I’m not too clear about the legal perspective. I guess my main purpose is this post was to start a dialogue about how we could have avoided such a situation with some preliminary suggestions.

I took your post seriously and had an extended exchange with you in the comments section. I indicated that I shared some of your concerns. I also expressed that I thought you had mischaracterized some of SBF's views about bitcoin and other cryptocurrencies. It appears that you have since edited the post to correct some of those mischaracterizations, but you did not acknowledge having done so, best I can tell. 

I also disagreed with your view that many good projects would lose funding if there were a crypto downturn. Unfortunately, with FTX collapsing so abruptly, there is a risk of that happening. I am hopeful that other donors will step up to fund the highest value projects funded by FTX, but this is a real challenge we face as a community. 

I'm puzzled by your statement in this new post that "It was quite obvious that this would happen..." There was certainly a risk things could go badly, and I think I personally underestimated the risk, but I don't think it is credible to say that it was obvious.

 

Apologies. Yes, thanks for reading and responding to my prior post. I believe I haven’t edited it since we last spoke in the comments section, but I did edit it when you pointed them out.

Thank you for your response. And I apologize for being defensive in my comment. And for not noticing your edits when they happened.

Response lifted from a different post:

"Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks." (New Yorker)

What makes EA, EA, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have already concluded something is super effective, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don't associate with EA.

Being honest, I do genuinely think that climate change is less important than runaway AI, primarily because of both option value issues and the stakes of the problem. One is a big problem that could hurt or kill millions, while AI could kill billions.

But I'm concerned that they couldn't simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.

  1. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it's because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don't donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.

Disagree, this would make transparency worse without providing much benefit.

  1. The new button of "support" is great, but I think EA forum should have a way to sort by controversiality. And, have the EA forum algorithm occasionally (some ϵ % of the time), punt controversial posts back upwards to the front page. If you're like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how "good" red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))

Disagree here because I don't want to see an EA forum that values controversial posts.

But I'm concerned that they couldn't simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.

Agree

Disagree here because I don't want to see an EA forum that values controversial posts.

Disagree. This is like saying, "Amazon shouldn't sort by 1 star, because otherwise it will get a bad reputation for selling bad products."

That's wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire. 

Disagree. This is like saying, "Amazon shouldn't sort by 1 star, because otherwise it will get a bad reputation for selling bad products."

That's wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire.

The reason I disagree is in my view, the internet already rewards controversiality and outrage way too much, and this is something that makes the EA forum much better because they avoid outrage and controversiality driving the process.

controversiality need not be extremely correlated with outrage. in fact, outrage can be very uncontroversial (school shooting). and controverisality is often productive (debate about X). my inclination is to trust the readership of this forum. promoting visibility to controversial posts will help people discuss ideas they've neglected. 

Thanks for this post. Not sure what it would take to get it onto the front page? The algorithm seems very opaque to me. 

Really like idea 1. Is there currently a way to view / find posts with overall negative votes?

None that I know of. 

Although I cheer for this,

What makes EA, EA, what makes EA antifragile, is its ruthless transparency

- although I really want to move to a world where radical transparency wins, I currently don't believe that we're in a world like that right now (I wish I could explain why I think that without immediately being punished for excess transparency, but for obvious reasons that seems impossible).

How do we get to that world? Or if you see this world in better light than I do, if you believe that the world is already mostly managing to avoid punishing important true ideas, what're the dynamics that preserve and promote that?

I like to think that open exchange of ideas, if conducted properly, converges on the correct answer. Of course, the forum in which this exchange occurs is crucial, especially the systems and software. Compare the amount of truth that you obtain from BBC, Wikipedia, Stack Overflow, Kialo, Facebook, Twitter, Reddit, and EA forum. All of these have different methods of verifying truth. The beauty of a place like each of these is that with the exception of BBC, you can post whatever you want. 

But the inconvenient truth will be penalized in different ways. On Wikipedia, it might get edited out for something more tame, though often not. On Stack Overflow, it will be downvoted but still available, and likely read. On Kialo it will get refuted, although if it is the truth, it will be promoted. On Facebook and Twitter, many might even reshare it, though into their own echochambers. On Reddit, it'll get downvoted and then posted into r/unpopularopinion. 

The important thing is to design a system where it takes more work to a) post a lie b) refute the truth. And also, somehow design said system such that there is incentive to a) post the truth b) refute a lie, and importantly c) read/spread the truth. Whether this is by citations or a reputation-based voting system is beyond me but something I've been mulling over for quite some time.  

I guess prediction markets will help.

Prediction markets about the judgements of readers is another thing I keep thinking about. Systems where people can make themselves accountable to Courts of Opinion by betting on their prospective judgements. Courts occasionally grab a comment and investigate it deeper than usual and enact punishment or reward depending on their findings.

I've raised these sorts of concepts with lightcone as a way of improving the vote sorting (where we'd sort according to a prediction market's expectation of the eventual ratio between positive and negative reports from readers). They say they've thought about it.

Hi there, 

 

How will anonymity of donations improve the governance process of funding? Also this move will affect conflicts of interests as eg. Competitions (or similar projects) may depend on the public being aware of which organizations and individuals cannot join/participate - to this point? how can this issue be avoided in anonymous funding?

 

Best regards, 

Miguel

yes, I now think anonymity of the sort that I proposed is the wrong way of going about this. can you think of a better solution?

Hi Sara,

Well the traditional approach is governance boards installed on publicly held corporations. Others utilize internal audits or operational audits so that there is a constant review of processes and controls. 

Large scale fraud is built to be deceptive, it takes precaution and skill to avert these to be honest as what happened with SBP and FTX is not the first of this scale. I'm thinking of providing a post on past frauds of the same magnitude so that maybe the forum has a profile that it can sift through especially I'm speculating that the direction will be on the side of caution for EA organizations moving forward...You think that will be helpful?

All the best,

Miguel

I think a post on past frauds would be very welcome, although a list of reading recommendations would be equally helpful and would require less work for you. EA has a lot to learn from more diverse voices that are more experienced in management within large organizations.

Hi Sara,

Thank you. I have created this post for a basic concept on fraud and how it occurs. I will add a post on your suggestion tomorrow outlining the best reading materials out there fraud, internal audit and governance to improve the knowledge base of EA on these areas.

All the best,

Miguel

[anonymous]1y2
3
4

I doubt EA can change because it prizes itself on being an ideological bubble and takes pride in having views that are heterodox. One of the first criticisms of EA that I was exposed to when I took the EA intro course did talk about the problem of diverting people to engage in morally ambiguous work in the finance industry (I just looked at the curriculum of the latest intro course and didn't see these criticisms linked anymore). I'm sure there are lots of criticisms that have been produced over the years about the kind of extreme utilitarianism that EA ostensibly promotes so I'm a little bit surprised that people are shocked that a concern highlighted in some of these criticisms has materialized. I guess that it maybe proves that EA criticism is more of an intellectual exercise rather than something that is truly digested and taken into account. The only thing that can burst an ideological bubble is when a catastrophe makes it clear that the ideology in question does not match reality. If EA survives the ensuing media fire storm (and probably a Netflix show that will be produced as a result of this conflagration) then I expect it will go back into an ideological bubble until the next time it pops. My bet on the next pop is when it becomes painfully clear that the ideological thinking behind imminent AI catastrophe doesn't match the facts on the ground but hopefully that will have less collateral damage on people outside of EA circles.

If this forum too survives longterm, Sara's idea to sort commentary by controversiality, or a sort of red-teaming seach engine sounds like something I would love to test out, even beyond the confines here, and a very necessary innovation for a heterodox community.  Especially  one that would want to avoid self-censoring at certain moments even if not all the time, and where interesting voices like SaraAzubuike or even anonymous ones can be heard, at least for now.

 

  1. Anonymity and pseudonimity may be more common around here than one would initially suppose. Even that Voldemort fellow or the other billionaire founders may occasionally express their views around here, as too may a mosquito net recipient such as myself.
  2. It is unclear to me that when controversy hits an effective entity it should be perturbed as much as I observe it to be. Perhaps we mosquito net recipients should make our voices heard louder as  money seems to have been flowing mostly endogamously very effectively.
Curated and popular this week
Relevant opportunities