I think there is a bit of tendency to assume that it is appropriate to ask for arbitrary amounts of transparency from EA orgs. I don't think this is a good norm: transparency has costs, often significant, and constantly asking for all kinds of information (often with a tone that suggests that it ought to be presented) is I think often harmful.
Transparency has costs, but potentially so does opacity (in terms of both loss of trust and reduced consideration for decisions that don't need to be justified externally). Arguably both apply here: the decision obviously wasn't uncontroversial, and the decision making process sounds quite limited for a significant commitment in a novel area. It's also possible some of that information that wasn't shared or subsequent metrics (I don't think anyone is asking for original research here) would actually cast the original decision in a more favourable light
I al...
It's on a matter of significant community concern, as well as documented concern/interest among the general public. On average, there are a couple of items at this level each year? I don't think the downsides of "constantly asking for all kinds of information" factor into this particular case.
I wonder if we would benefit from something like a system that hides the karma (and treats it as zero for visibility purposes) of posts that have less than <some quantity> of engagement. That way posts would get a "grace period" before getting hidden.
Again, I don’t think my picture here is a stretch from the normal English sense of the word “wholesomely”.
The more I read of these essays the less I agree with this. On my subjective authority as a native English speaker, your usage seems pretty far from the normal sense to me. I think what you're gesturing at is a reasonable concept but I think it's quite confusing to call it "wholesome".
As some evidence, I kept finding myself having to reinterpret sentences to use your meaning rather than what I would consider the more normal meaning. For example, "What is wholesome depends on the whole system." This is IMO kind of nonsensical in normal English.
I don't think we would have been able to use the additional information we would have gained from delaying the industrial revolution but I think if we could have the answer might be "yes". It's easy to see in hindsight that it went well overall, but that doesn't mean that the correct ex ante attitude shouldn't have been caution!
100% agree. I think it is almost always better to be honest, even if that makes you look weird. If you are worried about optics, "oh yeah, we say this to get people in but we don't really believe it" looks pretty bad.
I would qualify this statement by saying that it would be nice for OP to have more reasoning transparency, but it is not the most important thing and can be expensive to produce. So it would be quite reasonable for additional marginal transparency to not be the most valuable use of their staff time.
I think if there's anything they should bother to be publicly transparent about in order to subject to further scrutiny, it's their biggest cruxes for resource allocation between causes. Moral weights, theory of welfare and the marginal cost-effectiveness of animal welfare seem pretty decisive for GHD vs animal welfare.
- Some EAs knew about his relationship with Caroline, which would undermine the public story about FTX<->Alameda relations, but didn't disclose this.
- Some EAs knew that Sam and FTX weren't behaving frugally, which would undermine his public image, but also didn't disclose.
FWIW, these examples feel hindsight-bias-y to me. They have the flavour of "we now know this information was significant, so of course at the time people should have known this and done something about it". If I put myself in the shoes of the "some EAs" in these examples, it's not clea...
I think the best thing here would have been for much of the information to be shared casually, without needing to justify itself as important. People gossip about relationships and their terrible bosses all the time I suspect that if that had happened, people would have gathered more clues earlier, enough to make a difference on the margin.
To reuse your example, if you were the only person the perpetrator of the heist could con into lending their car to act as a getaway vehicle, then that would make P(Heist happens | Your actions) quite a bit higher than P(Heist happens | You acting differently), but you would still be primarily a mark or (minor) victim of the crime
Yes, this is a good point. I notice that I don't in fact feel very moved by arguments that P(FTX exists | EA exists) is higher, I think for this reason. So perhaps I shouldn't have brought that argument up, since I don't think it's the crux (although I do think it's true, it's just over-determining the conclusion).
Only ~10k/10B people are in EA, while they represent ~1/10 of history's worst frauds, giving a risk ratio of about 10^5:1, or 10^7:1, if you focus on an early cohort of EAs.
This seems wildly off to me - I think the strength of the conclusion here should make you doubt the reasoning!
I think that the scale of the fraud seems like a random variable uncorrelated with our behaviour as a community. It seems to me like the relevant outcome is "producing someone able and willing to run a company-level fraud"; given that, whether or not it's a big one or a small on...
I do think it's an interesting question whether EA is prone to generate Sams at higher than the base rate. I think it's pretty hard to tell from a single case, though.
I am one of the people who thinks that we have reacted too much to the FTX situation. I think as a community we sometimes suffer from a surfeit of agency and we should consider the degree to which we are also victims of SBF's fraud. We got used. It's like someone asking to borrow your car and then using it as the getaway car for a heist. Sure, you causally contributed, but your main sin was poor character judgement. And many, many people, even very sophisticated people, get taken in by charismatic financial con men.
I also think there's too much uncritical ...
There are various reasons to believe that SBF's presence in EA increased the chance that FTX would happen and thrive:
This is a very interesting take, and very well expressed. You could well be right that the narrative that 'we got used' is the most correct simple summary for EAs/EA. And I definitely agree that it is an under-rated narrative. There could even be psychological reasons for that (EAs being more prone to guilt than to embarassment?).
I note that even if P(FTX exists | EA exists) were quite a bit higher than P(FTX exists | ~EA exists), that could be compatible with your suggested narrative of EAs being primarily marks/victims. To reuse your example, if you were...
It seems like we could use the new reactions for some of this. At the moment they're all positive but there could be some negative ones. And we'd want to be able to put the reactions on top level posts (which seems good anyway).
I wonder if the focus on "narrow EA" is a reflection of short AI timelines and/or a belief that we need to make changes sooner rather than later.
It seems to me that "global EA" looks better the longer the future we have. Gains in people compound, and other countries may be much more influential in the future. If Nigeria is a global power in 50 years, then growing a community there now might be a good investment.
I don't think this has a clear answer though, since benefits from actually solving problems sooner can compound too.
I wholeheartedly agree with this post.
I think there has been a bit of over-reacting to recent events. I don't think the damage is that bad, and to some degree I think we've just been unlucky. Maybe we need to do some things differently (e.g. try to project less of an air of certainty, which many critics seem to perceive) but we should also beware the illusion of control.
You say, in effect, "not that centralised", but, from your description, EA seems highly centralised
Your argument that it's not centralised seems to be that EA is not a single legal entity
These are two examples, but I generally didn't feel like your reply really engaged with Will's description of the ways in which EA is decentralized, nor his attempt to look for finer distinctions in decentralization. It felt a bit like you just said "no, it is centralised!".
democracy has the effect of decentralising power.
I don't agree with this at all. IMO democracy often...
I think the average EA might underestimate the extent to which being visible in EA (e.g. speaking at EAG) is seen as a burden rather than an opportunity.
Related: https://www.lesswrong.com/posts/pDzdb4smpzT3Lwbym/my-model-of-ea-burnout?commentId=Xz2xzWEuLAiHsFWzf
Having read this I'm still unclear what the benefit of your restructuring of CEA is. It's not a decentralising move (if anything it seems like the opposite to me); it might be a legitimising move, but is lack of legitimacy an actual problem that we have?
The main other difference I can see is that it might make CEA more populist in the sense of following the will of the members of the movement more. Maybe I'm as much of an instinctive technocrat as you are a democrat, but it seems far from clear to me that that would be good. Nor that it solves a problem we actually have.
Yes, I think there's a lot of sliding between "decentralised" and "democratic" even though these have pretty much nothing to do with each other.
As a pretty clear example, the open source software community is extremely decentralised but has essentially zero democracy anywhere.
I think this is a place where the centralisation vs decentralisation axis is not the right thing to talk about. It sounds like you want more transparency and participation, which you might get by having more centrally controlled communication systems.
IME decentralised groups are not usually more transparent, if anything the opposite as they often have fragmented communication, lots of which is person-to-person.
I would love the community to be more supportive in ways that would help with that. Things I would like:
A few thoughts:
I'm not super convinced that the fundraising situation is tougher? It seems much easier to me than it was. Especially for small things we have a decent range of funders.
I also had this reaction, and I think it was mostly just the phrasing. "Break up OP" suggests that we have the power or right to do that, which we definitely don't. I think if the post said "OP could consider breaking itself up" it wouldn't sound like that.
I think the proposal to have significant re-grantors is a more approachable way of achieving something similar, in that it delegates control of some funds.
I think the intention wasn't "have lots of forums where EA topics are discussed", so much as "don't make it sound like the (in practice, one) forum is the only one that can be".
Thank you! This post says very well a lot of things I had been thinking and feeling in the last year but not able to articulate properly.
I think it's very right to say that EA is a "do-ocracy", and I want to focus in on that a bit. You talked about whether EA should become more or less centralized, but I think it's also interesting to ask "Should EA be a do-ocracy?"
My response is a resounding yes: this aspect of EA feels (to me) deeply linked to an underrated part of the EA spirit. Namely, that the EA community is a community of people who not only i...
Thanks for this comment, it’s very inspiring!
One thought I had is that do-ocracy (as opposed to “someone will have got this covered, right?”) describes other areas, as well as EA. On the recent 80k podcast, Lennart Heim describes a similar dynamic within AI governance:
“at some point, I would discover that compute seems really important as an input to these AI systems — so maybe just understanding this seems useful for understanding the development of AI. And I really saw nobody working on this. So I was like, “I guess I must be wrong if nobody’s worki...
What cultural and structural features do you think might contribute to the perceived decline in a just-do-it attitude?
While I think there is considerable merit to what you're saying, I think it's also important to acknowledge the existence of challenges for would-be doers in 2023 that weren't necessarily (as) present in 2008 or 2013. Some of these challenges are related to the presence and/or actions of big organizations and funders (e.g., the de-emphasis on earning to give affecting the universe of potential viable funders for upstarts). Others are relate...
I think the upside is that if it is "generational" people grow up and become more agentic as long as we foster the culture. I was remarking to a friend that it's interesting how people don't want to get up and learn to code to help with AI Safety (given the rates of AI doomerism) but people were willing to go into quant trading at seemingly higher rates to earn to give in early EA.
I definitely think there's a "generational" thing here. For those of us who've been around long enough to see how everything came from nothing but people doing things they thought needed to be done, it's perfectly obvious. But I can very much see how if you join the community today it looks like there are these serious, important organizations who are In Charge. But I do think it's still not really true.
+1.
I was slow to realise that, over the period of just a few years of growth, this bunch of uncertain, scrappy, loosely coordinated students had come to be...
That's much stronger than what I read it as. I think Sjir was saying something more like "if you turn up to a local EA event you should feel welcomed and like you are 'one of the gang' even if you only donate".
The purpose of EAG these days seems a bit murky to me, but it seems to be to be mostly for people who are highly engaged, and I think it's fair to say that if you just donate you are probably not highly engaged (although you might be).
Great post, I agree with a lot of it. There is definitely a kind of large org sclerosis that can develop, but IME it's generally associated with much larger orgs unless you have very dysfunctional management (which is a risk!).
One missing factor I think is fungibility. It's hard for donations to be funged between organizations, but it's very easy for them to be funged between protects within an organisation. So we might expect donors to have some preference for separate orgs for separate projects.
I think few people disagree with these directly, but I think many people believe or act in ways that are in tension with them. Going through the claims:
I think another contributing factor is that as a community we a) prize (and enjoy!) unconventional thinking, and b) have norms that well-presented ideas should be discussed on their merits, even if they seem strange. That means that even the non-insane members of the community often say things that sound insane from a common-sense perspective, and that many insane ideas are presented at length and discussed seriously by the community.
I think experiencing enough of this can disable your "insane idea alarm". Normally, if someone says something that sou...
A "together but divided" future is also possible. There are many divisions in e.g. feminism, but everyone still wants to lay claim to the big banner, and will generally regard the others as allies to some degree (even if they write scathing critiques of each other).
I think this is a useful exercise for a few reasons.
I would be tempted to divide the second wave in two. I think there was a distinct p...
If we interpret an up-vote as "I want to see more of this kind of thing", is it so surprising that people want to see more such supportive statements from high-status people?
I would feel more worried if we had examples of e.g. the same argument being made by different people and the higher-status person getting rewarded more. Even then - perhaps we do really want to see more of high-status people reasoning well in public.
Generally, insofar as karma is a lever for rewarding behaviour, we probably care more about the behaviour of high-status people and so we...
I think a good reading item on the empathy front is this article from a disability rights lawyer about her encounter with Singer. It is a very clear and honest piece, I think about it often.
https://www.nytimes.com/2003/02/16/magazine/unspeakable-conversations.html
Comic sans is a great font that is very readable even at small font sizes. Surely a community which cares about effectiveness over image will embrace this change.
I have a (somewhat unfair) vibe of this as the famous people deposit their work into the forum and leave for higher pursuits
I do think there's a big difference in how much various high-status people engage on the forum. And I think that the people who do engage feel like they're more "part of" the community... or at least that small part of it that actually uses the forum! It also gives them more opportunity to say stupid things and get downvoted, very humanising!
(An example that comes to mind is Oliver Habryka, who comments a lot on posts, not just his...
This is a great point. I also think there's a further effect, which is that older EAs were around when the current "heroes" were much-less -impressive university students or similar. Which I think leads to a much less idealising frame towards them.
But I can definitely see that if you yourself are young and you enter a movement with all these older, established, impressive people... hero-worshipping is much more tempting.
I think that's a good example of "why would people do this given what they knew?", I'm not sure it's an example of pedestalising etc. I'm being a bit fussy here because I do think I've seen the specific claim that there was lots of public promotion of Sam and I'm just not sure if it's true.
to put this guy on a pedestal; to elevate him as a moral paragon and someone to emulate; to tie EA's reputation so closely to his
I always find this claim a bit confusing: did we actually do those things? Are there some specific examples of doing this?
I can think of... the 80k interview and that's about it? I guess engaging with the FTX Foundation was somewhat positive but I don't think it was putting him on a pedestal. In fact when I look back I feel like a lot of the content linking Sam to EA came from people talking to Sam. I may well just not be reme...
I agree with what others have said re: pedestal, so am not going to produce more quotes or anecdotes. I stand by the claim, though.
I think people may have been inclined to put SBF on a pedestal because earning to give was the main thing people criticized about early EA. People were otherwise pretty supportive of early EA ideas; I mean, it's hard not to support finding more cost-effective global health charities. When SBF emerged, I think this was a bit of a "see, we told you so" moment for EAs who had been around for a long time, especially because S...
Some specific examples of EA leaders putting SBF on a pedestal that I found with a bit of brief digging:
ETA July: I regret posting the following comment for several reasons, partly because I got crucial information wrong and failed to put things into context and prevent misunderstandings. Please consider reading my longer explanation at the top of my follow-up comment here. I'm sorry to anyone I upset.
------------------------------------------------------------------
At EAG London 2022, they [ETA: this was an individual without consent of the organizers] distributed hundreds of stickers depicting Sam on a bean bag with the text "what would SBF do?". To my kno...
Yes, I think that him, e.g. being interviewed by 80K didn't make much of a difference. I think that EA's reputation would inevitably be tied to his to an extent given how much money they donated and the context in which that occurred. People often overrate how much you can influence perceptions by framing things differently.
There is the whole vouching for SBF as prospective purchaser of Twitter:
You vouch for him?
Very much so! Very dedicated to making the long-term future of humanity go well.
Differently but same idea: men's groups. Part of the problem with masculinity is that men don't actually talk about it. And it's often easiest to learn from people you feel similar to and respect.
EDIT: I see you suggested the same thing further down so I just agree with you :)
I think this is potentially very good and helpful advice if quite controversial. Alcohol is endemic in our society, but it seriously compromises your judgement! I say this as someone who has historically benefited a lot from alcohol as an aide to getting over social anxiety - it can be useful but it's a very double-edged sword.
That suggests another possible suggestion for community leaders: organise more dry events. Most meetups would probably be totally fine without alcohol, even social mixers (controversial!).
Yeah, I think this would be much better as a poll on a specific course of action. "Something needs to change... nothing in particular, but something" is an easy feeling to fall into.
I agree that this is super confusing. However, I do think that some claims about EA being too centralised have been about there being too few big orgs. I think that's just not true, and I think the confusion between these two points has probably made it difficult to discuss in the past.
All of which is to say: we should probably prefer to use more specific words than just "centralisation" unqualified.
Okay, but different groups and orgs can already have different norms today, right? Nobody is enforcing conformity. The worst that can happen is that CEA can ban you from EAG, so I guess yes it would be nice to have someone else running conferences so you could go to those?
I'm not playing dumb here, I genuinely find it confusing in what ways people feel they are being coerced by a central power in EA.
I don't agree with any of your criteria for "death". All of those sound totally survivable. "EA exits its recent high-growth phase" is very different from dying.
I would modify them to:
i.e. we transition to a negative growth regime and stay there.
And I think we could survive a lot of organizational collapse so I wouldn't even include that.
Just to add a note of optimism: a) people always take recent news too seriously; and b) many people don't read the forum. It's easy to think that everything is gloom if you spend too much time reading the drama on the forum, but most of reality hasn't changed. We still have thousands of people deeply engaged in doing good and their projects are still going as well as they were before. There are problems, sure, but announcing death is extremely premature IMO.
What is making things non-federal today? There already are, e.g. groups for Christians in EA which have some quite different ideas to the rest of the movement but coexist pretty peacefully. Is there something more that you would want there?
In (US-style) federalism, the subunits (US states) have quite a bit of power and autonomy. I don't have to worry myself about what Alabama decides about abortion or education. Being a South Carolinian or a Oregonian is a significant part of one's political identity in a sense.
So, if for instance, if the "edgy" / "normie" divide became a critical fault line, under federalism you might see substantial meta groups focusing on one side of the line or the other with for example their own high-end conferences and internal networking. It's not a rupture because ...
(I work at Open Phil, speaking for myself)
FWIW, I think this could also make a lot of sense. I don't think Holden would be an individual contributor writing code forever, but skilling up in ML and completing concrete research projects seems like a good foundation for ultimately building a team doing something in AI safety.
On the object level, the original question was:
> Are financial statements going to be released? In particular, how much was spent on estate agent fees, maintaince and bills? And the value of the events hosted. Is the reason for the change that EA has less money or that there was an error in the initial reasoning for buying it?
Even given the context I think this is asking too much. I would support a question like "I would love to know what the reasoning was: in particular, was the project financially unsustainable or were there other reasons?".
Aski... (read more)
I read Dean's ask a bit more narrowly or at least ambiguously -- although there is a reference to financial statements, the more specific asks are to how much was spent, how much value was achieved, and whether the project was not desirable in retrospect vs. merely the victim of changed circumstances. I don't read him as proposing an audit, though I could be wrong.
The last one may be valuable for others who can be considering future capital expense vs. rental options. The other two are valid things for donors to consider when deciding whether to give to EV... (read more)