How do you define "wokeness"? The term is often used very broadly as a placeholder for vaguely culturally left things the writer dislikes, broad enough that anyone in the audience can feel like its referring to specifically the things they dislike. And there's often a degree of strategic ambiguity/motte and bailey in how its used.
I also didn’t seek adequate backup given that I was friends with Owen. (Owen and I live in different countries and were not close friends, but we and our families have spent social time together.) When the woman in the TIME piece told me that her concern was about Owen, I flagged to her that I was friends with him. She and I decided to proceed anyway because we couldn’t think of a better option, although she felt it was unhealthy for EA that people who had power were entwined in these ways.
This seems to be a recurring issue in a lot of the rece...
I think EA has a particular problem where the emphasis on getting people who are "value aligned" means they don't get in experienced people from outside. Software startups have at least learned to bring in an experienced COO to help run day to day things
In the early days, it was hard to get people who were sufficiently “value aligned” with experience. I don’t think that’s the case anymore. For example, there are many value-aligned highly-experienced people on the EA Good Governance Project’s Trustee Directory. The issue is that many people don’t want to give up partial control of something to somebody outside their clique and/or prior to October didn’t have an easy way to find them
I feel like there's also an ambiguity in the term "community" being used to both mean:
A lot of the posts about EA community issues seems to be implicitly about the stereotypical "people who go to Bay Area house parties" community. Which is not representative of the wider community of people who might attend EA conferences, work/volunteer in EA orgs or donate.
I have a job outside EA where reputation is a concern, so as is normal for people in such industries I post mostly anonymously online, and start new accounts periodically to prevent potential information leakage. If the only way to engage with EA discussion online was under my real name I wouldn't do so.
That's probably on the extreme end, but I think lots of people exist somewhere on this spectrum and it would probably be bad for the movement if discussions were limited to only people willing to post under their real names, or persistent identities, as that would exacerbate problems of insularity and group think.
This is not just a question of the attitude of EA employers but of wider society. I have been involved in EA for a long time but now work in a professional role where reputation is a concern, so do all my online activity pseudonymously.
I would dislike it if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs". And that would probably be bad for the variety of perspectives and expertise in EA discussions.
I'm not particularly young anymore, and work in a non-EA field where reputation is a concern, which is a large part of why I post pseudonymously. I think it would be bad if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs".
Very interesting seeing your process for all this, thanks for laying it out.
On the tractability and neglectedness questions, how do you account for other interventions that impact child marriage rates? I'd assume that programs aimed at general economic growth and education also raise average marriage ages
Thanks for writing this. Do you have any advice on getting a financial advisor? I've been wanting to hire one as a one-off to check I'm doing everything right. But not sure how to find a good person
Why is big tent EA an end in itself? The EA movement exists for the purpose of doing good, not for having a movement. If multiple smaller movements are more effective at doing good then we should do that.
Multiple groups make it easier to specialise and avoid having single points of failure. Though you lose some economies of scale and coordination benefits.
I don't want to throw cold water on your enthusiasm. But I think you are underestimating the difficulty of getting anything potentially politically controversial published in China in the current climate and the potential downside risks of coming to the attention of the Chinese government in such an area.
Given the recent crackdowns on NGOs and civil society in China this would entail a very genuine risk of the related organisations being banned from operating in China, and make the government more likely to suppress EA idea in general. Which is a very high risk for the low odds of a single book meaningfully changing public opinion, and which is very unlikely to be published.
I've had similar feelings to Alice. Part of it is that group membership serves a role of signalling information about yourself to others. Its very different to describe yourself to others as an EA when the primary association with it is "slightly weird but well meaning group of charitable people" vs when its "those weird crypto/eugenics people". And in the latter case you are better off moving to labelling yourself as something else
EA seems to have a bit of a "not invented here" problem, of not taking onboard tried and tested mechanisms from other areas. E.g. with the boring standard conflict of interest and transparency mechanisms that are used by charitable organisations in developed countries.
Part of this seems to come from only accepting ideas framed in certain ways, and fitting cultural norms of existing members. (To frame it flippantly, if you proposed a decentralised blockchain based system for judging the competence of EA leaders you'd get lots of interest, but not if y...
My own observation has been that people are open to intellectual discussion (your discounting formula is off for x reasons) but not to more concrete practical criticism, or criticism that talks about specific individuals.
I share this feeling. I feel like EA has trended in the direction of some other groups I've dealt with where the personalities and interpersonal issues of a small number of people at the top come to be overly dominant.
I've also had my faith in the movement fractured a bit by seeing how much of how things were run seems to be based on friends of friends networks. I had naively assumed they were doing the kind of due diligence and institutional division of power that other charitable organisations do.
A lot of this isn't a particular specific set of issues, but its a general sense of ones estimates of people being shifted downward
There's a general lack of competence in (and at times active disdain for) skills in PR and communications in EA. Which for a movement that wants to convince people of things and attract membership seems problematic
Yeah, lot of the issues in EA are things I recognise from other fields that disproportionately hire academic high achievers straight out of college, who don't have much real world experience, and who overestimate the value of native intelligence over experience. But conveying the importance of that difference is difficult as, ironically, its something you mostly learn from experience.
I agree that people shouldn't think that way, but observably they do. And acknowledging human irrationality and working around it was the founding insight or rationalism and EA. I honestly can't really respond to most of your first two paragraphs since it seems to be based on the idea we shouldn't even be considering the question.
I'm not saying truth doesn't matter (if it came across that way I apologise) but that reputational effects are real and also matter. Which is very different from the strawman position of "we shouldn't do anything at all odd or unp...
Cool, thanks for clarifying your view! To clarify, here's a version of your comment that I wouldn't have objected to at all (filling in details where I left things in square brackets):
'You're right that the building is a manor house rather than a castle, and I'm generally in favor of EAs being accurate and precise in our language. That said, I think [demographic D] will mostly not be convinced of EA's virtue by this argument, because people still think of manor houses as quite fancy. And I think it's important for EA to convince [demographic D] of our virt...
It's entirely reasonable to say, as a normative claim, that people should be accurate in reporting.
But when you are thinking about reputational impact of a choice you should be examining not just what the reaction would be to strictly accurate reporting, but how people operating in bad faith could easily represent it, or how good faith people could misinterpret it. Whether they should or not is irrelevant to the predictable consequences.
If you're an organisation that solicits donations part of your basic obligations in your relationship with your donors is to be clear about whatyyou have spent money on in the past, and intend to spend it on in the future, so that people can look at that and make a reasonable judgement about what their donation is likely to be used for
Yeah this is my biggest concern. The whole value proposition of EA was to get away from the normal failure modes of charities. If they are falling into the same traps of using shoddy reasoning to justify self serving behaviour that's a major structural problem, not just a matter of a single decision.
You can get to Luton, Milton Keynes, Stevenage or a number of other small London satellite towns in less than 2 hours from Oxford, and less than 1 from central London. These are all pretty banal collections of concrete buildings, but would allow you to buy a venue for a fraction of the cost. It seems hard to escape the conclusion that this decision was mainly made based on a Manor house in Oxford being more aesthetically appealing than a concrete office building on an industrial estate or small town centre.
The lack of a 2 hour commute is nothing to sneeze at though. CFAR has (had? I haven't checked in on it lately) a venue a couple of hours away from Berkeley that they've used for organizing workshops and events, and the tribulations of organizing getting everyone to and from the venue pretty much ensured it was only used for running 4-5 day events. It made it significantly more difficult for folks at CFAR or MIRI to pop up to make "guest appearances" at workshops and the like significantly reducing value to participants.
Speaking from personal experien...
At the point you are having to debate the definition of a castle you've lost the optics argument even if you're technically correct.
I want to downvote this comment more strongly than any other comment I've downvoted on the EA Forum.
On the EA Forum, we should care about what's actually true. "Haha, you lose for having to clarify your point!" may be the rule of the game in soundbite politics, but it can't become the rule of the game in EA's internal decision-making or in conversations on the EA Forum, or we have little hope of actually doing the most good (as opposed to "doing the thing that's safest and plays the best, in adversarial soundbite form, to a random consumer of mass media wi...
There's no debate over the definition of a castle: Wytham Abbey is not a castle (it is not a form of military fortification). Roughly, Wytham Abbey is a castle in the way that an underground eco-house is a nuclear bunker. Which is to say: not at all (it's not some mere technicality that makes it not a castle; it is radically not a castle). There is no debate about definition to be had here.
So I'm not having a debate about definition; I'm noting a misrepresentation. I agree that the optics issue is already lost. I also think that we should not be misr...
While those are reasonable comparisons it rather raises the question of why you are buying a venue in one of the most expensive areas of the UK to begin with.
Were there any business cases made comparing this to other cheaper venues in different locations? That's the kind of basic due diligence I'd expect at most organizations I work with in my professional life, and I'm concerned if it's not the case here.
Well Wytham Abbey is only a 10 minute Uber ride from Oxford rail station, which is only 1 hour from London Paddington Station; and it's reasonably convenient by rail from Heathrow and Gatwick airports.
The key features of a good conference venue are (1) easy access to international flights and domestic rail connections, and (2) an enticing location that can attract busy, choosy intellectuals who have many other options of places to go and people to see.
Those features generally involve being located in places that many other people value -- i.e. that, given supply & demand, tend to involve high real estate prices.
Frankly I would think that there was finally someone with a modicum of sense and understanding of basic PR working in the area. And upgrade my views of the competency of the organisation accordingly.
Also I'd not that "this will save money in the long run" is a fairly big claim that has not been justified. There are literally hundreds of conference venues within a reasonable distance of Oxford, all of which are run by professional event managers who are able to take advantage of specialisation and economies of scale. Making it difficult to believe
Agreed. The whole founding insight of the EA movement was the importance of rigorously measuring value for money. The same logic is used to justify every warm and fuzzy but low value charity. And it's entirely reasonable to be very worried when major figures in the EA movement revert to that kind of reasoning when it's in their self interest.
I feel like a lot of this is downstream from people being reluctant to hire experienced people who aren't already associated with EA. Particularly for things like operations roles experience doing similar roles is going to make far more of a difference to effectiveness than deep belief in EA values.
When Coke need to hire new people they don't look for people who have a deep love of sugary drinks brands, they find people in similar roles for other things and offer them money. I feel like the reason EA orgs are reluctant to do this is that there's a degree of exceptionalism in EA.
I agree that it's downstream of this, but strongly agree with ideopunk that mission alignment is a reasonable requirement to have.* A (perhaps the) major cause of organizations becoming dysfunctional as they grow is that people within the organization act in ways that are good for them, but bad for the organization overall—for example, fudging numbers to make themselves look more successful, ask for more headcount when they don't really need it, doing things that are short-term good but long-term bad (with the assumption that they'll have moved on before t...
Agreed. Its entirely possible to take someone's money and spend it for good causes without promoting them and associating ones reputation with them
Forgive me if there's a structural reason why this wouldn't work. But why weren't you saving a larger share of the money coming in, to provide a buffer in case funding dropped off for whatever reason? Seems like part of the underlying issue here was assuming that funding levels would remain constant in the future
A meta level structural problem may be that so much decision making seems to be focused on a relatively small group of people without much oversight. Even with the best people in the world that's going to lead to group think and blind spots. Other charities and non-profits have extensive oversight systems that may be worth imitating.
How concerned were you about crypto generally being unethical?
Also to add to that, whether or not crypto is in itself ethical its known to be a very unstable sector and one with a particularly negative reputation. Was there any discussion of how to compensate for that potential volatility, and of potential reputational risks of being associated?
Please feel free to "be that guy" as hard as possible when we are talking about massive financial fraud.
Feels like it was a mistake to tell people to change their strategy if it can be reversed by a single donor having issues. All the emphasis on "we're not funding constrained" may have done long term harm by reducing future donations from a wider pool of people.
Sorry I should have been clearer, I was meaning more in psychological terms than economic ones. An extra dollar might still do the same amount of good, but the way people intuitively assess impact it will feel very different depending on the funding context people feel it is in.
So what specific kinds of talented people does EA need more of? Well, the most obvious place to look is the most recent Leader Forum, which gives the following talent gaps (in order):
Is there a place you should go if you meet one of those particular talent gaps?
The focus on student groups is also inherently redflaggy for some people, as it can be viewed as looking for people who have less scepticism and experience.
There being a lot of funding available in EA also changes the calculus for people deciding if they want to donate their own money. If there are super rich people donating to EA, to the extent that finding ways to spend money is a problem, then the motivation for normal individuals to donate is lower.
I think it changes it some, but not hugely? Even if the best remaining option for making the world better was direct cash transfers I think donations would still make a lot of sense; There's Lots More To Do.
I also don't think we stay in the current dynamic. Part of why it is important for a lot of people to go into directly doing useful things now is to identify and scale up opportunities for directing a lot of money toward valuable things. It's become harder to identify extremely cost-effective ways of spending your money to speed that process up, but money will still be very important.
Its not specific communications so much as it is the level of activity around specific causes. How many posts and how much discussion time is spent on AI and other cool intellectual things, vs. more mundane but important things like malaria. Danger of being seen as just a way for people to morally justify doing the kind of things they already want to do.
The claim that EA is talent constrained not money constrained always confuses me. Since there are no shortage of young talented people interested in EA, but there seem to be very few jobs for them to do in EA. If there is this huge pile of money sitting around why not give it to pepole in exchange for their labor? As every other industry or cause does.
Value drift seems like a risk. You might start off with a set of altruistic beliefs, but if you spend all your professional and social time around a set of people who don't share those beliefs then you are likely to adopt those beliefs, for the various well studied psychological effects of conformity and what information you will be being exposed to.
Same. I'm fairly confident in my writing skills but lack any talent in the other areas and would find doing so embarassing
I have a job outside EA where reputation is a concern, so as is normal for people in such industries I post mostly anonymously online, and start new accounts periodically to prevent potential information leakage. If the only way to engage with EA discussion online was under my real name I wouldn't do so.
That's probably on the extreme end, but I think lots of people exist somewhere on this spectrum. If that is disallowed or discouraged you get a situation where only "professional EAs" who have merged their real life reputation and their EA reputation would comment in discussions or be listened to. Which seems like a recipe for increased group think and insularity.