All of projectionconfusion's Comments + Replies

I have a job outside EA where reputation is a concern, so as is normal for people in such industries I post mostly anonymously online, and start new accounts periodically to prevent potential information leakage. If the only way to engage with EA discussion online was under my  real name I wouldn't do so.

That's probably on the extreme end, but I think lots of people exist somewhere on this spectrum.  If that is disallowed or discouraged you get a situation where only "professional EAs" who have merged their real life reputation and their EA reputation would comment in discussions or be listened to. Which seems like a recipe for increased group think and insularity. 

We all agree expanding the moral circle is an end in itself so this seems obviously correct

How do you define "wokeness"? The term is often used very broadly as a placeholder for vaguely culturally left things the writer dislikes, broad enough that anyone in the audience can feel like its referring to specifically the things they dislike. And there's often a degree of strategic ambiguity/motte and bailey in how its used. 

 I also didn’t seek adequate backup given that I was friends with Owen. (Owen and I live in different countries and were not close friends, but we and our families have spent social time together.) When the woman in the TIME piece told me that her concern was about Owen, I flagged to her that I was friends with him. She and I decided to proceed anyway because we couldn’t think of a better option, although she felt it was unhealthy for EA that people who had power were entwined in these ways. 

This seems to be a recurring issue in a lot of the rece... (read more)

I think EA has a particular problem where the emphasis on getting people who are "value aligned" means they don't get in experienced people from outside. Software startups have at least learned to bring in an experienced COO to help run day to day things

2
Ozzie Gooen
1y
I think good startups often do this, but lots of startups have trouble about this stage. Many do have their own cultures that are difficult to retain as they grow.  I think EA is more intense as there's more required material to understand, but it's a similar idea.

In the early days, it was hard to get people who were sufficiently “value aligned” with experience. I don’t think that’s the case anymore. For example, there are many value-aligned highly-experienced people on the EA Good Governance Project’s Trustee Directory. The issue is that many people don’t want to give up partial control of something to somebody outside their clique and/or prior to October didn’t have an easy way to find them

I feel like there's also an ambiguity in the term "community" being used to both mean: 

  • A relatively small and tightly knit social group of people in specific areas who know each-other in real life; 
  • And a larger global community of people who are involved in EA to varying levels, but it doesn't make up the majority of their social life. 

A lot of the posts about EA community issues seems to be implicitly about the stereotypical "people who go to Bay Area house parties" community. Which is not representative of the wider community of people who might attend EA conferences, work/volunteer in EA orgs or donate. 

7
Jeff Kaufman
1y
Good point! I'm trying to talk about the second category here, though that does include the first to some extent.

I have a job outside EA where reputation is a concern, so as is normal for people in such industries I post mostly anonymously online, and start new accounts periodically to prevent potential information leakage. If the only way to engage with EA discussion online was under my  real name I wouldn't do so.

That's probably on the extreme end, but I think lots of people exist somewhere on this spectrum and it would probably be bad for the movement if discussions were limited to only people willing to post under their real names, or persistent identities, as that would exacerbate problems of insularity and group think.

This is not just a question of the attitude of EA employers but of wider society. I have been involved in EA for a long time but now work in a professional role where reputation is a concern, so do all my online activity pseudonymously. 

I would dislike it if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs". And that would probably be bad for the variety of perspectives and expertise in EA discussions. 

I'm not particularly young anymore, and work in a non-EA field where reputation is a concern, which is a large part of why I post pseudonymously. I think it would be bad if it became the norm that people could only be taken seriously if they posted under their real names, and discussion was reserved for "professional EAs". 

Very interesting seeing your process for all this, thanks for laying it out.

On the tractability and neglectedness questions, how do you account for other interventions that impact child marriage rates? I'd assume that programs aimed at general economic growth and education also raise average marriage ages

1
Catherine F
1y
Thanks for engaging with the post :)  I think Lizka's comment  visualises this phenomena well. I think the answer would probably be to be targeting the negative outcomes themselves rather than child marriage.

Thanks for writing this. Do you have any advice on getting a financial advisor? I've been wanting to hire one as a one-off to check I'm doing everything right. But not sure how to find a good person

1
Rasool
1y
No first hand experience I'm afraid, but the /r/UKPersonalFinance wiki has a good breakdown

Why is big tent EA an end in itself? The EA movement exists for the purpose of doing good, not for having a movement. If multiple smaller movements are more effective at doing good then we should do that. 

 

Multiple groups make it easier to specialise and avoid having single points of failure. Though you lose some economies of scale and coordination benefits. 

7
Stefan De Young
1y
IMO big tent is valuable due to gains from trade. I have X cause area, but Charlie is better suited to work on X than I am, and I am better suited to work on their cause area Y. Our labour swap is more effective for our cause areas than each focusing on our own area.
7
Karthik Tadepalli
1y
It's not necessarily an end in and of itself, but a scenario like this can lead to fairly arbitrary factors deciding what EA "is" in a self reinforcing cycle. Let's say a popular news outlet wrote a critical article about EA and then many EAs decided to stop identifying with it because of the now negative connotations. It seems wrong to let external forces dictate what EA Is in that way.

I don't want to throw cold water on your enthusiasm. But I think you are underestimating the difficulty of getting anything potentially politically controversial published in China in the current climate and the potential downside risks of coming to the attention of the Chinese government in such an area. 

Given the recent crackdowns on NGOs and civil society in China this would entail a very genuine risk of the related organisations being banned from operating in China, and make the government more likely to suppress EA idea in general. Which is a very high risk for the low odds of a single book meaningfully changing public opinion, and which is very unlikely to be published. 

1
Matt Brooks
1y
From the way Tyler was talking about the book and topics, it did not seem to me like a politically controversial book "it was a book designed to explain America to the Chinese, and make it more explicable, more understandable".  Or at least the controversial parts could be taken out if required and a lot of the value could remain. 

I've had similar feelings to Alice. Part of it is that group membership serves a role of signalling information about yourself to others. Its very different to describe yourself to others as an EA when the primary association with it is "slightly weird but well meaning group of charitable people" vs when its "those weird crypto/eugenics people". And in the latter case you are better off moving to labelling yourself as something else 

1
HemanthB
1y
I think a meta that emerges here is: do I have Alice-like feelings because I feel that EA will weaken (or "detract from my issues") as a rallying call or because I'm concerned with how I'm perceived (both by others and myself), which is why I really appreciated how you framed Alice's POV as personal and emotional, KT!
6
Karthik Tadepalli
1y
That seems bad in equilibrium. For example, if the public view of EA after the WWOTF publicity tour is "those people who think about the long term future", then a global health/animal welfare person in EA would be "better off" by not calling themselves EA and labelling themselves as something else. But that would make it much harder to have a big tent within EA.
0
lc
1y
If you stop calling yourself an EA in public because you think doing so will give people the wrong impression, that's one thing, I guess.

EA seems to have a bit of a "not invented here" problem, of not taking onboard tried and tested mechanisms from other areas. E.g. with the boring standard conflict of interest and transparency mechanisms that are used by charitable organisations in developed countries. 

Part of this seems to come from only accepting ideas framed in certain ways, and fitting cultural norms of existing members. (To frame it flippantly, if you proposed a decentralised blockchain based system for judging the competence of EA leaders you'd get lots of interest, but not if y... (read more)

3
Nathan Young
1y
I agree but I think this is hard problem for everyone right. I don't know that any community can just fix it. 

My own observation has been that people are open to intellectual discussion (your discounting formula is off for x reasons) but not to more concrete practical criticism, or criticism that talks about specific individuals. 

7
Linch
1y
That was also Scott Alexander's point if I understood it correctly.

I share this feeling. I feel like EA has trended in the direction of some other groups I've dealt with where the personalities and interpersonal issues of a small number of people at the top come to be overly dominant. 

I've also had my faith in the movement fractured a bit by seeing how much of how things were run seems to be based on friends of friends networks. I had naively assumed they were doing the kind of due diligence and institutional division of power that other charitable organisations do.

A lot of this isn't a particular specific set of issues, but its a general sense of ones estimates of people being shifted downward

1
Brian Ayofemi
1y
Thanks for your thoughts here.

There's a general lack of competence in (and at times active disdain for) skills in PR and communications in EA. Which for a movement that wants to convince people of things and attract membership seems problematic

Yeah, lot of the issues in EA are things I recognise from other fields that disproportionately hire academic high achievers straight out of college, who don't have much real world experience, and who overestimate the value of native intelligence over experience. But conveying the importance of that difference is difficult as, ironically, its something you mostly learn from experience. 

I agree that people shouldn't think that way, but observably they do. And acknowledging human irrationality and working around it was the founding insight or rationalism and EA. I honestly can't really respond to most of your first two paragraphs since it seems to be based on the idea we shouldn't even be considering the question.

I'm not saying truth doesn't matter (if it came across that way I apologise) but that reputational effects are real and also matter. Which is very different from the strawman position of "we shouldn't do anything at all odd or unp... (read more)

Cool, thanks for clarifying your view! To clarify, here's a version of your comment that I wouldn't have objected to at all (filling in details where I left things in square brackets):

'You're right that the building is a manor house rather than a castle, and I'm generally in favor of EAs being accurate and precise in our language. That said, I think [demographic D] will mostly not be convinced of EA's virtue by this argument, because people still think of manor houses as quite fancy. And I think it's important for EA to convince [demographic D] of our virt... (read more)

It's entirely reasonable to say, as a normative claim, that people should be accurate in reporting.

But when you are thinking about reputational impact of a choice you should be examining not just what the reaction would be to strictly accurate reporting, but how people operating in bad faith could easily represent it, or how good faith people could misinterpret it. Whether they should or not is irrelevant to the predictable consequences.

[anonymous]1y13
3
0

If you're an organisation that solicits donations part of your basic obligations in your relationship with your donors is to be clear about whatyyou have spent money on in the past, and intend to spend it on in the future, so that people can look at that and make a reasonable judgement about what their donation is likely to be used for

Yeah this is my biggest concern. The whole value proposition of EA was to get away from the normal failure modes of charities. If they are falling into the same traps of using shoddy reasoning to justify self serving behaviour that's a major structural problem, not just a matter of a single decision.

You can get to Luton, Milton Keynes, Stevenage or a number of other small London satellite towns in less than 2 hours from Oxford, and less than 1 from central London. These are all pretty banal collections of concrete buildings, but would allow you to buy a venue for a fraction of the cost. It seems hard to escape the conclusion that this decision was mainly made based on a Manor house in Oxford being more aesthetically appealing than a concrete office building on an industrial estate or small town centre.

The lack of a 2 hour commute is nothing to sneeze at though. CFAR has (had? I haven't checked in on it lately) a venue a couple of hours away from Berkeley that they've used for organizing workshops and events, and the tribulations of organizing getting everyone to and from the venue pretty much ensured it was only used for running 4-5 day events. It made it significantly more difficult for folks at CFAR or MIRI to pop up to make "guest appearances" at workshops and the like significantly reducing value to participants. 

Speaking from personal experien... (read more)

At the point you are having to debate the definition of a castle you've lost the optics argument even if you're technically correct.

I want to downvote this comment more strongly than any other comment I've downvoted on the EA Forum.

On the EA Forum, we should care about what's actually true. "Haha, you lose for having to clarify your point!" may be the rule of the game in soundbite politics, but it can't become the rule of the game in EA's internal decision-making or in conversations on the EA Forum, or we have little hope of actually doing the most good (as opposed to "doing the thing that's safest and plays the best, in adversarial soundbite form, to a random consumer of mass media wi... (read more)

[anonymous]1y41
13
0

There's no debate over the definition of a castle: Wytham Abbey is not a castle (it is not a form of military fortification). Roughly, Wytham Abbey is a castle in the way that an underground eco-house is a nuclear bunker. Which is to say: not at all (it's not some mere technicality that makes it not a castle; it is radically not a castle). There is no debate about definition to be had here.

So I'm not having a debate about definition; I'm noting a misrepresentation. I agree that the optics issue is already lost.  I also think that we should not be misr... (read more)

While those are reasonable comparisons it rather raises the question of why you are buying a venue in one of the most expensive areas of the UK to begin with.

Were there any business cases made comparing this to other cheaper venues in different locations? That's the kind of basic due diligence I'd expect at most organizations I work with in my professional life, and I'm concerned if it's not the case here.

Well Wytham Abbey is only a 10 minute Uber ride from Oxford rail station, which is only 1 hour from London Paddington Station; and it's reasonably convenient by rail from Heathrow and Gatwick airports.

The key features of a good conference venue are (1) easy access to international flights and domestic rail connections, and (2) an enticing location that can attract busy, choosy intellectuals who have many other options of places to go and people to see. 

Those features generally involve being located in places that many other people value -- i.e. that, given supply & demand, tend to involve high real estate prices.

Frankly I would think that there was finally someone with a modicum of sense and understanding of basic PR working in the area. And upgrade my views of the competency of the organisation accordingly.

Also I'd not that "this will save money in the long run" is a fairly big claim that has not been justified. There are literally hundreds of conference venues within a reasonable distance of Oxford, all of which are run by professional event managers who are able to take advantage of specialisation and economies of scale. Making it difficult to believe

Agreed. The whole founding insight of the EA movement was the importance of rigorously measuring value for money. The same logic is used to justify every warm and fuzzy but low value charity. And it's entirely reasonable to be very worried when major figures in the EA movement revert to that kind of reasoning when it's in their self interest.

9
RobertJones
1y
Yes.  It seems very plausible that conferences are good and also that conferences in attractive venues are better, but it seems surprising that this would be the most effective use of the money.

I feel like a lot of this is downstream from people being reluctant to hire experienced people who aren't already associated with EA. Particularly for things like operations roles experience doing similar roles is going to make far more of a difference to effectiveness than deep belief in EA values. 

 

When Coke need to hire new people they don't look for people who have a deep love of sugary drinks brands, they find people in similar roles for other things and offer them money. I feel like the reason EA orgs are reluctant to do this is that there's a degree of exceptionalism in EA. 

I agree that it's downstream of this, but strongly agree with ideopunk that mission alignment is a reasonable requirement to have.* A (perhaps the) major cause of organizations becoming dysfunctional as they grow is that people within the organization act in ways that are good for them, but bad for the organization overall—for example, fudging numbers to make themselves look more successful, ask for more headcount when they don't really need it, doing things that are short-term good but long-term bad (with the assumption that they'll have moved on before t... (read more)

5
Conor Barnes
1y
It's pretty common in values-driven organisations to ask for an amount of value-alignment. The other day I helped out a friend with a resume for an organisation which asked for people applying to care about their feminist mission. In my opinion this is a reasonable thing to ask for and expect. Sharing (overarching) values improves decision-making and requiring for it can help prevent value drift in an org.

Agreed. Its entirely possible to take someone's money and spend it for good causes without promoting them and associating ones reputation with them

Forgive me if there's a structural reason why this wouldn't work. But why weren't you saving a larger share of the money coming in, to provide a buffer in case funding dropped off for whatever reason? Seems like part of the underlying issue here was assuming that funding levels would remain constant in the future

A meta level structural problem may be that so much decision making seems to be focused on a relatively small group of people without much oversight. Even with the best people in the world that's going to lead to group think and blind spots. Other charities and non-profits have extensive oversight systems that may be worth imitating.

How concerned were you about crypto generally being unethical?

Also to add to that, whether or not crypto is in itself ethical its known to be a very unstable sector and one with a particularly negative reputation. Was there any discussion of how to compensate for that potential volatility, and of potential reputational risks of being associated? 

Please feel free to "be that guy" as hard as possible when we are talking about massive financial fraud. 

Feels like it was a mistake to tell people to change their strategy if it can be reversed by a single donor having issues. All the emphasis on "we're not funding constrained" may have done long term harm by reducing future donations from a wider pool of people. 

Linch
1y12
11
1

It's not just a single donor, tech stocks have been down across the board in 2022.

Sorry I should have been clearer, I was meaning more in psychological terms than economic ones. An extra dollar might still do the same amount of good, but the way people intuitively assess impact it will feel very different depending on the funding context people feel it is in. 

So what specific kinds of talented people does EA need more of? Well, the most obvious place to look is the most recent Leader Forum, which gives the following talent gaps (in order):

Is there a place you should go if you meet one of those particular talent gaps? 

5
Ben Millwood
2y
I don't think we have a single "landing page" for all the needs of the community, but I'd recommend applying for relevant jobs or getting career advice or going to an EA Global conference, or figuring out what local community groups are nearby you and asking them for advice.

The focus on student groups is also inherently redflaggy for some people, as it can be viewed as looking for people who have less scepticism and experience.

There being a lot of funding available in EA also changes the calculus for people deciding if they want to donate their own money. If there are super rich people donating to EA, to the extent that finding ways to spend money is a problem, then the motivation for normal individuals to donate is lower. 

I think it changes it some, but not hugely? Even if the best remaining option for making the world better was direct cash transfers I think donations would still make a lot of sense; There's Lots More To Do.

I also don't think we stay in the current dynamic. Part of why it is important for a lot of people to go into directly doing useful things now is to identify and scale up opportunities for directing a lot of money toward valuable things. It's become harder to identify extremely cost-effective ways of spending your money to speed that process up, but money will still be very important.

Its not specific communications so much as it is the level of activity around specific causes. How many posts and how much discussion time is spent on AI and other cool intellectual things, vs. more mundane but important things like malaria. Danger of being seen as just a way for people to morally justify doing the kind of things they already want to do. 

The claim that EA is talent constrained not money constrained always confuses me. Since there are no shortage of young talented people interested in EA, but there seem to be very few jobs for them to do in EA. If there is this huge pile of money sitting around why not give it to pepole in exchange for their labor? As every other industry or cause does. 

Value drift seems like a risk. You might start off with a set of altruistic beliefs, but if you spend all your professional and social time around a set of people who don't share those beliefs then you are likely to adopt those beliefs, for the various well studied psychological effects of conformity and what information you will be being exposed to. 

Excellent story. 

Do you have any writing elsewhere?

2
atb
2y
Thanks, glad you enjoyed it. No other writing yet, but a few things are under consideration, so if they get accepted I'll try to remember reply to this comment with a link.

Same. I'm fairly confident in my writing skills but lack any talent in the other areas and would find doing so embarassing