Michael Townsend

Researcher at Giving What We Can.

Topic Contributions


Sort forum posts by: Occlumency (Old & Upvoted)

I'd use this feature if added!

I wonder if the algorithm (if it is done algorithmically?) that selects the posts to put in "Recommendations"/"Forum Favourites" should also be weighted for occlumency. It seems like the reasons outlined in this post would push in favour of this, though I have some concern that there'd be old posts that are now outdated, rather than foundational, which could get undue attention.  

My Job: EA Office Manager

As another anecdote of the value of a well-run office: After working at Trajan for a few weeks during a visit to the UK, seeing how incredible the office was (thanks to Jonathan) I honestly considered moving countries. I'm now inspired to look into whether it'd one day be possible to setup an office space in Australia (please message if interested!). 

Increasing Demandingness in EA

What Bec Hawk said is right: my claim is that that the number of people effective giving causes to go into direct work will be greater than the number people it causes to not go into direct work (who otherwise would). 

For what it's worth, I don't think >50% of people who do the GWWC plan will go onto doing direct work. 

Increasing Demandingness in EA

 I think this post does a great job of capturing  something I've heard from quite a few people recently.

Especially for longtermist EAs, it seems direct work is substantially more valuable relative to donations than it was in the past, and I think your thought experiment about the number of GWWC pledges it'd make sense to trade for one person working on an 80k priority pathway is a reasonably clear way of illustrating that point. 

But I think that this is a false dilemma (as you suggest it might be).  This isn't just because I doubt that the pledge (or effective giving generally) are in tension, but because I think they're mutually supportive. Effective giving is a reasonably common way to enter the effective altruism community. Noticing that you can have an extraordinary impact with donations — which, even from a longtermist perspective, I still think you can have — can inspire people to begin to taking action to improve the world, and potentially continue onto working directly. I think historically it's been a pretty common first step, and though I anticipate more direct efforts to recruit highly engaged EAs to become relatively more prominent in future, I still expect the path from effective giving --> priority path career, to continue much more often than effective giving --> someone not taking a priority path.

I've heard a lot of conflicting views on whether the above is right; it seems quite a few people disagree with me, and think there's much more of a tension here than I do, and I'd be interested to hear why. (For disclosure, I work at GWWC and personally see getting more people into EA as one of the main ways GWWC can be impactful) 

I suppose the upshot on this, if I'm right, is that the norm that "10% and you're doing your part" can continue, and it's not so obvious that it's in tension with the fact that doing direct work may be many times more impactful. While it may be uncomfortable that there are significant differences in the impactfulness of members of the community, I think this is/was/always will be the case.

Another thing worth adding is that I think there's also room for multiple norms on what counts as "doing your part". For example, I think you should also be commended and feel like you've done your part if you apply to several priority paths, even if you don't get one / it doesn't work out for whatever reason. Maybe Holden's suggestion of trying to get kick-ass at something, while being on standby to use your skill for good, could be another.

By way of conclusion, I feel like what I've written above might seem dismissive of the general issue that EA has yet to figure out — given the new landscape — how to think about demandingness. But I really think there is something to work out here, and so I really interesting this post for raising it quite explicitly as an issue. 

EA needs money more than ever

I like this framing! 

In general, I think that the fact that funding is often not a bottleneck for the most impactful longtermist projects often gets conflated with the idea that marginal donations aren't valuable (which they are! Many/most of those previous opportunities in non-longtermist causes that got many of us excited to be part of effective altruism still exist). 


My GWWC donations: Switching from long- to near-termist opportunities?

Thanks for posting this -- as the other comments also suggest, I don't think you're alone in feeling a tension between your conviction of longtermism and lack of enthusiasm for marginal longtermist donation opportunities. 

I want to distinguish between two different ways at approaching this. The first is simply maximising expected value, the second is trying to act as if you're representing some kind of parliament of different moral theories/worldviews. I think these are pretty different. [1]

For example, suppose you were 80% sure of longtermism, but had a 20% credence in animal welfare being the most important issue of our time, and you were deciding whether to donate to the LTFF or the animal welfare fund. The expected value maximiser would likely think one had a higher expected value, and so would donate all their funds to that one. However, the moral parliamentarian might compromise by donating 80% of their funds to the LTFF and 20% to the animal welfare fund. 

From this comment you left:

I'm not convinced small scale longtermist donations are presently more impactful than neartermist ones, nor am I convinced of the reverse. Given this uncertainty, I am tempted to opt for neartermist donations to achieve better optics.

I take it that you're in the game of maximising expected value, but you're just not sure that the longtermist charities are actually higher impact than the best available neartermist ones (even if they're being judged by you, someone with a high credence in longtermism). That makes sense to me! 

But I'm not sure I agree. I think there'd be something suspicious about the idea that neartermism/longtermism align on which charities are best (given they are optimising for very different things, it'd be surprising if they turned out with the same recommendation). But more importantly, I think I might just be relatively more excited about the kinds of grants the LTFF are making than you are, and also more excited about the idea that my donations could essentially 'funge' open philanthropy (meaning I get the same impact as their last dollar). 

I also think that if you place significant value on the optics of your donations, you can always just donate to multiple different causes, allowing you to honestly say something like "I donate to X, Y and Z -- all charities that I really care about and think are doing tremendous work" which, at least in my best guess, gets you most of the signalling value.

Time to wrapup the lengthy comment! I'd suggest reading Ben Todd's post on this topic, and potentially also the Red-Team against it. I also wrote "The value of small donations from a longtermist perspective" which you may find interesting. 

Thanks again for the post, I appreciate the discussion it's generating. You've put your finger on something important.

  1. ^

    At least, I think the high-level intuition behind each of these mental models are different. But my understanding from a podcast with Hilary Greaves is that when you get down to trying to formalise the ideas, it gets much murkier. I found these slides of her talk on this subject, in case you're interested!

Can we agree on a better name than 'near-termist'? "Not-longermist"? "Not-full-longtermist"?

I agree that the term, whether neartermist or not-longtermist, does not describe a natural category. But I think the latter does a better job at communicating that. The way I hear it, "not-longtermist" sounds like "not that part of idea-space", whereas neartermist sounds like an actual view people may hold that relates to how we should prioritise the nearterm versus the longterm. So I think your point actually supports one of David's alternative suggested terms.

And though you say you don't think we need a term for it at all, the fact that the term "neartermist" has caught on suggests otherwise. If it wasn't helpful, people wouldn't use it. However, perhaps you didn't just mean that we didn't need one, but that we shouldn't use one at all. I'd disagree with that too because it seems to me reasonable in many cases to want to distinguish longtermism with other worldviews EAs often have (i.e., it seems fair to me to say that Open Philanthropy's internal structure is divided on longtermist/not-longtermist lines). 

Also, cool image!

The Vultures Are Circling

Like other commenters, to back-up the tone of this piece, I'd want to see further evidence of these kinds of conversations (e.g., which online circles are you hearing this in?). 

That said, it's pretty clear that the funding available is very large, and it'd be surprising if that news didn't get out. Even in wealthy countries, becoming a community builder in effective altruism might just be one of the most profitable jobs for students or early-career professionals. I'm not saying it shouldn't be, but I'd be surprised if there weren't (eventually) conversations like the ones you described. And even if I think "the vultures are circling" is a little alarmist right now, I appreciate the post pointing to this issue.

On that issue: I agree with your suggestions of "what not to do" -- I think these knee-jerk reactions could easily cause bigger problems than they solve. But what are we to do? What potential damage could there be if the kind of behaviour you described did become substantially more prevalent?

Here's one of my concerns: we might lose something that makes EA pretty special right now. I'm an early-career employee who just started working at an EA org . And something that's struck me is just how much I can trust (and feel trusted by) people working on completely different things in other organisations. 

I'm constantly describing parts of my work environment to friends and family outside of EA, and something I often have to repeat is that "Oh no, I don't work with them -- they're a totally different legal entity -- it's just that we really want to cooperate with each other because we share (or respect the differences in) each other's values". If I had to start second-guessing what people's motives were, I'm pretty sure I wouldn't feel able to trust so easily. And that'd be pretty sad.

Save the Date: EAGxMars

I'd just like to give a shotout to the organisers for their great work!

I don't think anyone appreciates how hard running a conference can be at the best of times. But on Mars, the logistical difficulties are on another planet: the organisers have had astronomical health and safety challenges,  and don't get them started on the availability of vegan catering... 

Announcing What We Owe The Future

Very excited to see this! Hoping to pre-order enough books to have Christmas and birthday presents for years to come...

Load More