Charlie Dougherty

Hi! My name is Charlie, and I am the new executive director of Effective Altruism Norway. I was born and raised in the USA, but I have lived in Norway now for the last 10 years and have dual citizenship.

My interests include animal welfare, future history, politics, and meta-EA questions


Sorted by New

Wiki Contributions


Mapping of EA

Hi Jordan, 

Thanks for the interest! I am not sure what form this would take, or if I am the right person to be doing it, but if something happens to come up I will keep you in the loop 

Mapping of EA

Thanks Gidon! Would you think this is a useful exercise to try?

Mapping of EA

Oh Wow, that's a really fun idea. Thanks for sharing! It's like someone was writing an EA  fantasy series. 

The Explanatory Obstacle of EA

Hi Gidon! Thanks for the thoughtful reply.

Sorry if I got lost in the difference between a pitch and an explanation in your post. When we talk about one minute or equally short explanations of EA, I tend to think of them as pitches. In the EA world, I tend to think of long form education and discussion such as a fellowship program as an explanation.  I like the distinction, but I would also suggest the line between the two isn't clear cut.  I think this is also indicated when your suggested guidelines are directed to both pitches and explanations. 

My interpretation of what you wrote was that you felt that EA pitches were  neither very good at attracting people nor explaining EA very well to them either, so its interesting to hear you think the pitches are good. 

I like you suggestions, and I love the example of buying a car in your one minute pitch. Its a wonderful illumination of the idea that it's "the thought that counts" in being kind, but in little else.  

If I was to take a step back, though, I would also argue that knowing your audience is very important for even when explaining EA, as not every person looking to learn about EA is interested in all aspects of EA. Lots of people want to do more with their donations, but dont care about epistemics or consequentialism. 

Lots of students want to figure out how to use their time and energy best, but dont worry about earning to give just yet.

Others are completely preoccupied by the philosophy of the far future, and couldnt care less about giving what we can. 

Some people only care about the fact that EA is so strong on factory farming, but think AI is a fantasy. 

There are not that many people who are concerned with knowing the whole of EA and being able to chart it. Most of those people participate in this forum. Knowing the true state of EA is a meta question more than anything to me, and not always useful to the average supporter. (I can talk about this more, but it would take some sapce.)

What the people who need an explanation to EA probably need most an explanation of how EA is relevant to what they care about. We need to frame EA for the audience we are addressing, and until they become fully engaged in EA, a true complete charting of EA for them is probably unnecessary, and for many I suspect overwhelming. 

So for me, it goes back to knowing your audience. How can EA help them be better at what they want to do? How can they help us be a better movement? That is a key to building greater engagement, in my opinion.


Also, does anyone have an up-to-date mapping of EA right now?

The Explanatory Obstacle of EA

Hi Gidon,

Thanks for this, a really interesting way to think about the problem!

I think one rule of thumb that can help people simplify the framing problem is to know who your audience is. I am not sure that there is a universal framing that can be applied to all situations, and trying to abstract explanations to the points of having a framework of explanations might lead to some over-efficient explanations. 

I think your criticism of the website is right fair, but I believe it has more to do with writing to the wrong audience rather than giving a poor explanation. You mention this when you say that the wording might not appeal to someone who does not tend to think very analytically in daily life, but I do not think that the problem is not that it is not clear enough. The problem is that the text  does not capture the reader.

 I do not think that the point of a lot of our introductory pitches should to transfer the most bits of information, but rather to get people on the right track, interested and attracted to the idea. 

I might argue this is more of a copywriting issue than a clarity issue. 

I dont think that there is a 1st degree understanding of EA, and then further degrees of complexity that you understand as you go along. To be able to parse your explanations in this forum post to the degree that you do already requires a high degree of EA expertise. If someone understands all of the information you are trying to transmit in your explanation here, then they are already long past the point of requiring an introduction to EA.