All of Luke Freeman's Comments + Replies

GWWC has ambitious plans from 2022 onwards (we're hiring!)

Great question! We're working on finalising this post-merger and having a page on the website again. But in the meantime you can use this: 

(Note: there is a bug on the recurrence reporting right now which marks many recurring donations as one-off)

Leaning into EA Disillusionment

Thanks for writing this post. I think this would be very good to have as a required reading for fellowship programs.

What are some measurable proxies for EA community health?

Nice! Similarly you could look at comments and posts on the EA forum.

What are some measurable proxies for EA community health?

You can measure various different churn rates wherever they make sense and measure the average increase/decrease in churn (is churn rising, falling, staying the same).

What are some measurable proxies for EA community health?

I'd add sentiment analysis on public social media (twitter is pretty easy for this) for a few key terms, accounts, and hashtags.

Ooo I hadn’t thought of this. Good idea!
EA Infrastructure Fund: September–December 2021 grant recommendations

Thanks for publishing the acceptance rate! I think that's useful information to share.

Announcing Non-trivial, an EA learning platform for teenagers

I just want to give a huge round of applause to the Non-trivial team! So great to see this resource! I'm very keen to see this reach the next generation of critical thinking do-gooders!

Dalle-2: Group of internet users standing up and applauding the launch of a website in the style of the last supper

Much excite.

EA for dumb people?

I also want to chime in here and say that it was a bit of a shock for me coming into the EA community also: I was one of the more analytical people in most of my friendship groups, yet it was pretty quickly clear to me that my comparative advantage in this community was actually EQ, communications, and management. I'm glad to work with some incredibly smart analytical people who are kind enough to (a) help me understand things that confuse me when I'm frank about what I don't understand; and (b) remind me what else I bring to the table.

Luke needing to be reminded what he brings to the table I think is evidence that we're missing out on many extremely talented people who aren't 99.9th percentile on one particular skillset that we overselect for. 

As a counter-example, I am below average in many skills that people in my wider-peer group have that, I believe, would be incredibly helpful to the effective altruism movement. However, I am good at a very narrow type of things that are easy to signal in conversation that makes people in the EA community often think way more highly of me than... (read more)

EA for dumb people?

Let me know if you decide to go ahead with the idea and I'll see how I can help 😀 

Marriage, the Giving What We Can Pledge, and the damage caused by vague public commitments

Thanks for sharing your perspective here Jeffrey!

[Note: I wasn’t involved in the decisions around the wording of The Pledge so am speaking from my personal perspective as a member of The Pledge, and as a staff member at GWWC who has spoken with many members and prospective members.]

I agree that there is a downside to having ambiguity around the technicality of GWWC pledges. There are members who, from my perspective, I think they take it too loosely and others who take it too seriously.

However, the level of specificity is a very difficult tradeoff to make ... (read more)

How I Recommend University Groups Approach the Funding Situation

+1 to the comment here about humour. I'm someone who loves a good laugh and has a pretty dry sense of humour but am particularly wary about it when talking about money and suffering (I've seen it go pretty badly in several EA or EA-adjacent contexts).

It's also very important to think about humour in non-EA social contexts where there are a lot of people within the EA community alongside those who aren't. Someones first exposure to the community might be somewhere like an informal party and first impressions really count.

How I Recommend University Groups Approach the Funding Situation

Thank you so much for writing this up! I particularly like the tangible examples and even exact wordings that make it easier for other organisers e.g. 

  1. Your suggested language for conference funding (“we don’t want financial barriers to prohibit...attend the conference only if you want...”).
  2. Your  heuristics that you use to determine how you spend money for Brown EA.
  3. Your answers to FAQs about funding situation

In terms of promoting effective giving, I recommend university group leaders look at the guide for promoting effective giving, effective givi... (read more)

Future Fund June 2022 Update

Thank you for such a detailed and transparent post! It's really exciting to see experimentation in funding models as Future Fund enters the ecosystem. (It's also great to see a bunch of promising things getting the resources they need!)

I've found that the project ideas, areas of interest and grants/regrants databases are also especially useful resources in helping people to think about how they might best contribute! I've shared these multiple times when speaking with very promising people who are relatively cause neutral and just want to do as much good a... (read more)

Thanks for sharing the database links Luke! I wasn't aware FTX had that, but it definitely makes sense that they do.
More funding is really good

FWIW: GiveWell actually already had some opportunities in the pipeline that they were still working on (e.g. Dispensers for Safe Water). Given the funding needs of their top charities right now it's looking very likely they'll have more room for funding than they can fill this year (unless there's unprecedented growth which seems unlikely given current projected economic conditions). At the GiveDirectly bar of funding (10%-30% as cost effective) there's nowhere near enough funding for the foreseeable future.

9Zach Stein-Perlman1mo
(Update: yup [])
80k would be happy to see more projects in the careers space

Thanks Michelle for posting this! I think it really helps to clarify what 80K is doing and what it's not and to encourage people to fill the gaps (and even compete, well!).

I love the collaborative nature of the community and this is a great example.

Critiques of EA that I want to read

Thanks for sharing! I'd also love to read some of these critiques more fleshed out! Really appreciate that you posted bullet point summaries instead of either holding off for a more developed critique or just posting a vague list without summaries 😀 

Community builders should focus more on supporting friendships within their group

Thanks for this post! If it weren't for making friends with people in the EA community it's very unlikely I'd have made a career change to the direct work I'm doing and less likely I'd still be donating (certainly not at the level I have).

I like that you have specific suggestions for how community builders can act on this too 😀 

Michael Nielsen's "Notes on effective altruism"

Thanks Erin! I wouldn't say that EA is only about the key question, I just disagree that utilitarianism and an obligation to maximise are required or 'what EA is about'. I do agree that they are prevalent (and often good to have some upwards pressure on the amount we devote to doing good) 😀 

Michael Nielsen's "Notes on effective altruism"

EA is about two things.

[1] belief in utilitarianism or a utilitarian-esque moral system, that there exists an optimal world we should aspire to.

[2] belief that we should try to do as much good as possible

I would say that is a reasonable descriptive claim about the core beliefs of many in the community (especially more of the hardcore members), but IMHO neither are what "EA is about".

I don't see EA as making claims about what we should do or believe.

I see it as a key question of "how can we do the most good with any given unit of resource we devote to doing... (read more)

4Erin Braid2mo
I also consider this question to be the core of EA, and I have said things like the above to defend EA against the criticism that it's too demanding. However, I have since come to think that this characterization is importantly incomplete, for at least two reasons: 1. It's probably inevitable, and certainly seems to be the case in practice, that people who are serious about answering this question overlap a lot with people who are serious about devoting maximal resources to doing good. Both in the sense that they're often the same people, and in the sense that even when they're different people, they'll share a lot of interests and it might make sense to share a movement. 2. Finding serious answers to this question can cause you to devote more resources to doing good. I feel very confident that this happened to me, for one! I don't just donate to more effective charities than the version of me in a world with no EA analysis, I also donate a lot more money than that version does. I feel great about this, and I would usually frame it positively - I feel more confident and enthusiastic about the good my donations can do, which inspires me to donate more - but negative framings are available too. So I think it can be a bit misleading to imply that EA is only about this key question of per-unit maximization, and contains no upwards pressures on the amount of resources we devote to doing good. But I do agree that this question is a great organizing principle.
Announcing a contest: EA Criticism and Red Teaming

I also suspect that making a big deal about the winners would be a good thing. For example, if the winner of the prize was awarded on the main stage at an EA Global and given a fireside chat that'd further encourage good faith criticism and demonstrate that we really care about it.

High Impact Medicine, 6 months later - Update & Key Lessons

Thanks so much for writing this up! I really appreciate hearing more about what's going on inside different groups. I'm especially excited to see the work being done in professional groups.

Hearing ideas without them being exclusively attached to a wider philosophy and in more neutral (rather than persuasive) fashion may help avoid putting people off and allows individuals the freedom to pick aspects they found more meaningful to explore. This helps encourage a ‘truth seeking’ approach that focuses on ‘learning useful tools and asking important questions [.

... (read more)

Thanks Luke, I definitely think that autonomy and agency, particularly for professionals who are already established in a career, is a good approach to take, and might be a slight difference between community building in university/for professional groups (at least anecdotally, this is our experience). 

And on footnote (2), I think this is actually something reasonably important I want to write more about-for instance, in our fellowship, we noticed that people responded well to information that was from well-known sources like high impact journals or n... (read more)

On being ambitious: failing successfully & less unnecessarily

Thanks Punty!

I take your concern as being, probably rightfully so, these groups working on things without knowing of similar ongoing/completed attempts.

Yep! I agree that competition is sometimes great, but it's the lack of awareness/learning/collaboration that can be a problem.

Would you propose collaborated record keeping between funders and entrepreneurs of ongoing and completed projects, with their potential points of failure? Including, why funding was not approved.

Yep, something like this. For example, I've heard/read many times both these things:

  • The i
... (read more)
What YouTube channels do you watch?

Here's the link:

Thank you so much, I hadn't seen this!
Monthly Overload of EA - June 2022

Thanks David! I always appreciate your curation 💕

On being ambitious: failing successfully & less unnecessarily

Thanks for writing this! I strongly upvoted this comment because I think it contributes a lot to and extends the OP on many different points.

On being ambitious: failing successfully & less unnecessarily

Definitely. Thanks for sharing that argument and example!

Sophia's Shortform

Yeah, but what people experience when they hear about EA via someone like Matt will determine their further actions/beliefs about EA. If they show up and unnecessarily feel unwelcome or misunderstand EA then we’ve not just missed and opportunity then and there but potentially soured them for the long term (and what they say to others will spur other before we get a chance to reach them).

Sophia's Shortform

Thanks Sophia. I think that you’ve quite articulately identified and laid out the difference between these two schools of thought/intuitions about community.

I’d like to see this developed further into a general forum post as I think it contributes to the conversation. FWIW my current take is that we’re more in a multiplicative world (for both the reasons you lay out) and that the lower cost solutions (like the ones you laid out) seem to be table stakes (and I’d even go further and say that if push came to shove I’d actively trade off towards focusing more on the median for these reasons).

Thanks Luke 🌞 Yeah, I definitely think there are some multiplicative effects. Now I'm teasing out what I think in more detail, I'm starting to find the "median" and "tails" distinction, while useful, still maybe a bit too rough for me to decide whether we should do more or less of any particular strategy that is targeted at either group (which makes me hesitant to immediately put these thoughts as a top form post until I've teased out what my best guesses are on how we should maybe change our behaviour if we think we live in a "multiplicative" world).[1] [#fnmp2ep65zcjf] Here are some more of the considerations/claims (that I'm not all that confident in) that are swirling around in my head at the moment 😊. tl;dr: * High fidelity communication is really challenging (and doubly so in broad outreach efforts). * However, broad outreach might thicken the positive tail of the effective altruism movement's impact distribution and thin the negative one even if the median outcome might result in a "diluted" effective altruism community. * Since we are trying to maximize the effective altruism community's expected impact, and all the impact is likely to be at the tails, we actually probably shouldn't care all that much about the median outcome anyway. HIGH FIDELITY COMMUNICATION ABOUT EFFECTIVE ALTRUISM IS CHALLENGING (AND EVEN MORE DIFFICULT WHEN WE DO BROADER OUTREACH/TRY TO BE WELCOMING TO A WIDER RANGE OF PEOPLE) I do think it is a huge challenge to preserve the effective altruism community's dedication to: 1. caring about, at least, everyone alive today; and 2. transparent reasoning, a scout mindset and more generally putting a tonne of effort into finding out what is true even if it is really inconvenient. I do think really narrow targeting might be one of the best tools we have to maintain those things. Some reasons why we might we want to de-emphasize filtering in existing local groups: First reason One reason why focusing on this l
Quantifying Uncertainty in GiveWell's GiveDirectly Cost-Effectiveness Analysis

Just wanted to say: nice work – this is super interesting! Are there plans to do more of this type of analysis on other charities and causes?

Would love to! I'm in communication to set up an EA Funds grant to continue building these for other GiveWell charities. I'd also like to do this with ACE! but I'll need to communicate with them about it.
Is there a "Humans of EA" project? Meaning a project portraying different members within EA.

We generally release at least 1 per month on our blog and social media. We have a (needs to be updated) collection of some here: 

Aha thank you so much! I'll have a look!
On being ambitious: failing successfully & less unnecessarily

Yeah, that's a fantastic example. I really think that CE are a standout organisation on a lot of fronts.

EA Funds donation platform is moving to Giving What We Can

Thanks for your feedback Michael! 

Now the website for the funds doesn't let me donate? Where do I donate? At this other website attached to this other organization? That's odd. Why not just have a simple donation page on your website (thinks my naive brain). Hopefully there will at least be a link from the EA Funds website to the donation page! 

You will still be able to donate to the four EA Funds (either directly or with a link to GWWC) via the EA Funds website. This would be similar to the Founders Pledge Climate Fund or the Patient Philanthrop... (read more)

Thanks, this was very helpful!
[Linkpost] Towards Ineffective Altruism

A lot of the recent criticisms of EA I've seen target longtermism in its most "extreme" form, and drag all of effective altruism with it.

Although I am philosophically very persuaded by longtermism (I think it is an especially important contribution from the effective altruism community and I'm actively working on longtermist causes alongside other ones) I think that it's not the only game in town and we should be careful about times when we might be accidentally representing EA in that way. I think that if we're not especially careful to represent the dive... (read more)

1Markus Amalthea Magnuson3mo
Exactly! Somewhat of a sidenote but I find it relevant: I've seen this thing with many political parties in Sweden that usually have a youth organisation that for various reasons often represents a more radical version of the so-called party line on various issues. Political opponents will try to hold the party responsible for what the youth branch says and does, but it's generally understood by most (I think) that the latter is the avant-garde and should not be conflated with the general views of e.g. those voting for the party in elections. Denying there are important connections between the two would be dishonest, but so would saying they are the same be.
"Big tent" effective altruism is very important (particularly right now)

Had a bit of time to digest overnight and wanted to clarify this a bit further.

I'm very supportive of #3 including "epistemics of core members to be world class". But fear that trying to achieve #3 too narrowly (demographics, worldviews, engagement levels etc) might ultimately undermine our goals (putting more people off, leaving the core group without as much support, worldviews becoming too narrow and this hurts our epistemics,  we don't create enough allies to get things we want to do done).

I think that nurturing the experience through each level o... (read more)

"Big tent" effective altruism is very important (particularly right now)

Thanks for clarifying! Not much to add now right this moment other than to say that I appreciate you going into detail about this.

EA culture is special; we should proceed with intentionality

Thanks for writing this post! I like a lot of the recommendations you made as well as the specific examples you point to when talking about concerns. I’m really glad we’re having these conversations and I think this post contributes to it.

"Big tent" effective altruism is very important (particularly right now)

Thanks Sophia! That example is very much the kind of thing I’m talking about. IMHO it’s pretty low cost and high value for us to try and communicate in this way (and would attract more people with a scout mindset which I think would be very good).

"Big tent" effective altruism is very important (particularly right now)

Thanks Thomas! I definitely agree that when you get into the details of some of these they’re certainly not easy and that the framing of some of them could be seen as applause lights.

"Big tent" effective altruism is very important (particularly right now)

Also, almost everything anyone does is sub-maximally effective. We simply do not know what maximally effective is. We do think it’s worth trying to figure out our best guesses using the best tools available but we can never know with 100% certainty.

"Big tent" effective altruism is very important (particularly right now)

Yeah, I actually called this point out in general in my #8 footnote (“Plus some of these things could (low confidence) make a decent case for considering how low cost they might be.”). I’ve been at EA events or in social contexts with EAs when someone has asserted with great confidence that things like voting and giving blood are pointless. This hasn’t been well received by onlookers (for good reason IMHO) and I think it does more harm than good.

"Big tent" effective altruism is very important (particularly right now)

(I also felt that the applause lights argument largely didn’t hold up and came across as unnecessarily dismissive, I think the comment would have held up better without it)

9Thomas Kwa3mo
Thanks, I made an edit to weaken the wording. I mostly wanted to point out a few characteristics of applause lights that I thought matched: * the proposed actions are easier to cheer for on a superficial level * arguing for the opposite is difficult, even if it might be correct: "Avoid coming across as dogmatic, elitist, or out-of-touch." inverts to "be okay with coming across as dogmatic, elitsit, or out-of-touch" * when you try to put them into practice, the easy changes you can make don't address fundamental difficulties, and making sweeping changes has high cost Looking over it again, saying they are applause lights is saying that the recommendations are entirely vacuous, which is a pretty serious claim I didn't mean to make.
"Big tent" effective altruism is very important (particularly right now)

Thanks Rob. I think you just made my point better than me! 😀

"Big tent" effective altruism is very important (particularly right now)

Agree on both points. I think the concentric circles model still holds well. "Big tent" still applies at each level of engagement though. The best critics in the core will be those who still feel comfortable in the core while disagreeing with lots of people. I highly value people who are at a similar level of engagement but hold very different views to me as they make the best critics.

"Big tent" effective altruism is very important (particularly right now)

Sorry if the remainder of the comment didn't communicate this clearly enough:

I think the "bait and switch" of EA  (sell the "EA is a question" but seem to deliver "EA is these specific conclusions") is self-limiting for our total impact. This is self-limiting because:

  • It limits the size of our community (put off people who see it as a bait and switch)
  • It limits the quality of the community (groupthink, echo chambers, overfishing small ponds  etc)
  • We lose allies
  • We create enemies
  • Impact is a product of: size (community + allies) * quality (community +
... (read more)
"Big tent" effective altruism is very important (particularly right now)

Thanks Nathan. I definitely see the tensions here. Hopefully these clarifications will help :)

I reckon it's better if we focus on being a smaller highly engaged community rather than a really big one.

My central claim isn't about the size of the community, it's about the diversity of EA that we present to the world (and represent within EA) and staying true to the core question not a particular set of conclusions. 

It depends on what you mean by "focus" too. The community will always be some degree of concentric circles of engagement. The total size and... (read more)

Load More