All of Tom Gardiner's Comments + Replies

Further to this, if the primary goal is to learn about how the general public thinks about charitable giving, you could probably achieve the same result for far less than 100k. The remainder could be held in reserve and given to that cause if you really do think it's the best use of the money, or to your current best guess if you do not. It seems like there's an insight you wish to have and you've set a needlessly big pricetag on obtaining it. 

2
David Clarke
1y
I think it's unlikely that people will choose to make donations that I regard as totally useless. If they do, then that'll be learning in itself. In general, I think rich people hoarding wealth is a bigger problem for society than people making slightly suboptimal decisions about their giving, so I'm comfortable with the idea of putting a big sum towards this project.

I must offer my strongest possible recommendation for Speedy BOSH! - it has genuinely changed my relationship with food. None of the recipes I have tried are bad, some are fairly average but many are truly glorious. Obviously, as an EA I have been keeping notes on each dish I try from it in Google Doc and I'd be happy to suggest my favourites to anyone who buys/has the book. 

2
Tiresias
1y
Agree! The recipes I've made from them have been consistently very good.
2
Lucy Hampton
1y
+1 to this! I also recommend the plain old BOSH! book - in my experience, these aren't much slower than the Speedy BOSH! recipes, either. 

Lot of good points here. One slight critique and one suggestion to build on the above. If I seem at all confrontational in tone, please note that this is not my aim - I think you made a solid comment. 

Critique: I have a great sense of caution around the belief that "smart, young EAs", and giving them grants to think about stuff, are the best solution to something, no matter how well they understand the community. In my mind, one of the most powerful messages of the OP is the one regarding a preference for orthodox yet inexperienced people over those w... (read more)

1
Chris Leong
1y
I don’t see my suggestion of getting a few groups of smart, young, EAS as exclusive with engaging with experts. Obviously they trade off in terms of funds and organiser effort, but it wouldn’t actually be that expensive to pay the basic living expenses of a few young people.

The first point here seems very likely true. As for the second, I suspect you're mostly right but there's a little more to it. The first of the people I quote in my comment was eventually persuaded to respect my views on altruism, after discussing the philosophy surrounding it almost every night for about three months. I don't think shorter timespans could have been successful in this regard. He has not joined the EA community in any way, but kind of gets what it's about and thinks it's basically a good thing. If his first contact with the community he had... (read more)

Thank you for writing this. I'm not sure whether I agree or disagree, but it seems like a case well made. 

While I do not mean to patronise, as many others will have found this, the one contribution I feel I have to make is an emphasis on how very differently people in the wider public may react to ideas/arguments that seem entirely reasonable to the typical EA. Close friends of mine, bright and educated people, have passionately defended the following positions to me in the past:
-They would rather millions die from preventable diseases than Jeff Bezos... (read more)

4
Arepo
1y
I'm not surprised people with those sorts of views exist, but to some degree I'd expect them to diminish with familiarity. It's easier to sneer at things when you don't have multiple friends openly doing or supporting them. There's also a question of how much harm comes from such people hearing more about EA, even assuming they don't change or just reinforce their views. It seems unlikely they'd have been won over by a slower, more guarded approach, so the question would probably be something like 'are those people likely to become more proactively anti EA such that they turn people away from making effective donations on net, despite the greater discussion around the idea of doing so?' That certainly seems possible, but nonetheless I would bet at pretty good odds against it. Writing this, it occurs to me that one effect of a more open media policy might be to blur the lines further between EA as 'a social movement' and EA as 'a certain way of donating money, that loads of people do without getting engaged'. That again is plausibly bad, but I would bet on being net good.

Agreed - Scott Alexander does this very well, as does Yudkowsky in Rationality: A-Z. Both of these also benefit from being blogs of their own creation, where they can dictate a lot of the norms, and so I expect to have a fair bit more slack in how high the ceiling is. 

As a teenager, I came up with a set of four rules that I resolved ought to be guiding and unbreakable in going through life. They were, somewhat dizzyingly in hindsight, the product of a deeply sad personal event, an interest in Norse mythology and Captain America: Civil War. Many years later, I can't remember what Rules 3 and 4 were; the Rules were officially removed from my ethical code at age 21, and by that point I'd stop being so ragingly deontological anyway. I recall clearly the first two.

Rule 1 - Do not give in to suffering. Rule 2 - Ease the suffe... (read more)

As both a member of the EA community and a retired mediocre stand-up act, I appreciate that you took the time to write this. You rightly highlight that some light-heartedness has benefited some writers within the EA community, and outside of it. My intuition is that the level of humour we can see being used is, give or take, the right level given the goals the community has. A lot of effort and money has been spent on making the community, along with many job opportunities within it, seem professional in the hope that capable individuals will infer that we... (read more)

2
SWK
1y
Thanks so much for your insights. Can't really argue with what you say here: I think you articulated the idea of subtlety and the importance of correct application with humor far better than I did. Admittedly, John Oliver is an extreme example of humor, perhaps so extreme as to be unhelpful as a model for EA. Overall, maybe my use of the word "humor" in this post was too strong. I really liked Tiger Lava Lamp's comment below on "microhumor," which Scott Alexander describes as "things that aren’t a joke in the laugh-out-loud told-by-a-comedian sense, but still put the tiniest ghost of a smile on your reader’s face while they’re skimming through them." This seems to be a more accurate description of the MacAskill and Karnosfky examples I gave.  It seems like we both have a sense that something like Alexander's microhumor can fall within EA's humor threshold and be an effective tool for EA to an extent.

I'd be interested to know, if any of the powers that be are reading, to what extent the Long Term Future Fund could step in to take up the slack left by FTX in regard to the most promising projects now lacking funding. This would seem a centralised way for smaller donors to play their part, without being blighted by ignorance as to who all the others small donors are funding. 

Thanks! I'll definitely check those out.

Were they wearing an Emporio Armani t-shirt, by any chance?

That was really interesting to read. Let me know if you intend to continue down this line of research!

1
turchin
2y
I also have an article about survival on islands and have been thinking about surviving in caves. The topic of survival on ships is a really interesting one and I hope to turn to it some day but now I am working on other problems. 

It's certainly a much easier way to transport things in bulk!

Really interesting that you say that; astro nav is a big part of the Officer Of the Watch qualification in the UK, which is roughly in line with it's civilian equivalent. That being said, if the laptop with the relevant software went down it would be a noticeable setback. I'd like to believe we'd pull through though.

I'm definitely going to change my attitude to community building, to the extent I am involved with it, as a result of reading this. Making sure that criticisms are addressed to the satisfaction of the critic seems hugely important and I don't think I had grasped that before.

My first thought on reading this suggestion for working groups was "That's a great idea, I'd really support someone trying to set that up!"

My second thought was "I would absolutely not have wanted to do that as a student. Where would I even begin?"

My third thought was that even if you did organise a group of people to try implementing the frameworks of EA to build some recommendations from scratch, this will never compare to the research done by long-standing organisations that dedicate many experienced people's working lives to finding the answers. The co... (read more)

Agreed, hence "I don't even think the main aim should be to produce novel work". Imagine something between a Giving Game and producing GiveWell-standard work (much closer to the Giving Game end). Like the Model United Nations idea - it's just practice.

Really glad that you brought up this topic Dedicating one's career (or an appreciable fraction of time or happiness) to a project that will likely fail is a huge deal for someone's personal narrative, and we're hoping that swathes of people will be committed enough to do this. I don't have any answers that aren't mere applause lights, but hope this remains a prevalent discussion.

To clarify, my position could be condensed to "I'm not convinced small scale longtermist donations are presently more impactful than neartermist ones, nor am I convinced of the reverse. Given this uncertainty, I am tempted to opt for neartermist donations to achieve better optics."

The point you make seems very sensible. If I update strongly back towards longtermist giving I will likely do as you suggest.

That seems like a very robust approach if one had a clear threshold in mind for how many qualified AI alignment researchers is enough. Sadly, I have no intuition or information for this, nor a finger on the pulse of that research community.

Hi Olivia, really good of you to share these experiences. A few points I think might be helpful for your next conference:

-The social norms in EA are probably the most open and accepting of any group I've seen in my life. Provided two people aren't engaged in a focused one-on-one, walking up to a group and saying "Hi, can I join this conversation?" seems universally allowed with no sense of alienation or awkwardness at all. People would always catch me up on the conversation topic and include me fully. 
- It was my first conference too and I also hadn't... (read more)

1
Olivia Addy
2y
Hi Tom, thanks for this comment, really useful suggestions! I hadn't considered volunteering but that does sound like it would be a great way to make some quick friends.

Will we need to email Clare whenever some new oxygen needs producing?

4
OllieBase
2y
Yes, thanks for the heads up, I'll email nearer the time (2025 maybe)

I was going to suggest the last point, but you're way ahead of me! In the next couple of years, the first batch of St Andrews EAs will have fully entered the world of work/advanced study, and keeping some record of what the alumni are doing would be meaningful. 
[As highlighted in the thread post, we are two EAs who know each other outside the forum.]

I think some form of this could be valuable, noting Sebastian's point that decreasing risk should be the main priority. It struck me reading the main article that the tendency for EAs to congregate to some extent geographically poses a challenge from a long-term perspective. Oxford, the community's beating heart, is uncomfortably close to London (the obvious civilian target) and Portsmouth (home of the Royal Navy, probably second-top priority military target), meaning a large fraction of the community would be wiped out in a nuclear war. It might be prudent for EAs who can work remotely to set up 'colonies' in places unlikely to be devastated by a nuclear exchange, to provide resilience. 

3
Greg_Colbourn
2y
I'm guessing that with the number of nukes Russia has, most cities in NATO above ~100k people will be targeted, including Oxford. In contrast, basically everywhere in the Southern Hemisphere is likely safe (New Zealand is a favourite hide out for elites I hear).