Having been involved in everything from climate justice to AI safety, in my social networks, not just in those movements, but among family and friends, dozens and dozens of people have asked me if there is "hope" of the most idealistic and maximal kind. I.e., hope to the degree of there not really being any serious need to worry about any kind of risk to global civilization whatsoever, as there is no significant risk, and it's all manufactured hysteria.

Everyone who asks me knows I'm an advocate, instead of a scientist/expert/whatever. They're often aware I've been in contact with the people who are, respectively, some of the relevant experts (e.g., environmental scientists, political scientists, full-time academic researchers on whatever, etc.). Even when I reply with an ambiguous answer, most people nervously repeat the question in a way that I can tell is them fishing for emotional validation.

A real answer to the question would be my impression of the spread of opinion among the most well-informed people I know. I tell them that there is technically some hope but the risk of civilizational by this or that is significant. Again, I'm not as pessimistic as others, so I haven't told them there's no hope, or we're rapidly running out of hope.

People start getting upset when I can't honestly tell them the rosiest, most wilfully and irrationally optimistic outlook is the truth. They're not even asking about ostensibly more speculative stuff like AI. Most people seem to emotionally warp themselves into a paranoid hunch that even the IPCC and the Bulletin of Atomic Scientists are doomsday cults.

They're aware there were movements that facilitated the de-escalation and end of the Cold War, or successfully rallied the world to stop the ozone crisis before it was too late. They're begging me to tell them that the movements like that today I've been connected to can eliminate existential risks like a slam dunk. They're trying to put me in a position to put any worries about existential risk out of sight and out of mind, so that they don't have to worry about it. That's what I mean by most people just seeking emotional validation.

I can't even say "no," directly, because I'll trigger them. I can see they're on the brink of tears or a meltdown. I have to imply it.

Even then, they immediately start gaslighting me, or even become hostile, like I'm the kind of doomer who's really making things worse by causing average people just living their lives to become super anxious over utterly fabricated apocalypse scenarios.

Again, this is most often not even about AGI risk being some science-fiction nonsense, but about runaway climate change, or the risk of nuclear war over the war in Ukraine, or geopolitical tensions in Asia related to Taiwan or North Korea. They look at me with fear, or even resentment, when I can't tell them the lies they want to hear. They know I'm more of an ordinary person who can't verify anything. I'm just the closest person they know who they want to reassure them, based on my relatively deeper familiarity with the subject matter of existential risk.

This has been my co-workers. It has been my relatives. It has been my childhood friends. It has been political activists, environmentalists or effective altruists who dropped out of whatever movement because they couldn't bear any more the increasing emotional and mental burden that has come with the mounting sense of impending doom. It has been strangers I meet who bring this up as small talk about what they've read in the news, and start probing when I mention as an interesting point that I've participated in some of the movements aiming to address these monumental problems. They're people I meet in queues, and on the bus, and people on the street asking me what the buttons or shirts about saving the world or whatever that I wear are about.

As of a year or so ago, I started trying to avoid these kinds of conversations entirely because they felt so fruitless. They still didn't stop. People I know occasionally ask me out of the blue about this. I'm facing peer pressure to give people false hope, just so they can hear someone else affirm out loud their rationalizations and subconscious denial, as their anxiety about the potentially pending end of the world keeps increasing. I just try to change the subject as quick as I can.

These are my personal experiences, so it's not research data. Yet this has been a worsening trend over the last 5 years among the vast majority of people I've talked to, over a hundred people, maybe a few hundred. I stopped trying to make precise estimates of how many people it has been while ago.

This feels to me like it might reflect a general trend in society more than a series of uninformative anecdotes from just my life. The empirical data from public opinion polls about how people are thinking in terms of individual propositions about runaway climate change, or out-of-control artificial intelligence, or the risk of World War 3, is just numbers anyway. They tend not to reflect the social psychology of how all kinds of people among the general public feel about existential risk. Hopefully this is valuable info for other effective altruists.

That's not my main motivation for writing this. Personally, I try to change the subject because I'm tired of being others' existential therapist who they look to for hope. I don't know who I'm supposed to look to for hope. I can't just look to all of you for some more magical kind of hope. That's not how this works.

I just need to get this off my chest among a bunch of other people who will understand how I feel. As to how I feel, personally, as someone who has made a mission of thinking, and trying to do something about, whatever existential risks, endlessly, it feels horrible. This fucking sucks.

I've been doing this for half my life. I started getting involved in movements trying to do anything about all of this when I was in Grade 10, years before I heard of effective altruism, or this movement even existed. Now, I'm 31 years old.

I'm not giving up on anything. I'll keep going. It just takes a severe emotional toll to keep going sometimes.





More posts like this

Sorted by Click to highlight new comments since: Today at 9:39 AM

My model of this is that the human brain evolved, over the last ~3000 years (possibly 10,000 or 1000), to avoid being "hacked" by the words of other humans; specifically, hacked into sacrificing physical resources or social status to someone who optimized for finding a combination of words or ideas that could do that. For example, the people who were persuaded by ancient priests to give an unusually large tithe to the sun god temple (e.g. via proto pascal's wager arguments) had less viable offspring, and people resistant to persuasion about anything had dominated the genepool by the time of the industrial revolution. A dynamic like this, but much more nuanced (e.g. maybe conformity-obsession was the mechanic itself, not resistance to new ideas).

Unfortunately, the current state of our culture (in EU and particularly NA) is that because nobody bled to death or got burned to a crisp by the million for almost 80 years, we then end up in the pitiful situation where most people extrapolate the last 80 years of peace of our parents and grandparents, onto the next 80 years for our lives and our children and grandchildren. Given that the world was alright for 80 years, that means that the predictions of many "doomsayers" necessarily must have been wrong for 80 years. Ancient Rome had a similar situation before it collapsed.

There's also the problem of EA being "Punchable", in the worlds of Holden; EA is in fact an attack on the legitimacy of the Left and Right by default, merely by making an effort to be altruistic:

Having seen the EA brand under the spotlight, I now think it isn’t a great brand for wide public outreach. It throws together a lot of very different things (global health giving, global catastrophic risk reduction, longtermism) in a way that makes sense to me but seems highly confusing to many, and puts them all under a wrapper that seems self-righteous and, for lack of a better term, punchable?

The issue is that when you arrive and make your pitch, the average person is oriented towards their reality where they are in the right for expecting a peaceful existence, rather than the real reality where you are in the right for making your pitch. It is helpful to think of it maybe more like a startup founder pitching their idea to an investor, except rather than a transaction of cash that the investor is professional and experienced about, it's an exchange of a way of life, and you are arguing that their lifeplan was pretty poorly planned out, and they aren't professional or experienced about hearing this the way that investors are professional and experienced about hearing elevator pitches.

In the 2020s, you also have to tack on the rise of social media misinformation and competing messages. People's friend groups are now filled with hot takes that optimize for memetic spread, and people see ideas that cause fear as something that their friends have fallen victim to, specifically on social media. Competing intense messages make people withdraw into their shell as a pretty basic adjustment/immune response, regardless of what reality looks like. 

What helped me was reading about dath ilan, specifically the basement. Seeing it from the perspective of a civilization where people give the problem a proportionate amount of weight, gave me a point of reference that helped have more precise thoughts about the current situation.

TL;DR The root cause of this phenomenon is well described by Kevin Simler's Down the Rabbit Hole, and this image:

This is just how people are. It's the full context of the situation. The enemy's gate is down.

That sounds rough. Sorry to hear about that experience.