Hide table of contents

Note: I'm not claiming that I know of a new x-risk, I just want to know about the right policy in this situation

If someone identifies a new existential or catastrophic risk, it seems prudent to avoid publishing it widely as this may constitute an infohazard.

However, it probably doesn't make sense to keep this information to oneself since other people can begin to work on research and mitigation if they are aware of the risk.

Is there a group of people to disclose new x-risks to that can make relevant experts aware of the risk? In general, how and where should someone disclose a new x-risk?

20

0
0

Reactions

0
0
New Answer
New Comment


4 Answers sorted by

Not a comprehensive answer but a few ideas. I don't know of any existing documentation or organisation about how to do this. 

  1. I think talking to people currently heavily involved in funding x-risk mitigation efforts is a good start. People with a proven track record of taking x-risks seriously are more likely to adequately consider the relevant concerns and assist by progressing the discussion and coming up with meaningful mitigation strategies. For example, you could email Nick Bostrom or someone at Open Philanthropy. I've heard Kevin Esvelt is someone with a track record or taking info-hazards seriously too. 
  2. Maybe don't go directly to super critical people in existing efforts. It's possible that you should qualify your ideas first by talking to other experts (who you trust) in whichever domain is likely to know about those risks (although of course you'd want to avoid losing control of the narrative, such as by someone you tell overzealously raising alarm and damaging your credibility). 

There's probably lots of specific reasoning that might be necessary based on the relevant risk (for example if it's tied up with specific economic activity the way AI capabilities development is). 

I endorse the suggestion to talk to talking to someone senior at Open Phil. EA doesn't have a centralized decisionmaker, but Open Phil might be closest as a generally trusted group which is used to handling these issues.

Ok, and any advice for reaching out to trusted-but-less-prestigious experts? It seems unlikely that reaching out to e.g. Kevin Esvelt will generate a response!

5
Linch
I think someone like Esvelt (and also Greg, who personally answered in the affirmative) will probably respond. Even if they are too busy to do a call, they'll know the appropriate junior-level people to triage things to. 

To build on Linch's response here:
I work on the biosecurity & pandemic preparedness team at Open Philanthropy. Info hazard disclosure questions are often gnarly. I'm very happy to help troubleshoot these sorts of issues, including both general questions and more specific concerns. The best way to contact me, anonymously or non-anonymously, is through this short form. (Alternatively, you could reach my colleague Andrew Snyder-Beattie here.) Importantly, if you're reaching out, please do not include potentially sensitive details of info hazards in form submissions – if necessary, we can arrange more secure means of follow-up communication, anonymous or otherwise (e.g., a phone call). 

The guiding principle I recommend is 'disclose in the manner which maximally advantages good actors over bad actors'. As you note, this usually will mean something between 'public broadcast' and 'keep it to yourself', and perhaps something in and around responsible disclosure in software engineering: try to get the message to those who can help mitigate the vulnerability without it leaking to those who might exploit it.

On how to actually do it, I mostly agree with Bloom's answer. One thing to add is although I can't speak for OP staff, Esvelt, etc., I'd expect - like me - they would far rather have someone 'pester' them with a mistaken worry than see a significant concern get widely disseminated because someone was too nervous to reach out to them directly.

Speaking for myself: If something comes up where you think I would be worth talking to, please do get in touch so we can arrange a further conversation. I don't need to know (and I would recommend against including) particular details in the first instance.

(As perhaps goes without saying, at least for bio - and perhaps elsewhere - I strongly recommend against people trying to generate hazards, 'red teaming', etc.)

Generally speaking, I would suggest a shift of focus away from particular risks which arise from emerging technologies, and towards the machinery which is generating all such risks, an ever accelerating knowledge explosion.

It's natural to see a particular risk and wish to do something about it.  But such a limited focus is not really fully rational once we realize that it doesn't really matter if we remove one particular existential risk unless we can remove them all.   As example, if I knew how to make genetic engineering fully safe why would that matter if we then go on to have a nuclear war?

It's a logic failure to assume, as seemingly almost all "experts" do, that we can continue to enthusiastically fuel an ever accelerating knowledge explosion and then somehow successfully manage every existential risk which emerges from that process, every day forever.  

We're failing to grasp what the concept of acceleration actually means.  It means that if the knowledge explosion is going at, say, 50mph today, tomorrow it will be 75mph, and then 150mph, and then 300mph etc.  Sooner or later this accelerating process of power accumulation will exceed the human ability to manage.  No one can predict exactly when or how we'll crash the system, but simple common sense logic demonstrates it will happen eventually on our current course.

The "experts" would have us focus on the details of particular emerging technological threats.   The experts are wrong.  What we need to be focused on instead is the knowledge explosion assembly line which is generating all the threats.

The way I deal with info-hazards in general is that I balance the risks and gains of talking about it with specific people. I haven't wanted to talk to "EA seniors" unless I know them well enough to trust them. But I do talk to people, because it helps me grow my own understanding, and that might help me or them do something about it.

I don't think you know me well enough to trust me, but I'd be happy to hear about it and give feedback on the reasoning.

Comments1
Sorted by Click to highlight new comments since:

It's a very important question.

However, it probably doesn't make sense to keep this information to oneself since other people can begin to work on research and mitigation if they are aware of the risk.

I don't think this is always the case. In anthropogenic x-risk domains, it can be very hard to decrease the chance of an existential catastrophe from a certain technology, and very easy to inadvertently increase it (by drawing attention to an info hazard). Even if the researchers (within EA) are very successful, their work can easily be ignored by the relevant actors in the name of competitiveness ("our for-profit public-benefit company takes the risk much more seriously than the competitors, so it's better if we race full speed ahead", "regulating companies in this field would make China get that technology first", etc.).

(See also: The Vulnerable World Hypothesis.)

Curated and popular this week
 ·  · 16m read
 · 
Applications are currently open for the next cohort of AIM's Charity Entrepreneurship Incubation Program in August 2025. We've just published our in-depth research reports on the new ideas for charities we're recommending for people to launch through the program. This article provides an introduction to each idea, and a link to the full report. You can learn more about these ideas in our upcoming Q&A with Morgan Fairless, AIM's Director of Research, on February 26th.   Advocacy for used lead-acid battery recycling legislation Full report: https://www.charityentrepreneurship.com/reports/lead-battery-recycling-advocacy    Description Lead-acid batteries are widely used across industries, particularly in the automotive sector. While recycling these batteries is essential because the lead inside them can be recovered and reused, it is also a major source of lead exposure—a significant environmental health hazard. Lead exposure can cause severe cardiovascular and cognitive development issues, among other health problems.   The risk is especially high when used-lead acid batteries (ULABs) are processed at informal sites with inadequate health and environmental protections. At these sites, lead from the batteries is often released into the air, soil, and water, exposing nearby populations through inhalation and ingestion. Though data remain scarce, we estimate that ULAB recycling accounts for 5–30% of total global lead exposure. This report explores the potential of launching a new charity focused on advocating for stronger ULAB recycling policies in low- and middle-income countries (LMICs). The primary goal of these policies would be to transition the sector from informal, high-pollution recycling to formal, regulated recycling. Policies may also improve environmental and safety standards within the formal sector to further reduce pollution and exposure risks.   Counterfactual impact Cost-effectiveness analysis: We estimate that this charity could generate abou
Dorothy M.
 ·  · 5m read
 · 
If you don’t typically engage with politics/government, this is the time to do so. If you are American and/or based in the U.S., reaching out to lawmakers, supporting organizations that are mobilizing on this issue, and helping amplify the urgency of this crisis can make a difference. Why this matters: 1. Millions of lives are at stake 2. Decades of progress, and prior investment, in global health and wellbeing are at risk 3. Government funding multiplies the impact of philanthropy Where things stand today (February 27, 2025) The Trump Administration’s foreign aid freeze has taken a catastrophic turn: rather than complying with a court order to restart paused funding, they have chosen to terminate more than 90% of all USAID grants and contracts. This stunningly reckless decision comes just 30 days into a supposed 90-day review of foreign aid. This will cause a devastating loss of life. Even beyond the immediate deaths, the long-term consequences are dire. Many of these programs rely on supply chains, health worker training, and community trust that have taken years to build, and which have already been harmed by U.S. actions in recent weeks. Further disruptions will actively unravel decades of health infrastructure development in low-income countries. While some funding may theoretically remain available, the reality is grim: the main USAID payment system remains offline and most staff capable of restarting programs have been laid off. Many people don’t believe these terminations were carried out legally. But NGOs and implementing partners are on the brink of bankruptcy and insolvency because the government has not paid them for work completed months ago and is withholding funding for ongoing work (including not transferring funds and not giving access to drawdowns of lines of credit, as is typical for some awards). We are facing a sweeping and permanent shutdown of many of the most cost-effective global health and development programs in existence that sa
abrahamrowe
 ·  · 9m read
 · 
This is a Draft Amnesty Week draft. It may not be polished, up to my usual standards, fully thought through, or fully fact-checked.  Commenting and feedback guidelines:  I'm posting this to get it out there. I'd love to see comments that take the ideas forward, but criticism of my argument won't be as useful at this time, in part because I won't do any further work on it. This is a post I drafted in November 2023, then updated for an hour in March 2025. I don’t think I’ll ever finish it so I am just leaving it in this draft form for draft amnesty week (I know I'm late). I don’t think it is particularly well calibrated, but mainly just makes a bunch of points that I haven’t seen assembled elsewhere. Please take it as extremely low-confidence and there being a low-likelihood of this post describing these dynamics perfectly. I’ve worked at both EA charities and non-EA charities, and the EA funding landscape is unlike any other I’ve ever been in. This can be good — funders are often willing to take high-risk, high-reward bets on projects that might otherwise never get funded, and the amount of friction for getting funding is significantly lower. But, there is an orientation toward funders (and in particular staff at some major funders), that seems extremely unusual for charitable communities: a high degree of deference to their opinions. As a reference, most other charitable communities I’ve worked in have viewed funders in a much more mixed light. Engaging with them is necessary, yes, but usually funders (including large, thoughtful foundations like Open Philanthropy) are viewed as… an unaligned third party who is instrumentally useful to your organization, but whose opinions on your work should hold relatively little or no weight, given that they are a non-expert on the direct work, and often have bad ideas about how to do what you are doing. I think there are many good reasons to take funders’ perspectives seriously, and I mostly won’t cover these here. But, to