Hide table of contents

Centre for Long-Term Resilience

The Centre for Long-Term Resilience (CLTR) is an independent think tank with a mission to transform global resilience to extreme risks. It works with governments and other institutions to improve relevant governance, processes, and decision-making.

CLTR focuses on two areas of risk where effective governance today could substantially mitigate both current and future threats:

Artificial intelligence (AI), including risks arising from unethical uses of AI, from AI systems behaving in unintended ways in high-stakes domains, and from the broader impacts of AI on the economy and society.

Biosecurity, including risks arising from naturally occurring pandemics, laboratory leaks, bioweapons and ‘dual-use’ research (advances that can be used for harm as well as good).

CLTR also focuses on Risk Management more broadly — the process of both transforming risk governance, and of identifying, assessing and mitigating all extreme risks.

It helps governments and other institutions transform resilience to extreme risks by:

  • Helping decision-makers and the wider public to understand extreme risks and what can be done about them.
  • Providing expert advice and red-teaming on policy decisions. 
  • Convening cross-sector conversations and workshops related to extreme risks.
  • Developing and advocating for policy recommendations and effective risk management frameworks and systems.
  • Providing an exchange for specialist knowledge, including by facilitating expert placements into government.

Funding

In August 2023 Founders Pledge published a profile on the Centre for Long-Term Resilience, recommending them as a funding option.

As of June 2022, the Centre for Long-Term Resilience has received over $2.8 million in funding from the Survival and Flourishing Fund,[1][2][3] and $100,000 from the EA Infrastructure Fund.[4]

CLTR’s 2022 Annual Report[5] also refers to over £1 million from a private foundation (focused primarily on impact investing, the promotion of social responsibility and making grants that benefit low and middle-income countries). It also mentions $100,000 from The Powoki Foundation (which focuses on safeguarding humanity from global challenges, such as safely navigating synthetic biology and advanced artificial intelligence). 

...

(Read more)