Value erosion refers to a process by which competitive dynamics could eventually lead to "the proliferation of forms of life (countries, companies, autonomous AIs) which lock-in bad values", if those forms of life outcompete others (Dafoe, 2020).
Dafoe, Allan (2019) Value erosion for FHI July 2019, July 25.
Bostrom, Nick (2004) The...
When human-level AI will emerge (AI timelines) is a consideration for both AI risk interventions and other interventions. For example, one's AI timelines affect how much to discount interventions that have a delayed effect.
This topic is for posts discussing Large Language Models (LLMs) -- for example, the GPT models produced by OpenAI.
AI safety | Artificial intelligence | AI governance | AI forecasting
Benchmarks are tests which enable us to measure the progress of AI capabilities, and test for characteristics which might pose safety risks.
BASALT: A Benchmark for Learning from Human Feedback - AI Alignment Forum
Misaligned Powerseeking — SERI ML Alignment Theory Scholars Program | Summer 2022
Proliferation is the spread of weapons, technologies, or things useful in producing weapons or technologies to actors who did not previously have those things. The term "proliferation" is especially associated with the spread of weapons of mass destruction - such as nuclear weapons - or of technologies or materials useful in producing them. The term diffusion...
Metagenomics is the analysis of genetic material obtained from an environmental sample.
Moorhouse, Fin & Luca Righetti (2022) Ajay Karpur on metagenomic sequencing, Hear This Idea, June 13.
Reasoning transparency is a form of transparency that prioritizes the sharing of information about underlying general thinking processes and philosophy and the communication of this information in ways that make it easier for the recipient to determine what updates to make in response to it.
Muehlhauser, Luke (2017) Reasoning transparency, Open...
A technology race is a competitive dynamic between rival teams to first develop a powerful technology, such as weapons of mass destruction, advanced artificial intelligence, or spaceflight capability.
The expression arms race is often used to describe technology races, though the expression is also used to describe a specific type of technology race...
The total view is a view in population ethics that regards one outcome as better than another if and only if it contains greater total wellbeing, regardless of whether it does so because people are better off or because there are more well-off people. It may be contrasted with person-affecting views.
MacAskill, William, Richard Yetter Chappell ...
User | Post Title | Topic | Pow | When | Vote |
A donation pledge is a commitment to give at least some specified amount of money over a specified period to charities or other donation opportunities. Some pledges target specific types of people. For example, the Giving What We Can 10% Pledge is a commitment people make to donate at least 10% of their income per year over their lifetimes, the Founders Pledge is a commitment entrepreneurs make to donate a percentage of their profits, and the Giving Pledge is a commitment very wealthy individuals make to donate the majority of their wealth.
There are various reasons for or against taking, or promoting, a donation pledge. For example, donation pledges might help people follow through on their altruistic intentions, rather than forgetting to do so or facing value drift. But it is also possible for a pledge to be too constraining, for example, if a person's financial situation changes, if they now want to take risks with their career and thus need more financial runway, or if the donation opportunities they want to give to are not within the scope of their pledge (e.g. if they are not registered charities). (Note, though, that it is often possible to resign from a pledge if your situation changes significantly.)
I The Institute for Law & AI (LawAI) is an independent think tank that researches and advises on the legal challenges posed by artificial intelligence. LawAI conducts foundational research into issues at the intersection of artificial intelligence and law, with the goal of providing governments, industry actors, academics, other nonprofit organizations, and the public with a better understanding of some of the most important legal and policy questions in the field of AI governance.
I The Institute for Law & AI (LawAI) is an independent think tank that researches and advises on the legal challenges posed by artificial intelligence. LawAI conducts foundational research into issues at the intersection of artificial intelligence and law, with the goal of providing governments, industry actors, academics, other nonprofit organizations, and the public with a better understanding of some of the most important legal and policy questions in the field of AI governance.
The Institute for Law and& AI (previously known as (Legal Priorities ProjectLawAI) is an independent think tank that researches and advises on the legal challenges posed by artificial intelligence. LawAI conducts foundational research into issues at the intersection of artificial intelligence and law, with the goal of providing governments, industry actors, academics, other nonprofit organizations, and the public with a researchbetter understanding of some of the most important legal and community-building project that conducts and supports strategic legal research that aims to mitigate existential risk and promotepolicy questions in the flourishingfield of future generations.
The organisation was founded as the Legal Priorities Project in 2020. The idea for the project originated in an effective altruism group at Harvard Law School two years earlier.AI governance.
Righetti, LucaO'Keefe, Cullen. "Chips for Peace: How the U.S. and Its Allies Can Lead on Safe and Beneficial AI." Institute for Law & Fin Moorhouse (2021)AI, July 2024. https://law-ai.org/chips-for-peace/.
Bullock, Charlie, Suzanne Van Arsdale, Mackenzie Arnold, Cullen O'Keefe, and Christoph Winter on the Legal Priorities ProjectWinter. "Legal Considerations for Defining 'Frontier Model.'" Institute for Law & AI, September 2024. https://law-ai.org/frontier-model-definitions/, Hear This Idea, October 18..
Schuett, Jonas (2020)Bullock, Charlie, Suzanne Van Arsdale, Mackenzie Arnold, Matthijs Maas, and Christoph Winter. "Existing Authorities for Oversight of Frontier AI Models." Institute for Law & AI, July 2024. Introducing the Legal Priorities Projecthttps://law-ai.org/existing-authorities-for-oversight/, Effective Altruism Forum, August 30.
Winter, Christoph et al. (2021) Legal priorities research: A research agenda, Legal Priorities Project, January..
global priorities research | law
Survival and Flourishing Fund (2020) SFF-2021-H1 S-process recommendations announcement, Survival and Flourishing Fund.
EA Opportunities is a website that aims to help people find things they can do within the EA community.
Their main product is the "Opportunity Board", which features everything short of full-time positions. They also host lists of relevant resources and organizations.
EA Opportunities. Official website.
EA Opportunities is a website that aims to help people find things they can do within the EA community.
Their main product is the "Opportunity Board", which features everything short of full-time positions. They also host lists of relevant resources and organizations.
EA Opportunities. Official website.
Invincible WellbeingHedonia. Invincible Wellbeing's podcast.
If you or someone you know is interested in making a major donation, please contact Longview's CEO,CEO, sim@longview.simran@longview.org, to explore whether Longview can help you.
Longview Philanthropy’s core priority is safely navigating emerging technology, with most of their grantmaking in the following focus areas:
In addition to these core focus areas, Longview has made substantial grants and recommendations in global priorities research, media, global health & development, global priorities research, and animal welfare.
Longview Philanthropy is led by Simran Dhaliwal (CEO) and Natalie Cargill (President)(President & Founder). They areThe team is based in London,London and haveacross the US, and Longview has legal entities in both the UK & the US. Their operations are funded directly by donors who believe in their mission, including Open Philanthropy, Martin Crowley, Tom Crowley, Likith Govindaiah, Justin Rockefeller, Rafael Albert and several private philanthropists and foundations.
If you or someone you know is interested in making a major donation, please contact Longview's CEO,CEO, sim@longview.simran@longview.org.
If you or someone you know is interested in making a major donation, please contact Longview's CEO, sim@longview.org., to explore whether Longview can help you.
If you or someone you know is interested in making a major donation, please contact Longview's CEO, sim@longview.org.
If you or someone you know is interested in making a major donation, please contact Longview's CEO, sim@longview.org.
They manage specialised funds with distinct focus areas. Their public funds—the Emerging Challenges Fund and the Nuclear Weapons Policy Fund—Fund—are open to all donors. For major donors, they offer private funds featuring enhanced reporting and confidential insights.
Official website: Longview Philanthropy. Official website.
Public Funds: Emerging Challenges Fund & Nuclear Weapons Policy Fund.
If you or someone you know is interested in making a major donation, please contact sim@longview.org.
Longview Philanthropy is a longtermist grantmaking organization based in London.
As of July 2022, Longview Philanthropy has threeis an independent and expert-led philanthropic advisory for major donors who want to do the most good possible with their giving.
Longview Philanthropy’s core priority is emerging technology, with most of their grantmaking in the following focus areas:
In addition to these core focus areas: existential risk reduction, areas, Longview has made substantial grants in global health & development, global priorities research,research, and longtermist institutional reform.animal welfare.
As of July 2022,What does Longview Philanthropy has received $15 million indo?
Longview's grantmaking and advisory teams deliver world-class philanthropic services—all at no cost. Whether you're beginning your giving journey or managing an established portfolio, Longview can help you create lasting impact.
Excellent grantmaking is the foundation of Longivew's work. They identify highly impactful projects with crucial funding gaps overlooked by other funders or requiring broader support. They look for ways to multiply their impact, whether by attracting matching donations or providing seed funding and support to catalyse new initiatives.
They manage specialised funds with distinct focus areas. Their public funds—the Emerging Challenges Fund and the Nuclear Weapons Policy Fund—are open to all donors. For major donors, they offer private funds featuring enhanced reporting and confidential insights.
For donors wishing to make large gifts, they offer access to grant recommendations drawn from their top opportunities. These concise analyses help donors find and fill the Future Fund,[1] most critical funding gaps.
For major donors seeking to develop significant philanthropic portfolios, they provide a bespoke end-to-end service at no cost. This includes detailed analysis, expert-led learning series, residential summits, tailored strategic planning, grant recommendations, due diligence, and $500,000 from Open Philanthropy.[2]impact assessment.
Longview Philanthropy launched a nuclear security grantmaking program in December 2021. The organization is currently hiring a grantmaker to co-lead this program.[3]led by Simran Dhaliwal (CEO) and Natalie Cargill (President). They are also hiring a grantmaker to work on other existential risks.[4]based in London, and have legal entities in both the UK & the US. Their operations are funded directly by donors who believe in their mission, including Open Philanthropy, Martin Crowley, Tom Crowley, Likith Govindaiah, Justin Rockefeller, Rafael Albert and several private philanthropists and foundations.
effective altruism funding | existential risk | global priorities research | longtermism | longtermist institutional reform
Future Fund (2022) Our grants and investments: Longview, Future Fund.
Open Philanthropy (2022) Grants database: Longview Philanthropy, Open Philanthropy.
Longview Philanthropy (2022) Nuclear security programme co-lead, Longview Philanthropy, March.
Longview Philanthropy (2022) Longtermist grantmaker, Longview Philanthropy, March.
Rethink Wellbeing was founded in early 20222023 by Dr Inga Grossman[1] with the support of a team of four. It outperformed two other mental health-related pilot projects that the team tested,tested the year before, one of which also received seed funding from the EAIF.[2] In its first year, Rethink Wellbeing successfully piloted its first flagship program, the CBT Lab, indicating effectiveness and demand within the target group. They are now focused on scaling their services to reach more ambitious altruists and further increase their cost-effectiveness.
PauseAI is an organization that advocates for a global moratorium on AI development.
Rethink Wellbeing (RW) offers proven, engagingengaging, and affordable programs to nurture mental wellbeing and resilience at scale. Rethink Wellbeing acts as an impact multiplier: their programs are specially designed to help ambitiously altruistic individuals who are dedicated to improving the world feel and perform better, supercharging their ability to do good in the world.work.
Rethink Wellbeing was founded in early 2022 by Dr Inga Grossman[1], together with the support of a team of four. It was the most promising of threeoutperformed two other mental health-related project ideaspilot projects that the team tested.tested, one of which also received seed funding from the EAIF.[2] In its first year, Rethink Wellbeing successfully piloted its first flagship program, the CBT Lab, indicating effectiveness and demand within the target group. They are now focused on scaling their services to reach more ambitious altruists and further increase their cost-effectiveness.
As of October 2024, Rethink Wellbeing has supported over 200 ambitious altruists. They still have four employees in the core team and a network of 50+ volunteers and advisorsadvisors.[2]3].
Rethink Wellbeing facilitates 8-16-week guided online mental health programs for ambitious altruists, including The CBT Lab and The IFS Lab–Lab – both group-based peer support formats guided by trained laypeople. In each program, participants learn to apply evidence-based psychotherapeutic and behaviour change methods to improve their mental wellbeing and thereby their productivity.
~10% of money and time invested in EA is lost to poor mental health:
Based on the 2023 program study, participants in If you would like to use Rethink Wellbeing’s data to independently verify the results, or for your own research please reach out.
Key outcomes of the 2024 CBT Lab increased theirPeer Support program included productivity by ~5-9gains equivalent to 8 additional weekly working hours per week and decreased their mental health burden (including symptoms of depression and anxiety) by ~16-28%[3].a greater wellbeing increase than from becoming partnered or finding employment.
The research is longitudinal and employs gold-standard methods used in clinical...
Similar to last year (2023), the Donation Election is an event where Forum users can add to a pot of money (the "Donation Election Fund"), and then vote on how we allocate it. CEA will match the first $5000 donated.
Giving What We Can (GWWC) is an organisation dedicated to inspiring and supporting people to give more, and give more effectively.
It is a project of theCentre for Effective Altruism.Giving What We Can promotes three donation pledges, the most prominent of which is the 10% Pledge, a commitment to donate at least 10% of one's income each year in the way one thinks will achieve the most good. As of
June 2022,December 2024, this pledge has been signed bymore than 7,500over 9,000 people.In November 2009, Ord and Will MacAskill launched Giving What We Can as an international community of people who were committed to giving more, and giving more effectively. The Centre for Effective Altruism was incorporated in 2011 as registered charity and an umbrella organisation for Giving What We Can and the then newly founded 80,000 Hours. Giving What We Can has since spun out and, as of late August 2024, is its own independent legal entity.
As of July 2022,Giving What We Canhas received $700,000 in fundingis funded by a combination of direct donations fromthemembers and individuals, as well as grants. As of 2024, Giving What We Can's largest funder is Open Philanthropy. Read more on theirFuture Fundtransparency page.[2]Giving What We Can (2022) Our members, Giving What We Can.
^Future Fund (2022)Our grants and investments: Giving What We Can,Future Fund.