Introduction
The Peace and Conflict Science Institute (PACS Institute) is an independent, non-profit research organization dedicated to deepening our understanding of peace and its potential to create a better world. The PACS Institute conducts interdisciplinary research at the intersection of philosophy, behavioral science, cognitive neuroscience, and political science. As the Effective Altruism (EA) community shares the goal of reducing suffering and promoting well-being on a global scale, the PACS Institute is looking to integrate its work under the EA umbrella and attract funding from EA funding bodies. This blog post seeks reflections, insights, and advice from the EA community on how to best achieve this integration.
Shared Goals and Interconnectedness
As Emil Wasteson, Executive Director of EA Sweden, has noted, peace studies and effective altruism share a common goal of reducing suffering and promoting well-being on a global scale. Both fields recognize the interconnectedness of global issues and the importance of considering the long-term consequences of our actions. Integrating the PACS Institute into the EA movement can help address problems like reducing the risk of great power war, increasing collaboration for a safer development of AI, and other emerging technologies, as well as helping more people out of poverty.
Peace Engineering and Longtermism
Anders Sandberg, Senior Research Fellow at the Future of Humanity Institute, has emphasized the potential of "peace engineering" to reduce the risk of conflict. Given the significant human-caused disasters such as wars and democide, finding interventions that reduce the harm caused by armed conflict can be high-impact. Moreover, considering the long-term implications, reducing the risk of great power war is a high-priority goal for the EA movement to reduce the risk of extinction or astronomical suffering.
Seeking Reflections and Advice
As the PACS Institute looks to integrate its work into the EA movement, we seek reflections and advice from the community on the following questions:
- How can the PACS Institute effectively communicate the alignment of its mission with the goals of the EA movement?
- What strategies can the PACS Institute adopt to attract funding from EA funding bodies?
- Are there any specific projects or research areas that the PACS Institute should prioritize to align with EA principles and focus areas?
- How can the PACS Institute collaborate with existing EA organizations and initiatives to maximize its impact?
- Are there any challenges or concerns that the PACS Institute should be aware of when integrating into the EA movement?
Conclusion
The PACS Institute is excited about the prospect of joining forces with the Effective Altruism community, as both fields share a common vision of creating a better, more peaceful world. By seeking reflections and advice from the EA community, the PACS Institute aims to strengthen its integration into the movement and more effectively work towards the shared goals of reducing suffering and promoting well-being on a global scale. Your insights and suggestions are invaluable in this endeavor, and we look forward to a fruitful collaboration.
Hi, welcome to the EA Forum.
I see that your post has received a mixed reception so far, with some people downvoting your post. I don't know why they did this, but I'd guess it's because they perceived your post as motivated by money.
I suspect that integrating into the EA community will happen effectively if you engage with EA thinking at an intellectual level. EAs notoriously love having their cherished beliefs challenged, and if you wrote some content which educated the community in the right ways, it would probably be very much appreciated. When I say "the right ways", I'm thinking of content which (a) rests on enough EA common knowledge to be meaningful to people in EA (b) leverages things you know which the community might not know.
Examples which I suspect would go down well:
If you could make claims like these, and back them with careful argument, in a way which demonstrates understanding of EA thinking on cause prioritisation, then I would certainly consider you very much integrated into the EA movement, and I suspect others would too.
Yeah, I strong downvoted because the post doesn't explain what the institute actually does but instead tries to pull public quotes of EAs and try to shoe horn said quotes into PAC Institute. The website is also very opaque. I feel like I got further away from understanding what peace engineering was while reading this post. The money thing I presumed more was an inelegant turn of phrase on the part of the author but did play a part in downvoting.
I think some clear descriptions of what the poster's organization has accomplished, or at least of who it is affiliated with and some concrete near-term plans, would significantly help this post.
Thoughts:
a) Depending on your level of knowledge, you and/or your team may want to consider participating in EA Virtual Programs. Don't just accept the standard EA line on things, try to figure out what your actual beliefs are. I also recommend searching out criticisms of EA when deciding whether EA is for you.
b) If you are interested in working on AI governance, I'd strongly recommend working through the AI Governance course.
c) Applying for the next EA Conference as this is one of the best ways to network with people who you might want to collaborate with. Don't be disappointed if you aren't invited to EA Global, unfortunately, lots of people who are doing great work aren't accepted to EA Global and they tend to only accept people who are already high-context on EA. EAGx conferences tend to be more accessible but definitely apply to both.
d) Have a look at this list of organisations and projects within Effective Altruism. Reach out to organisations with whom you might be interested in collaborating.
e) Post reflections on your work to the forum. It's quite common for organizations to post reflections on their work and their plans going forward to the forum when they are seeking funding.
“What strategies can the PACS Institute adopt to attract funding from EA funding bodies?” and “Are there any specific projects or research areas that the PACS Institute should prioritize to align with EA principles and focus areas?”
I think the big one would be prioritisation, either along the lines of EA’s “Importance, Tractability and Neglectedness” framework or by prioritising conflicts that pose an existential risk to humanity.
For example, you might focus your institution’s work on the most unstable countries where war or civil war is most likely to break out, or on countries like Indian and Pakistan where conflict could quickly escalate to nuclear war.
“How can the PACS Institute collaborate with existing EA organizations and initiatives to maximize its impact?”
Perhaps getting in touch with Nuclear Threat Intiative would work well - they focus more on WMDs rather than conflict generally.
Thank you all for your insightful comments and feedback. They are invaluable as we navigate our integration into the Effective Altruism (EA) community.
We understand and appreciate the constructive criticism regarding our communication about the PACS Institute's work and mission. We recognize the importance of providing a clear and comprehensive description of our work, projects, and accomplishments to effectively situate ourselves within the EA community. We will be refining our messaging and providing more transparency about our work and its alignment with the principles of Effective Altruism.
Sanjay and zchuang, we acknowledge your concerns about the need for intellectual engagement and contribution to the knowledge within the community. Our aim in integrating into the EA community is not solely about securing funding but is more about contributing to the shared goal of reducing global suffering and promoting well-being. We are interested in producing content that challenges and expands upon existing EA thinking, particularly in areas related to peace and security.
One of our most recent accomplishments is the publication of a groundbreaking research paper titled "Reframing the Ontology of Peace Studies" by Anders Reagan. This paper addresses a lack of ontological clarity in peace studies and proposes a radical reframing of the field as the multidisciplinary scientific study of the optimal social conditions for the continued evolution of the trait of sentience1. These findings will form the basis for our advocacy work at the upcoming Human Rights Council meeting in Geneva this September.
Jason, we acknowledge your suggestion to provide clear descriptions of our accomplishments and affiliations. We understand that this information is essential for the EA community to assess our effectiveness and potential for impact. We will work on making this information more readily available.
Chris Leong, your advice on networking within the EA community and familiarizing ourselves with EA thinking and criticisms is well taken. We will certainly explore the EA Virtual Programs and the AI Governance course, and we look forward to participating in the next EA Conference.
Freedomandutility, your insights on prioritization and collaboration are valuable. We agree that focusing on conflicts with the potential for existential risk aligns well with EA principles. We will explore potential collaborations with organizations such as the Nuclear Threat Initiative.
We are excited about the prospect of contributing our expertise in peace and conflict studies to the EA community. We appreciate your patience and guidance in this process and look forward to continued engagement with all of you.