Hide table of contents

Applications will be evaluated on a rolling basis. All applications must be submitted by Friday, March 22nd, 2024, 11:59 pm GMT.

CEA is hiring a head of communications. While a successful candidate would ideally have a strong communications background, we’re open to applications from generalists with strong foundational skills who can build a team with additional expertise. This is a senior leadership position reporting to the CEO. The remit of the role is broad, including developing and executing communications strategies for both CEA and effective altruism more broadly. We anticipate that this individual will become the foremost leader for strategic communications related to EA and will have a significant impact in shaping the field's strategy. This will include collaborating with senior leaders at other organizations doing EA-related work.

Both EA and CEA are at important inflection points. Public awareness of EA has grown significantly over the past 2 years, during which time EA has had both major success and significant controversies. To match this growth in awareness, we're looking to increase our capacity to inform public narratives about and contribute to a more accurate understanding of EA ideas and impact. The stakes are high: Success could result in significantly higher engagement with EA ideas, leading to career changes, donations, new projects, and increased traction in a range of fields. Failure could result in long-lasting damage to the brand, the ideas, and the people who have historically associated with them.

We’re looking for a leader who can design and execute a communications strategy for EA. This person will be a strategic partner with and member of CEA’s leadership team to help us shape both the substance and messaging of EA. You’ll be able to build from the foundation set by our existing team, building on the work of our outgoing head of communications to further grow and expand the team, which currently includes one full-time staff member and support from an external agency. CEA has a new CEO, who is in the process of developing a new organizational strategy and views strengthening our communications function as a key priority. You should expect significant organizational support—e.g. attention from senior leadership and the allocation of necessary financial resources.

100

0
0

Reactions

0
0
Comments1
Sorted by Click to highlight new comments since:

Given my recent experience with nitpicking about linkposts, the new head of communications should devote some time to improving debate on the EA Forum, to make it a more positive and encouraging place. People are just going to leave otherwise. There needs to be a more positive communication culture. And more moderators, perhaps, with a modicum of emotional intelligence. I find the EA forum lacks that.

Curated and popular this week
 ·  · 38m read
 · 
In recent months, the CEOs of leading AI companies have grown increasingly confident about rapid progress: * OpenAI's Sam Altman: Shifted from saying in November "the rate of progress continues" to declaring in January "we are now confident we know how to build AGI" * Anthropic's Dario Amodei: Stated in January "I'm more confident than I've ever been that we're close to powerful capabilities... in the next 2-3 years" * Google DeepMind's Demis Hassabis: Changed from "as soon as 10 years" in autumn to "probably three to five years away" by January. What explains the shift? Is it just hype? Or could we really have Artificial General Intelligence (AGI)[1] by 2028? In this article, I look at what's driven recent progress, estimate how far those drivers can continue, and explain why they're likely to continue for at least four more years. In particular, while in 2024 progress in LLM chatbots seemed to slow, a new approach started to work: teaching the models to reason using reinforcement learning. In just a year, this let them surpass human PhDs at answering difficult scientific reasoning questions, and achieve expert-level performance on one-hour coding tasks. We don't know how capable AGI will become, but extrapolating the recent rate of progress suggests that, by 2028, we could reach AI models with beyond-human reasoning abilities, expert-level knowledge in every domain, and that can autonomously complete multi-week projects, and progress would likely continue from there.  On this set of software engineering & computer use tasks, in 2020 AI was only able to do tasks that would typically take a human expert a couple of seconds. By 2024, that had risen to almost an hour. If the trend continues, by 2028 it'll reach several weeks.  No longer mere chatbots, these 'agent' models might soon satisfy many people's definitions of AGI — roughly, AI systems that match human performance at most knowledge work (see definition in footnote). This means that, while the compa
 ·  · 4m read
 · 
SUMMARY:  ALLFED is launching an emergency appeal on the EA Forum due to a serious funding shortfall. Without new support, ALLFED will be forced to cut half our budget in the coming months, drastically reducing our capacity to help build global food system resilience for catastrophic scenarios like nuclear winter, a severe pandemic, or infrastructure breakdown. ALLFED is seeking $800,000 over the course of 2025 to sustain its team, continue policy-relevant research, and move forward with pilot projects that could save lives in a catastrophe. As funding priorities shift toward AI safety, we believe resilient food solutions remain a highly cost-effective way to protect the future. If you’re able to support or share this appeal, please visit allfed.info/donate. Donate to ALLFED FULL ARTICLE: I (David Denkenberger) am writing alongside two of my team-mates, as ALLFED’s co-founder, to ask for your support. This is the first time in Alliance to Feed the Earth in Disaster’s (ALLFED’s) 8 year existence that we have reached out on the EA Forum with a direct funding appeal outside of Marginal Funding Week/our annual updates. I am doing so because ALLFED’s funding situation is serious, and because so much of ALLFED’s progress to date has been made possible through the support, feedback, and collaboration of the EA community.  Read our funding appeal At ALLFED, we are deeply grateful to all our supporters, including the Survival and Flourishing Fund, which has provided the majority of our funding for years. At the end of 2024, we learned we would be receiving far less support than expected due to a shift in SFF’s strategic priorities toward AI safety. Without additional funding, ALLFED will need to shrink. I believe the marginal cost effectiveness for improving the future and saving lives of resilience is competitive with AI Safety, even if timelines are short, because of potential AI-induced catastrophes. That is why we are asking people to donate to this emergency appeal
Recent opportunities in Building effective altruism
49
Ivan Burduk
· · 2m read