Here's our update for September 2016. It was a great month for growth at CEA. In September, we got 106 new Giving What We Can members and 1,748 new EA Newsletter signups; in both cases that’s nearly double the previous three month average. 

If anyone has questions, comment below and I'll check in in a couple of days to respond.

 

Metrics

 

New members per month over the last year.

The final spike on the right is the 106 new members in September. This is the highest number of pledges in any month outside of giving season (the highest ever is 129 in December 2014).

 

 

(The dip after September 2016 is because we’ve yet to have the full month of October).




New newsletter subscribers by month

Growth in number of newsletter subscribers has also been strong; it’s been the best month ever.


Inline images 1



Community and Outreach Division



Online Marketing

This month we’ve been experimenting with multiple strategies designed to get Giving What We Can Pledges and signups for the EA Newsletter. This has been very successful and as a result we’re considering focusing more on Facebook promotion, email follow up to people considering the pledge, and Twitter promotion. The top three last click conversion sources for pledges were 80,000 Hours, Facebook and effectivealtruism.org.

 

Community, Chapters and Events

We’ve begun planning EA Global 2017 and we’ll be opening applications for EAGx 2017 shortly. We’ve also launched a new high-touch chapters program, providing more in-depth support to a pilot group of student chapters. Chapter leaders interested in knowing more should contact chapters@centreforeffectivealtruism.org. Julia has been working on developing strategies designed to foster cooperative and friendly behaviour in the EA community.

 

Networking and Media

We met with several major philanthropists to advise them on their giving, including Sir Tom Hunter, Scotland’s first billionaire, who is donating almost all his wealth to projects in Rwanda and Scotland.

 

Will discussed effective altruism on Sam Harris’s podcast, which so far as been listened to over 350,000 times. He also appeared in Swedish national media.

 

Y Combinator

We applied to Y Combinator, the leading startup incubator, based in Silicon Valley, which has produced the likes of AirBnB, Dropbox and Reddit. 80,000 Hours went through the program in summer of 2015 and found it exceptionally useful in improving the focus of the organisation and helping the organisation grow, and we’re hoping it will help us in the same way. On the 28th October we’ll hear about whether we’ve got an interview, which would take place in early November. We think that our chances are about 50/50.



Special Projects Division

 

Philanthropic Advising

The philanthropic advising team has focused on exploring new funding opportunities, including in mental health, tobacco regulation, and research funding.

 

Policy

Sebastian Farquhar has put the finishing touches on a forthcoming report on the international community’s possible responses to existential risk in Helsinki for the Finnish Government, and with Toby Ord, has been advising UK government officials on a variety of areas.

 

Research

The research team have begun working through the list of high-priority projects they had compiled and are working on a new research webpage, which we hope to unveil by the end of the year.

 

EA Institute

We submitted our first grant proposal, to the John Templeton Foundation, in collaboration with the University of York and the Centre for Ethics, Philosophy and Public Affairs at St Andrews.

 

Comments14


Sorted by Click to highlight new comments since:

What does Giving What We Can spend its time on? I realized that I am unable to answer this question even loosely, which is unusual. The Our Team page (https://www.givingwhatwecan.org/about-us/team/) helps, but parts are still unclear. More generally, I'm not sure what activities GWWC does that cause more GWWC pledges, which is what it bases its impact on.

What percentage of GWWC pledges do you think were caused by GWWC (or CEA as a whole)? I'm particularly interested in measurements that don't rely on self-reports. So far, I've only seen statistics about self-reports by people who took the pledge, which were used to get a 60:1 multiplier.

Hi Rohin, great question! Since Giving What We Can outreach is now managed by the wider Community and Outreach team at CEA it might make sense to speak in terms of our team as a whole.

Based on the figures we have, it seems reasonable that CEA activities were at least partially responsible for something like 70% of new pledges in September. We experimented with a number of new strategies to get additional pledges in September, including optimising our email and social media campaigns and running a Facebook retargeting campaign. We also tried to reduce the amount of time between following up with people who had expressed some interest in the pledge, either at 80,000 Hours workshops, EA Global or through engagement with websites we run (effectivealtruism.org etc).

More specifically, we looked at our Google Analytics to see how members are reaching the join page on the GWWC site. Out of the 106 who joined in September, 15 came from Facebook, both through our posts and through ads we ran. 12 came from the new effectivealtruism.org site, 3 of whom arrived after viewing the cause prioritisation flowchart (originally created by GPP). 11 were directed from emails; these were mostly people we followed up with who had previously started filling out signup forms but hadn’t completed them. 6 were directed from an article on giving and happiness which was written several years ago by a former staff member, Andreas Mogensen, for the GWWC blog. In terms of the wider CEA, 17 came from 80000hours.org, 9 from www.effectivealtruism.com (which is currently the Doing Good Better book website), and 4 from Sam Harris’s podcast where he interviewed Will MacAskill.

Thanks!

Side note, is there an easy way to get push notifications, perhaps through email, when there's a reply to a comment or post you wrote?

We have adding this as a potential future project for EA forum development. See https://github.com/tog22/eaforum/issues/65

is there an easy way to get push notifications, perhaps through email, when there's a reply to a comment or post you wrote?

There's an rss feed [1] and there are rss-to-email services [2]

[1] http://effective-altruism.com/message/inbox.rss

[2] a quick search turns up https://blogtrottr.com/

I just realized: there's no way that rss feed can work, because it needs to be authenticated with your cookies. Sorry!

Okay, that makes sense. I ran into that issue fairly quickly and thought there might be a workaround but tabled that to look at later.

[anonymous]2
0
0

Congratulations to all concerned. As a member of the EA community in West Yorkshire I am really pleased to see the collaboration with the University of York. I hope the bid to the John Templeton Foundation is successful.

If GWWC will continue on as an internal brand within the Community & Outreach Division, how open is CEA in taking donations earmarked for GWWC instead of general operations funding? I ask because some donors may evaluate GWWC as substantially more impactful than CEA's other activities.

My current plan is that to a first approximation we won't accept restricted donations, including to GWWC. (It's a fiction that truly restricted donations are possible, anyway). But we will give donors the chance to express their preferences about how the money is to be used, which we'll consider in the aggregate when making strategic decisions. If donors think we're making major mistakes in allocation of resources between different activities, I'd love to see that written up, it would be very helpful to us.

I'm confused about the ongoing status of GPP. Has GPP been fully folded under the "Special Projects Division", or is that happening in the near future? Also, my impression was Special Projects would fill the role of GPP and GWWC's original research. Is that the case? Also, will Owen and Seb be continuing on with Special Projects and its work, or focusing on FHI's research priorities?

GPP has fully folded under Special Projects. GPP had two tracks: policy research and outreach, and fundamental EA theory. These now have their own distinct teams under the special project division. The third team is philanthropic advising, which was previously under GWWC. Owen and Seb are continuing with Special Projects.

[anonymous]0
0
0

I think this is wise given the complexity of GPP's core research agenda, but I really like the branding and identity of the project and the prominence it gives to research effectiveness as a critically important idea. I see it as being potentially analogous to what the Fraunhofer-Gesellschaft does for innovation in Germany in that it could turn a vague concept into a strategic process.

Curated and popular this week
Sam Anschell
 ·  · 6m read
 · 
*Disclaimer* I am writing this post in a personal capacity; the opinions I express are my own and do not represent my employer. I think that more people and orgs (especially nonprofits) should consider negotiating the cost of sizable expenses. In my experience, there is usually nothing to lose by respectfully asking to pay less, and doing so can sometimes save thousands or tens of thousands of dollars per hour. This is because negotiating doesn’t take very much time[1], savings can persist across multiple years, and counterparties can be surprisingly generous with discounts. Here are a few examples of expenses that may be negotiable: For organizations * Software or news subscriptions * Of 35 corporate software and news providers I’ve negotiated with, 30 have been willing to provide discounts. These discounts range from 10% to 80%, with an average of around 40%. * Leases * A friend was able to negotiate a 22% reduction in the price per square foot on a corporate lease and secured a couple months of free rent. This led to >$480,000 in savings for their nonprofit. Other negotiable parameters include: * Square footage counted towards rent costs * Lease length * A tenant improvement allowance * Certain physical goods (e.g., smart TVs) * Buying in bulk can be a great lever for negotiating smaller items like covid tests, and can reduce costs by 50% or more. * Event/retreat venues (both venue price and smaller items like food and AV) * Hotel blocks * A quick email with the rates of comparable but more affordable hotel blocks can often save ~10%. * Professional service contracts with large for-profit firms (e.g., IT contracts, office internet coverage) * Insurance premiums (though I am less confident that this is negotiable) For many products and services, a nonprofit can qualify for a discount simply by providing their IRS determination letter or getting verified on platforms like TechSoup. In my experience, most vendors and companies
 ·  · 4m read
 · 
Forethought[1] is a new AI macrostrategy research group cofounded by Max Dalton, Will MacAskill, Tom Davidson, and Amrit Sidhu-Brar. We are trying to figure out how to navigate the (potentially rapid) transition to a world with superintelligent AI systems. We aim to tackle the most important questions we can find, unrestricted by the current Overton window. More details on our website. Why we exist We think that AGI might come soon (say, modal timelines to mostly-automated AI R&D in the next 2-8 years), and might significantly accelerate technological progress, leading to many different challenges. We don’t yet have a good understanding of what this change might look like or how to navigate it. Society is not prepared. Moreover, we want the world to not just avoid catastrophe: we want to reach a really great future. We think about what this might be like (incorporating moral uncertainty), and what we can do, now, to build towards a good future. Like all projects, this started out with a plethora of Google docs. We ran a series of seminars to explore the ideas further, and that cascaded into an organization. This area of work feels to us like the early days of EA: we’re exploring unusual, neglected ideas, and finding research progress surprisingly tractable. And while we start out with (literally) galaxy-brained schemes, they often ground out into fairly specific and concrete ideas about what should happen next. Of course, we’re bringing principles like scope sensitivity, impartiality, etc to our thinking, and we think that these issues urgently need more morally dedicated and thoughtful people working on them. Research Research agendas We are currently pursuing the following perspectives: * Preparing for the intelligence explosion: If AI drives explosive growth there will be an enormous number of challenges we have to face. In addition to misalignment risk and biorisk, this potentially includes: how to govern the development of new weapons of mass destr
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to