Leverage Research: reviewing the basic facts

by throwaway3rd Aug 201897 comments

58

Criticism (EA Orgs)
Frontpage

Resources spent

  • Leverage Research has now existed for over 7.5 years1
  • Since 2011, it has consumed over 100 person-years of human capital.
  • From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23

Outputs

Some of the larger outputs of Leverage Research include:

  • Work on Connection Theory: although this does not include the initial creation of the theory itself, which was done by Geoff Anders prior to founding Leverage Research
  • Contributions to productivity of altruists via the application of psychological theories including Connection Theory
  • Intellectual contributions to the effective altruism community: including early work on cause prioritisation and risks to the movement.
  • Intellectual contributions to the rationality community: including CFAR’s class on goal factoring
  • The EA Summits in 2013-14: The EA summit is a precursor to EA Global, which is being revived in 2018

Its website also has seven blog posts.4

Recruitment Transparency

  • Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.
  • Leverage Research has provided psychology consulting services using Connection Theory, leading it to obtain mind-maps of a substantial fraction of its prospective staff and donors, based on reports from prospective staff and donors.
  • The leadership of Leverage Research have on multiple occasions overstated their rate of staff growth by more than double, in personal conversation.
  • Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observation of Leverage Research staff at these events.
  • Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.

General Transparency

  • The website of Leverage Research has been excluded from the Wayback Machine5
  • Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.
  • Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe. A substantial number of staff of Paradigm Academy were previously staff of Leverage Research.

General Remarks

Readers are encouraged to add additional facts known about Leverage Research in the comments section, especially where these can be supported by citation, or direct conversational evidence.

Citations

1. https://www.lesswrong.com/posts/969wcdD3weuCscvoJ/introducing-leverage-research

2. https://projects.propublica.org/nonprofits/organizations/453989386

3. https://projects.propublica.org/nonprofits/organizations/452740006

4. http://leverageresearch.org/blog

5. https://web.archive.org/web/*/http://leverageresearch.org/

97 comments, sorted by Highlighting new comments since Today at 6:35 PM
New Comment
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

[My views only]

Although few materials remain from the early days of Leverage (I am confident they acted to remove themselves from wayback, as other sites link to wayback versions of their old documents which now 404), there are some interesting remnants:

  • A (non-wayback) website snapshot from 2013
  • A version of Leverage's plan
  • An early Connection Theory paper

I think this material (and the surprising absence of material since) speaks for itself - although I might write more later anyway.

Per other comments, I'm also excited by the plan of greater transparency from Leverage. I'm particularly eager to find out whether they still work on Connection Theory (and what the current theory is), whether they addressed any of the criticism (e.g. 1, 2) levelled at CT years ago, whether the further evidence and argument mentioned as forthcoming in early documents and comment threads will materialise, and generally what research (on CT or anything else) have they done in the last several years, and when this will be made public.

An article in Splinter News was released few days ago, showing leaked emails where Jonah Bennett, a former Leverage employee who is now editor-in-chief for Palladium Magazine (LinkedIn ), was involved with a white nationalist email list, where he among other things made anti-Semetic jokes about a holocaust survivor, says he "always has illuminating conversations with Richard Spencer", and complained about someone being "pro-West before being pro-white/super far-right".

I have 35 mutual friends with this guy on Facebook, mostly EAs. This makes me think that while at Leverage he interacted a reasonable amount with the EA community. (Obviously, I expect my EA mutual friends to react with revulsion to this stuff.)

Bennett denies this connection; he says he was trying to make friends with these white nationalists in order to get information on them and white nationalism. I think it's plausible that this is somewhat true. In particular, I'd not be that surprised if Bennett is not a fan of Hitler, and if he said racist jokes more to fit in. But I'd be pretty surprised if it turned out that he didn't have endorsed explicitly racist views--this seems ... (read more)

"which makes me think that it's likely that Leverage at least for a while had a whole lot of really racist employees."

"Leverage" seems to have employed at least 60 people at some time or another in different capacities. I've known several (maybe met around 15 or so), and the ones I've interacted with often seemed like pretty typical EAs/rationalists. I got the sense that there may have been few people there interested in the neoreactionary movement, but also got the impression the majority really weren't.

I just want to flag that I really wouldn't want EAs generally think that "people who worked at Leverage are pretty likely to be racist," because this seems quite untrue and quite damaging. I don't have much information about the complex situation that represents Leverage, but I do think that the sum of the people ever employed by them still holds a lot of potential. I'd really not want them to get or feel isolated from the rest of the community.

Ok actually I just reread this comment and now I think that the thing you quoted me as saying is way too strong. I am confused by why I wrote that.

Yep, understood, and thanks for clarifying in the above comment. I wasn't thinking you thought many of them were racist, but did think that at least a few readers may have gotten that impression from the piece.

There isn't too much public discussion on this topic and some people have pretty strong feelings on Leverage, so sadly sometimes the wording and details matter more than they probably should.

Yeah, I don't think that people who worked at Leverage are pretty likely to be racist.

Hi Buck, Ozzie and Greg,

I thought I’d just add some data from my own experience.

For context, I’ve been heavily involved in the EA community, most recently running CEA. After I left CEA, I spent the summer researching what to do next and recently decided to join the Leverage Research team. I’m speaking personally here, not on behalf of Leverage.

I wanted to second Ozzie’s comment. My personal experience at least is that I’ve found the Leverage and Paradigm teams really welcoming.

They do employ people with a wide range of political views with the idea that it helps research progress to have a diversity of viewpoints. Sometimes this means looking at difficult topics and I’ve sometimes found it uncomfortable to try and challenge why I hold different viewpoints but I’ve always found that the focus is on understanding ideas and the attitude to the individual people one of deep respect. I’ve found this refreshing.

I wanted to thank Ozzie for posting this in part because I noticed reticence in myself to saying anything because my experience with conversations about Leverage is that they can get weird and personal quite fast. I know people who’ve posted positive things about Leverage on the E

... (read more)

Hello Larissa,

I'd be eager to see anything that speaks to Leverage's past or present research activity: what have they been trying to find out, what have they achieved, and what are they aiming for at the moment (cf).

As you know from our previous conversations re. Leverage, I'm fairly indifferent to 'they're shady!' complaints (I think if people have evidence of significant wrongdoing, they should come forward rather than briefing adversely off the record), but much less so to the concern that Leverage has an has achieved extraordinarily little for an organisation with multiple full-time staff working for the better part of a decade. Showing something like, "Ah, but see! We've done all these things," or, "Yeah, 2012-6 was a bit of a write-off, but here's the progress we've made since", would hopefully reassure, but in any case be informative for people who would like to have a view on leverage independent of which rumour mill they happen to end up near.

Other things I'd be interested to hear about is what you are planning to work on at Leverage, and what information you investigated which - I assume - leads to a much more positive impression of Leverage than I take the public evidence to suggest.

Hi Greg,

Thanks for the message and for engaging at the level of what has Leverage achieved and what is it doing. The tone of your reply made me more comfortable in replying and more interested in sharing things about their work so thank you!

Leverage are currently working on a series of posts that are aimed at covering what has been happening at Leverage from its inception in 2011 up until a recent restructure this year. I expect this series to cover what Leverage and associated organisations were working on and what they achieved. This means that I expect Leverage to answer all of your questions in a lot more depth in the future. However, I understand that people have been waiting a long time for us to be more transparent so below I have written out some more informal answers to your questions from my understanding of Leverage to help in the meantime.

Another good way to get a quick overview of the kinds of things Leverage has been working on beyond my notes below is by checking out this survey that we recently sent to workshop participants. It’s designed for people who’ve engaged directly with our content so it won’t be that relevant for people to fill in necessarily but it gives ... (read more)

Just wanted to say I super appreciated this writeup.

Thanks Raemon :-) I'm glad it was helpful.

6anonymoose1yThanks Larissa - the offer to write up posts from Leverage Research is a generous offer. Might it not be a more efficient use of your time, though, to instead answer questions about Leverage the public domain, many of which are fairly straightforward? For example, you mention that Leverage is welcoming to new staff. This sounds positive - at the same time, the way Leverage treated incoming staff is one of the main kinds of fact discussed in the top-level post. Is it still true that: (i) staff still discuss recruitees on individual slack channels, (ii) mind-mapping is still used during recruitment of staff, (iii) growth-rates are overestimated, (iv) specific lists of attendees are recruited from EA events, and (v) negative rumours are still spread about other organizations that might compete for similar talent? To the extent that you are not sure about (i-v), it would be interested to know whether you raised those concerns with Geoff in the hiring process, before joining the organization. For other questions raised by the top-level post: (a) are Leverage's outputs truly as they appear? (b) Is its consumption of financial resources and talent, as it appears? (c) Has it truly gone to such efforts to conceal its activities as described under the general transparency section? (d) How will Leverage measure any impact from its ninth year of operation? From another post [https://forum.effectivealtruism.org/posts/kAgeY6nmpC6pzMbpM/to-what-extent-is-paradigm-academy-a-front-organization-for] : (e) How many of the staff at Leverage Research are also affiliated with Paradigm Academy? (f) How much of the leadership of Leverage Research is also playing a leading role at Paradigm Academy? Since your investigations give you much more information on this topic than is available to an interested outsider, it should be very easy for you to help us out with these questions.

Hi Anonymoose,

I’d like to do two things with my reply here.

  1. First, to try and answer your questions as best I can.
  2. But then second, start to work out how to make future conversations with you about Leverage more productive

1. ANSWERING YOUR QUESTIONS

I’d recommend first reading my recent reply to Greg because this will give you a lot of relevant context and answers some of your questions.

Questions a, b and d: outputs, resources and future impact

Your Questions:

“(a) are Leverage's outputs truly as they appear?”
“(b) Is its consumption of financial resources and talent, as it appears?”
“(d) How will Leverage measure any impact from its ninth year of operation?”

In terms of questions a, b and d, I will note the same thing as I said in my reply to Greg which is that we’re currently working both on a retrospective of the last eight and a half years of Leverage and on updating Leverage’s existing website. I think these posts and updates will then allow individuals to assess for themselves

  1. our past work and outputs
  2. whether it was worth the resources invested
  3. our plans for the future

For now, though sections “What did Leverage 1.0 work on?” and “What is Leverage doing now” in my reply to Greg... (read more)

Hi Buck,

For anyone that isn’t aware, I’m the founder and Executive Director of Leverage Research and as such wanted to reply to your comment.

First, I want to challenge the frame of this message. Messages like these, however moderately they are intended or executed, pull up a larger political context of attacks. People start asking the question “who should we burn” and then everyone scrambles to disavow everyone else so that they themselves don’t get burned.

I’m against disavowal and burning, at least in this case. My reaction if I found out that Jonah was “officially racist” by whatever measures would be to try to talk to him personally and convince him that the ideas were wrong. If I thought he was going to do something horrible, I’d oppose him and try to stop him. I think that disavowal and burning is a really bad way to fight racism because it pushes it underground without addressing it, and I’m not interested in getting public applause or doing short-sighted PR mitigation by doing something that is superficially good and actually harmful.

In terms of Jonah’s views, Jonah is a public figure and as such should speak for himself. He wrote a reply to the Splinter piece here: https://... (read more)

As to other questions relating to Leverage, EA, funding- and attention-worthiness, etc., I’ve addressed some concerns in previous comments and I intend to address a broader range of questions later. I don’t however endorse attack posts as a discussion format, and so intend to keep my responses here brief. The issues you raise are important to a lot of people and should be addressed, so please feel free to contact me or my staff via email if it would be helpful to discuss more.

[Own views]

If an issue is important to a lot of people, private follow-ups seem a poor solution. Even if you wholly satisfy Buck, he may not be able to relay what reassured him to all concerned parties, and thus likely duplication of effort on your part as each reaches out individually.

Of course, this makes more sense as an ill-advised attempt to dodge public scrutiny - better for PR if damning criticism remains in your inbox rather than on the internet-at-large. In this, alas, Leverage has a regrettable track record: You promised 13 months ago to write something within a month to better explain Leverage better, only to make a much more recent edit (cf.) that you've "changed your plans" and enco... (read more)

I also hope your faith in Bennett is well-placed, that whatever mix of vices led him to write vile antisemitic ridicule on an email list called 'morning hate' in 2016 bear little relevance to the man he was when with Leverage in ~~2018, or the man he is now.

Perhaps it'd be helpful for Bennett to publish a critique of alt-right ideas in Palladium Magazine?

  • In Bennett's statement on Medium, he says now that he's Catholic, he condemns the views he espoused. If that's true, he should be glad to publish a piece which reduces their level of support.
  • Since he used to espouse those views, he has intimate understanding of the psychology of those who hold them. So a piece he edits could help deconvert/deradicalize people more effectively than a piece edited by an outsider. And whatever persuaded him to abandon those views might also work on others.

Bennet might complain that publishing such a piece would put him in an impossible bind, because any attempt to find common ground with alt-righters, and explain what originally drew him to the movement to do effective deconversion, could be spun as "Jonah Bennett doubles down on alt-right ideology" for click... (read more)

2Milan_Griffes1yWow, I didn't know about this. Thank you for drawing attention to it.
1Buck1y[I wrote an addendum to this comment, but then someone pointed out that it was unclear, so I deleted it]

Note: I was previously CEO of CEA, but stepped down from that role about 9 months ago.

I've long been confused about the reputation Leverage has in the EA community. After hearing lots of conflicting reports, both extremely positive and negative, I decided to investigate a little myself. As a result, I've had multiple conversations with Geoff, and attended a training weekend run by Paradigm. I can understand why many people get a poor impression, and question the validity of their early stage research. I think that in the past, Leverage has done a poor job communicating their mission, and relationship to the EA movement. I'd like to see Leverage continue to improve transparency, and am pleased with Geoff's comments below.

Despite some initial hesitation, I found the Paradigm training I attended surprisingly useful, perhaps even more so than the CFAR workshop I attended. The workshop was competently run, and content was delivered in a polished fashion. I didn't go in expecting the content to be scientifically rigorous, most self improvement content isn't. It was fun, engaging, and useful enough to justify the time spent.

Paradigm is now running the EA summit. I know Mindy and Peter,... (read more)

I don't think that Leverage, Paradigm or related projects are good use of EA time or money

Found this surprising given the positive valence of the rest of the comment. Could you expand a little on why you don't think Leverage et al. are a good use of time/money?

I think their approach is highly speculative, even if you were to agree with their overall plan. I think Leverage has contributed to EA in the past, and I expect them to continue doing so, but this alone isn't enough to make them a better donation target than orgs like CEA or 80K.

I'm glad they exist, and hope they continue to exist, I just don't think Leverage or Paradigm are the most effective things I could be doing with my money or time. I feel similarly about CFAR. Supporting movement building and long-termism is already meta enough for me.

Interesting. I don't usually conflate "good use" with "most effective use."

Seems like "not a good use" means something like "this project shouldn't be associated with EA."

Whereas "not the most effective use" means something like "this project isn't my best-guess about how to do good, but it's okay to be associated with EA."

Perhaps this is just semantics, but I'm genuinely not sure which sense you intend.

-6Evan_Gaensbauer2y
2Evan_Gaensbauer2yAs someone whose experience as an outsider from Leverage, who has not done paid for any EA organizations in the past, is similar to Tara's, I can corroborate her impression. I've not been in the Bay Area or had a volunteer or personal association with any EA organizations located there since 2014. Thus, my own investigation was from afar, following the spread-out info on Leverage available online, including past posts regarding Leverage on LW and the EA Forum, and online conversations with former staff, interns and visitors to Leverage Research. The impression I got from what is probably a very different data-set than Tara's is virtually identical. Thus, I endorse as a robust yet fair characterization of Leverage Research. I've also heard from several CFAR workshop alumni myself they found the Paradigm training they received more useful than the CFAR workshop they attended as well. A couple of them also noted their surprise at this impression, given their trepidation knowing Paradigm sprouted from Leverage, what with their past reputation. A confounding factor in these anecdotes would be the CFAR workshops my friends and acquaintances had attended were from a few years ago, in which time those same people revisiting CFAR, and more recent CFAR workshop alumni, remark how different and superior to their earlier workshops CFAR's more recent ones have been. Nonetheless, the impression I've received is nearly unanimous positive experiences at Paradigm workshops from attendees part of the EA movement, competitive in quality with CFAR workshops, which has years of troubleshooting and experience on Paradigm. I want to clarify the CEA has not been alone in movement-building activities, and the CEA itself has ongoing associations with the Local Effective Altruism Network (LEAN) and the Effective Altruism Foundation out of the German-speaking EA world on movement-building activities. Paradigm Academy's staff, in seeking to kickstart grassroots movement-building efforts in

I was interviewed by Peter Buckley and Tyler Alterman when I applied for the Pareto fellowship. It was one of the strangest, most uncomfortable experiences I've had over several years of being involved in EA. I'm posting this from notes I took right after the call, so I am confident that I remember this accurately.

The first question asked about what I would do if Peter Singer presented me with a great argument for doing an effective thing that's socially unacceptable. The argument was left as an unspecified black box.

Next, for about 25 minutes, they taught me the technique of "belief reporting". (See some information here and here). They made me try it out live on the call, for example by making me do "sentence completion". This made me feel extremely uncomfortable. It seemed like unscientific, crackpot psychology. It was the sort of thing you'd expect from a New Age group or Scientology.

In the second part of the interview (30 minutes?), I was asked to verbalise what my system one believes will happen in the future of humanity. They asked me to just speak freely without thinking, even if it sounds incoherent. Again it felt extremely cultish. I expected this to l... (read more)

I had an interview with them under the same circumstances and also had the belief reporting trial. (I forget if I had the Peter Singer question.) I can confirm that it was supremely disconcerting.

At the very least, it's insensitive - they were asking for a huge amount of vulnerability and trust in a situation where we both knew I was trying to impress them in a professional context. I sort of understand why that exercise might have seemed like a good idea, but I really hope nobody does this in interviews anymore.

-27avindroth2y
  • Leverage Research spent a further $388k in 2017.
  • At least 11 of 13 Paradigm Academy staff listed on Linkedin are known to have worked for Leverage Research or allied organizations.
  • The coin made by Reserve (one of the successor companies to Leverage Research) has returned -32.7% since its float at the time of writing. In the same time period, bitcoin returned 24%.
8anonymoose1yReserve has now lost 50.6 [https://coinmarketcap.com/currencies/reserve-rights/] % of its value since its float, while Bitcoin has returned ~1% over the same time period.

So, after I read this comment I left thinking that Reserve performed exceptionally poorly, but it seems that almost all cryptocurrencies have gone down about the same amount since June 19th (the time of Reserve's launch, from what I can tell). Here are some random currencies that I clicked on, on the coinmarketcap website that you linked. This is a comprehensive list, so I report the price change since June 19th for every currency that I looked at:

  • Bitcoin Cash:
    • June 19th price: $416
    • Price now: $244
    • Change: -41.3%
  • XRP:
    • June 19th price: $0.448
    • Price now: $0.25
    • Change: -44.1%
  • Litecoin:
    • June 19th price: $135
    • Price now: $55
    • Change: -59.2%
  • Monero
    • June 19th price: $100
    • Price now: $58
    • Change: -42%

You are also incorrect that Bitcoin has returned 1% over the same time period. On June 19th, the price of Bitcoin was $9273, and it now is $8027. So while you are correct that Bitcoin went down significantly less than Reserve, it performed drastically better than almost all other cryptocurrencies, and still went down by about 13%.

I don't think Reserve is overall a super great idea, but I think the statistics you cited seem misleading to me, and it seems that Reserve overall is performing similarly to the rest of the non-Bitcoin crypto-market.

after I read this comment I left thinking that Reserve performed exceptionally poorly but..

Your initial impression was correct. Reserve has entered a terrible market and managed to perform substantially worse than its terrible competitors. Since May 24, when Reserve Rights was priced:

  • the S&P gained 14%,
  • cryptocurrency at large lost 17%,
  • cryptocurrencies excluding Bitcoin lost 33%,
  • while Reserve Rights managed to lose 52.3%.
You are also incorrect that Bitcoin has returned 1% over the same time period.

Reserve Rights was floated on May 24 according to CoinMarketCap, at which time Bitcoin was worth $7800-$7950, and it is now worth the same amount, so the error must be either with you, or with CoinMarketCap.

5Habryka1yI used Jun 19th, because that was the first date with a market cap available, which seemed like the most reasonable date to start. So that likely explains the discrepancy.
2Larks1yI don't know much about it, but isn't Reserve meant to be a Stablecoin? If so any change in value seems significantly worse than for other coins.

I also don't know much about it, but I think Reserve includes a couple of coins. 'Reserve Rights' is not intended to be a stablecoin (I think it is meant to perform some function for the stablecoin system, but I'm ignorant of what it is), whilst 'Reserve', yet to be released, is meant to be stable.

4Milan_Griffes1yHuh, do you know what 'Reserve Rights' does / why it exists? Is there a short explainer of it somewhere?

The reason for posting these facts now is that as of the time of writing, Leverage's successor, the Paradigm Academy is seeking to host the EA Summit in one week. The hope is that these facts would firstly help to inform effective altruists on the matter of whether they would be well-advised to attend, and secondly, what approach they may want to take if they do attend.

Leverage Research has recruited from the EA community using mind-maps and other psychological techniques, obtaining dozens of years of work, but doing little apparent good. As a result, the author views it as inadvisable for EAs to engage with Leverage Research and its successor, Paradigm Academy. Rather, they should seek the advice of mentors outside of the Leverage orbit before deciding to attend such an event. Based on past events such as the Pareto Fellowship, invitees who ultimately decide to attend would be well-advised to be cautious about recruitment, by keeping in touch with friends and mentors throughout.

I think this would be more useful as part of the main post than as a comment.

-3Evan_Gaensbauer2yI've provided my explanations for the following in this comment [http://effective-altruism.com/ea/1r2/leverage_research_reviewing_the_basic_facts/f49] : * No evidence has been provided Paradigm Academy is Leverage's successor. While the OP stated facts about Leverage, all the comments declaring more facts about Leverage Research are merely casting spurious associations between Leverage Research and the EA Summit. Along with the facts, you've smuggled in an assumption amounting to nothing more than a conspiracy theory about Leverage rebranding themselves as Paradigm Academy and is organizing the 2018 EA Summit for some unclear and ominous reason. In addition to no logical reason or sound evidence being provided for how Leverage's negative reputation in EA should be transferred to the upcoming Summit, my interlocutors have admitted themselves or revealed their evidence from personal experience to be weak. I've provided my direct personal experience knowing the parties involved in organizing the EA Summit, and also having paid close attention from afar of Leverage's trajectory in and around EA, contrary to the unsubstantiated thesis the 2018 EA Summit is some opaque machination by Leverage Research. * There is no logical connection between the facts about Leverage Research and the purpose of the upcoming EA Summit. Further, the claims presented as facts about the upcoming Summit aren't actually facts. At this point, I'll just point out the idea Paradigm is somehow necessarily in any sense Leverage's successor is based on no apparent evidence. So the author's advice doesn't logically follow from the claims made about Leverage Research. What's more, as I demonstrated in my other comments, this event isn't some unilateral attempt by Paradigm Academy to steer EA in some unknown direction. As one of the primary organizers for the EA community in Vancouver, Canada; the primary organizer for the rationalit

See Geoff's reply to me below: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I think it's reasonable for people to use Leverage's history and track record in evaluating Paradigm.

Thanks for making this post, it was long overdue.

Further facts

  • Connection Theory has been criticized as follows: "It is incomplete and inadequate, has flawed methodology, and conflicts well established science." The key paper has been removed from their websites and the web archive but is still available at the bottom of this post.
  • More of Geoff Anders's early work can be seen at https://systematicphilosophy.com/ and https://philosophicalresearch.wordpress.com/. (I hope they don't take down these websites as well.)
  • Former Leverage staff have launched a stablecoin cryptocurrency called Reserve (formerly "Flamingo"), which was backed by Peter Thiel and Coinbase.
  • In 2012-2014, they ran THINK.
  • The main person at LEAN is closely involved with Paradigm Academy and helps them recruit people.

Recruitment transparency

  • I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
  • The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science
... (read more)

Just to add a bit of info: I helped with THINK when I was a college student. It wasn't the most effective strategy (largely, it was founded before we knew people would coalesce so strongly into the EA identity, and we didn't predict that), but Leverage's involvement with it was professional and thoughtful. I didn't get any vibes of cultishness from my time with THINK, though I did find Connection Theory a bit weird and not very useful when I learned about it.

6Evan_Gaensbauer2yDo you mind clarifying what you mean by "recruits people?" I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff. In this comment [http://effective-altruism.com/ea/1r2/leverage_research_reviewing_the_basic_facts/f48] I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn't not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren't being provided. I'm disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I'm unconvinced your or the other reports of Leverage as being like a cult aren't being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren't a deliberate attempt to do nothing but tarnish Leverage. Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment [http://effective-altruism.com/ea/1r2/leverage_research_reviewing_the_basic_facts/f42] . As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA.

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations.

See Geoff's reply to me above: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I don't think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.

[-][anonymous]2y 13

CEA incubated EAF

I don't think this is accurate. (Please excuse the lack of engagement with anything else here; I'm just skimming some of it for now but I did notice this.)

[Edit: Unless you meant EA Funds (rather than Effective Altruism Foundation, as I read it)?]

1Evan_Gaensbauer2yI meant the EA Foundation, who I was under the impression received incubation from CEA. Since apparently my ambiguous perception of those events might be wrong, I've switched the example of one CEA's incubees to ACE.
[-][anonymous]2y 12

That one is accurate.

Also "incubees" is my new favourite word.

9throwaway22yI could list a number of specific details, but not without violating the preferences of the people who shared their experiences with me, and not without causing even more unnecessary drama. These details wouldn't make for a watertight case that they're a "cult". I deliberately didn't claim that Leverage is a cult. (See also this [https://www.lesswrong.com/posts/gBma88LH3CLQsqyfS/cultish-countercultishness].) But the details are quite alarming for anyone who strives to have well-calibrated beliefs and an open-minded and welcoming EA community. I do think their cultishness led to unnecessary harm to well-meaning, young people who wanted to do good in the world.

There's a big difference between feeling cultlike, as in "weird", "disorienting", "bizarre" etc, and exhibiting the epistemic flaws of a cult, as in having people be afraid to disagree with the thought leader, a disproportionate reverence for a single idea or corpus, the excommunication of dissenters, the application of one idea or corpus to explain everything in the world, instinctively explaining away all possible counterarguments, refusal to look seriously at outside ideas, and so on.

If you could provide any sanitized, abstracted details to indicate that the latter is going on rather than merely the former, then it would go a long way towards indicating that LR is contrary to the goal of well-calibrated beliefs and open-mindedness.

8Habryka2y(While LessWrong.com was historically run by MIRI, the new LessWrong is indeed for most intents and purposes an independent organization (while legally under the umbrella of CFAR) and we are currently filing documents to get our own 501c3 registered, and are planning to stick around as an organization for at least another 5 years or so. Since we don't yet have a name that is different from "LessWrong", it's easy to get confused about whether we are an actual independent organization, and I figured I would comment to clarify that.)

Hi everyone,

I'd like to start off by apologizing. I realize that it has been hard to understand what Leverage has been doing, and I think that that's my fault. Last month Kerry Vaughan convinced me that I needed a new approach to PR and public engagement, and so I've been thinking about what to write and say about this. My plan, apart from the post here, was to post something over the next month. So I'll give a brief response to the points here and then make a longer main post early next week [UPDATE: see 2nd edit below].

(1) I'm sorry for the lack of transparency and public engagement. We did a lot more of this in 2011-2012, but did not really succeed in getting people to understand us. After that, I decided we should just focus on our research. I think people expect more public engagement, even very early in the research process, and that I did not understand this.

(2) We do not consider ourselves an EA organization. We do not solicit funds from individual EAs. Instead, we are EA-friendly, in that (a) we employ many EAs, (b) we let people run EA projects, and (c) we contribute to EA causes, especially EA movement building. As noted in the post, we ran the E... (read more)

counting our research as 0 value, and using the movement building impact estimates from LEAN, we come out well on EV compared to an average charity ... I will let readers make their own calculations

Hi Geoff. I gave this a little thought and I am not sure it works. In fact it looks quite plausible that someone's EV (expected value) calculation on Leverage might actually come out as negative (ie. Leverage would be causing harm to the world).

This is because:

  • Most EA orgs calculate their counterfactual expected value by taking into account what the people in that organisation would be doing otherwise if they were not in that organisation and then deduct this from their impact. (I believe at least 80K, Charity Science and EA London do this)

  • Given Leverage's tendency to hire ambitious altruistic people and to look for people at EA events it is plausible that a significant proportion of Leverage staff might well have ended up at other EA organisations.

  • There is a talent gap at other EA organisations (see 80K on this)

  • Leverage does spend some time on movement building but I estimate that this is a tiny proportion of the time, >5%, best guess 3%, (based on having talked to people

... (read more)

Hi Geoff,

In reading this I'm confused about the relationship between Paradigm and Leverage. People in this thread (well, mostly Evan) seem to be talking about them as if Leverage incubated Paradigm but the two are now fully separate. My understanding, however, was that the two organizations function more like two branches of a single entity? I don't have a full picture or anything, but I thought you ran both organizations, staff of both mostly live at Leverage, people move freely between the two as needed by projects, and what happens under each organization is more a matter of strategy than separate direction?

By analogy, I had thought the relationship of Leverage to Paradigm was much more like CEA vs GWWC (two brands of the same organization) or even CEA UK vs CEA USA (two organizations acting together as one brand) than CEA vs ACE (one organization that spun off another one, which is now operates entirely independently with no overlap of staff etc).

Jeff

Hi Jeff,

Sure, happy to try to clarify. I run both Leverage and Paradigm. Leverage is a non-profit and focuses on research. Paradigm is a for-profit and focuses on training and project incubation. The people in both organizations closely coordinate. My current expectation is that I will eventually hand Leverage off while working to keep the people on both projects working together.

I think this means we’re similar to MIRI/CFAR. They started with a single organization which led to the creation of a new organization. Over time, their organizations came to be under distinct leadership, while still closely coordinating.

To understand Leverage and Paradigm, it’s also important to note that we are much more decentralized than most organizations. We grant members of our teams substantial autonomy in both determining their day-to-day work and with regard to starting new projects.

On residence, new hires typically live at our main building for a few months to give them a place to land and then move out. Currently less than 1/3 of the total staff live on-site.

Thanks for clarifying!

Two takeaways for me:

  • Use of both the "Paradigm" and "Leverage" names isn't a reputational dodge, contra throwaway in the original post. The two groups focus on different work and are in the process of fully dividing.

  • People using what they know about Leverage to inform their views of Paradigm is reasonable given their level of overlap in staff and culture, contra Evan here and here.

Could you comment specifically on the Wayback Machine exclusion? Thanks!

My plan, apart from the post here, was to post something over the next month.

Did you end up posting anything on this subject?

5Khorton2yWhat have you done to promote movement building? I didn't see anything on the post or your website, other than the summit next week.

Leverage:

(1) founded THINK, the first EA student group network

(2) ran the EA Summit 2013, the first large EA conference (video)

(3) ran the EA Summit 2014

(4) ran the EA Retreat 2014, the first weeklong retreat for EA leaders

(5) handed off the EA Summit series to CEA; CEA renamed it EA Global

(6) helped out operationally with EA Global 2015.

3Dunja2yCould you please specify which methods of introspection and psychological frameworks you employ to this end, and which evidence you use to assure these frameworks are based on the adequate scientific evidence, obtained by reliable methods?
-8schwartzman2y

CEA appears as a "participating organisation" of the EA Summit. What does this mean? Does CEA endorse paradigm academy?

CEA is not involved in the organizing of the conference, but we support efforts to build the EA community. One of our staff will be speaking at the event.

4Evan_Gaensbauer2yAs an attendee to the 2018 EA Summit, I've been informed by the staff of Paradigm Academy that not even the whole organization, nor Leverage Research, initiated this idea. Geoff Anders nor the executive leadership of Leverage Research are the authors of this Summit. I don't know the hierarchy of Paradigm Academy or where Mindy McTeigue or Peter Buckley, the primary organizers of the Summit, fall in it. As far as I can tell, the EA Summit was independently initiated by these staff at Paradigm and other individual effective altruists they connected with. In the run-up to organizing this Summit, the organizations these individual community members are staff at became sponsors of the EA Summit. Thus, the Local Effective Altruism Network; Charity Science; Paradigm Academy and the CEA are all participants at this event, endorsing the goal of the Summit within EA, without those organizations needing to endorse each other. That's an odd question to ask. Must each EA organization endorse every other involved at EA Global, or any other EA event, prior to its beginning for the community to regard it as "genuinely EA?" As far as I can tell, while Paradigm is obviously physically hosting the event, what it means for the CEA and the other organizations to be participating organizations is just that, officially supporting these efforts at the EA Summit itself. It means no more and no less than for any organization other than what Julia stated in her comment. Also, I oppose using or pressuring the CEA in a form of triangulation, and to be cast by default as the most legitimate representation of the whole EA movement. Nothing I know about the CEA would lead me to believe they condone the type of treatment where someone tries speaking on their behalf in any sense without prior consent. Also, past my own expectations, the EA community recently made clear [http://effective-altruism.com/ea/1qx/the_ea_community_and_far_future_ea_funds_are_not/] they don't as a whole give license to

Evan, thank you for these comments here. I just wanted to register, in case it's at all useful, that I find it a bit difficult to understand your posts sometimes. It struck me that shorter and simpler sentences would probably make this easier for me. But I may be totally ideosyncratic here (English isn't my first language), so do ignore this if it doesn't strike you as useful.

-6Evan_Gaensbauer2y

Good day all,

Can anyone please provide an example of a tangible output from this 'research organization' of the sort EA generally recognize and encourage?

Any rationale or consideration as to how association with such opaque groups does anything other than seriously undermine EA's mission statement would also be appreciated.

Kind Regards

Alistair Simmonds

About two years have now passed since the post. Main updates:

  • Leverage Research appears to be just four people. They have announced new plans, and released a short introduction to their interests in early stage science, but not any other work. Their history of Leverage Research appears to have stalled at the fourth chapter.
  • Reserve seems to be ten people, about seven of whom were involved with Leverage Research. Reserve Rights is up by about 160% since being floated two years ago.
  • Paradigm Research is now branding as a self-help organisation.

I honestly don't get all this stuff about not publishing your work. Time to brag, boy will I get shit on for this comment, but it's really relevant to the issue here: I never even had a minor in the subject, but when I had a good philosophical argument I got it published in a journal, and it wasn't that hard. Peer reviewed, not predatory, went through three rounds of revisions. Not a prestigious journal by any stretch of the imagination, but it proves that I knew what I was doing, which is good enough. You think that peer review is bullshit, fine: that mea... (read more)

Some participants of the Pareto fellowship have told me that Leverage resembles a cult. I can't remember many specifics. One thing is that the main guy (Geoff Anders?) thinks, 100% in earnest, that he's the greatest philosopher who's ever lived.

3Evan_Gaensbauer2y1. The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment [http://effective-altruism.com/ea/1r2/leverage_research_reviewing_the_basic_facts/f3w] has in the past been compared to a Ponzi scheme [http://benjaminrosshoffman.com/effective-altruism-is-self-recommending/]. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn't perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will "rationally" go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment [http://effective-altruism.com/ea/1r2/leverage_research_reviewing_the_basic_facts/f3x] , you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I've also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they're supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I've desisted in making posts like this myself, because digging for context I found Leverage has cha

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I'm confused: the comments on Less Wrong you'd see by "person" and "personN" that were the same person happened when importing from Overcoming Bias. That wouldn't be happening here.

They might still be the same person, but I don't think this forum being descended from LessWrong's code tells us things one way or the other.

1Evan_Gaensbauer2yThanks. I wasn't aware of that. I'll redact that part of my comment.
9throwaway22yI don't feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.
1Evan_Gaensbauer2yUpvoted. I'm sorry for the ambiguity of my comment. I meant each of the posts here under the usernames "throwaway," "throwaway2," and "anonymous" are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it's almost certainly the same anonymous individual making the comments under the same account. I wouldn't expect you to forgo your anonymity.
5kbog2yYour comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you're wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.
1Evan_Gaensbauer2yI admit I'm coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it's worth, I treat the EA Forum not as an internal space but how I'd ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the 'Effective Altruism' Facebook group, or LessWrong. I admit I've been wasting time. I've stopped responding directly to the OP because if I'm coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven't decided yet.
2BenHoffman2y"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes. Maybe my criticism would have been better received if I'd left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.
0Evan_Gaensbauer2y[epistemic status: meta] Summary: Reading comments in this thread which are similar reactions I've seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn't gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience. Upvoted. I don't if you've read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren't new to me, either. I'm sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read. I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I'd spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I'm as if not more socially involved with the rationality community than I am with EA. From my inside view, here is how I'd describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven't found how or what I write on the EA Forum to be too long. But that's because I'm anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don't self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won't be formatted on any discourse style common to the rationalist diaspora. I've touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be

Intellectual contributions to the rationality community: including CFAR’s class on goal factoring

Just a note. I think this might be a bit missleading. Geoff, and other members of Leverage research taught a version of goal factoring at some early CFAR workshops. And Leverage did develop a version of goal factoring inspired by CT. But my understanding is that CFAR staff independently developed goal factoring (starting from an attempt to teach applied consequentialism), and this is an instance of parallel development.

[I work for CFAR, though I had not yet joined the EA or rationality community in those early days. I am reporting what other longstanding CFAR staff told me.]

Leverage Research has now existed for over 7.5 years1 Since 2011, it has consumed over 100 person-years of human capital.

Given by their own admission in a comment response to their original post, the author of this post is providing these facts so effective altruists can make an informed decision regarding potentially attending the 2018 EA Summit, with the expectation these facts can or will discourage EAs from attending the EA Summit, it’s unclear how these facts are relevant information.

  • In particular, no calculation or citation is provided for the

... (read more)

Meta:

It might be worthwhile to have some sort of flag or content warning for potentially controversial posts like this.

On the other hand, this could be misused by people who dislike the EA movement, who could use it as a search parameter to find and "signal-boost" content that looks bad when taken out of context.

What are the benefits of this suggestion?

This is a romp through meadows of daisies and sunflowers compared to what real Internet drama looks like. It's perfectly healthy for a bunch of people to report on their negative experiences and debate the effectiveness of an organization. It will only look controversial if you frame it as controversial; people will only think it is a big deal if you act like it is a big deal.

1Evan_Gaensbauer2yI agree with kbog, while this is unusual for discourse for the EA Forum, this is still far above a bar where I think it's practical to be worried about controversy. If someone thinks the content of a post on the EA Forum might trigger some reader(s), I don't see anything wrong with including content warnings on posts. I'm unsure what you mean by "flagging" potentially controversial content.