Shortform Content [Beta]

Cullen_OKeefe's Shortform

The venerable Judge Easterbrook appears to understand harms from unaligned AI. In this excerpt, he invokes a favorite fictional example among the AI risk community:

The situation is this: Customer incurs a debt and does not pay. Creditor hires Bill Collector to dun Customer for the money. Bill Collector puts a machine on the job and repeatedly calls Cell Number, at which Customer had agreed to receive phone calls by giving his number to Creditor. The machine, called a predictive dialer, works autonomously until a human voice comes on the line. If that ha

... (read more)
andrewleeke's Shortform

How would a therapist diagnose an EA mindset?

I wonder why I’m drawn to EA — particularly in terms of my personality. For example, I have a tendency to justify things (sometimes myself) and endorsing EA principles might simply be an extension of this. EA principles allow me to justify my choice of career, lifestyle, values, diet, etc. 

Does anyone else see EA as an extension of their deep desire to justify things? Since justification is a defensive process, I wonder whether this has any relation to self-esteem or self-worth.

And, if EA doesn't relate to ... (read more)

You might find it helpful to look at this ethnography of an EA group. Also relevant is this analysis of the Big Five personality traits of respondents to the Rethink Charity community survey. It has statistical flaws, but one takeaway is that most EAs are high in openness.

Justification and signalling explanations don't seem especially compelling to me because in some sense, everything is justification and signaling. Also, I'm not sure if you're hinting at this, but it's unlikely that you'll be diagnosed with a mental illness just for being drawn to / belie... (read more)

DonyChristie's Shortform

Would you be interested in a Cause Prioritization Newsletter? What would you want to read on it?

4EdoArad5dI'll sign up and read if it'd be good 😊 What I'd be most interested in are the curation of 1. New suggestions for possible top cause areas 2. New (or less known) organizations or experts in the field 3. Examples of new methodologies 4. and generally, interesting new research on prioritization between and within practically any EA-relevant causes.
3Ramiro4dAdd to (3) new explanations or additions to methodologies - e.g., I still haven't found anything substantial about the idea of adding something like 'urgence' to the ITN framework.

Definitely! And I'll raise by my general interest in thoughtful analyses of existing frameworks

nickmatt's Shortform

Reading science fiction to build intuition about longtermism:

Throughout my involvement over the past 2 years leading and participating in EA fellowships, I've heard numerous fellows say something like "longtermism seems interesting and somewhat convincing, but I find it hard to think about and at times intractable". This difficulty in thinking about how our current actions can have massive long-term impacts on timelines longer than say, a century, isn't something most people coming into an intro to EA fellowship have thought about. 

From my experience ... (read more)

Showing 3 of 4 replies (Click to show all)
2Linch4dSemi-aside: Have you read Canticle for Leibowitz? I barely remember exact details from the book, but I read it when I was very young, and plausibly it affected my priors on nuclear war somewhat, in a subtle/sneaky way.
1Ramiro4dYou nailed it - Aasimov's and Cixin Liu's classics should be almost compulsory reading. However, it caught my eye you call Cixin Liu's trilogy the Dark Forest Trilogy, instead of referring to it as something likeThe 3-body problem books or Trisolarian Trilogy or Remembrances of Earth's Past. What I enjoy most in these books is the challenge of maintaining something like long-term cooperation. To such a list I'd add something like The Ministry for the Future [] (someone should add a good review to this forum); but though it has wonderful passages, sometimes it's irrealistic optimistic (or even simplistic, along the lines "capitalism is evil") and takes a lot for granted.

I just misremembered the official name of the trilogy -- Remembrance of Earth's Past is correct.

Ramiro's Shortform

'Good' news: as expected, as real interest rates fall, so do SDR, increasing the social cost of carbon. (not novelty, ok, but monetary policy-makers explicitly acknowledging it seems to be good)
Bad news: of course, it still seems to be higher than a normative SDR based on time-neutrality.

Archer's Shortform

[HALF-BAKED IDEA - Fundraising through paid newsletters / text message courses.]

THE IDEA : Paid newsletters and paid text message courses should be considered as a potential earn -to-give entrepreneurship strategy.

More specifically, newsletters / text messages that are focused on sourcing items of interest for niche groups e.g. jobs, events, petitions, ideas, tools, content etc.

Some quick thoughts on why:

(WARNING: I am by no stretch of the imagination a fundraising expert, also this might be the longest short form ever)

Generally speaking, I don’t believe c... (read more)

3Aaron Gertler7dThree cheers for long Shortform posts! Totally fine to spell out a half-baked idea here, at whatever length. Anyway, one of the first questions I always want to ask when I hear a business idea: What's an example of this type of business succeeding? Clearly, there are successful people writing newsletters on Substack. But did any of them: a) Start with a fairly small audience, many/most of whom were already giving them money without expecting something in return? b) Try to crowdsource content from many sources, instead of having a single author be the driving force/personality behind the newsletter? There may be newsletters in this category, but I expect that they are quite rare. Additionally, most EA orgs actively want their ideas to be free, because they want as many people as possible to hear them, and this is more valuable to them than whatever money they would get from a much smaller paid audience. For example, even if 80K could convert their 100,000+ newsletter subscribers into an audience of 10,000 people paying $5/month (no small task), I don't know if they would want to, given how many fewer people would get job leads from them under that scenario. As far as newsletters created by individual entrepreneurs, this is a reasonable business idea like many others. You can find lots of online guides to building an audience for your copywriting, coaches to help you get started, clubs where people share feedback on each other's writing, and so on. But like most reasonable businesses, this one is fairly competitive and tough to succeed in (no such thing as a free lunch!). It will be a reasonable thing to do for a few EAs, maybe, but doesn't stand out to me as more promising than other types of startups. This doesn't make it a bad idea -- just one of many, many things that people should consider if they want to build a business.

Thanks for the feedback Aaron!

With regards to EA orgs e.g. 80,000 hours. I wasn’t trying to suggest that EA orgs have their own paid newsletters. Rather, I was suggesting that a separate not-for-profit organisation could be set up specifically for creating paid newsletters (on any topic) while stating that said organisation is trying to raise money for effective charities. The organisation would be made of individuals and teams who each run different paid newsletters under one umbrella. (These potentially could be subsidiary companies; I haven’t really tho... (read more)

Dan Hageman's Shortform

Donor Advised Funds with no minimum contribution requirements


Fidelity Charitable and Schwab Charitable recently eliminated the minimum initial contribution requirements for their donor-advised funds. 

Fidelity and Schwab are major financial services companies. Their nonprofit arms operate donor-advised funds (DAFs), which are tools to make donating more convenient and tax-efficient (see Aaron Hamlin’s succinct summary for more info). Like many financial services companies, Fidelity and Schwab previously required customers to contribute ~$5,000 in... (read more)

NunoSempere's Shortform

CoronaVirus and Famine

The Good Judgement Open forecasting tournament gives a 66% chance for the answer to "Will the UN declare that a famine exists in any part of Ethiopia, Kenya, Somalia, Tanzania, or Uganda in 2020?"

I think that the 66% is a slight overestimate. But nonetheless, if a famine does hit, it would be terrible, as other countries might not be able to spare enough attention due to the current pandem

... (read more)

Ethiopia's Tigray region has seen famine before: why it could happen again - The Conversation Africa

Tue, 17 Nov 2020 13:38:00 GMT


The Tigray region is now seeing armed conflict. I'm at 5-10%+ that it develops into famine (regardless of whether it ends up meeting the rather stringent UN conditions for the term to be used) (but have yet to actually look into the base rate).  I've sent an email to to see if they update their forecasts. 

3Aaron Gertler8moDid you mean to post this using the Markdown editor? Currently, the formatting looks a bit odd from a reader's perspective.
MichaelStJules's Shortform

I feel increasingly unsympathetic to hedonism (and maybe experientalism generally?). Yes, emotions matter, and the strength of emotions could be taken to mean how much something matters, but if you separate a cow and her calf and they’re distressed by this, the appropriate response for their sake is not to drug or fool them until they feel better, it’s to reunite them. What they want is each other, not to feel better. Sometimes I think about something bad in the world that makes me sad; I don't think you do me any favour by just taking away my sadness; I d... (read more)

NunoSempere's Shortform

Reasons why upvotes on the EA forum and LW don't correlate that well with impact .

  1. More easily accessible content, or more introductory material gets upvoted more.
  2. Material which gets shared more widely gets upvoted more.
  3. Content which is more prone to bikeshedding gets upvoted more.
  4. Posts which are beautifully written are more upvoted.
  5. Posts written by better known authors are more upvoted (once you've seen this, you can't unsee).
  6. The time at which a post is published affects how many upvotes it gets.
  7. Other random factors, such as whether other strong po
... (read more)
Showing 3 of 4 replies (Click to show all)
4NunoSempere9dI'd say it also doesn't correlate that well with its total (direct+indirect) impact either, but yes. And I was thinking more in contrast to the karma score being an ideal measure of total impact; I don't have thoughts to share here on the impact of the post itself on the community.

Thanks, that makes sense. 

I think that for me, I upvote according to how much I think a post itself is valuable for me or for the community as a whole. At least, that's what I'm trying to do when I'm thinking about it logically.

7MichaelA9dI agree that the correlation between number of upvotes on EA forum and LW posts/comments and impact isn't very strong. (My sense is that it's somewhere between weak and strong, but not very weak or very strong.) I also agree that most of the reasons you list are relevant. But how I'd frame this is that - for example - a post being more accessible increases the post's expected upvotes even more than it increases its expected impact. I wouldn't say "Posts that are more accessible get more upvotes, therefore the correlation is weak", because I think increased accessibility will indeed increase a post's impact (holding other factor's constant). Same goes for many of the other factors you list. E.g., more sharing tends to both increase a post's impact (more readers means more opportunity to positively influence people) and signal that the post would have a positive impact on each reader (as that is one factor - among many - in whether people share things). So the mere fact that sharing probably tends to increase upvotes to some extent doesn't necessarily weaken the correlation between upvotes and impact. (Though I'd guess that sharing does increase upvotes more than it increases/signals impact, so this comment is more like a nitpick than a very substantive disagreement.)
MichaelA's Shortform

Collection of sources relevant to impact certificates/impact purchases/similar

Certificates of impact - Paul Christiano, 2014

The impact purchase - Paul Christiano and Katja Grace, ~2015 (the whole site is relevant, not just the home page)

The Case for Impact Purchase  | Part 1 - Linda Linsefors, 2020

Making Impact Purchases Viable - casebash, 2020

Plan for Impact Certificate MVP - lifelonglearner, 2020

Impact Prizes as an alternative to Certificates of Impact - Ozzie Gooen, 2019

Altruistic equity allocation - Paul Christiano, 2019

Social impact bond - Wikipe... (read more)

Mati_Roy's Shortform

oh, of course, for-profit charities are a thing! that makes sense

I learned about it in "Economics without Illusion", chapter 8.

it's not because your organization's product/service/goal is to help other people and your customers are philanthropists that you can't make a profit.

profitable charities might increase competition to provide more effective altruism, and so still provide more value even though it makes a profit (maybe)


davidwalker's Shortform

Could EA benefit from having a "bulldog"? 

That is, a pugnacious (but scrupulous) public advocate of EA and EA-adjacent ideas. In the EA community currently, who might come closest to being something like EA's bulldog? 

More precisely, I'm thinking of a hybrid between, say, Christopher Hitchens and Peter Singer (or perhaps Derek Parfit, for added dryness). A fiery, polemical wit married to a calm, analytical rigor. 

A good, non-EA -affiliated example of this style is  Alex J. O'Connor, better known as Cosmic Skeptic on YouTube, a student o... (read more)

10Aaron Gertler18dI feel like the main role of a bulldog is to fend off the fiery, polemical enemies of a movement. Atheism and veganism (and even AI safety, kind of) have clear opponents; I don't think the same is especially true of EA (as a collection of causes). There are people who argue for localism, or the impracticality of measuring impact, but I can't think of the last time I've seen one of those people have a bad influence on EA. The meat industry wants to kill animals; theists want to promote religion; ineffective charities want to... raise funds? Not as directly opposed to what we're doing. I suppose we did have the Will MacAskill/Giles Fraser debate [] at one point, though. MacAskill also took on Peter Buffet [] in an op-ed column. I don't know how he feels about those efforts in retrospect. We could certainly use more eloquent/impassioned public speakers on EA topics (assuming they are scrupulous, as you say), but I wouldn't think of them as "bulldogs" -- just regular advocates.

This Letter  made me feel like there can be organized opposition from ineffective charities

1davidwalker17dThank you Aaron, these are great points!
alexrjl's Shortform

I'm considering taking the very +EV betting opportunities available with the US election with the money I plan to donate over the next 6 months, then donating the winnings (or not donating if I lose).

Some more discussion on my twitter here but I'm interested in thoughts from EAF members too. It's not a huge amount of money either way.

3alexrjl17dThis went well :) Congrats EAF meta, Rethink, and GFI on your winnings.

Together with a few EA friends, I ended up betting a substantial amount of money on Biden. It went well for me, too, as well as for some of my friends. I think presidential elections present unusually good opportunities for both betting and arbitrage, so it may be worth coordinating some joint effort next time.

(As a note of historical interest, during the 2012 US election a small group of early EAs made some money arbitraging Intrade.)

7alexrjl22dI ended up doing this.
Linch's Shortform

crossposted from LessWrong

There should maybe be an introductory guide for new LessWrong users coming in from the EA Forum, and vice versa.

I feel like my writing style (designed for EAF) is almost the same as that of LW-style rationalists, but not quite identical, and this is enough to be substantially less useful for the average audience member there.

For example, this identical question is a lot less popular on LessWrong than on the EA Forum, despite naively appearing to appeal to both audiences (and indeed if I were to guess at the purview of LW, to be cl... (read more)

RyanCarey's Shortform

Possible EA intervention: just like the EA Forum Prizes, but for the best Tweets (from an EA point-of-view) in a given time window.

Reasons this might be better than the EA Forum Prize:

1) Popular tweets have greater reach than popular forum posts, so this could promote EA more effectively

2) The prizes could go to EAs who are not regular forum users, which could also help to promote EA more effectively.

One would have to check the rules and regulations.

Showing 3 of 4 replies (Click to show all)

The Emergent Ventures Prize is an example of a prize scheme that seems good to me: giving $100k prizes to great blogs, wherever on the internet they're located.

3RyanCarey10moTom Inglesby on nCoV response [] is one recent example from just the last few days. I've generally known Stefan Schubert, Eliezer Yudkowsky, Julia Galef, and others to make very insightful comments there. I'm sure there are very many other examples. Generally speaking, though, the philosophy would be to go to the platforms that top contributors are actually using, and offer our services there, rather than trying to push them onto ours, or at least to complement the latter with the former.
3Aaron Gertler10moI agree with this philosophy, but remain unsure about the extent to which strong material appears on various platforms (I sometimes do reach out to people who have written good blog posts or Facebook posts to send my regards and invite them to cross-post; this is a big part of Ben Kuhn's recent posts have appeared on the Forum, and one of those did win a prize). Aside from 1000-person-plus groups like "Effective Altruism" and "EA Hangout", are there any Facebook groups that you think regularly feature strong contributions? (I've seen plenty of good posts come out of smaller groups, but given the sheer number of groups, I doubt that the list of those I check includes everything it should.) ***** I follow all the Twitter accounts you mentioned. While I can't think of recent top-level Tweets from those accounts that feel like good Prize candidates, I think the Tom Inglesby thread is great! One benefit of the Forum Prize is that it (ideally) incentivizes people to come and post things on the Forum, and to put more effort into producing really strong posts. It also reaches people who deliberately worked to contribute to the community. If someone like Tom Inglesby was suddenly offered, say, $200 for writing a great Twitter thread, it's very unclear to me whether this would lead to any change in his behavior (and it might come across as very odd). Maybe not including any money, but simply cross-posting the thread and granting some kind of honorary award, could be better. Another benefit: The Forum is centralized, and it's easy for judges to see every post. If someone wants to Tweet about EA and they aren't already a central figure, we might have a hard time finding their material (and we're much more likely to spot, by happenstance, posts made by people who have lots of followers). That said, there's merit to thinking about ways we can reach out to send strong complimentary signals to people who produce EA-relevant things even if they're unaware of the movement's exi
johncrox's Shortform

For those interested in US election betting strategies, I'm hosting a Discord here:

NunoSempere's Shortform

The Stanford Social Innovation Review makes the case (archive link) that new, promising interventions are almost never scaled up by already established, big NGOs.

I suppose I just assumed that scale ups happened regularly at big NGOs and I never bothered to look closely enough to notice that it didn't. I find this very surprising.

Nathan Young's Shortform

Rather than using Facebook as a way to collect EA jobs we should use an airtable form

1) Individuals finding jobs could put all the details in, saving time for whoever would have to do this process at 80k time.

2) Airtable can post directly to facebook, so everyone would still see it

3) Some people would find it quicker. Personally, I'd prefer an airtable form to inputting it to facebook manually every time. 

Ideally we should find websites which often publish useful jobs and then scrape them regularly. 

It would be good to easily be able to export jobs from the EA job board.

1Nathan Young25dI suggest at some stage having up and downvoting of jobs would be useful.
Load More