New Comment
165 comments, sorted by Click to highlight new comments since: Today at 8:39 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I know of at least 1 NDA of an EA org silencing someone for discussing what bad behaviour that happened at that org. Should EA orgs be in the practice of making people sign such NDAs?

I suggest no.

4ChanaMessinger14d
I think I want a Chesterton's TAP for all questions like this that says "how normal are these and why" whenever we think about a governance plan.
2Nathan Young14d
I am unsure what you mean? As in, because other orgs do this it's probably normal? 
4ChanaMessinger14d
I have no idea, but would like to! With things like "organizational structure" and "nonprofit governance", I really want to understand the reference class (even if everyone in the reference class does stupid bad things and we want to do something different).
0Yitz3mo
Strongly agree that moving forward we should steer away from such organizational structures; much better that something bad is aired publicly before it has a chance to become malignant

Feels like we've had about 3 months since the FTX collapse with no kind of leadership comment. Uh that feels bad. I mean I'm all for "give cold takes" but how long are we talking.

Sam Harris takes Giving What We Can pledge for himself and for his meditation company "Waking Up"

Harris references MacAksill and Ord as having been central to his thinking and talks about Effective Altruism and exstential risk. He publicly pledges 10% of his own income and 10% of the profit from Waking Up. He also will create a series of lessons on his meditation and education app around altruism and effectiveness.

Harris has 1.4M twitter followers and is a famed Humanist and New Athiest. The Waking Up app has over 500k downloads on android, so I guess over 1 million overall. 

https://dynamic.wakingup.com/course/D8D148

I like letting personal thoughts be up or downvoted, so I've put them in the comments.

6Nathan Young2y
Harris is a marmite figure - in my experience people love him or hate him.  It is good that he has done this.  Newswise, it seems to me it is more likely to impact the behavior of his listeners, who are likely to be well-disposed to him. This is a significant but currently low-profile announcement. As will the courses be on his app.   I don't think I'd go spreading this around more generally, many don't like Harris and for those who don't like him, it could be easy to see EA as more of the same (callous superior progessivism). In the low probability (5%?) event that EA gains traction in that space of the web (generally called the Intellectual Dark Web - don't blame me, I don't make the rules) I would urge caution for EA speakers who might pulled into polarising discussion which would leave some groups feeling EA ideas are "not for them".
6MichaelDickens2y
My guess is people who like Sam Harris are disproportionately likely to be potentially interested in EA.

This seems quite likely given EA Survey data where, amongst people who indicated they first heard of EA from a Podcast and indicated which podcast, Sam Harris' strongly dominated all other podcasts.

More speculatively, we might try to compare these numbers to people hearing about EA from other categories. For example, by any measure, the number of people in the EA Survey who first heard about EA from Sam Harris' podcast specifically is several times the number who heard about EA from Vox's Future Perfect. As a lower bound, 4x more people specifically mentioned Sam Harris in their comment than selected Future Perfect, but this is probably dramatically undercounting Harris, since not everyone who selected Podcast wrote a comment that could be identified with a specific podcast. Unfortunately, I don't know the relative audience size of Future Perfect posts vs Sam Harris' EA podcasts specifically, but that could be used to give a rough sense of how well the different audiences respond.

2Aaron Gertler2y
Notably, Harris has interviewed several figures associated with EA; Ferriss only did MacAskill, while Harris has had MacAskill, Ord, Yudkowsky, and perhaps others.
3David_Moss2y
This is true, although for whatever reason the responses to the podcast question seemed very heavily dominated by references to MacAskill.  This is the graph from our original post [https://forum.effectivealtruism.org/posts/ZuGTc3awtG6nrziiq/ea-survey-2019-series-how-eas-get-involved-in-ea#Podcasts], showing every commonly mentioned category, not just the host (categories are not mutually exclusive). I'm not sure what explains why MacAskill really heavily dominated the Podcast category, while Singer heavily dominated the TED Talk [https://forum.effectivealtruism.org/posts/ZuGTc3awtG6nrziiq/ea-survey-2019-series-how-eas-get-involved-in-ea#TED_Talk] category.
4Nathan Young2y
The address (in the link) is humbling and shows someone making a positive change for good reasons. He is clear and coherent. Good on him.

How are we going to deal emotionally with the first big newspaper attack against EA?

EA is pretty powerful in terms of impact and funding.

It seems only an amount of time before there is a really nasty article written about the community or a key figure.

Last year the NYT wrote a hit piece on Scott Alexander and while it was cool that he defended himself, I think he and the rationalist community overreacted and looked bad.

I would like us to avoid this.

If someone writes a hit piece about the community, Givewell, Will MacAskill etc, how are we going to avoid a kneejerk reaction that makes everything worse?

I suggest if and when this happens:

  1. individuals largely don't respond publicly unless they are very confident they can do so in a way that leads to deescalation.

  2. articles exist to get clicks. It's worth someone (not necessarily me or you) responding to an article in the NYT, but if, say a niche commentator goes after someone, fewer people will hear it if we let it go.

  3. let the comms professionals deal with it. All EA orgs and big players have comms professionals. They can defend themselves.

  4. if we must respond (we often needn't) we should adopt a stance of grace, curiosity and hu

... (read more)

Yeah, I think the community response to the NYT piece was counterproductive, and I've also been dismayed at how much people in the community feel the need to respond to smaller hit pieces, effectively signal boosting them, instead of just ignoring them. I generally think people shouldn't engage with public attacks unless they have training in comms (and even then, sometimes the best response is just ignoring).

Dear reader,

You are an EA, if you want to be. Reading this forum is enough. Giving a little of your salary effectively is enough. Trying to get an impactful job is enough. If you are trying even with a fraction of your resources to make the world better and chatting with other EAs about it, you are one too.

The Scout Mindset deserved 1/10th of the marketing campaign of WWOTF. Galef is a great figurehead for rational thinking and it would have been worth it to try and make her a public figure.

4Ozzie Gooen5mo
I think much of the issue is that: 1. It took a while to ramp up to being able to do things such as the marketing campaign for WWOTF. It's not trivial to find the people and buy-in necessary. Previous EA books haven't had similar. 2. Even when you have that capacity, it's typically much more limited than we'd want. I imagine EAs will get better at this over time. 

Post I spent 4 hours writing on a topic I care deeply about: 30 karma

Post I spent 40 minutes writing on a topic that the community vibes with: 120 karma

I guess this is fine - iys just people being interested but it can feel weird at times.

5NunoSempere9mo
This is not fine
2Nathan Young9mo
I dunno. I thought I'd surface.
1niplav10mo
Yeah, this is an unfortunate gradient, you have to decide not to follow it :-/ But there is more long-term glory in it.

I strongly dislike the following sentence on effectivealtruism.org:

"Rather than just doing what feels right, we use evidence and careful analysis to find the very best causes to work on."

It reads to me as arrogant, and epitomises the worst caracatures my friends do of EAs. Read it in a snarky voice (such as one might if they struggled with the movement and were looking to do research) "Rather that just doing what feels right..."

I suggest it gets changed to one of the following:

  • "We use evidence and careful analysis to find the very best causes to work on."
  • "It's great when anyone does a kind action no matter how small or effective. We have found value in using evidence and careful analysis to find the very best causes to work on."

I am genuinely sure whoever wrote it meant well, so thank you for your hard work.

Are the two bullet points two alternative suggestions? If so, I prefer the first one.

8Matt_Lerner3y
I also thought this when I first read that sentence on the site, but I find it difficult (as I'm sure its original author does) to communicate its meaning in a subtler way. I like your proposed changes, but to me the contrast presented in that sentence is the most salient part of EA. To me, the thought is something like this: "Doing good feels good, and for that reason, when we think about doing charity, we tend to use good feeling as a guide for judging how good our act is. That's pretty normal, but have you considered that we can use evidence and analysis to make judgments about charity?" The problem IMHO is that without the contrast, the sentiment doesn't land. No one, in general, disagrees in principle with the use of evidence and careful analysis: it's only in contrast with the way things are typically done that the EA argument is convincing.
3Nathan Young3y
I would choose your statement over the current one. I think the sentiment lands pretty well even with a very toned down statement. The movement is called "effective altruism". I think often in groups are worried that outgroups will not get their core differences when generally that's all outgroups know about them. I don't think that anyone who visits that website won't think that effectiveness isn't a core feature. And I don't think we need to be patronising (as EAs are charactured as being in conversations I have) in order to make known something that everyone already knows.

EAs please post your job posting to twitter

Please post your jobs to Twitter and reply with @effective_jobs. Takes 5 minutes. and the jobs I've posted and then tweeted have got 1000s of impressions. 

Or just DM me on twitter (@nathanpmyoung) and I'll do it. I think it's a really cheap way of getting EAs to look at your jobs. This applies to impactful roles in and outside EA.

Here is an example of some text:

-tweet 1

Founder's Pledge Growth Director

@FoundersPledge are looking for someone to lead their efforts in growing the amount that tech entrepreneurs give to effective charities when they IPO. 

Salary: $135 - $150k 
Location: San Francisco

https://founders-pledge.jobs.personio.de/job/378212

-tweet 2, in reply

@effective_jobs

-end

I suggest it should be automated but that's for a different post.

If you type "#" follwed by the title of a post and press enter it will link that post.

Example:
Examples of Successful Selective Disclosure in the Life Sciences 

This is wild

2EdoArad4mo
OMG

I notice I am pretty skeptical of much longtermist work and the idea that we can make progress on this stuff just by thinking about it.

I think future people matter, but I will be surprised if, after x-risk reduction work, we can find 10s of billions of dollars of work that isn't busywork and shouldn't be spent attempting to learn how to get eg nations out of poverty.

Several journalists (including those we were happy to have write pieces about WWOTF) have contacted me but I think if I talk to them, even carefully, my EA friends will be upset with me. And to be honest that upsets me.

We are in the middle of a mess of our own making. We deserve scrutiny. Ugh, I feel dirty and ashamed and frustrated.

To be clear, I think it should be your own decision to talk to journalists, but I do also just think that it's just better for us to tell our own story on the EA Forum and write comments, and not give a bunch of journalists the ability to greatly distort the things we tell them in a call, with a platform and microphone that gives us no opportunity to object or correct things. 

I have been almost universally appalled at the degree to which journalists straightforwardly lie in interviews, take quotes massively out of context, or make up random stuff related to what you said, and I do think it's better that if you want to help the world understand what is going on, that you write up your own thoughts in your own context, instead of giving that job to someone else.

2ChanaMessinger3mo
<3

Unbalanced karma is good actually. it means that the moderators have to do less. I like the takes of the top users more than the median user and I want them to have more but not total influence. 

Appeals to fairness don't interest me - why should voting be fair?

I have more time for transparency.

A friend asked about effective places to give. He wanted to donate through his payroll in the UK. He was enthusiastic about it, but that process was not easy.

  1.  It wasn't particularly clear whether GiveWell or EA Development Fund was better and each seemed to direct to the other in a way that felt at times sketchy.
  2. It wasn't clear if payroll giving was an option
  3. He found it hard to find GiveWell's spreadsheet of effectiveness
     

Feels like making donations easy should be a core concern of both GiveWell and EA Funds and my experience made me a little embarrassed to be honest.

EA short story competition?

Has anyone ever run a competition for EA related short stories?

Why would this be a good idea?
* Narratives resonate with people and have been used to convey ideas for 1000s of years
* It would be low cost and fun
* Using voting on this forum there is the same risk of "bad posts" as for any other post

How could it work?
* Stories submitted under a tag on the EA forum.
* Rated by upvotes
* Max 5000 words (I made this up, dispute it in the comments)
* If someone wants to give a reward, then there could be a prize for the highest rated
* If there is a lot of interest/quality they could be collated and even published
* Since it would be measured by upvotes it seems unlikely a destructive story would be highly rated (or as likely as any other destructive post on the forum)

Upvote if you think it's a good idea. If it gets more than 40 karma I'll write one. 
 

edited

Give Directly has a President (Rory Stewart) paid $600k,  and is hiring a Managing Director. I originally thought they had several other similar roles (because I looked on the website) but I talked to them an seemingly that is not the case. Below is the tweet that tipped me off but I think it is just mistaken.

Once could still take issue with the $600k (though I don't really)

https://twitter.com/carolinefiennes/status/1600067781226950656?s=20&t=wlF4gg_MsdIKX59Qqdvm1w 

Seems in line with CEO pay for US nonprofits with >100M in budget, at least when I spot check random charities near the end of this list.

I feel confused about the president/CEO distinction however.

-10NickLaing2mo

I dislike the framing of "considerable" and "high engagement" on the EA survey.

This copied from the survey:

  • No engagement: I’ve heard of effective altruism, but do not engage with effective altruism content or ideas at all
  • Mild engagement: I’ve engaged with a few articles, videos, podcasts, discussions, events on effective altruism (e.g. reading Doing Good Better or spending ~5 hours on the website of 80,000 Hours)
  • Moderate engagement: I’ve engaged with multiple articles, videos, podcasts, discussions, or events on effective altruism (e.g. subscribing to the
... (read more)
5Nathan Young1y
I think this is part of a more general problem that people say things like "I'm not totally EA" when they donate 1%+ of their income and are trying hard. Why create a club where so many are insecure about their membership. I can't speak for everyone, but if you donate even 1% of your income to charities which you think are effective, you're EA in my book. 
5Aaron Gertler1y
It is one of my deepest hopes, and one of my goals for my own work at CEA, that people who try hard and donate feel like they are certainly, absolutely a part of the movement. I think this is determined by lots of things, including: 1. The existence of good public conversations about donations, cause prioritization, etc., where anyone can contribute 2. The frequency of interesting news and stories about EA-related initiatives that make people feel happy about the progress their "team" is making I hope that the EA Survey's categories are a tiny speck compared to these.
3Aaron Gertler1y
Thanks for providing a detailed suggestion to go with this critique!  While I'm part of the team that puts together the EA Survey, I'm only answering for myself here. 1. People can consider themselves anything they want! It's okay! You're allowed!  I hope that a single question on the survey isn't causing major changes to how people self-identify. If this is happening, it implies a side-effect the Survey wasn't meant to have. 2. Have you met people who specifically cited the survey (or some other place the question has showed up — I think CEA might have used it before?) as a source of disillusionment? I'm not sure I understand why people would so strongly prefer being in a "highly engaged" category vs. a "considerably engaged" category if those categories occupy the same relative position on a list. Especially since people don't use that language to describe themselves, in my experience. But I could easily be missing something. I want someone who earns-to-give (at any salary) to feel comfortable saying "EA is a big part of my life, and I'm closely involved in the community". But I don't think this should determine how the EA Survey splits up its categories on this question, and vice-versa. ***** One change I'd happily make would be changing "EA-aligned organization" to "impact-focused career" or something like that. But I do think it's reasonable for the survey to be able to analyze the small group of people whose professional lives are closely tied to the movement, and who spend thousands of hours per year on EA-related work rather than hundreds. (Similarly, in a survey about the climate movement, it would seem reasonable to have one answer aimed at full-time paid employees and one answer aimed at extremely active volunteers/donors. Both of those groups are obviously critical to the movement, but their answers have different implications.) Earning-to-give is a tricky category. I think it's a matter of degree, like the difference betwee
2Nathan Young1y
It's possible that this question is mean to measure something about non-monetary contribution size, not engagement. In which case, say that.  Call it, "non-financial contribution" and put 4 as " I volunteer more than X hours" and 5 as "I work on a cause area directly or have taken a lower than salary rate jobs".

Nuclear risk is in the news. I hope:
-  if you are an expert on nuclear risk, you are shopping around for interviews and comment
- if you are an EA org that talks about nuclear risk, you are going to publish at least one article on how the current crisis relates to nuclear risk or find an article that you like and share it
- if you are an EA aligned journalist, you are looking to write an article on nuclear risk and concrete actions we can take to reduce it

Factional infighting

[epistemic status - low, probably some element are wrong]

tl;dr
- communities have a range of dispute resolution mechanisms, whether voting to public conflict to some kind of civil war
- some of these are much better than others
- EA has disputes and resources and it seems likely that there will be a high profile conflict at some point
- What mechanisms could we put in place to handle that conflict constructively and in a positive sum way?

When a community grows as powerful as EA is, there can be disagreements about resource allocation.  ... (read more)

9Stefan_Schubert1y
By and large I think this aspect is going surprisingly well, largely because people have adopted a "disagree but respect" ethos. I'm a bit unsure of such a fund - I guess that would pit different cause areas against each other more directly, which could be a conflict framing.  Regarding the mechanism of bargains, it's a bit unclear to me what problem that solves.

EA infrastructure idea: Best Public Forecaster Award

  1. Gather all public forecasting track records
  2.  Present them in an easily navigable form
  3.  Award prizes one for best brier score of forecasts resolving in the last year

If this gets more than 20 karma, I'll write a full post on it. This is rough.

Questions that come to mind

Where would we find these forecasts

To begin with I would look at those with public records:

Beyond these, one could build a community around finding forecasts of public fi... (read more)

Is there a way to sort shortform posts?

I would like to see posts give you more karma than comments (which would hit me hard). Seems like a highly upvtoed post is waaaaay more valuable than 3 upvoted comments on that post, but it's pretty often the latter gives more karma than the former.

6ChanaMessinger14d
Sometimes comments are better, but I think I agree they shouldn't be worth exactly the same.
6ChanaMessinger14d
People might also have a lower bar for upvoting comments.
2Nathan Young14d
There you go, 3 mana. Easy peasy.
2Pat Myron15d
simplest first step would be just showing both separately like Reddit
2Nathan Young15d
You can see them separately, but it's how they combine that matters. 
3Pat Myron15d
I know you can figure them out, but I don't see them presented separately on users pages. Am I missing something? Is it shown on the website somewhere?
1jimrandomh15d
They aren't currently shown separately anywhere. I added it to the ForumMagnum feature-ideas repo but not sure whether we'll wind up doing it.
3Nathan Young14d
They are shown separately here: https://eaforum.issarice.com/userlist?sort=karma [https://eaforum.issarice.com/userlist?sort=karma] 
1Pat Myron15d
Is there a link to vote to show interest?

The amount of content on the forum is pretty overwhelming at the moment and I wonder if there is a better way to sort it. 

There is no EA "scene" on twitter.

For good or ill, while there are posters on twitter who talk about EA, there isn't a "scene" (a space where people use loads of EA jargon and assume everyone is EA) or at least not that I've seen.

This surprised me.

EA Book discount codes.

tl;dr EA  books have a positive externality. The response should be to subsidise them

If EA thinks that certain books (doing good better, the precipice) have greater benefits than they seem, they could subsidise them.

There could be an EA website which has amazon coupons for EA books so that you can get them more cheaply if buying for a friend, or advertise said coupon to your friends to encourage them to buy the book.

From 5 mins of research the current best way would be for a group to buys EA books and sell them at the list price... (read more)

2Ozzie Gooen2y
I think people have been taking up the model of open sourcing books (well, making them free). This has been done for [The Life You can Save](https://en.wikipedia.org/wiki/The_Life_You_Can_Save) and [Moral Uncertainty](https://www.williammacaskill.com/info-moral-uncertainty).  I think this could cost $50,000 to $300,000 or so depending on when this is done and how popular it is expected to be, but I expect it to be often worth it.
1Nathan Young2y
Seems that the Ebook/audiobook is free. Is that correct? I imagine being able to give a free physcial copy would have more impact.
1SamiM2y
Yes, it's free [https://www.thelifeyoucansave.org/the-book/].
2alexrjl2y
I like this idea and think it's worth you taking further. My initial reactions are: * Getting more EA books into peoples hands seems great and worth much more per book than the cost of a book. * I don't know how much of a bottleneck the price of a book is to buying them for friends/club members. I know EA Oxford has given away many books, I've also bought several for friends (and one famous person I contacted on instagram as a long shot who actually replied. * I'd therefore be interested in something which aimed to establish whether making books cheaper was a better or worse idea than just encouraging people to gift them. * John Behar/TLYCS probably have good thoughts on this.
1Nathan Young2y
Do you have any thoughts as to what the next step would be. It's not obvious to me what you'd do to research the impact of this. Perhaps have a questionnaire asking people how many people they'd give books to at different prices. Do we know the likelihood of people reading a book they are given?

I'll sort of publicly flag that I sort of break the karma system. Like the way I like to post comments is little and often and this is just overpowered in getting karma.

eg I recently overtook Julia Wise and I've been on the forum for years less than anyone else.

I don't really know how to solve this - maybe someone should just 1 time nuke my karma? But yeah it's true.

Note that I don't do this deliberately - it's just how I like to post and I think it's honestly better to split up ideas into separate comments. But boy is it good at getting karma. And soooo m... (read more)

To modify a joke I quite liked:

Having EA Forum karma tells you two things about a person:

  1. They had the potential to have had a high impact in EA-relevant ways
  2. They chose not to.

I wouldn't worry too much about the karma system. If you're worried about having undue power in the discourse, one thing I've internalized is to use the strong upvote/downvote buttons very sparingly (e.g. I only strong-upvoted one post in 2022 and I think I never strong-downvoted any post, other than obvious spam).

3Felix Wolf21d
Hey Nathan, thank you for the ranking list. :) I don't think you need to start with zero karma again. The karma system is not supposed to mean very much. It is heavily favoured in certain aspects than a true representation of your skill or trustworthiness as a user on this forum. It is more or less a xp bar for social situations and is an indicator that someone posts good content here. Let's look at an example: [https://forum.effectivealtruism.org/posts/aNKuzuQkpfyBLwqqv/announcing-my-retirement ] Aaron Gertler retired from the forum, someone who is in high regard, which got a lot of attention and sympathy. Many people were interested in the post, and it's an easy topic to participate. So many were scrolling down to the comments to write something nice and thanking him for his work. JP Addison did so too. He works for CEA and as a developer for the forum. His comment [https://forum.effectivealtruism.org/posts/aNKuzuQkpfyBLwqqv/announcing-my-retirement?commentId=6brKjuA2jPmGDczAq] got more Karma than any post he made so far. Karma is used in many places with different concepts behind it. The sum of it gives you no clear information. What I would think in your case: you are an active member of the forum, participate positively with only one post with negative karma. You participated in the FTX crisis discussion, which was an opportunity to gain or lose significant amounts of karma, but you survived it, probably with a good score. Internetpoints can make you feel fantastic, they are a system to motivate for social interaction and to follow the community norms (in positive and negative ways). Your modesty suits you well, but there is no need to. Stand upwards. There will always be those with few points but really good content, and those who overshoot the gems by far with activity.

Question answers

When answering questions, I recommend people put each separate point as a separate answer. The karma ranking system is useful to see what people like/don't like and having a whole load of answers together muddies the water. 

EA global

1) Why is EA global space constrained? Why not just have a larger venue?

I assume there is a good reason for this which I don't know.

2) It's hard to invite friends to EA global. Is this deliberate?

I have a close friend who finds EA quite compelling. I figured I'd invite them to EA global. They were dissuaded by the fact they had to apply and that it would cost $400.

I know that's not the actual price, but they didn't know that. I reckon they might have turned up for a couple of talks. Now they probably won't apply. 

Is there no way that this event could be more welcoming or is that not the point?

Re 1) Is there a strong reason to believe that EA Global is constrained by physical space? My impression is that they try to optimize pretty hard to have a good crowd and for there to be a high density of high-quality connections to be formed there.

Re 2) I don't think EA Global is the best way for newcomers to EA to learn about EA. 

EDIT: To be clear, neither 1) nor 2) are necessarily endorsements of the choice to structure EA Global in this way, just an explanation of what I think CEA is optimizing for.

EDIT 2 2021/10/11: This explanation may be wrong, see Amy Labenz's comment here.

Personal anecdote possibly relevant for 2): EA Global 2016 was my first EA event. Before going, I had lukewarm-ish feelings towards EA, due mostly to a combination of negative misconceptions and positive true-conceptions; I decided to go anyway somewhat on a whim, since it was right next to my hometown, and I noticed that Robin Hanson and Ed Boyden were speaking there (and I liked their academic work). The event was a huge positive update for me towards the movement, and I quickly became involved – and now I do direct EA work.

I'm not sure that a different introduction would have led to a similar outcome. The conversations and talks at EAG are just (as a general rule) much better than at local events, and reading books or online material also doesn't strike me as naturally leading to being part of a community in the same way.

It's possible my situation doesn't generalizes to others (perhaps I'm unusual in some way, or perhaps 2021 is different from 2016 in a crucial way such that the "EAG-first" strategy used to make sense but doesn't anymore), and there may be other costs with having more newcomers at EAG (eg diluting the population of people more familiar with EA concepts), but I also think it's possible my situation does generalize and that we'd be better off nudging more newcomers to come to EAG.

Hi Nathan, 
 

Thank you for bringing this up! 

1) We’d like to have a larger capacity at EA Global, and we’ve been trying to increase the number of people who can attend. Unfortunately, this year it’s been particularly difficult; we had to roll over our contract with the venue from 2020 and we are unable to use the full capacity of the venue to reduce the risk from COVID. We’re really excited that we just managed to add 300 spots (increasing capacity to 800 people), and we’re hoping to have more capacity in 2022. 

There will also be an opportunity for people around the world to participate in the event online. Virtual attendees will be able to enjoy live streamed content as well as networking opportunities with other virtual attendees. More details will be published on the EA Global website the week of October 11.
 

2) We try to have different events that are welcoming to people who are at different points in their EA engagement. For someone earlier in their exploration of EA, the EAGx conferences are going to be a better fit. From the EA Global website:

Effective altruism conferences are a good fit for anyone who is putting EA principles into action through their... (read more)

2Nathan Young1y
Thanks for taking the time to answer. That all makes sense.

UK government will pay for organisations to hire 18-24 year olds who are currently unemployed, for 6 months. This includes minimum wage and national insurance.

 

I imagine many EA orgs are people constrained rather than funding constrained but it might be worth it. 



And here is a data science org which will train them as well https://twitter.com/John_Sandall/status/1315702046440534017

 

Note: applications have to be for 30 jobs, but you can apply over a number of organisations or alongside a local authority etc. 



https://www.gov.uk/government/collections/kickstart-scheme

This perception gap site would be a good form for learning and could be used in altruism. It reframes correcting biases as a fun prediction game.

https://perceptiongap.us/

It's a site which gets you to guess what other political groups (republicans and democrats) think about issues.

Why is it good:

1) It gets  people thinking and predicting. They are asked a clear question about other groups and have to answer it.
2) It updates views in a non-patronising way - it turns out dems and repubs are much less polarised than most people think (the stat they give i... (read more)

It is frustrating that I cannot reply to comments from the notification menu. Seems like a natural thing to be able to do.

I wish the forum had a better setting for "I wrote this post and maybe people will find it interesting but I don't want it on the front page unless they do because that feels pretenious"

I think the EA forum wiki should allow longer and more informative articles. I think that it would get 5x traffic. So I've created a market to bet. 

Does EA have a clearly denoted place for exit interviews? Like if someone who was previously very involved was leaving, is there a place they could say why?

I think the wiki should be about summarising and synthesising articles on this forum. 

- There are lots of great articles which will be rarely reread
- Many could do with more links to eachother and to other key peices
- Many could be better edited, combined etc
- The wiki could take all content and aim to turn it into a minimal viable form of itself

2Stefan_Schubert1y
Sounds interesting. Can you flesh out a bit more what this should look like, in your view?
4Nathan Young1y
I think that the forum wiki should focus on taking chunks of article text and editing it, rather than pointing people to articles. So take all of the articles on global dev, squish them together or shorten them.  So there would be a page on "research debt" which would contain this article [https://forum.effectivealtruism.org/posts/EbvJRAvwtKAMBn2td/link-distillation-and-research-debt] and also any more text that seemed relevant, but maybe without the introduction. Then a preface on how it links to other EA topics, a link to the original article and links to ways it interacts with other EA topics. It might turn out that that page had 3 or 4 articles squished into one or was broken into 3 or 4 pages. But like Wikipedia you could then link to "research debt" and someone could easily read it.
2Stefan_Schubert1y
Thanks, makes sense. I'd be interested in, e.g. Pablo's view.
4Nathan Young1y
If only we had tagging.

EA criticism

[Epistemic Status: low, I think this is probably wrong, but I would like to debug it publicly]

If I have a criticism of EA along Institutional Decision Making lines, it is this:

For a movement that wants to change how decisions get made, we should make those changes in our own organisations first.

Examples of good progress:
-  prizes - EA orgs have offered prizes for innovation
- voting systems - it's good that the forum is run on upvotes and that often I think EA uses the right tool for the job in terms of voting

Things I would like to see more... (read more)

EA twitter bots

 A set of EA jobs twitter bots which each retweet a specific set of hashtags eg #AISafety #EAJob, #AnimalSuffering #EAJob, etc etc.  Please don't get hung up on these, we'd actually need to brainstorm the right hashtags.

You follow the bots and hear about the jobs.

Rather than using Facebook as a way to collect EA jobs we should use an airtable form

1) Individuals finding jobs could put all the details in, saving time for whoever would have to do this process at 80k time.

2) Airtable can post directly to facebook, so everyone would still see it https://community.airtable.com/t/posting-to-social-media-automatically/20987

3) Some people would find it quicker. Personally, I'd prefer an airtable form to inputting it to facebook manually every time. 

Ideally we should find websites which often publish useful jobs and then scrape them regularly. 

1Nathan Young2y
It would be good to easily be able to export jobs from the EA job board.
1Nathan Young2y
I suggest at some stage having up and downvoting of jobs would be useful.

Does anyone know people working on reforming the academic publishing process?

Coronavirus has caused journalists to look for scientific sources. There are no journal articles because of the lag time. So they have gone to preprint servers like bioRxiv (pronounced bio-archive). These servers are not peer reviewed so some articles are of low quality. So people have gone to twitter asking for experts to review the papers.

https://twitter.com/ryneches/status/1223439143503482880?s=19

This is effectively a new academic publishing paradigm. If there were support fo... (read more)

2Sanjay3y
HaukeHillebrandt has recommended supporting Prof Chris Chambers to do this: https://lets-fund.org/better-science/ [https://lets-fund.org/better-science/]
  • It is unclear to me that if we chose cause areas again, we would choose global developement
  • The lack of a focus on global development would make me sad
  • This issue should probably be investigated and mediated to avoid a huge community breakdown - it is naïve to think that we can just swan through this without careful and kind discussion

If I find this forum exhausting to post on some times I can only imagine how many people bounce off entirely.

The forum has a wiki (like wikipedia)

 The "Criticism of EA Community" wiki post is here.

I think it would be better as a summary of criticisms rather than links to documents containing criticisms.

This is a departure from the current wiki style, so after talking to moderators we agreed to draft externally.

Collaborative Draft:

https://docs.google.com/document/d/1RetcAA7D94y6v3qxoKi_Ven-xF98FjirokvI-g8cKI4/edit# 

Upvote this post if you think the "Criticism of EA Community" post will be better as a collaboratively-written summary. 

Downvote if you ... (read more)

With better wiki features and a way to come to consensus on numbers I reckon this forum can write a career guide good enough to challenge 80k. They do great work, but we are many.

There were too few parties on the last night of EA global in london which led to overcrowding, stressed party hosts and wasting a load of people's time.

I suggest in future that there should be at least n/200 parties where n is the number of people attending the conference. 

I don't think CEA should legislate parties, but I would like to surface in people's minds that if there are fewer than n/200 parties, then you should call up your friend with most amenable housemates and tell them to organise!

Has rethink priorities ever thought of doing a survey of non-EAs? Perhaps paying for a poll? I'd be interested in questions like "What do you think of Effective Altruism? What do you think of Effective Altruists?"

Only asking questions of those who are currently here is survivorship bias. Likewise we could try and find people who left and ask why.

We are definitely planning on doing this kind of research, likely sometime in 2021.

I did a podcast where we talked about EA, would be great to hear your criticisms of it. https://pca.st/i0rovrat

Should I do more podcasts?

3JWS11d
I listened to this episode today Nathan, I thought it was really good, and you came across well. I think EAs should consider doing more podcasts, including those not created/hosting by EA people or groups. They're an accessible medium with the potential for a lot of outreach (the 80k podcast is a big reason why I got directly involved with the community). I know you didn't want to speak for EA has a whole, but I think it was a good example with EA talking to the leftist community in good faith,[1] which is (imo) one of our biggest sources of criticism at the moment. I'd recommend others check out the rest of Rabbithole's series [https://rabbithole.simplecast.com/episodes]on EA - it's a good piece of data on what the American Left thinks of EA at the moment. Summary:  +1 to Nathan for going on this podcast +1 for people to check out the other EA-related Rabbithole episodes 1. ^ A similar podcast for those interested would be Habiba's appearance [https://mostinterestingpeople.podbean.com/e/33-habiba-islam-on-the-left-and-effective-altruism/] on Garrison's podcast The Most Interesting People I Know

Any time that you read a wiki page that is sparse or has mistakes, consider adding what you were trying to find. I reckon in a few months we could make the wiki really good to use. 

I sense that conquest's law is true -> that organisations that are not specifically right wing move to the left.

I'm not concerned about moving to the left tbh but I am concerned with moving away from truth, so it feels like it would be good to constantly pull back towards saying true things. 

I think the forum should have a retweet function but for the equivalent of github forks. So you can make changes to someone's post and offer them the ability to incorporate them. If they don't, you can just remake the article with the changes and an acknolwedgement that you did.

I don't think people would actually do that that often, because they'd get no karma most of the time, but it would give karma, attribution trail for:
- summaries
- significant corrections/reframings
- and the author could still accept the edits later

My very quick improving institutional decision-making (IIDM) thoughts

Epistemic status: Weak 55% confidence. I may delete. Feel free to call me out or DM me etc etc. 

I am saying these so that someone has said them. I would like them to be better phrased but then I'd probably never share them. Please feel free to criticise them though I might modify them a lot and I'm sorry if they are blunt:

  • I don't understand what concrete learnings there are from IIDM, except forecasting (which I am biased on). The EIP produced a report which said that in the institut
... (read more)
2Benjamin Start6mo
Haha I don't know what IIDM is but I do know what forecasting is. If I had lots of money one of the things I'd do is create a forecasting news organization. They don't talk about what happened, they talk about what's going to happen. The knowlege transfer is important. People are too spread apart to use one platform, but if there was a list of people who were readily available to share information on certain topics and their contact info that would be valuble. 
2Nathan Young6mo
Benjamin, I think you and I are gonna be friends. You at EAG SF? 
3Benjamin Start6mo
This forum is not user-friendly. Took a bit to arrive.  I am not! I applied and didn't get it, I think the movement is bigger than available tickets in a convention. I'm on a few EA discords if you'd like to chat. 

Do we prefer

  • impact tractability neglectedness 
  • scale solvability neglectedness
3Nathan Young9mo
SSN
5Tessa9mo
I have strong "social security number" associations with the acronym SSN. Setting those aside, I feel "scale" and "solvability" are simpler and perhaps less jargon-y words than "impact" and "tractability" (which is probably good), but I hear people use "impact" much more frequently than "scale" in conversation, and it feels broader in definition, so I lean towards "ITN" over "SSN".
3Linch9mo
In my head, "impact" seems to mix together scale + neglectedness + tractability, unless I'm missing something.
2Miranda_Zhang9mo
I actually prefer "scale, tractability, neglectedness" but nobody uses that lol
1Joseph Lemien9mo
ITN.

I am gonna do a set of polls and get a load of karma for it (70% >750).  I'm currently ~20th overall on the forum despite writing few posts of note. I think polls I write create a lot of value and I like the way it incentivises me to think about questions the community wants to answer.

I am pretty happy with the current karma payment but I'm not sure everyone will be so I thought I'd surface it. I've considered saying that polls delivery half the karma, but that feels kind of messy and I do think polls are currently underrated on the forum.

Any ideas... (read more)

1Yonatan Cale1y
What? Polls? Do you mean "Questions"?

EA podcasts and videos

Each EA org should pay $10 bounty to the best twitter thread talking about any episode. If you could generate 100 quality twitter threads on 80,000 hours episodes that for $1000 that would be really cheap. People would quote tweet and discuss and it would make the whole set of knowledge much more legible.

3finm1y
Cool idea, I'll have a think about doing this for Hear This Idea. I expect writing the threads ourselves could take less time than setting up a bounty, finding the threads, paying out etc. But a norm of trying to summarise (e.g. 80K) episodes in 10 or so tweets sounds hugely valuable. Maybe they could all use a similar hashtag to find them — something like #EAPodcastRecap or #EAPodcastSummary
2Nathan Young1y
I recommend a thread of them. I rarely see poeple using hashtags currently. And I probably agree you could/should write them yourselves but: - other people might think different things are interesting than you do
1finm1y
Thanks! Sounds right on both fronts.

I edited the of wikipedia on Doing Good Better to try and make it more reflective of the book and Will's current views. Let me know how you think I did.

https://en.wikipedia.org/w/index.php?title=William_MacAskill&editintro=Template%3ABLP_editintro#Doing_Good_Better

Plant-based meat. Fun video from a youtuber which makes a strong case. Very sharable. https://youtu.be/-k-V3ESHcfA

Why do some shortforms have agree voting and others don't?

7Larks12d
Depends on when the shortform was created.
0Nathan Young12d
As in they've recently removed it? If not, that doesn't seem true.

I notice that sometimes I want to post on something that's on both the EA forum and lesswrong. And ideally, clicking "see lesswrong comments" would just show them on the current forum page and if I responded, it would calculate EA forum karma for the forum and LessWrong karma for lessWrong.

Probably not worth building, but still.

Someone being recommended to learn about EA by listening to 10 hours of podcasts in the wild

Maximise useful feedback, minimise rudeness

When someone says of your organisation "I want you to do X" do not say "You are wrong to want X"

This rudely discourages them from giving you feedback in future. Instead, there are a number of options:

  • If you want their feedback "Why do you want X?" "How does a lack of X affect you?"
  • If you don't want their feedback "Sorry, we're not taking feedback on that right now" or "Doing X isn't a priority for us"
  • If you think they fundamentally misunderstand something "Can I ask you a question relating to X?"

None of these opti... (read more)

Other than my karma, this post got negative karma. Why?

I understand that sometimes I post controversial stuff, but this one is just straightforwardly valuable. 

https://forum.effectivealtruism.org/posts/GshpbrBaCQjxmAKJG/cause-prioritsation-contest-who-bettors-think-will-win

I sense new stuff on the forum is probably overrated. Surely we should assume that most of the most valuable things for most people to read have already been written? 

Have you seen the new features google docs has added recently?

  • Tick boxes
  • Project trackers
  • New types of tables

Feels like they are gunning for Notion. 

The difference between the criticism contest and openphil's cause prioritisation contest is pretty interesting. 60% I'm gonna think OpenPhil's created more value in terms of changes in a 10 years time.

1 minute video summaries of  my EA Criticism contest articles:

Summaries are underrated - https://www.loom.com/share/4781668372694c83a4e9feffe249469b - full text

Improving Karma - https://www.loom.com/share/6d0decef2bd14efc9b22e14d43693002 - full text

Common misconception I see:

Longtermists causes are not:

  • Causes which are much more pressing under longtermism than other belief systems

Longtermist causes are:

  • Those which are a high priority for marginal resources, whether they are under other belief systems or not.
  • The fact that biorisk and AI risk are high priority without longtermism doesn't make them not "longtermist causes" just as it doesn't make the not "causes that affect people alive today"

How much value is there in combining two EA slacks which discuss the same topic?

Probably $1,000s right? 

Or maybe we should assume it will be a natural process that one will subsume the other?

Effective altruism and politics

Here is an app that lets you vote on other people's comments (I'd like to see it installed in the forum so there is a lower barrier to entry) 

You can add thoughts and try and make arguments that get broad agreement. 

What are the different parties of opinion on EA and politics.

https://pol.is/283be3mcmj
 

1Yonatan Cale1y
An open question for me (for EA Israel? For EA?) is whether we can talk about economic-politics publicly in our group. For example, can we discuss openly that "regulating prices is bad". This is considered an open political debate in Israel, politicians keep wanting to regulate prices (and sometimes they do, and then all the obvious things happen)
2Nathan Young1y
I mean I'd like to chat about that, and maybe happy to on this shortform? But I wouldn't write a post on it. I guess it doesn't seem that neglected to me.
1Yonatan Cale1y
In Israel, it is controversial to suggest not regulating prices, or to suggest lowering import taxes, or similar things. I could say a lot about this, but my points are:  In Israel: 1. It is neglected 2. It means EA would be involved in local politics
1Yonatan Cale1y
I remember I was really jealous of the U.S when Biden suggested some very expensive program (UBI? Some free-medical-care reform?), but he SHOWED where the money is supposed to come from, there was a chart!

EA Wiki

I've decided I'm going to just edit the wiki to be like the wiki I want.

Currently the wiki feels meticulously referenced but lacking in detail. I'd much prefer it to have more synthesised content which is occasionally just someone's opinion. If you dislike this approach, let me know.

8Pablo1y
I do think that many of the entries are rather superficial, because so far we've been prioritizing breadth over depth. You are welcome to try to make some of these entries more substantive. I can't tell, in the abstract, if I agree with your approach to resolving the tradeoff between having more content and having a greater fraction of content reflect just someone's opinion. Maybe you can try editing a few articles and see if it attracts any feedback, via comments or karma?

Why do posts get more upvotes than questions with the same info?

I wrote this question: https://forum.effectivealtruism.org/posts/ckcoSe3CS2n3BW3aT/what-ea-projects-could-grow-to-become-megaprojects

Some others wrote this post summarising it:
https://forum.effectivealtruism.org/posts/faezoENQwSTyw9iop/ea-megaprojects-continued

Why do you think the summary got more upvotes. I'm not upset, I like a summary too, but in my mind, a question that anyone can submit answers to or upvote current answers is much more useful. So I am confused. Can any suggest why?

 

2Jack Malde1y
Anyone can comment on a post and upvote comments so I don't see why a question would be better in that regard. Also the post contained a lot of information on potential megaprojects which is not only quite interesting and educational but also prompts discussion.

At what size of the EA movement should there be an independent EA whistleblowing organisation, which investigates allegations of corruption?

8Larks2y
Can you think of any examples of other movements which have this? I have not heard of such for e.g. the environmentalist or libertarian movements. Large companies might have whistleblowing policies, but I've not heard of any which make use of an independent organization for complaint processing.
1Nathan Young2y
The UK police does. It seems to me if you wanted to avoid a huge scandal you'd want to empower and incentivise an organisation to find small ones.

Some thoughts
- Utilitiarianism but being cautious around the weird/unilateral stuff is still good
- We shouldn't be surprised that we didn't figure out SBF was fraudulent quicker than billions of dollars of cryto money... and Michael Lewis
- Scandal prediction markets are the solution here and one day they will be normal. But not today. Don't boo me, I'm right
- Everyone wants whistleblowing, no one wants the correctly incentivised decentralised form of whistleblowing.
- Gotta say, I feel for many random individual people who knew or interacted closely with SB... (read more)

2hawkebia3mo
I remain confused about "utilitarianism, but use good judgement". IMO, it's amongst the more transparent motte-and-baileys I've seen. Here are two tweets from Eliezer that I see are regularly re-shared: This describes Aristotelian Virtue Ethics - finding the golden mean between excess and deficiency. So are people here actually virtue ethicists who sometimes use math as a means of justification and explanation? Or do they continue to take utilitarianism to some of its weirder places, privately and publicly, but strategically seek shelter under other moral frameworks  when criticized?  I'm finding it harder to take people who put "consequentialist" and "utilitarian" in their profiles and about mes seriously. If people abandon their stated moral framework on big important and consequential questions, then either they're deluding themselves on what their moral framework actually is, or they really will act out the weird conclusions - but are being manipulative and strategic by saying "trust us, we have checks and balances"
2Nathan Young3mo
I don't think you have to abandon it, but you can look twice or ask trusted friends etc etc.  That doesn't mean you can't do the thing you intended to do.
1hawkebia3mo
And what happens when that double-checking comes back negative? And how much weight do you choose to give it? The answer seems to be rooted in matters of judgement and subjectivity. And if you're doing it often enough, especially on questions of consequence, then that moral framework is better described as virtue ethics. Out of curiosity, how would you say your process differs from a virtue ethicist trying to find the golden mean between excess and deficiency?

I strongly dislike the "further reading" sections of the forum wiki/forum tags. 

They imply that the right way to know more about things is to read a load of articles. It seems clear to me that instead we should sythesise these points and then link them where relevant. Then if you wanted more context you could read the links. 

The 'Further reading' sections are a time-cheap way of helping readers learn more about a topic, given our limited capacity to write extended entries on those topics.

Clubhouse Invite Thread

1) Clubhouse is a new social media platform, but you need an invite to join
2) It allows chat in rooms, and networking
3) Seems some people could deliver sooner value by having a clubhouse invite
4) People who are on clubhouse have invites to give
5) If you think an invite would be valuable or heck you'd just like one, comment below and then if anyone has invites to give they can see EAs who want them.
6) I have some invites to give away.
 

Mailing list for the new UK Conservative Party group on China.

Will probably be worth signing up to if that's your area of interest.

https://chinaresearchgroup.substack.com/p/coming-soon

Please comment any other places people could find mailing lists or good content for EA related areas.

I think in SBF we farmed out our consciences. Like people who say "there need to be atrocities in war so that people who live in peace" we thought "SBF can do trade dodgy coins stuff so that we can help, but let's not think about it". I don't think we could have known about the fraud, but I do think there were plenty of warning signs we ignored as "SBF is the man in the arena". No,  either we should have been cogent and open about what he was doing or we should have said we didn't like it and begun pulling away reputationally.