All of slg's Comments + Replies

Apply for Red Team Challenge [May 7 - June 4]

Noting my excitement that you picked up on the idea and will actually make this happen!

The structure you lay out sounds good.

Regarding the winning team, will there be financial rewards? I’d give it >70% that someone would fund at least a ~$1000 award for the best team.

3Cillian Crosson2mo
Thanks Simon! Currently, we don't plan to provide a financial reward to the winning team (though I must admit, we haven't given this much thought). It's a good point though & we'll consider it further in the coming weeks. If anyone reading this is interested in funding an award for the winning team, please do get in touch.
Where would we set up the next EA hubs?

Do you know which funder is supporting the EA Hotel type thing?

3Chris Leong2mo
Apparently they have support from a private donor.
Where would we set up the next EA hubs?

Maybe you’re already considering this but here it goes anyway:

I‘d advise against the name ‚longtermist hub‘. I wouldn‘t want longtermism to also become an identity, just as EA is one.

It also has reputational risks—which is why new EA-oriented orgs do not have EA in their name.

3Jonathan_Michel2mo
Very strong upvote. Thanks for commenting this Simon.
4Severin T. Seehrich2mo
Yes, we are currently working on a better name. Thanks for the input, and feel free to send me a message if you have a great idea.
Apply for Professional Coaching

As far as I understand sessions will be fully subsidised by TfG. If you can’t afford them you can choose to pay 0$—unsure if this is standard among EA coaches.

I also think centralisation of psychological services might be valuable as it makes it easier to match fitting coaches/coachees and assess coaching performance.

Managing 'Imposters'

Practical advice for how to run EA organisations is really valuable, thanks for writing this up.

Retrospective on Catalyst, a 100-person biosecurity summit

Hey, I just wanted to leave a note of thanks for this excellent write-up!

I and some other EAs are planning an event with a similar format—your advice is super helpful to structure our planning and avoid obvious mistakes. 

In general, these kinds of project management retrospectives provide a lot of value (e.g., EAF's hiring retrospective).

What are some artworks relevant to EA?

This is cool, I had no idea you were also working on this.

Concrete Biosecurity Projects (some of which could be big)

This could be easier, yes. I know of one person who models the defensive potential of different metagenomic sequencing approaches, but I think there is space for at least 3-5 additional people doing this. 

Concrete Biosecurity Projects (some of which could be big)

I think he was explicitly addressing your question of sexually-transmitted diseases being capable of triggering pandemics, not if they can end civilization. 

Discussing the latter in detail would quickly get into infohazards—but I think we should spend some of our efforts (10%) on defending against non-respiratory viruses. But I haven't thought about this in detail.

Concrete Biosecurity Projects (some of which could be big)

I do mean EAs with a longtermist focus. While writing about highly-engaged EAs, I had Benjamin Todd's EAG talk in mind, in which he pointed out that only around 4% of highly-engaged EAs are working in bio.

And thanks for pointing out I should be more precise. To qualify my statement, I'm 75% confident that this should happen.

Concrete Biosecurity Projects (some of which could be big)

Despite how promising and scalable we think some biosecurity interventions are, we don’t necessarily think that biosecurity should grow to be a substantially larger fraction of longtermist effort than it is currently.

 

Agreed that it shouldn't grow substantially, but ~doubling the share of highly-engaged EAs working on biosecurity feels reasonable to me. 

6MichaelA4mo
FWIW, I don't actually know what you mean/believe here and whether it's different to what the post already said, because: * The post said "fraction of longtermist effort" but you're saying "share of highly-engaged EAs". Maybe you're thinking the increased share should mostly come from highly engaged EAs who aren't currently focused on longtermist efforts? That could then be consistent with the post. * You said "feels reasonable", which doesn't make it clear whether you think this actually should happen, it probably should happen, it's 10% likely it should happen, it shouldn't happen but it wouldn't be unreasonable for it to happen, etc.
Concrete Biosecurity Projects (some of which could be big)

I have only been involved in biosecurity for 1.5 years, but the focus on purely defensive projects (sterilization, refuges, some sequencing tech) feels relatively recent. It's a lot less risky to openly talk about those than about technologies like antivirals or vaccines.

I'm happy to see this shift, as concrete lists like this will likely motivate more people to enter the space. 

Democratising Risk - or how EA deals with critics

@CarlaZoeC or Luke Kemp, could you create another forum post solely focused on your article? This might lead to more focused discussions, separating debate on community norms vs discussing arguments within your piece.

I also wanted to express that I'm sorry this experience has been so stressful. It's crucial to facilitate internal critique of EA, especially as the movement is becoming more powerful, and I feel pieces like yours are very useful to launch constructive discussions.

Countermeasures & substitution effects in biosecurity

I particularly agree with the last point on focussing on purely defensive (not net-defensive) pathogen-agnostic technologies, such as metagenomic sequencing and resilience measures like PPE, air filters and shelters. 

 If others share this biodefense model in the longtermist biosecurity community, I think it'd be important to point towards these countermeasures in introductory materials (80k website, reading lists, future podcast episodes) 

Exposure to 3m Pointless viewers- what to promote?

I do wonder what the downside is here. It's a fleeting, low-fidelity impression of EA that will probably not stick in most minds. However, if 10-20 people donate money after hearing about it through Patrick, it might already be positive in sum.

2DavidNash5mo
I'd be slightly surprised if it led to a single donation, I'm not even sure how many searches it would lead to
EA megaprojects continued

Do you specifically object to the term megaproject, or rather to the idea of launching larger organizations and projects that could potentially absorb a lot of money?

If it's the latter, the case for megaprojects is that they are bigger bets, with which funders could have an impact using larger sums of money, i.e., ~1-2 order of magnitudes bigger than current large longtermist grants. It is generally understood that EA has a funding overhang,  which is even more true if you buy into longtermism, given that there are few obvious investment opportunities... (read more)

ludwigbald's Shortform

Hey Ludwig, happy to collaborate on this. A bunch of other EAs and I analyzed the initial party programs under EA considerations; this should be easily adapted to the final agreement and turned into a forum post.

What high-level change would you make to EA strategy?

Caveat: I work in Biosecurity.

I agree with the last point. Based on Ben Todd's presentation at EAG,

  • 18% of engaged EAs work on AI alignment, while
  • 4% work on Biosecurity.

Based on Toby Ord's estimates in the Precipice,  the risk of extinction in the next 100 years from

  • Unaligned artificial intelligence is ∼ 1 in 10, while
  • the risk from engineered pandemics is ∼ 1 in 30.

So, the stock of people in AI is 4.5x higher than Biosecurity, while AI is only  3x as important.

There is a lot of nuance missing here, but I'm moderately confident that this dysbalance... (read more)

What high-level change would you make to EA strategy?

Is there a historical precedent for social movements buying media? If so, it'd be interesting to know how that influenced the outlet's public perception/readership.

As of now, it seems like movements "merely" influence media, such as the NYTimes turning more leftward in the last few years or Vox employing more EA-oriented journalists.

Disagreeables and Assessors: Two Intellectual Archetypes

Spencer Greenberg also comes to mind; he once noted that his agreeableness is in the 77th percentile. I'd consider him a generator.

What EA projects could grow to become megaprojects, eventually spending $100m per year?
Answer by slgAug 07, 202112

Launching a Nucleic Acid Observatory, as outlined recently by Kevin Esvelt and others here (link to paper). With $100m one could launch a pilot version covering 5 to 10 states in the US.

Project Ideas in Biosecurity for EAs

Thanks for this write-up. Concerning this point:

Quantitative investigation of tech capabilities required for broad environmental nucleic acid surveillance to be useful

This article provides a good introduction to current challenges within genomic pathogen surveillance: Ten recommendations for supporting open pathogen genomic analysis in public health

0[comment deleted]1y
The German Effective Altruism Network - recap 2020

Hi, happy to read about where you stand and where you want to go with NEAD. 

FYI, the link in this sentence seems broken: "currently offering self-hosted alternatives to Slack, Google, and Zoom, one reason for this being our concern with risks from data privacy neglect. " 

1Ekaterina_Ilin1y
Ops, thanks for pointing that out - should be fixed now!
If you value future people, why do you consider near term effects?

Hi!


I think you mean to say: "every way a higher growth rate would be good is also an equally plausibly reason it would be bad"


Instead you wrote:


"Evidential symmetry here would be something like: every way a higher growth rate would be good is also an equally plausibly reason it would be good eg. increased emissions are equally likely to be good as they are to be bad.) "

1Alex HT2y
Thank you :) I've corrected it