sawyer

Sawyer Bernath is the Executive Director of the Berkeley Existential Risk Initiative (BERI). Prior to joining BERI in July 2019, he was a Production Manager at Research Electro Optics (now Excelitas Boulder) and Head of Production at Modular Robotics. He has a B.S. in physics from Tufts University.

Topic Contributions

Comments

St. Petersburg Demon – a thought experiment that makes me doubt Longtermism

This might be a disagreement about whether or not it's appropriate to use "infinity" as a number (i.e. a value). Mathematically, if a function approaches infinity as the input approaches infinity, I think typically you're supposed say the limit is "undefined", as opposed to saying the limit is "infinity". So whether this is (a) underselling it or (b) just writing accurately depends on the audience.

Why Helping the Flynn Campaign is especially useful right now

SBF's Protect Our Future PAC has put more than $7M towards Flynn's campaign. I think this is what _pk  and others are concerned about, not  direct donations. And this is what most people concerned with "buying elections" are concerned about. (This is what the Citizens United controversy is about.)

Future Matters: March 2022

That seems like a positive adjustment from my perspective! I think the interviews are valuable content, so I'd still encourage you to add a link to the interview, with the name and topic of the person you interviewed. That way interested Substack readers will still see it.

"Long-Termism" vs. "Existential Risk"

As Nathan Young mentioned in his comment, this argument is also similar to Carl Shulman's view expressed in this podcast: https://80000hours.org/podcast/episodes/carl-shulman-common-sense-case-existential-risks/

Is there an EA grants database?

(Assuming you mean "rot")

As far as specific needs, nothing very specific. Sometimes I wonder how much overlap there is in grantees between the different grantmakers, and having more of them in one table where they can be collectively sorted and filtered would be more useful. I just generally think it's good to have transparency in grantmaking, and a single source that covers >90% of what people might consider "EA grantmaking" is more transparent than asking people to look at several different HTML tables or non-tabular lists.

$100 bounty for the best ideas to red team

Why might one believe that MacAskill and Ord's idea of The Long Reflection is actually a bad idea, or impossible, or that it should be dropped from longtermist discourse for some other reason?

Robin Hanson's argument here: https://www.overcomingbias.com/2021/10/long-reflection-is-crazy-bad-idea.html

Future Matters: March 2022

Very cool! I'm super happy that this exists, and I'm excited by this first issue. On the constructive criticism side, I think this is too long for a newsletter. I think it's unlikely that I fully read future editions, and if they're all this long, I might unsubscribe at some point. So consider this one vote for trying to make the newsletter shorter :)

Is there an EA grants database?

Thanks Yonatan, this is great! Glad to see this was so straightforward, I appreciate you putting it together. Misha seems to have taken care of the EA Funds part, at least up to mid-2021, so we're getting close. I'm planning to merge them in one direction or another.

Is there an EA grants database?

Very cool Misha, thanks! Do you plan to keep this updated over time? If so, I think integrating Yonatan's sheet into this Airtable (or visa versa) would already accomplish most of what I was looking for.

Load More