301 karmaJoined


Topic contributions

Thanks, great to hear you found it useful!

As you mention, the export controls are aimed at, and have the primary effect of, differentially slowing down a specific country's AI development, rather than AGI development overall.

This has a few relevant side effects, such as reduced proliferation and competition, but doesn't slow down the frontier of overall AGI development  (nor does it aim to do so).


Thanks a lot and good  point, edited to include full names and links!

For more information on the current funding situation, here is OpenPhil's latest update indicating that their assets are down ~40%, limiting growth in their commitment compared to previous projections, and GiveWell's latest funding projection update indicating that they "don’t expect to have enough funding to support all the cost-effective opportunities [they] find".

Thank you very much for sharing this honest account of your experience!

Failure is an inherent byproduct of taking more risks, and it's really hard to write openly about career near-misses like this one, as failing (really) hurts. 

However, accounts of failures, near-misses, and missed opportunities are extremely valuable for all of us to learn and go forward, especially as we embark in more ambitious high-risk, high-reward projects.

This post is especially valuable as it also sheds more light on paths towards high-impact political roles, so thanks again!

Completely agree! Although I imagine that the situation will change soon due to 1) last funding decisions being finalized 2) funded projects coming out of stealth mode 3) more rejected applicants posting their applications publicly (when there are few downsides to doing so) 4) the Future Fund publishes a progress report in the next months.

So I expect the non-disclosure issue to be significantly reduced in the next few months.

Thanks for sharing the presentation, great work!

Regarding the third question from the audience, "What kind of resource could we share to a random person on the street if we want to introduce them to AI x-risk?", in addition to the resources you mention I think Stuart Russel's 2021 BBC Reith Lectures Series, "Living with Artificial Intelligence", is an excellent introduction for a generalist audience.

In addition to being accessible, the talks have the institutional gravitas of being from a prestigious lecture series from the BBC and an established academic, which makes them more likely to convince a generalist audience.

I think the marginal value of a pre-order of What We Owe the Future is much higher than a pre-order of Gates's book, as Gates's book has a much higher baseline probability of ending up as a bestseller and receiving significant press coverage thanks to Gates's fame.

As usual, it would be great to see downvotes accompanied by reasons for downvoting, especially in the case of NegativeNuno's comments, since it's an account literally created to provide frank criticism with a clear disclaimer in its bio.

Thanks for this writeup! I definitely never thought about snakebites as a major issue before, despite its similarity to "obvious" global health issues like malaria.

This issue is gaining the attention of EU policymakers, including MEPs.

On April 20, an MEP from the Greens/EFA political group tabled a parliamentary question on the issue, citing recent research reviews to note that high-welfare octopus farming is impossible.

He asks whether the European Commission can "confirm the incompatibility of commercial octopus farming investments with the ‘do no significant harm’ principle, which underpins the EU’s sustainable finance policies and is the basis for EU taxonomy".

Load more