David Thorstad

Assistant Professor of Philosophy @ Vanderbilt University
1289 karmaJoined Apr 2021www.dthorstad.com

Comments
102

I’d like to hope that academics are aiming for a level of understanding above that of a typical user on an Internet forum.

All academic works have a right to reply. Many journals print response papers and it is a live option to submit responses to critical papers, including mine. It is also common to respond to others in the context of a larger paper. The only limit to the right of academic reply is that the response must be of suitable quality and interest to satisfy expert reviewers.

That's fair! But I also think most op-eds on any topic are pretty bad. As for academic papers, I have to say it took me at least a year to write anything good about EA, and that was on a research-only postdoc with 50% of my research time devoted to longtermism. 

There's an awful lot that has been written on these topics, and catching up on the state of the art can't be rushed without bad results. 

Strongly agree. I think there's also a motivation gap in knowledge acquisition. If you don't think there's much promise in an idea or a movement, it usually doesn't make sense to spend years learning about it. This leads to large numbers of very good academics writing poorly-informed criticisms. But this shouldn't be taken to indicate that there's nothing behind the criticisms. It's just that it doesn't pay off career-wise for these people to spend years learning enough to press the criticisms better.

Glad it helped! All credit to Nicholas who wrote 99% of it. If you have a minute, I uploaded a talk version of the paper last week. Would love to hear what you think, especially re accessibility: 

You folks impress me! But seriously, that's a big ask.

To be fair, this could trigger lawsuits. I hope someone is reflecting on FTX, but I wouldn't expect anyone to be keen on discussing their own involvement with FTX publicly and in great detail.

Here's a gentle introduction to the kinds of worries people have (https://spectrum.ieee.org/power-problems-might-drive-chip-specialization). Of the cited references "the chips are down for moore's law" is probably best on this issue, but a little longer/harder. There's plenty of literature on problems with heat dissipation if you search the academic literature. I can dig up references on energy if you want, but with Sam Altman saying we need a fundamental energy revolution even to get to AGI, is there really much controversy over the idea that we'll need a lot of energy to get to superintelligence? 

Ah - that comes from the discontinuity claim. If you have accelerating growth that isn't sustained for very long, you get something like population growth from 1800-2000, where the end result is impressive but hardly a discontinuity comparable to crossing the event horizon of a black hole. 

(The only way to go around the assumption of sustained growth would be to post one or a few discontinuous leaps towards superintelligence. But that's harder to defend, and it abandons what was classically taken to ground the singularity hypothesis, namely the appeal to recursive self-improvement). 

Here's the talk version for anyone who finds it easier to listen to videos: 

Load more