I agree with this.
"Number of publications" and "Impact per publication" are separate axes, and leaving the latter out produces a poorer landscape of X-risk research.
Glad to hear that the links were useful!
Keeping by Holden's timeline sounds good, and I agree that AGI > HLMI in terms of recognizability. I hope the quiz goes well once it is officially released!
I am not the best person to ask this question (@so8res, @katja_grace, @holdenkarnofsky) but I will try to offer some points.
I completed the three quizzes and enjoyed it thoroughly.
Without any further improvements, I think these quizzes would still be quite effective. It would be nice to have a completion counter (e.g., an X/Total questions complete) at the bottom of the quizzes, but I don't know if this is possible on quizmanity.
Got through about 25% of the essay and I can confirm it's pretty good so far.
Strong upvote for introducing me to the authors and the site. Thank you for posting.
Every time I think about how I can do the most good, I am burdened by questions roughly like
I do not have good answers to these questions, but I would bet on some actions being positively impactful on the net.
For example
W.r.t. the action that is most positively impactful, my intuition is that it would take the form of safeguarding humanity's future or protecting life on Earth.
Some possible actions that might fit this bill:
The problem here is that some of these actions might spawn harm, particularly (2) and (3).
As per my last shortform, over the next couple of weeks I will be moving my brief profiles for different catastrophes from my draft existential risk frameworks post into shortform posts to make the existential risk frameworks post lighter and more simple.
In my last shortform, I included the profile for the use of nuclear weapons and today I will include the profile for climate change.
Does anyone have a good list of books related to existential and global catastrophic risk? This doesn't have to just include books on X-risk / GCRs in general, but can also include books on individual catastrophic events, such as nuclear war.
Here is my current resource landscape (these are books that I have personally looked at and can vouch for; the entries came to my mind as I wrote them - I do not have a list of GCR / X-risk books at the moment; I have not read some of them in full):
General:
AI Safety
Nuclear risk
General / space
Biosecurity
I have been working on a post which introduces a framework for existential risks that I have not seen covered on the either LW or EAF, but I think I've impeded my progress by setting out to do more than I originally intended.
Rather than simply introduce the framework and compare it to the Bostrom's 2013 framework and the Wikipedia page on GCRs, I've tried to aggregate all global and existential catastrophes I could find under the "new" framework.
Creating an anthology of global and existential catastrophes is something I would like to complete at some point, but doing so in the post I've written would be overkill and would not in line with the goal of"making the introduction of this little known framework brief and simple".
To make my life easier, I am going to remove the aggregated catastrophes section of my post. I will work incrementally (and somewhat informally) on accumulating links and notes for and thinking about each global and/or existential catastrophe through shortform posts.
Each shortform post in this vein will pertain to a single type of catastrophe. Of course, I may post other shortforms in between, but my goal generally is to cover the different global and existential risks one by one via shortform.
As was the case in my original post, I include DALLE-2 art with each catastrophe, and the loose structure for each catastrophe is Risk, Links, Forecasts.
Here is the first catastrophe in the list. Again note that I am not aiming for comprehensiveness here, but rather am trying to get the ball rolling for a more extensive review of the catastrophic or existential risks that I plan to complete at a later date. The forecasts were observed on October 3 0002022 and represent the community's uniform median forecast.
I should have chosen a clearer phrase than "not through formal channels". What I meant was that my much of my forecasting work and experiences came about through my participation on Metaculus, which is "outside" of academia; this participation did not manifest as forecasting publications or assistantships (as would be done through a Masters or PhD program), but rather as my track record (linked in CV to Metaculus profile) and my GitHub repositories. There was also a forecasting tournament I won, which I also linked on the CV.