All of philosophytorres's Comments + Replies

I didn't downvote this comment, but a) This may have not been your intention, but even in context, the "white supremacy" claim in the e-book does read as your claim b) I don't think "poorer countries should transfer their wealth to richer countries" supports "a political, economic and cultural system in which whites overwhelmingly control power and material resources". The richest countries include many countries that aren't majority white such as Singapore, Qatar, UAE, Taiwan etc, so I don't think the 'overwhelmingly' criterion is met here. c) I'm of the opinion that people should refrain from ever using terms "in a legal scholarly sense"; instead they should either use the term in its usual sense or create a new term with a more specific definition. That being said, I think a charitable reading of your e-book makes it seem like you are describing certain conclusions of longetermism as supporting 'white supremacy', and that you are using the term in a 'legal scholarly sense' and defining it as "a political, economic and cultural system in which whites overwhelmingly control power and material resources". I don't know if you have made this claim elsewhere but it did not seem like your e-book claims that "longtermists are white supremacists".
The link above has an additional "." at the end that prevents it from properly working.

I don't know how to embed snapshots, but anyone who wishes is welcome to type "phil torres" into linkedin or email me for the snapshots I've just taken right now - it brings up "Researcher at Centre for the Study of Existential Risk, University of Cambridge". As I say, it's unclear if this is deliberate - it may well be an oversight, but it has contributed to the mistaken external impression that Phil Torres is or was research staff at CSER.

Were the Great Tragedies of History “Mere Ripples”?

[Responding to Alex HT above:]

I'll try to find the time to respond to some of these comments. I would strongly disagree with most of them. For example, one that just happened to catch my eye was: "Longtermism does not say our current world is replete with suffering and death."

So, the target of the critique is Bostromism, i.e., the systematic web of normative claims found in Bostrom's work. (Just to clear one thing up, "longtermism" as espoused by "leading" longtermists today has been hugely influenced by Bostromism -- this is a fact, I believe, about intel... (read more)

Clarifying existential risks and existential catastrophes

Have you seen my papers on the topic, by chance? One is published in Inquiry, the other is forthcoming. Send me an email if you'd like!

8John G. Halstead2y
please do
Response to recent criticisms of EA "longtermist" thinking

You don't even have the common courtesy of citing the original post so that people can decide for themselves whether you've accurately represented my arguments (you haven't). This is very typical "authoritarian" (or controlling) EA behavior in my experience: rather than given critics an actual fair hearing, which would be the intellectually honest thing, you try to monopolize and control the narrative by not citing the original source, and then reformulating all the arguments while at the same time describing these reformulations a... (read more)

Book Review: Enlightenment Now, by Steven Pinker

Sloppy scholarship. Please do take a look, if you have a moment:

3Aaron Gertler3y
I read your full critique of the Existential Risk chapter, and agreed with nearly all of your points, as I think I mentioned when you posted it on the Forum. (I also linked to you in this post, in case you didn't see that on your first reading!) Did you have other criticism of the book beyond that chapter that you felt I should have pointed out?
Would this be a good top-level post for the forum? I imagine lots of EAs have read Enlightenment Now or are planning to read it. It seems relevant to highlight the flaws that an influential book might have relating to its treatment of existential risks.
EA Hotel with free accommodation and board for two years

Wow, this is absolutely stunning. I can't myself participate, but I genuinely hope this project takes off. I'm sure you're familiar with the famous (but not demolished) Building 20 at MIT: It provided a space for interdisciplinary work -- and wow, the results were truly amazing.

Thanks! Yes, Building 20 sounded great. It's mentioned in Deep Work, from which I reference the the "hub-and-spoke" model in the OP.
What does Trump mean for EA?

Friends: I recently wrote a few thousand words on the implications that a Trump presidency will have for global risk. I'm fairly new to this discussion group, so I hope posting the link doesn't contravene any community norms. Really, I would eagerly welcome feedback on this. My prognosis is not good.

Some considerations for different ways to reduce x-risk

A fantastically interesting article. I wish I'd seen it earlier -- about the time this was published (last February) I was completing an article on "agential risks" that ended up in the Journal of Evolution and Technology. In it, I distinguish between "existential risks" and "stagnation risks," each of which corresponds to one of the disjuncts in Bostrom's original definition. Since these have different implications -- I argue -- for understanding different kinds of agential risks, I think it would be good to standardize the n... (read more)

Two Strange Things About AI Safety Policy

Oh, I see. Did they not ask for his approval? I'm familiar with websites devising their own outrageously hyperbolic headlines for articles authored by others, but I genuinely assumed that a website as reputable as Slate would have asked a figure as prominent as Bostrom for approval. My apologies!

The Map of Impact Risks and Asteroid Defense

Very interesting map. Lots of good information.

Note: news publications impose titles on authors without consulting them. Obviously he would never write that title.