richard_ngo

AI safety research engineer at DeepMind (all opinions my own, not theirs). I'm from New Zealand and now based in London; I also did my undergrad and masters degrees in the UK (in Computer Science, Philosophy, and Machine Learning). Blog: thinkingcomplete.blogspot.com

richard_ngo's Comments

Systemic change, global poverty eradication, and a career plan rethink: am I right?

A while back I read this article by Hickel, which was based on the book. See this EA forum post which I made after reading it, and also the comments which I wrote down at the time:

  • The article was much better than I thought it would be based on the first few paragraphs.
  • Still a few dodgy bits though - e.g. it quotes FAO numbers on how many people are hungry, but neglects to mention that this is good progress (quote from FAO): "The proportion of undernourished people in the developing regions has fallen by almost half. One in in seven children worldwide are underweight, down from one in four in 1990."
  • I also tried to factcheck the claim that the specific number $1.90 is a bad metric to use, by reading the report he said was a "trenchant critique" of it. There was a lot of stuff about how a single summary statistic can be highly uncertain, but not much about why any other poverty line would be better.
  • Overall I do think it's fairly valuable to point out that there's been almost no movement of people above $7.40 if you exclude China. But the achievement of moving a lot of people above the $1.90 number shouldn't be understated.
  • This graph seems like the most important summary of the claims in the article. So basically, the history of poverty reduction was almost entirely Asia, the future of it (or lack thereof) will be almost entirely about Africa.
  • Note that the graph displays absolute numbers of people. The population of Africa went from 600M in 1990 to 1.2B today to 1.6B in 2030, apparently. So according to this graph there’s been a reduction in percentage of extreme poverty in Africa, and that’s projected to continue, but it’s at a far lower rate than Asia.
richard_ngo's Shortform

One use case of the EA forum which we may not be focusing on enough:

There are some very influential people who are aware of and somewhat interested in EA. Suppose one of those people checks in on the EA forum every couple of months. Would they be able to find content which is interesting, relevant, and causes them to have a higher opinion of EA? Or if not, what other mechanisms might promote the best EA content to their attention?

The "Forum Favourites" partly plays this role, I guess. Although because it's forum regulars who are most likely to highly upvote posts, I wonder whether there's some divergence between what's most valuable for them and what's most valuable for infrequent browsers.

How should we run the EA Forum Prize?

Personally I find the prize disproportionately motivating, in that it increases my desire to write EA forum content to a level beyond what I think I'd endorse if I reflected for longer.

Sorry if this is not very helpful; I imagine it's also not very representative.

Should EA Buy Distribution Rights for Foundational Books?

As one data point, the Institute of Economic Affairs (which has had pretty major success in spreading its views) prints out many short books advocating its viewpoints and hands them out at student events. That certainly made me engage with their ideas significantly more, then give the books to my friends, etc. I think they may get economies of scale from having their own printing press, but it might be worth looking into how cheaply you can print out 80-page EA primers for widespread distribution.

Max_Daniel's Shortform

People tend to underestimate the importance of ideas, because it's hard to imagine what impact they will have without doing the work of coming up with them.

I'm also uncertain how impactful it is to find people who're good at generating ideas, because the best ones will probably become prominent regardless. But regardless of that, it seems to me like you've now agreed with the three points that the influential EA made. Those weren't comparative claims about where to invest marginal resources, but rather the absolute claim that it'd be very beneficial to have more talented people.

Then the additional claim I'd make is: some types of influence are very valuable and can only be gained by people who are sufficiently good at generating ideas. It'd be amazing to have another Stuart Russell, or someone in Stephen Pinker's position but more onboard with EA. But they both got there by making pioneering contributions in their respective fields. So when you talk about "accumulating AI-weighted influence", e.g. by persuading leading AI researchers to be EAs, that therefore involves gaining more talented members of EA.

[Link] "Will He Go?" book review (Scott Aaronson)
Thanks for sharing the last link, which I think provides useful context (that Open Philanthropy's funder has a history of donating to partisan political campaigns).

Why is this context useful? It feels like this the relevance of this post should not be particularly tied to Dustin and Cari's donation choices.

the upshot of this post is effectively an argument that supporting Biden's campaign should be thought of as an EA cause area

Is "X should be thought of as an EA cause area" distinct from "X would be good"? More generally, I'd like the forum to be a place where we can share important ideas without needing to include calls to action.

On the other hand, I also endorse holding political posts to a more stringent standard, so that we don't all get sucked in.

Max_Daniel's Shortform

Task X for which the claim seems most true for me is "coming up with novel and important ideas". This seems to be very heavy-tailed, and not very teachable.

I would also expect that, if I poked a bit at these claims, it would usually turn out that X is something like "contribute to this software project at the pace and quality level of our best engineers, w/o requiring any management time" or "convince some investors to give us much more money, but w/o anyone spending any time transferring relevant knowledge".

Neither of these feel like central examples of the type of thing EA needs most. Most of the variance of the impact of the software project will be in how good the idea is; same for most of the variance of the impact of getting funding.

Robin Hanson is someone who's good at generating novel and important ideas. Idk how he got that way, but I suspect it'd be very hard to design a curriculum to recreate that. Do you disagree?

richard_ngo's Shortform

Then there's the question of how many fields it's actually important to have good research in. Broadly speaking, my perspective is: we care about the future; the future is going to be influenced by a lot of components; and so it's important to understand as many of those components as we can. Do we need longtermist sociologists? Hell yes! Then we can better understand how value drift might happen, and what to do about it. Longtermist historians to figure out how power structures will work, longtermist artists to inspire people - as many as we can get. Longtermist physicists - Anders can't figure out how to colonise the galaxy by himself.

If you're excited about something that poses a more concrete existential risk, then I'd still advise that as a priority. But my guess is that there's also a lot of low-hanging fruit for would-be futurists in other disciplines.

richard_ngo's Shortform

Another related thing that isn't discussed enough is the immense difficulty of actually doing good research, especially in a pre-paradigmatic field. I've personally struggled to transition from engineer mindset, where you're just trying to build a thing that works (and you'll know when it does), to scientist mindset, where you need to understand the complex ways in which many different variables affect your results.

This isn't to say that only geniuses make important advances, though - hard work and persistence go a long way. As a corollary, if you're in a field where hard work doesn't feel like work, then you have a huge advantage. And it's also good for building a healthy EA community if even people who don't manage to have a big impact are still excited about their careers. So that's why I personally place a fairly high emphasis on passion when giving career advice (unless I'm talking to someone with exceptional focus and determination).

Load More