Working to reduce extreme suffering for all sentient beings.
Author of Suffering-Focused Ethics: Defense and Implications & Reasoned Politics.
Co-founder (with Tobias Baumann) of the Center for Reducing Suffering (CRS).
ebooks available for free here and here.
It's unfortunate that the quote I selected implies "all minimalist axiologies" but I really was trying to talk about this post.
Perhaps it would be good to add an edit on that as well? E.g. "The author agrees that the answers to these questions are 'yes' (for the restricted class of minimalist axiologies he explores here)." :)
(The restriction is relevant, not least since a number of EAs do seem to hold non-experientialist minimalist views.)
The author agrees that the answers to these questions are "yes".
Not quite. The author assumes a certain class of minimalist axiologies (experientialist ones), according to which the answers to those questions are:
The author does not agree that the answer to question 2 is "yes" for minimalist views in general, since other minimalist views may hold that the answer is "no" (as he clarifies in the first footnote).
The author's main point is that perhaps you shouldn't be worried about that.
I don't think that's a fair summary. Better summaries of the main points of the piece would be that:
This analysis seems to neglect all "net negative outcomes", including scenarios in which s-risks are realized (as Mjeard noted), the badness of which can go all the way to the opposite extreme (see e.g. "Astronomical suffering from slightly misaligned artificial intelligence").
Including that consideration may support a more general focus on ensuring a better quality of the future, which may also be supported by considerations related to grabby aliens.
I think it's important to stress that it's not just that some people with an extremely high IQ fail to change their minds on certain issues, and more generally fail to overcome confirmation bias (which I think is fairly unsurprising). A key point is that there actually doesn't appear to be much of a correlation at all between IQ and resistance to confirmation bias.
So to slightly paraphrase what you wrote above, I didn't just write the post because a correlation across a population is of limited relevance when you’re dealing with a smart individual who lacks one of these traits, but also because for a number of these traits (e.g. interpersonal kindness, being driven, and limiting confirmation bias), there seems to be virtually no correlation in the first place. And also because these other skills likely are more easy to improve than is IQ, implying that there is a tractability case for focusing more on developing and incentivizing these other traits.
You argue that EA overrates IQ
As noted above, my main claim is not that "EA overrates IQ" at a purely descriptive level, but rather that other important traits deserve more focus in practice (because those other important traits seem neglected relative to smarts, and also because — at the level of what we seek to develop and incentivize — those other traits seem more elastic and improvable).
I noted in the comment above that:
one line of evidence I have for this is how often I see references to smarts, including in internal discussions related to career and hiring decisions, compared to other important traits.
Without directly quoting anyone, I can, to be more specific, say that I've seen relatively senior people in EA imply that certain EA organizations (including CRS, where I work) will be eager to hire applicants if they are extremely smart. That's the kind of sentiment I feel I've seen quite often, and with which I strongly disagree, because being "extremely smart" is far from being sufficient, even if the person in question has altruistic values.
“Science advances one funeral at a time.” If that’s true,
If that were literally true, then science wouldn't ever advance much. :)
It seems that most scientists are in fact willing to change their minds when strong evidence has been provided for a hypothesis that goes against the previously accepted view. The "Planck principle" seems more applicable to scientists who are strongly invested in a given hypothesis, but even in that reference class, I suspect that most scientists do actually change their minds during their lifetime when the evidence is strong. And even if that were not the case, I don't think it would count as compelling evidence in favor of thinking that IQ isn't strongly correlated with less confirmation bias. (E.g. non-scientists might still do far worse.)
I think stronger evidence for a weak or non-existent correlation between IQ and resistance to confirmation bias is found in the psychological studies on the matter. :)
Thanks for your comment and for listing those traits and skills; I strongly agree that those are all useful qualities. :)
One might argue that willingness to do grunt work, taking initiative, and mental stamina all belong in a broader "drive/conscientiousness" category, but I think they are in any case important and meaningfully distinct traits worth highlighting in their own right.
Likewise, one could perhaps argue that "ability to network well" falls under a broader category of "social skills", in which interpersonal kindness and respect might also be said to fall (as a somewhat distinct trait or ability, cf. the cognitive vs. affective empathy distinction; networking ability probably draws more strongly on cognitive empathy while [genuine] interpersonal kindness probably relies more on affective empathy). A related trait one could list in that category is skill in perspective-taking.
Regarding the correlation point, I agree that IQ is likely correlated with many of the traits I listed, but I don't believe that this is a strong reason to think that we are not overemphasizing IQ relative to these other traits. Moreover, as noted in another comment, a reason to focus more on these other traits relative to IQ at the level of what we seek to develop individually and incentivize collectively is that many of these other traits and skills probably are more elastic and improvable than is IQ.
As for how many of these traits are correlated significantly with IQ, it's worth noting that — beyond "being driven" and "interpersonal kindness" — myside bias (also) appears to show “very little relation to intelligence”. And I likewise doubt that IQ has much of a correlation with a willingness to face unpleasant and inconvenient conclusions, or resistance to signaling-related distortions. (Some relevant albeit weak and indirect evidence regarding IQ and signaling-related distortions — specifically when it comes to distortions due to partisan/tribal loyalties — is that greater knowledge of political matters, which is presumably a decent proxy of IQ, does not seem to improve people's ability to provide an accurate representation of the opposite side's views, even when subjects are given a financial incentive.)
Thanks, it looks interesting. :)
Thanks for your comment, Linch. :)
It's a fair point that my post was quite vague on some key points, and your comment provides a great invitation for me to try to clarify my claims and views a bit.
The article claims that an important trait (smarts) is overrated as a precondition to impact
I actually wouldn't say that that's my core claim, although I do agree with it.
My claim about overemphasis relates more to the level of actions, norms, and practical focus than it relates to predictions about how much variance in impact IQ accounts for. (This is somewhat apropos the distinction between procedural vs. declarative knowledge as well as the intention-behavior gap.)
That is, it's possible that we're mostly right about how much variance different factors predict (or at least that we would be right on reflection, cf. your note in the other comment about how our immediate intuitions might be wrong), yet that we're nonetheless off in terms of how much we focus on developing and selecting for those respective factors in practice (including, and perhaps especially, when it comes to less tangible "focus promoters" such as norms, informal prestige conferral, and daydreams).
So I think IQ is probably somewhat descriptively overrated (more on this below), but I think the degree to which it is overemphasized at the level of norms, actions, and salient decision criteria is considerably stronger. One line of evidence I have for this is how often I see references to smarts, including in internal discussions related to career and hiring decisions, compared to other important traits.
How much do I think these other things are underemphasized, in quantitative terms? It is difficult for me to put a precise number on it, but my sense is that it would be good if most of the other traits and virtues I listed were to receive at least twice as much attention as they currently do, both in terms of how much time people devote to cultivating them in personal development efforts as well as in terms of how often these virtues are emphasized in the broader discourse among aspiring effective altruists. And beyond neglectedness, a reason to focus more on these other traits relative to smarts at the level of what we seek to develop individually and incentivize collectively is that those other traits and virtues likely are more elastic and improvable than is IQ — which isn't to say that IQ cannot also be improved.
How well does IQ predict "impact"?
Next, regarding the question of how well IQ predicts impact, I think this depends critically on how we define "impact". This may feel like a trivial point, but please bear with me as I try to explain where I'm coming from. :)
I like that you specified the following in your other comment, namely that you estimated impact roughly in terms of "what prediction-evaluation setups would say about someone's past impact five years from now". That's a clearly specified point in time.
However, I think it's likely that impact assessments will diverge substantially depending on the timeframe (cf. our vast uncertainty over time and the "Three Mile Island effect"). This also relates to the virtues I listed in the post.
For example, I think it's possible (perhaps ~10 percent likely) that the community ends up going in a highly suboptimal direction due to focusing too exclusively on metrics such as "number of publications" or "useful theoretical insights provided" over, say, a five-year period, while neglecting less tangible factors such as interpersonal kindness and social health, which may gradually — in less noticeable ways that might only become apparent over longer timespans — lead to corrosion, burnout, or conflicts. (And the lack of emphasis on such less tangible factors might also be driving people away in the short term, in ways that are probably easy to miss by potential evaluators of impact.)
Likewise, it could be that factors such as "attention to social aspects" explain relatively little individual variation in impact, yet that they are nonetheless critical in terms of the community's success or failure. (Similar to how individual variation in some traits is less predictive of certain outcomes than is country-level variation. Indeed, individual-level success is not always conducive to collective success — sometimes it's even detrimental to it; altruistic behaviors that are too babbler bird-esque might be a concrete example of that.)
Finally, I think the point about clarifying fundamental issues, specifically fundamental values, is critical. After all, an impact evaluation that is made relative to some pre-specified set of values (that is held constant) may diverge greatly from an evaluation — even a five-year evaluation — that also factors in moral reflection, and which evaluates impact based on the updated values endorsed on reflection. Such reflection and consequently updated evaluative criteria may even flip the sign of one's impact.
I'd expect IQ to be significantly better correlated with impact based on the former kind of evaluation (where I might roughly agree with your estimates in the case of a five-year assessment*) vs. the latter evaluation (which in idealized terms one could think of as "an impact evaluation made relative to the values that the person would endorse if they had focused chiefly on value exploration their entire life" — something that more limited value reflection efforts could presumably approximate).
In the latter case, IQ might still come close to being the main predictor, but I suspect that a construct tracking "focus on fundamental values" might do even better among aspiring EAs (not least because changes in fundamental values can change the consequent evaluations a lot). That's one of the reasons I think it's worth focusing much more on fundamental values. :)
Added; thanks for the suggestion! :)