Jacob Eliosoff

Posts

Sorted by New

Wiki Contributions

Comments

EA needs consultancies

Well, my point wasn't to prove you wrong.  It was to see what people thought about a strong version of what you wrote: I couldn't tell if that version was what you meant, which is why I asked for clarification.  Larks seemed to think that version was plausible anyway.

EA needs consultancies

All right.  Well, I know you're a good guy, just keep this stuff in mind.

Out of curiosity I ran the following question by our local EA NYC group's Slack channel and got the following six responses.  In hindsight I wish I'd given your wording, not mine, but oh well, maybe it's better that way.  Even if we just reasonably disagree at the object level, this response is worth considering in terms of optics.  And this was an EA crowd, we can only guess how the public would react.

Jacob: what do y'all think about the following claim: "before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible"

Jacob: all takes welcome!

A: I think it's false 😛 as a lot of people are interested in the truth and trying hard to make the world a better place

B: also think it's false; wasn't this basically the premise of the enlightenment?

B: Thinking e.g. legal reforms esp. french revolution and prussian state, mexican cientificos, who were comteans

B: might steelman this by specifying the entire world i.e. a globalist outlook

B: even then, modernist projects c. 1920 onwards seemed to have a pretty strong alliance between proper reasoning on best evidence and genuine charitable impulses, even where ineffective or counterproductive

B: and, of course, before all the shit and social dynamics e.g. lysenkoism, marxism had a reasonably good claim at being scientific and materialist in its revolutionary aims

C: I find it plausible that one can be very concerned about what is true without being very good finding out the truth according to rationalists' standards. Science and philosophy are hard! (And, in some cases, rationalists probably just have weird standards.)

D: Disagree. Analogy: before evidence-based medicine, physicians were still concerned with what was true and trying to make the world a better place (through medical practice). They just had terrible methodology (e.g., theorizing that led to humors and leeches).

D: Likewise, I think EA is a step up in methodology, but it's not original in its simultaneous concern for welfare and truth.

E: Sounds crazy hubristic..

F: I think this isn’t right, but not necessarily because I think the intersection is all that common, it might be, I don’t know, but more because EA is small enough that its existence doesn’t provide much evidence of a large change in the number of people in this intersection. It could be a bunch them just talk to each other more now

EA needs consultancies

I did read it, and I agree it improves the tone of your post (helpfully reduces the strength of its claim).  My criticism is partly optical, but I do think you should write what you sincerely think: perhaps not every single thing you think (that's a tall order alas in our society: "I say 80% of what I think, a hell of a lot more than any politician I know" - Gore Vidal), but sincerely on topics you do choose to opine on.

The main thrusts of my criticism are:

  1. Because of the optical risk, and also just generally because criticizing others merits care, you should have (and still can) clarify which of the significantly different meanings I listed (or others) of "they are not seeking truth" you intended.
  2. If you believe one of the stronger forms, eg "before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible," then I strongly disagree, and I think this is worth discussing further for both optical and substantive reasons. We would probably get lost in definition hairsplitting at some point, but I believe many, many people (activists, volunteers, missionaries, scientists, philanthropists, community leaders, ...) for at least hundreds of years have both been trying hard to make the world a better place and trying hard to be guided by an accurate understanding of reality while doing so. We can certainly argue any one of them got a lot wrong: but that's about execution, not intent.

    This is, again, partly optical and partly substantive: but it's worth realizing that to a lot of the world who predate EA or have read a lot about the world pre-EA, the quoted claim above is just laughable. I care about EA but I see it as a refinement, a sort of technical advance. Not an amazing invention.
EA needs consultancies

I come in peace, but I want to flag that this claim will sound breathtakingly arrogant to many people not fully immersed in the EA bubble, and to me:

I'm probably not phrasing this well, but to give a sense of my priors: I guess my impression is that my interactions with approximately every entity that perceives themself as directly doing good outside of EA* is that they are not seeking truth, and this systematically corrupts them in important ways.

Do you mean:
a) They don't make truth-seeking as high a priority as they should (relative to, say, hands-on work for change)?
b) They try to understand what's true, but their feeble non-EA efforts go nowhere?
c) They make zero effort to seek the truth? ("Not seeking truth")
d) They don't care in the slightest what the truth is?

These are worth distinguishing, at least in communications that might plausibly be read by non-EAs. Someone could read what you wrote and conclude, or at least conclude you believe, that before EA the intersection of people who were very concerned about what was true, and people who were trying hard to make the world a better place, was negligible. That would be unfortunate.

Shouldn't 'Effective Altruism' be capitalized?

I pretty much echo everything Aaron G said but in short it comes down to the impression left on the reader. "Effective Altruism" looks like a group one could try to join; "effective altruism" looks like a field of study or a topic of discussion. I think the latter is more the impression we want to cultivate. Remember the first rule of EA: WE ARE NOT A CULT!

AI boxing

Ah I thought maybe this was like chess boxing

How bad is coronavirus really?

Just a quick comment: I'd be wary of any answers to this that focus narrowly on the health impact (eg expected death toll) without trying to factor in other major impacts on well-being: economic (increased poverty and especially unemployment, reduced GDP, lost savings due to market drop), geopolitical (eg increased nationalism/protectionism, and even increased potential for war), and maybe more - even basic things like global anxiety! (Also some benefits, eg reduced carbon emissions, though I'd argue these are overrated.) These aren't easy to assess but I'd be very surprised if they didn't add up to more net impact than the deaths/illnesses themselves.