Ben Pace

Comments

EA Debate Championship & Lecture Series

I was surprised, this video was much less goodharted than I expected (after having been primed with the super-fast talking example). I was expecting more insane things.

Though overall it had the level of much broad public debate/discourse I’ve seen. I watched the first three speakers, and didn’t learn anything. In good debates I’ve seen I’ve felt that I’ve learned something from the debaters about their fields and their unique world views, these felt like two opposing sides in a broader political debate with kind of no grounding in reality. They were optimized for short-scale (e.g. <30 seconds) applause lights for the audience, when objected they’d make it a fight saying things like “Don‘t even try to win that example”, their examples seemed false yet rewarded (primarily attributing China’s rise out of poverty in the last 50 years to ‘redistribution’ and getting applause for it, which, correct me if I’m wrong, is not at all the primary reason, they had massive growth in industry in part by copying a lot of the west). I wouldn’t expect to learn anything, it just seemed like nobody understood economics and they were indexed off what was like 0-1 inferential steps from what the audience as a whole understood. I guess that was the worst part, how can you discuss interesting ideas if they have to be obvious to an audience that big and generic within 10-20 seconds?

EA Debate Championship & Lecture Series

I just want to say I, Ben Pace, feel attacked every time someone criticizes “BP” in this comment thread.

Announcing "Naming What We Can"!

I'm open to a legal arrangement of shared nationalities, bank accounts, and professional roles.

Some quick notes on "effective altruism"

“Hello, I’m an Effective Altruist.”

“Hello, I’m a world-unfucker.”

Honestly, I think the second one might be more action-oriented. And less likely to attract status-seekers. Alright, I’m convinced, let’s do it :)

Some quick notes on "effective altruism"

I kinda think that "I'm an EA/he's an EA/etc" is mega-cringey (a bad combo of arrogant + opaque acryonym + tribal)

It sounds like you think it’s bad that people have identified their lives with trying to help people as much as they can? Like, people like Julia Wise and Toby Ord shouldn’t have made it part of their life identity to do the most good they can do. They shouldn’t have said “I’m that sort of person” but they should have said “This is one of my interests”.

Some quick notes on "effective altruism"

I do not know. Let me try generating names for a minute. Sorry. These will be bad.

“Marginal World Improvers”

”Civilizational Engineers”

”Black Swan Farmers”

“Ethical Optimizers”

”Heavy-Tail People”

Okay I will stop now.

Proposed Longtermist Flag

Appreciate you drawing this, I like the idea.

Some quick notes on "effective altruism"

we would be heading toward a more action-oriented and less communal group, which could reduce the attraction to manipulative people

I don't expect a brand change to "Global Priorities" to bring in more action-oriented people. I expect fewer people would donate money themselves, for instance, they would see it as cute but obviously not having any "global" impact, and therefore below them.

(I think it was my inner Quirrell / inner cynic that wrote some of this comment, but I stand by it as honestly describing a real effect that I anticipate.)

Some quick notes on "effective altruism"

The Defense Professor’s fingers idly spun the button, turning it over and over. “Then again, only a very few folk ever do anything interesting with their lives. What does it matter to you if they are mostly witches or mostly wizards, so long as you are not among them? And I suspect you will not be among them, Miss Davis; for although you are ambitious, you have no ambition.”

That’s not true!” said Tracey indignantly. “And what’s it mean?”

Professor Quirrell straightened from where he had been leaning against the wall. “You were Sorted into Slytherin, Miss Davis, and I expect that you will grasp at any opportunity for advancement which falls into your hands. But there is no great ambition that you are driven to accomplish, and you will not make your opportunities. At best you will grasp your way upward into Minister of Magic, or some other high position of unimportance, never breaking the bounds of your existence.”

—HPMOR, Chapter 70, Self-Actualization (part 5)

Added: The following is DEFINITELY NOT a strong argument, but just kind of an associative point. I think that Voldemort (both the real one from JK Rowling and also the one in HPMOR) would be much more likely to decide that he and his Death Eaters should have “Global Priorities” meetings than “Effective Altruist” meetings. (“We’re too focus on taking over the British Ministry for Magic, we need to also focus on our Global Priorities.“) In that way I think the former phrase has a more general connotation of ”taking power and changing the world” in a way the latter does not.

Some quick notes on "effective altruism"

I was just reflecting on the term 'global priorities'. I think to me it sounds like it's asking "what should the world do", in contrast to "what should I do". The latter is far mode, the former is near. I think that staying near mode while thinking about improving the world is pretty tough. I think when people fail, they end making recommendations that could only work in-principle if everyone coordinates at the same time, and also as a result shape their speech to focus on signaling to achieve these ends, and often walk off a cliff of abstraction. I think when people stay in near mode, they focus on opportunities that do not require coordination, but opportunities they can personally achieve. I think that EAs caring very much about whether they actually helped someone with their donation has been one of the healthier epistemic things for the community. Though I do not mean to argue it should be held as a sacred value.

For example, I think the question "what should the global priority be on helping developing countries" is naturally answered by talking broadly about the West helping Africa build a thriving economy, talk about political revolution to remove corruption in governments, talk about what sorts of multi-billion dollar efforts could take place like what the Gates Foundation should do. This is a valuable conversation that has been going on for decades/centuries.

I think the question "what can I personally do to help people in Africa" is more naturally answered by providing cost-effectiveness estimates for marginal thousands of dollars to charities like AMF. This is a valuable conversation that I think has has orders of magnitude less effort put into it outside the EA community. It's a standard idea in economics that you can reliably get incredibly high returns on small marginal investments, and I think it is these kind of investments that the EA community has been much more successful at finding, and has managed to exploit to great effect.

"global priorities (GP)"  community is... more appropriate  than "effective altruism (EA)" community... More appropriate (or descriptive) because it better focuses on large-scale change, rather than individual action

Anyway, I was surprised to read you say that, in direct contrast to what I was thinking, and I think how I have often thought of Effective Altruism.

Load More