Hide table of contents

Summary: Emrik and I discussed ways to improve EA. We mainly discussed my proposal for more debating plus organizing debate better. Debate is an asymmetric weapon unlike stuff related to social status, popularity and influence. We also went through my flowchart of a debate methodology I developed. The call had a friendly vibe, and was mostly about explaining and sharing ideas, with only a little debating.

Watch the discussion on YouTube

Outline of tree discussed in call

Truth Seeking vs. Status/Influence

(Everything said about Effective Altruism (EA) applies to many other groups. These are widespread issues.)

by Elliot Temple Nov 2022 https://criticalfallibilism.com

  • Improving EA
    • I’d like to improve EA a large amount. I have ideas for how to do that.
      • But if you just go to EA with an idea like that, one of the main things that happens is … nothing.
        • How do you get attention for a great idea?
          • Truth Seeking Processes
            • What makes a process truth-seeking?
              • True ideas have a large advantage
                • Example: rational, productive debate
                  • You need a process that does critical analysis. And realistically it needs to involve people talking with each other some.
                    • That basically means critical discussion or debate. People need to think critically about ideas and talk about their criticisms. Then true ideas have a large advantage because they will do better in critical analysis. (That’s not a guarantee, but it sure helps compared to other approaches that don’t significantly advantage true ideas.)
              • Truth-seeking processes are symmetrical
                • That means they work well regardless of which idea or side is correct
                  • A truth seeking process should get a good outcome, for everyone, if I’m right or if EA is right
                    • If whoever turns out to be wrong is disliked, mocked, viewed as low status, punished, etc., that’s not truth-seeking
                      • Being wrong shouldn’t be shameful
                        • (Bad faith, bad intentions, dishonesty, falsifying data, etc., are different than merely being wrong.)
            • What if my suggestions for EA are incorrect?
              • A truth-seeking process should work about equally well whether I’m right or wrong. It should get a good outcome either way. If I’m right, it should have a good chance to figure that out. If instead EA is right, it should have a good chance to figure that out.
                • If I’m right, a truth-seeking process should (probably) lead to EA changing, learning, reforming.
                  • If I’m wrong, a truth-seeking process should lead to me (probably) changing, learning, reforming.
                    • Whoever is wrong gets the better deal: they get to learn something. (If their error is known. This is all based on the best knowledge anyone has, not omniscience.)
                      • Ignoring wrong-appearing ideas has two main problems.
                        • First, you’re fallible. The ideas you ignore might actually be true.
                          • Second, you’re not giving the mistaken critic any way to change his mind and learn better. An opportunity for progress is lost, and he won’t become your supporter – he’ll instead go around telling people that he criticized your group, and you wouldn’t give any counter-arguments, so you’re clearly irrational and probably wrong. Reasonable people will be alienated from your group.
          • Social/Influence Processes
            • Being influential is something that (people with) false ideas can do well at
              • Getting high social status is something that (people with) false ideas can do well at
                • Marketing is something that (people with) false ideas can do well at
                  • Making friends is something that (people with) false ideas can do well at
                    • Establishing rapport is something that (people with) false ideas can do well at
                      • Creating a social network is something that (people with) false ideas can do well at
                        • Gaining karma/upvotes is something that (people with) false ideas can do well at
                          • Becoming popular is something that (people with) false ideas can do well at
                            • Making a positive first impression is something that (people with) false ideas can do well at
                            • Fitting in well with other EAs is something that (people with) false ideas can do well at
                            • Impressing people is something that (people with) false ideas can do well at
                            • Appearing smart is something that (people with) false ideas can do well at
                            • Writing post titles that people will click on (and won’t dislike as “clickbait”) is something that (people with) false ideas can do well at
                            • Posting frequently and repetitively, on a forum where old posts get little attention, is something that (people with) false ideas can do well at
                            • Appearing plausible to people is a social skill
                            • Emoting in ways people like is a social skill. E.g. how do you come off ambitious instead of arrogant? Getting that right is different than truth-seeking

Important point I forgot to say on the call

During the call, I got distracted and never said a key point. debate is for 1) reforming EA itself (i said that) and 2) setting a good example so that other groups will listen to reason (we didn't talk about this)

spreading good debate norms to other groups would let any good arguments, including EA's, have much more impact. imagine if 10% of all charities were open to debate and would change to more cost-effective approaches due to rational arguments. imagine if companies and politicians were open to debate.

EA currently has HUGE problems with most of it's best ideas being ignored – without counter-arguments – by almost everybody. this is so normalized that i think EAs don't notice and just take it for granted as how the world is.

i think this problem is fixable. if one decently sized group like EA was willing to become open to debate, i think that could show people the way and spread to other groups.

Put another way, i think getting EA to do rational debate is a harder problem than getting other groups to start doing it after EA. I don't think people should be put off because it seems hard to get other people/groups to be rational; if they'd actually go first, and get it right themselves, i think that's the key issue. in other words, scaling rational debate from 1 person to 10,000 is hard. scaling from 10,000 to millions is easier. you don't need to worry so much about the mass adoption problem. it's the early adoption problem that's more important. getting to 10k might not even be hard if it got to 100 so there were many positive examples (productive debates) and it was hard to just ignore without engaging.

BTW does anyone want to debate with me?

17

0
0

Reactions

0
0

More posts like this

Comments2
Sorted by Click to highlight new comments since: Today at 11:52 AM

Really intrigued by the idea of debates! I was briefly reluctant about the concept at first, because what I associate with "debates" is usually from politics, religious disputes, debating contests, etc. where the debaters are usually lacking so much of essential internal epistemic infrastructure that the debating format often just makes it worse. Rambly, before I head off to bed:

  • Conditional on it being good for EA to have more of a culture for debating, how would we go about practically bring that about?
    • I wonder if EA Global features debates. I haven't seen any. It's mostly just people agreeing with each other and perhaps  adding some nuance.
    • You don't need to have people hostile towards each other in order for it to qualify as "debate", I do think one of the key benefits of debates is that the disagreement is visible.
    • For one, it primes the debaters to hone in on disagreements, whereas perhaps EA in-group are overly primed to find agreements with each other in order to be nice.
    • Making disagreements more visible will hopefwly dispel the illusion that EA as a paradigm is "mostly settled", and get people to question assumptions. This isn't always the best course of action, but I think it's still very needed on the margin, and could get into why if asked.
    • If the debate (and the mutually-agreed-upon mindset of trying to find each others' weakest points) is handled well, it can onlookers feel like head-on disagreeing is more ok. I think we're mostly a nice community, reluctant to step on toes, so if we don't see any real disagreements, we might start to feel like the absence of disagreement is the polite thing to do.
  • A downside risk is that debating culture is often steeped in the "world of arguments", or as Nate Soares put it: "The world is not made of arguments. Think not "which of whese arguments, for these two opposing sides, is more compelling? And how reliable is compellingness?" Think instead of the objects the arguments discuss, and let the arguments guide your thoughts about them."
  • We shouldn't be adopting mainstream debating norms, it won't do anything for us. What I'm excited about is the idea making spaces for good-natured visible disagreements where people are encouraged to attack each others' weakest points. I don't think that mindset comes about naturally, so it could make sense to deliberately make room for it.
  • Also, if you want people to debate you, maybe you should make a shortlist of the top things you feel would be productive to debate you on. : )

Part of what's going on here is that Popperian epistemology says, in brief summary, we learn by critical thinking and debate (both within our mind and with others). Bayesian epistemology does not say that. It (comparatively) downplays the roles of debate and criticism.

In the Popperian view, a rational debate is basically the same process as rational thinking but externalized to involve other people. Or put the other way around, trying to make critical arguments about ideas in your head that you're considering is one of the main aspects of thinking.

I'm unaware of any Bayesian who claims to have adequate knowledge of Popper who has written some kind of refutation of Popperian epistemology, or who endorses and takes responsibility for a particular refutation written by someone familiar with Popper's views. This is asymmetric. Popper wrote refutations of Bayesian ideas and generally made a significant effort to critically analyze other schools of thought besides his own and to engage with critics.

Also, if you want people to debate you, maybe you should make a shortlist of the top things you feel would be productive to debate you on. : )

The things I'm most interested in debating are broad, big picture issues like about debate methodology or what the current state of the Popper/Bayes debate is (e.g. what literature exists, what is answered by what, what is unanswered). Attempts to debate other topics will turn into big picture discussions anyway because I will challenge premises, foundations or methodology.

The debate topic doesn't really matter to me because, if it isn't one of these big picture issues, I'll just change the topic. The bigger picture issues have logical priority. Reaching a conclusion related to e.g. poverty depends on debate and thinking methodology, what epistemology is correct, what knowledge is, what is a good argument, what does it take to reach a conclusion about an issue, how should people behave during debates, when should people use literature references or write fresh arguments, etc. I don't want to name some attention-getting issues as potential debate topics and then effectively bait-and-switch people by only talking philosophy. I'll either talk about the issues I think have logical priority or else, if we disagree about that, then about which issues have logical priority and why. Either way it'll be fairly far removed from any EA causes, though it'll have implications for EA causes.

Curated and popular this week
Relevant opportunities