All of Davis_Kingsley's Comments + Replies

Very cool! I wasn't aware of this and am also interested to see how it goes.

Thanks for the post! I think it provides an interesting (and data-driven!) counterexample to the narratives of neuroticism/scrupulosity around the EA community.

One thing that I would really be interested in is how this issue would look in a survey of people who once identified as EA but no longer do so, since I agree that those who leave EA due to bad experiences are probably underrepresented in this data. As you say, EA isn't one-size-fits-all, so insofar as people are selecting out that doesn't necessarily mean there's a huge problem -- but I'm still curious to see what those numbers would look like!

5
Catherine Low🔸
Turns out Rethink Priorities have been funded by EA Funds to work on something that could shed some light on this question. I'm looking forward to hearing how they get on at gathering survey respondents and seeing the results they get.

It's not precisely OpenPhil, but GoodVentures' recent surprise withdrawal from several cause areas and refusal to even publicly say what all the areas they were withdrawing from were comes to mind...

A while ago -- 2017, maybe? -- I remember attending EA Global in San Francisco, where Will MacAskill gave as either the keynote or the closing talk an address on the theme "Keep EA Weird". Do people still support this ideal? I notice that GoodVentures recently stopped funding some "weirder" cause areas, for instance.

It's at least possible to me that as EA has gotten bigger, some of the "weird" stuff has been pushed to the margins more and that's correct, but I'm not sure I've seen a detailed discussion of this, at least when it comes to cause areas rather than debates about polyamory and the like.

I really don't think -- at all -- that one's ability to give talks at EAG is at all centrally based on whether Emile Torres has denounced you on Twitter or whatever. As I understand it Torres has gone after a long list of prominent EA figures for various reasons (including Nick Bostrom, Will, Toby, etc.) who continue to be quite involved.

(Disclaimer: I worked in events for CEA some years ago but was not involved with managing the admissions process for EAG, selecting EAG keynote speakers, etc. -- indeed I am not even sure who all is on that team at present.)

Thanks for the edits!

I indeed attended LessOnline for a day, but not Summer Camp or Manifest; while there I didn't notice the "race science" angle you mention but I was only there for a day and spent a bunch of that time presenting classes/sessions on rationality stuff and then talking to people afterwards, so you probably have a broader sense of what was present during "the events as a whole" than I do.

This is pretty concerning to me (as someone who didn't attend Manifest but very well might have under other circumstances). I knew Hanania had been at Manifest before and would perhaps be there again, but didn't realize the event overall had this level of "race science" presence? I hope the Manifest organizers take action to change that for future events, and in general have thought some of Manifold's recent decisions seemed rather too "edgy"/sensationalist/attention-seeking (not sure of the right word here...) for my taste.

However, this post also rubs me ... (read more)

Thanks Davis Kingsley! I edited my post to include a mention that The Guardian article is flawed, and that Vassar has been more or less excommunicated (I had already replaced the Vassar mention with a link to Saul expanding on Vassar's attendance).

I guess I am happy to hear that my vibes on rationalists vs. EAs doesn't ring true to you—I hope you are right on this regard.

I changed the Republicans part into strikethrough, since multiple people have objected to it now, but left the Thielosphere mention as Thiel is tied to Yarvin, who is tied to race stuff. T... (read more)

2
David Mathers🔸
Depends how you define "strong" for ACX. I think the median was 4.something on a 1 most left to 10 most right scale: https://docs.google.com/forms/d/e/1FAIpQLScHznuYU9nWqDyNvZ8fQySdWHk5rrj2IdEDMgarf3s34bSPrA/viewanalytics  But yes, I'd say ACX has a long history of too much tolerance of the far-right, but most readers are not far-right themselves. (The comments section is generally more right-wing than the lurkers I think.)  I think the current political situation in the US is somewhat problematic in the context of inclusion/exclusion, because on the one hand, nearly half of Americans with a party affiliation are Republicans and that MUST include many decent people who would bring good things to the movement, but on the other hand I also do think that the mainstream Republican party, so long as its leading figure is Trump, will remain an anti-democratic menace, as demonstrated by Trump's behavior around the last election. (Something I think Scott Alexander himself actually agrees with as far as I can tell, ironically.) For Thiel specifically, he is fairly strongly associated with Yarvin as far as I remember, who is clearly a fascist. I am therefore generally against attracting Thiel fans. There are probably some exceptions though: libertarians who admire Thiel for other reasons and are  just in denial about how fash-y his views are.  Tyler Cowen, who seems ok to me, is probably in that category. 

Yes, to be clear I'm not criticizing the initial decision to run but rather the dubious impact estimates and calls to action towards the end of that campaign.

I find myself quite skeptical of this analysis following the dramatically failed predictions (and more direct calls to action) regarding the tractability of the Carrick Flynn campaign in 2022, which now seems like a major blunder. If anything I think there's a stronger case for that sort of thing than there is for national presidential elections...

I think the 2nd place result for Carrick is quite good for a 1st-time candidate with 1st-time political action team behind. There were many mistakes obviously, but deciding to run was not one of them IMO. No political action will result in certainty, the goal is ~always to move the needle or take a bunch of swings.

Peter
19
10
0

I think it's good to critically interrogate this kind of analysis. I don't want to discourage that. But as someone who publicly expressed skepticism about Flynn's chances, I think there are several differences that mean it warrants closer consideration. The polls are much closer for this race, Biden is well known and experienced at winning campaigns, and the differences between the candidates in this race seem much larger. Based on that it at least seems a lot more reasonable to think Biden could win and that it will be a close race worth spending some effort on. 

Thanks for the tip! Just donated my mana to GiveDirectly.

Congratulations Niel! Best of luck with the future of 80k!

Whatever happened to AppliedDivinityStudies, anyway? Seemed to be a promising blog adjacent to the community but I just checked back to see what the more recent posts were and it looks to have stopped posting about a year ago?

I am around!
https://twitter.com/alexeyguzey/status/1668834171945635840

In general I think "TESCREAL" is a bad term that conflates a bunch of different things in order to attack them all as a group and I'd prefer not to see it used.

I consider this sort of "oh, I have a take but you guys aren't good enough for it" type perspective deeply inappropriate for the Forum -- and I say that as someone who is considerably less "anti-Anthropic" than some of the comments here.

I interpret it as broadly the latter based on the further statements in the Twitter thread, though I could well be wrong.

Congrats Ben, and count me in as another voice in favor of this type of humor on the Forum!

Yes, to be clear I don't think Oli was necessarily claiming that -- I was replying to Jonas here, who listed Tara as one of "the Leverage people" in his own comment.

Wait, was Tara a Leverage person? Kerry and Larissa work for Leverage now and Tyler was affiliated in the past, but I wasn't under the impression Tara was particularly involved with Leverage -- though I could of course be wrong!

1
DanielFilan
I do not read Oli as saying that Tara was at Leverage, and I've never heard that she was.

A while ago I remember seeing some  discussion of EA analysis of Ukraine relief following the Russian invasion -- perhaps some EAs from Poland were involved? Did this ever get comprehensively written up anywhere?

2
Andy_Schultz
Here was one project: https://efektywnyaltruizm.org/blog/help-for-ukrainians/. I found that link on https://forum.effectivealtruism.org/posts/gacpE79RKke2foG9K/rough-attempt-to-profile-charities-which-support-ukrainian.

I quite suspect people at Anthropic are already thinking of considerations like this when deciding what to do and am not sure that an anonymous post is needed here.

While I don't like this post, I think someone should be writing a more detailed post along these lines to provide more context for people outside of Anthropic. It feels like many newer people in AI safety have positive feelings about Anthropic by default because of its association with EA and a post that causes people to think some more about it could be good. 

Thanks for posting this! I appreciate the legibility and insight into the process here, especially during a stressful time in EA/on the Forum. 

Thanks for posting this. I think giving detailed reflections and "lessons learned" like this can be really helpful in these sorts of situations, but I also recognize it can be tough to do in public. Positive reinforcement for this openness and frank discussion!

Historical note: If EA had emerged in the 1970s era of the gay rights movement rather than the 2010s, I can imagine an alternative history in which some EAs were utterly outraged and offended that gay or lesbian EAs had dared to invite them to a gay or lesbian event. The EA community could have leveraged the latent homophobia of the time to portray such an invitation as bizarrely unprofessional, and a big problem that needs addressing. Why are we treating polyamory and kink in 2023 with the same reactive outrage that people would have treated gay/lesbian s

... (read more)

Side-note: the OP says "Wildly unusual social practices like polyamory", but I think poly is fairly common in the Bay Area outside of EA/rat circles.

I suspect it's fairly common in other young, blue-tribe, urban contexts in the US too? (Especially if we treat "polyamorous", "non-monogamous", and many "monogamish" relationship styles as more-or-less the same phenomenon.)


I've heard this argument before but I think it's quite overstated. I grew up in the SF Bay Area and still am in touch with many friends from childhood. They are generally young, blue-tribe, ... (read more)

-4
Anthony Repetto
Third Generation Bay Area, here - and, if you aren't going to college at Berkeley or swirling in the small cliques of SF among 800,000 people living there, yeah, not a lot of polycules. I remember when Occupy oozed its way through here that left a residue of 'say-anything-polyamorists' who were excited to share their 'pick-up artist' techniques when only other men where present. "Gurus abuse naïve hopefuls for sex" has been a recurring theme of the Bay, every few decades, but the locals don't buy it.

It's also worth noting that I am an adult convert to Catholicism and was involved with the Bay Area rationalist and EA community (and uncomfortable with the "polyamory pressure" in that community) for years before joining the Church, including some time when I didn't take religion seriously much at all. Claiming or implying that I hold my views (or faced backlash against them) just because I'm Catholic does me a disservice.

I note also that others in the community who are not (as far as I know) Catholic have faced backlash for their views against polyamory ... (read more)

No, but if you say "polyamory has been a problem in the EA (and rationalist) communities for a long time" and people know that you do in fact believe polyamory to be immoral, it's completely reasonable for them to respond as Kelsey did?
 


Most people don't know that and I wasn't asserting it here -- that would be much more controversial and much more of a debate than I wanted to have, and further one that I don't think is very appropriate for the EA Forum! My hope is (was?) that even people who quite disagree with me -- including many polyamorous people -- would have common cause in opposing the pressure to be polyamorous that has been prevalent.

Imagine I wrote:

I think veganism has been a problem in the EA community for a long time and has led to some bad dynamics where people have been pressured to go without food that meets their nutritional needs, including residential multi-day events where only vegan food was served.

If someone, knowing my views on animals that are probably about as well known as your views on sexual morality, responded as if I was saying animal welfare doesn't matter, I think that would be pretty reasonable. And if I didn't want that interpretation I'd need to drop the "veganism has been a problem" bit and just talk about the particular bad dynamics I was opposed to.

I am a Catholic -- though I would not call myself a traditionalist -- and I believe what the Church teaches, including on matters of sexuality. Bringing my religion up in this way feels like a character attack that ought to be below the standards of the EA Forum though, and I'm grieved to see it.

My posts here are not saying "Polyamory is a sin, convert to Catholicism." They are not saying "you should be pressured into monogamy." Those things seem much more contentious than what I'm going for here. Instead, I am saying that there has long been in fact the e... (read more)

1
ZachWeems
Clarifying for forum archeologists: "traditionalist" in Catholicism refers to people who consider the theological claims and organizational changes in Vatican II to be illegitimate, or at minimum taken too far. Catholics who consider the Church to have divinely guided authority over religious and moral truths will sometimes call themselves "orthodox" (lowercase) Catholics, to distinguish themselves from those who don't accept this & from traditionalists who accept everything up to Vatican II. So, ozymandias intended to indicate "Davis accepts the Vatican's teaching on sin, hell, sexual mores, etc". Davis objected to an adjective that implied he rejects Vatican II.

I also think it’s quite reasonable for a religious person to give secular arguments for worldviews which also happen to be held in their religion.

For example, if Davis was making a humanistic argument for why people should take Giving What We Can’s 10% pledge, then accusing him of disingenuously trying to sneak in the “Catholic agenda” of giving a tithe to the poor doesn’t seem fair.

Or imagine if a Jain was giving a humanistic argument for why people should be vegetarian, and they were accused of disingenuously trying to sneak in the “Jain agenda” of animal welfare.

It's also worth noting that I am an adult convert to Catholicism and was involved with the Bay Area rationalist and EA community (and uncomfortable with the "polyamory pressure" in that community) for years before joining the Church, including some time when I didn't take religion seriously much at all. Claiming or implying that I hold my views (or faced backlash against them) just because I'm Catholic does me a disservice.

I note also that others in the community who are not (as far as I know) Catholic have faced backlash for their views against polyamory ... (read more)

My posts here are not saying "Polyamory is a sin, convert to Catholicism."

No, but if you say "polyamory has been a problem in the EA (and rationalist) communities for a long time" and people know that you do in fact believe polyamory to be immoral, it's completely reasonable for them to respond as Kelsey did?

If you want people only to respond to the more limited "people should not be pressured into polyamory" perhaps you should say that explicitly?

Yes, I'm not sure this needs to be said but just to be clear -- I also don't think CEA or whatever should have a "talking people out of polyamorous relationships" department, and this would seem like a bizarre overreach to me.

I'm thinking of things much more along the lines of "discourage the idea of polyamory as 'more rational' and especially polyamory pressure in particular", not "make EA institutions formally try to deconvert people from polyamory" or whatever.

To be clear, the thing I was wishing we had resolved internally was much more the widespread pressure to be polyamorous in (at least some parts of?) EA rather than individual people's relationships; as you say, it would not be appropriate for the EA community to have a discussion about how to "resolve" your personal relationships.  What would that even mean?

However, I think that this is far from the first time that major cultural issues with polyamory and unwelcome pressure to be polyamorous have been brought up, and it does seem to me that that's the... (read more)

In the article, Gopalakrishnan mentions having raised her concerns earlier only to be dismissed and attacked, told that she was "bigoted" against polyamorous people

The article has "One commenter wrote that her post was 'bigoted' against polyamorous people."

While Gopalakrishnan has deleted the post and the comments are no longer visible, my memory is that the comment describing her as saying something bigoted was reasonable?

I think polyamory has been a problem in the EA (and rationalist) communities for a long time and led to both some really uncomfortable and concerning community dynamics and also just a lot of drama and problems. Multiple high-profile women have told me that they felt pressured to be polyamorous by men in the community and/or felt that polyamory was bad but they didn't feel comfortable speaking up against it, and I've faced some degree of community social backlash myself for speaking out (even informally!) against polyamory. 

In general I think this has been kind of an ongoing issue for quite some time, and I wish we had resolved it "internally" rather than it being something exposed by outside investigators.

I think that relevant context for backlash against Davis Kingsley's anti-polyamory views is that he is an orthodox Catholic. His anti-polyamory views are part of a set of fairly extreme views about sexuality, including being opposed to homosexuality, masturbation, contraception, premarital sex, and any sexual intercourse other than PIV. He has also expressed the viewpoint that polyamory should be socially stigmatized and people should be pressured into monogamy. I believe that much, perhaps most, of the backlash he has faced is due to the overall set of hi... (read more)

Multiple high-profile women have told me that they felt pressured to be polyamorous by men in the community

I too have (consistently) seen this, so I am grateful to hear it being brought up publicly

I am very bothered specifically by the frame "I wish we had resolved [polyamory] "internally" rather than it being something exposed by outside investigators."

I am polyamorous; I am in committed long-term relationships (6 years and 9 years) with two women, and occasionally date other people. I do not think there is anything in my relationships for "the community" to "resolve internally". It would not be appropriate for anyone to tell me to break up with one of my partners. It would not be appropriate for anyone to hold a community discussion about how to '... (read more)

You say :

Whenever someone in your life asks you half-jokingly asks "how can I become smart like you?", you no longer need to answer "Have you ever read Harry Potter?" because Projectlawful.com does not have Harry Potter in it.

On the contrary, this is a work I strongly wouldn't recommend, and especially not to newcomers. It's highly sexualized, contains descriptions of awful torture and various other forms of extreme misconduct, has a bunch of weird fetish material that more or less immediately disqualifies it as an intro rec in my opinion (far more so than... (read more)

I recognize this comment may not be received well here, but I think things like this are quite bad for EA to support -- there are very substantial political skew issues in the movement already, and running political candidates as a EA intervention seems like another step down a road I think the movement needs to quickly depart.

The "Organizations vs. Getting Stuff Done" post is about anarchist political activism. This is a rather unusual area -- under normal circumstances organizations are a relevant tool to aid in getting things done, not an obstacle to it.

1
quinn
to partially rehash what was on discord and partially add more:  * I don't think saying that institutions have benefits and are effective is at all an argument against specific drawbacks and failure modes. Things that are have pros can also have cons, pros and cons can coexist, etc.  * I agree that a portion of the criticism is moot if you don't on priors think hierarchy and power are intrinsically risky or disvaluable, but I think having those priors directs one's attention to problems or failure modes that people without those priors would be wise to learn from. Moreover, if you look at the four points in the article, I don't think those priors are critical for any of them. * specifically, I think a variety of organizations are interested in trading off inefficiency problems of bottom-up against the information bottleneck problems of top-down. People who are motivated by values to reject the top-down side would intuitively have learned lessons about how to make the bottom-up side function. * If I find the name of the individual, I'll return to thread to make my point about the german scientist who may have prevented the nazis from getting nukes by going around and talking to people (not by going through institutional channels)

To me this seems like essentially a "cheap shot" -- you could write basically this story in support of very many positions. Imagine a story that's like "wow, this guy was a utilitarian, even back then people knew utilitarianism could lead to unacceptable conclusions, we're getting rid of his statue" or whatever. In fact, you could probably write a story like this against certain ideas in EA animal thought.

Yeah, IIRC both G.K. Chesterton and C.S. Lewis wrote about how anyone can just say "the future will agree with me," as a way of getting support for your ideas, but nobody really knows about the future and probably everyone is wrong because the future will be more complicated than anyone thinks, and so arguments from the future are bad logic and invalid. (I think that Lewis's is a bit of the Screwtape Letters and that Chesterton's essay is in "What's Wrong With The World.") So I endorse this complaint.

But I didn't include that in my description because I do in fact think veganism will take over the world once the technology gets far enough, so that wasn't my true objection to the story.

One relevant concept might be that of the feedback loop, where the output of a process affects the input. For instance, if you survey only people who are already attending your events as to how to improve them, you might wind up missing ways to improve it for those who didn't attend. After several cycles of this you might wind up with an event that is very appealing for the "in crowd" but which doesn't much appeal to newcomers.

Note that Torres was banned from the forum for a year following a previous discussion here where he repeatedly called another member a liar and implied that member should be fired from his job.

Good point re: Charity Entrepreneurship.

I'm somewhat more skeptical of the grantmaking thing though because there are few enough positions that it is not very legible who is good at it, whether others currently outside the field could do better, etc.

I could be wrong -- I can point to specific things from some grantmakers that I thought were particularly good, for instance -- but it doesn't feel to me that it's the most amenable field for such a program. 

(Note that this is low-confidence and I could be wrong -- if there are more objective grantmaking skill metrics somewhere I'd be very interested to see more!)

5
Kirsten
Some trainable things I think would help with grantmaking: -knowledge of the field you're making grants in -making a simple model to predict the expected value of a grant (looking for a theory of change, forecasting the probability of different steps, identifying the range of possible outcomes) -best practices for identifying early signs a grant won't be worth funding, to save time, without being super biased against people you don't know or from a different background to you who eventually could do good work -giving quality feedback to successful and unsuccessful applicants -engaging with donors (writing up summaries of why you gave different grants, talking to people who are considering donating through your fund) -evaluating your grants to learn how closely what really happened matched your model It doesn't seem to me obviously less trainable then being a Navy seal

My impression is that the people who end up working in EA organizations are not on the same tier of discipline, work ethic, commitment, etc. as elite military forces and are not really even very close?

I don't say that to disparage EA direct workers, I'm involved in direct work myself  -- but my sense is that much more is possible. That said, as you mention the amount of discipline needed may simply not be as high.

4
AppliedDivinityStudies
Yeah again, for highly creative intellectual labor on multi-decade timescale, I'm not really convinced that working super hard or having no personal life or whatever is actually helpful. But I might be fooling myself since this view is very self-serving.

For some reason I can't see the draft, when I click on the notification I received for it it says "Error: app.operation_not_allowed" and kind of glitches out the interface until I refresh. Apologies!
 

(edit: fixed now, thanks!)

Thanks, I'm impressed by this reply and your willingness to go out there and do a survey. I will have more substantive feedback later as I want to consult with someone else before making a further statement -- ping me if I haven't replied by Friday.

1
QubitSwarm99
Thank you for your kind words. I will ping you midday-evening Eastern time on Friday if I see no reply. I am going to make a full post (probably by this evening), so please reply to that instead of in this comment thread, if possible. Hope you have a nice day.

I (very anecdotally) think there are lots of people who are interested in donating to quite specific cause areas, e.g. "my father died of cancer so I donate to cancer charities" or "I want to donate to help homelessness in my area" -- haven't studied that in depth though.

Hmm, I remember seeing a criticism somewhere in the EA-sphere that went something like:

"The term "longtermism" is misleading because in practice "longtermism" means "concern over short AI timelines", and in fact many "longtermists" are concerned with events on a much shorter time scale than the rest of EA."

I thought that was a surprising and interesting argument, though I don't recall who initially made it. Does anyone remember?

This sounds like a misunderstanding to me. Longtermists concerned with short AI timelines are concerned with them because of AI's long lasting influence into the far future.

The most important thing in life is to be free to do things. There are only two ways to insure that freedom — you can be rich or you can you reduce your needs to zero. I will never be rich, so I have chosen to crank down my desires. The bureaucracy cannot take anything from me, because there is nothing to take.

Colonel John Boyd

6
Ben_West🔸
This is great. Much more eloquent than my post.

I think this comment, while quite rude, does get at something valuable. There's an argument that goes "hmm, the outside view says this is absurd, we should be really sure of our inside view before proceeding" and I think that's sometimes a bit of a neglected perspective in rationalist/EA spaces.

I happen to know that the inside view on HPMoR bringing people into the community is very strong, and that the inside view on Eli Tyre doing good and important work is also very strong. I'm less familiar with the details behind the other gra... (read more)

I think there is something going on in this comment that I wouldn't put in the category of "outside view". Instead I would put it in the category of "perceiving something as intuitively weird, and reacting to it".

I think weirdness is overall a pretty bad predictor of impact, both in the positive and negative direction. I think it's a good emotion to pay attention to, because often you can learn valuable things from it, but I think it only sometimes tends to give rise to real arguments in favor or against an idea.

It is also v... (read more)

I don't agree with all of the decisions being made here, but I really admire the level of detail and transparency going into these descriptions, especially those written by Oliver Habryka. Seeing this type of documentation has caused me to think significantly more favorably of the fund as a whole.

Will there be an update to this post with respect to what projects actually fund following these recommendations? One aspect that I'm not clear on is to what extent CEA will "automatically" follow these recommendations and to what extent there will be significant further review.

I really admire the level of detail and transparency going into these descriptions, especially those written by Oliver Habryka

Hear, hear.

I feel proud of the commitment to epistemic integrity that I see here.

I will make sure to update this post with any new information about whether CEA can actually make these grants. My current guess is that maybe 1-2 grants will not be logistically feasible, but the vast majority should have no problem.

Just posting to acknowledge that I've seen this - my full reply will be long enough that I'm probably going to make it a separate post.

Neither is poverty alleviation or veganism or anything else in practice.

Again, strong disagree - many things are not politicized and can be answered more directly. One of the main strengths of EA, in my view, is that it isn't just another culture war position (yet?) - consider Robin Hanson's points on "pulling the rope sideways".

2
kbog
I think I'm losing track of the point. What does it mean to answer something "more directly"? I'm not sure how that's relevant here since I'm clearly saying that we're not taking a position on abortion.
You said the problem was stating it authoritatively rather than the actual conclusions, I made it sound less authoritative but now you're saying that the actual conclusions matter.

Sorry, I perhaps wasn't specific enough in my original reply. The "less authoritative" thing was meant to apply to the entire document, not just this one section - that's why I also said I wasn't sure documents like this are good for EA as a movement.

I think there's something unhealthy and self-reinforcing about tiptoeing around like that. The
... (read more)
3
kbog
In the preface I state that hedging language is minimized for the sake of readability. Neither is poverty alleviation or veganism or anything else in practice.

Like I said, that's not really the point - it also doesn't meaningfully resolve that particular issue, because of course the whole dispute is whose well-being counts, with anti-abortion advocates claiming that human fetuses count and pro-abortion people claiming that human fetuses don't.

I dunno, maybe I'm overly cautious, but I'm not fond of someone publishing a well-made and official-looking "based on EA principles, here's who to vote for" document, since "EA principles" quite vary - I think if EA becomes seen as politically aligned (with either major US party) that constitutes a huge constraint on our movement's potential.

2
kbog
You said the problem was stating it authoritatively rather than the actual conclusions, I made it sound less authoritative but now you're saying that the actual conclusions matter. The document has sufficient disclaimers as it is, I mean the preface clearly says EAs could disagree. You don't see Givewell writing "assuming that poverty is the #1 cause area, which EAs may disagree on" multiple times and I don't treat politics with special reverence as if different rules should apply. I think there's something unhealthy and self-reinforcing about tiptoeing around like that. The point here is to advertise a better set of implicit norms, so that maybe people (inside and outside EA) can finally treat political policy as just another question to answer rather than playing meta-games. If I care about total well-being, then of course people who say that some people's well being doesn't count are going to be wrong. This includes the pro lifers, who care about the future well being of a particular fetus but not the future well being of any potential child (or not as much, at least).
Load more