All of Gordon Seidoh Worley's Comments + Replies

Still short on details. Note that this says "in principal" and "initial board". Finalization of this deal could still require finding 3 more board members to replace Helen and Tasha and Ilya who they'd still be happy with to have EA/safety voices. We'll have to wait to see what shakes out.

Yes. It's hard to find people who are poorer because of automation once we smooth over short-term losses.

What's easier to find is people who felt poorer because they lost status, even though they actually had more purchasing power and could afford more and better goods. But they weren't actually economically poorer, just felt poorer because other people got richer faster than them.

I guess it depends on what kind of regulation you're thinking of.

While it's true that the US and EU value individual liberty highly, these countries are also quite motivated to regulate arms to maintain their technological lead over other countries, for example by regulating the export of cyber, nuclear, and conventional weapons and putting restrictions on who can be part of their supply chain. Smaller countries have been more willing to treat other countries as equals when it comes to arms and not worry about the possibility of attach since they feel litt... (read more)

Mostly seems like a good thing to me. The more chips needed to build AI are dependent on supply chains that run through countries that are amenable to regulating AI, the safer we are. To that end, routing as much of the needed chip supply chain through the US and EU seems most likely to create conditions where, if we impose regulations on AI, it will take years to build up supply chains that could circumvent those regulations.

1
CC
1y
If I understand correctly, it sounds like you think the US and EU are more amenable to regulating AI than mainland China, Taiwan, or South Korea. Is that true? The US and some European countries are famous for valuing individual liberty, so on that basis I might guess they would be less amenable to regulation than East Asian countries. And this article seems to indicate that the Chinese government is already regulating AI more strictly than the US government is: https://www.theverge.com/2023/2/22/23609945/china-ai-chatbots-chatgpt-regulators-censorship

Personally I downvoted this post for a few reasons:

  • insufficient details to evaluate the claims
  • claims are not stated clearly
  • I found it hard to follow because it contained various abbreviations with no explanation of them
  • assumes reader has context that is not presented in the post

To me this reads more like publicly posting content that was written only with an audience of folks working at CEA or similar orgs in mind. So I downvoted because it doesn't seem worth a lot of people reading it since it's unclear what value there is there for them. This isn't to sa... (read more)

-1
There_is_a_problem
1y
I wouldn't put myself at risk of defamation by writing a post detailing SA in EA, nor do I want to speak for CH's take on this, which is why the evidence feels so scant. With SA, it'll always be hard to prove it beyond a reasonable doubt, so even if I gave you some evidence, you could refute it.   Also, this probably the post speaks more strongly to people who were following the post on the Time article (esp my posts) but doesn't make a lot of sense if you just dropped in now. 

This is a short note to advise you against something called CouponBirds.

I don't know much about them other than they're creating lots of spam in a bid to soak up folks who are bummed about the loss of Amazon Smile. They've sent me emails and posted spammy comments on posts both here and on Less Wrong (I report them each time; they keep creating new accounts to post).

If you were thinking of using them, I encourage you to not because we should not support those who spam if we want to live in a world with less spam.

And EA is the social life, not professional for a lot of us.

I want to say something specifically in response to this.

It's great that EAs can be friends with each other. But EA has a mission. It's not a social club. We're here to do good better. Things that get in the way of doing good better should be dropped if we actually care about the mission.

The trouble is I know lots of people have impoverished social lives. They have maybe one or maybe two communities to build social bonds within. So when you're in that stance the natural thing is to try to extract ... (read more)

"The better strategy is to get some more communities!"

Does this really work for most people?

I think my life over the past year or so has been substantially enriched as I've gone from seeing my rationalist group friends in my city from once a month or so to 1-2 times a week, but at the same time, as a reasonably introverted and Aspie person, who also has a three month old who always wants to be carried, this has close to maxed out my social meter. I don't think normal people can maintain having more than one or two real space communities that they are reall... (read more)

8
RobBensinger
1y
As a poly person whose main relationship over the last seven years has been with a non-rationalist non-EA, I want to say: Ingroup and outgroup people are both great. I think it would impoverish the community, and be a tragic loss of a lot of beautiful friendships and relationships, if people tried hard to avoid dating anyone from one group or the other.

I've not tried to quantify this, but I've lived in a bunch of rationalist/EA houses. I've seen the dynamics up close. The downsides are very large and massively outweigh the upsides based on what I've seen. The only thing that makes people think the upsides outweigh the downsides is, I suspect, that they are desperate.

This isn't really weird, though. This is just what seems to happen in lots of communities. Dating within a community is usually dangerous to the community, and this holds for lots of communities. This is a fully general phenomenon among human... (read more)

2
timunderwood
1y
So why do you think dating within a church, or an university community, or maybe a high school, for example, works fine? Or is the argument that my impression that these things are fine is incorrect? More to the point: It seems like the minimalist approach to dealing with issues of dating other people in a group house is to encourage group houses to form specific rules around that, and not to change the general culture of the community. This  also still isn't a clear attempt to look at the downsides of making it very clear to everyone that you should not ask anyone out at an EAGx event, and that it is very bad juju if you even think that sleeping with the professional partner you just met at one of them would be a nice thing to do. Phrased in better corporate speak, of course. Also telling local community organizers to make sure they regularly announce at groups that we don't want anyone who meets someone here to date someone else here. That would be bad, and Time Magazine might someday find the worst thing that ever happened in such a situation and write about it, and we are now optimizing to avoid that. I mean I organize a LW/ACX meetup, and if I was told that, I'd ignore it, possibly rename my group to add 'unofficial' to the title, and be seriously annoyed with the meetup meta organizer person. And this is despite the point that there is only one woman who regularly shows up at the events. And if that is not what you think should be done, then what is the specific set of policy changes we are proposing? Or is it just giving people a cultural vibe that dating people who you might interact with professionally can have serious downsides? I mean sure, and we've made a general cultural attempt to make a big chunk of the individual upsides illegal at the same time because they are seen as being bad systemically. But this is completely irrelevant to me, since I have no expectations of being professionally involved with people who I might date in the community. I

Conflicts of interests is a way to put it, but I think it massively undersells it.

To me the thing not dating in EA (or at your company) is it upholds an extremely valuable professional norm of preventing your personal and work life from colliding in messy ways. The trouble is that breakups happen, they are messy, people do and say things they will regret, and you want as much separation between the personal and professional parts of your life in such a situation. That way only a part of your life is on fire. If you're like many people and only really have ... (read more)

The upside is huge for a lot of people. And EA is the social life, not professional for a lot of us.

Have you actually tried to quantify either the downside or the upside here, or are you saying that one is small and the other is big based on a general intuition?

As someone who thinks banning dating in EA would be bad, I really, really want to see people on the other side of this issue try harder to make their argument as rigorous as possible.

To be clear, I am mostly saying don't date other EAs most of the time, especially if you are doing more than small scale earning to give. If you plan to work in EA, then EA is your office. EA is too small to think of it as an ecosystem where people can find other opportunities. There's one EA game in town. That's the place I think it's fraught to date.

What problem are you trying to solve by recommending to not date within EA? 

If it's conflicts of interest, it seems like you'll get more mileage directly promoting norms of avoiding conflict of interest by disclosing what would bias judgement and avoiding being a decision maker in that situation.
As one anecdote, I worked in a typical enough non EA startup in which multiple coworkers had romantic relationships with each other, and multiple coworkers had strong friend relationships with each other. In my experience management decisions were more highly ... (read more)

There's a lot I disagree with in this post, but there's one part I super agree with:

Be much, much less accepting of any intersection between romance and office/network

Traditionally dating happens with your 2nd and 3rd order connections, not your first. Also, dating in a professional (or related) setting is very likely to lead to bad outcomes. We know this.

I realize people want to date like-minded people. There are lots of them out there who aren't in EA! You just have to look for them.

I realize people want to date like-minded people. There are lots of them out there who aren't in EA! You just have to look for them.

source?

This comment and the OP are blurring the line between "office" and "network". I think some people want a strong taboo within EA against dating co-workers, and other people want a strong taboo within EA against dating other EAs. "Network" makes it sound like folks are proposing the latter, but most of the specifics so far are about workplace relationships. Regardless of the merits of the different views, it seems helpful to clearly distinguish those proposals and argue for them separately.

I'm curious why this has gotten so many downvotes but no comments indicating disagreement.

It's on net positive karma but vote count makes it clear there are downvotes.

Crossposting has been a huge win in my opinion.

It used to be what you did was post on one site and then manually crosspost to the other. This was annoying and somewhat error prone and you had to setup the links yourself by editing at least one post after it was published.

Automatic crossposting eliminates that mess, and it adds the nice feature of letting you know if there's comments on the other site (since you don't want to read the same thing twice but you might want to know about the conversation on the other site).

The feature is also fairly easy to ign... (read more)

I think Scott's argument for for openness to eccentrics on the ground that a couple of great ideas have far more positive value than a whole bunch of negative ones have negative value in generalises to an argument for being open to 'eccentrics' who comprise large numbers of new or intermittent posters.

You've got to consider the base rates. Most eccentrics are actually just people with ungrounded ideas that are wrong since it's easy to have wild ideas and hard to have correct ideas and thus even harder to have wild and correct ideas.

In the old days of Less ... (read more)

7
Arepo
1y
It is tiresome to have conversations in which you assume I only started thinking about this yesterday and haven't considered basic epistemic concepts.  a) I am not talking about actual eccentrics; I'm drawing the analogy of a gestalt entity mimicking (an intelligent) eccentric. You don't have to agree that the tradeoff is worthwhile, but please claim that about the tradeoff I'm proposing, not some bizarre one where we go recruiting anyone who has sufficiently heterodox ideas. b) I am not necessarily suggesting removing the karma system. I'm suggesting toning it down, which could easily be accompanied by other measures to help users find the content they'd most like to see. There's plenty of room for experimentation - the forum seems to have been stuck in a local maximum (at best - perhaps not a maximum) for the last few years, and CEA should have the resources for some A/B testing of new ideas. c) Plenty of pre-Reddit internet forums have been successful in pursuing their goal with no karma system at all, let alone a weighted one. Looking at the current posts on the front page of the EA Reddit, only one is critical of EA, and that's the same Bostrom discussion that's been going on here. So I don't see good empirical evidence that toning down the karma system would create the kind of wild west you fear.

Sure, not everyone likes curated gardens. If that's not the kind of site you want, there's other places. Reddit, for example, has active communities that operate under different norms.

The folks who started the Forum prefer the sort of structure it has. If you want something else and you don't have a convincing argument that convinces us, you're free to participate in discussions elsewhere.

As to deeper reasons why the Forum is the way it is, see, for example, https://www.lesswrong.com/posts/tscc3e5eujrsEeFN4/well-kept-gardens-die-by-pacifism

7
Arepo
1y
'There are other places' seems like a terrible benchmark to judge by. Reddit is basically the only other active forum on the internet for EA discussion and nowhere else has any chance of materially affecting EA culture. The existence of this place suppresses alternatives - I used to run a utilitarianism forum that basically folded into this because it didn't seem sensible at the time to compete with people we almost totally agreed with. Posting a a single unevidenced LW argument as though it were scripture as an argument against being exposed to a wider range of opinions seems like a poor epistemic practice. In any case, that thread is about banning, which I've become more sympathetic to, and which is totally unrelated to the karma system.

My personal commentary: probably not much of a loss, but directionally sucks. No charity was receiving much money this way as far as I know, but was a nice way to feel a bit better about shopping on Amazon since you got to pick where their charitable giving went, and I'm sure it had some marginal impact for some charities. I'm also sure there's a backstory, like wanting to not be neutral on where the funds go because they leave it up to customers to choose from any registered charity, but that's not present in the announcement.

I voted against this one because it's not specific to EA. This is a general phenomenon of people who have a "disfunction" of not having 99th percentile executive function seeking ADHD diagnoses to get access to amphetamines. It might be happening in EA, but it's not clear there is an EA problem rather than a society-wide problem.

Addressing it as a general problem might be worthwhile, but we'd need to analyze it (maybe someone already has!).

Almost no organization in the world that gets stuff done on reasonable timelines operates this way. I think there's a very high prior against this.

Democracy makes sense for things you are forced into, like the government you're born under and forced to be ruled by. EA organizations are voluntary orgs that are already democratic in that funders can take their money and go elsewhere if they don't like how they are run. This would add a level of complication to decision making that would basically guarantee that large EA orgs would fail to achieve their missions.

This is a mechanism for maintaining cultural continuity.

Karma represents how much the community trusts you, and in return, because you are trusted, you're granted greater ability to influence what others see because your judgement has been vetted over a long series of posts. The increase in voting power is roughly logarithmic with karma, so the increased influence in practice hits diminishing returns pretty quickly.

If we take this away it allows the culture of the site to drift more quickly, say because there's a large influx of new folks. Right now existi... (read more)

7
Jeroen Willems
1y
I really don't think the libertarian "if you don't like it, go somewhere else" works here as the EA forum is pretty much the place where EA discussions are held. Sure, they happen on twitter and reddit too but you have to admit it's not the same. Most discussions start here and are then picked up there. I agree with your other arguments, I don't want the culture of the site to drift too quickly because of a large influx of new folks. But why wouldn't a cut off be sufficient for that? I don't see why the power has to keep on increasing after, say, a 200 karma. Because at that point value lock-in might become an issue. Reminds me a bit of the average age of US senators being 64 years old. Not too dismiss the wisdom of experienced people, but insights from new folks is important too.
4
Arepo
1y
This doesn't seem self-evidently bad or obviously likely.

The statement is almost certainly intentionally ambiguous. That's kind of how a lot of PR works: say things directionally and let people read in their preferred details.

There's a somewhat standard argument I've heard before that we shouldn't separate these out because they're already covered by existential risks. For example, this is already a complaint lobbied against s-risks since they're a subtype of x-risk. Trying to focus on a subtype seems like mostly a bid to allow something to rise to higher importance than it would otherwise be under standard x-risk framings.

For example, many people try to claim climate change is an extinction risk because fast, significant climate change could cause an extinction event, but that... (read more)

1
Charlie_Guthmann
2y
I don't necessarily think s-risks and extinction risks are strictly subtypes of x-risks (if by subtype you mean subset), although it seems like the community may have a few definitions swirling around for each term.  Does giving something an acronym = trying to focus on? It could explicitly help you focus less on it by clarifying via making it easier to communicate, for example. Even if it adds focus, if it also adds clarity (which it totally may not), there is at least the notion of some tradeoff. This still seems more important than almost anything else that isn't an x-risk to me. So is the implication here that existential risk is the sole term that gets an acronym? I feel ok about letting the global dev and animal welfare communities have acronyms (conditioning on acronyms being useful) even though one might say they are orders of magnitude less important than x-risk reduction.

I don't know if we really disagree, but I'm not interested in talking about it. Seems extremely unlikely to be a discussion worth the effort to have since I don't think either of us thinks making up deceptive quotes is okay. I think I'm just sloppier than you and that's not interesting.

I was just paraphrasing. You literally wrote "rationally resolve disagreements" which feels like the same thing to me as "rational dispute resolution".

I edited my comment to quote you more literally since I think it maintains exactly the same semantic content.

1
Elliot Temple
2y
We disagree about quotation norms. I believe this is important and I would be interested in discussing it. Would you be? We could both explain our norms (including beliefs about their importance or lack thereof) and try to understand the other person’s perspective.

No offense, but I'm surprised, because your phrasing doesn't parse for me, since it's not clear to me what it would mean for EA as a movement to be "rational", and most use of "rational" in the way you're using it here reflects a pattern shared among folks with only passing familiarity with Less Wrong.

For example, you ask about "rational debate" and "rationally resolv[ing] disagreements", but the point of the post I linked is sort of that this doesn't make sense to ask for. People might debate using rational arguments, but it would be weird to call that ra... (read more)

1
Elliot Temple
2y
You raise multiple issues. Let's go one at a time. I didn't write the words "rational dispute resolution". I consider inaccurate quotes an important issue. This isn't the first one I've seen, so I'm wondering if there's a disagreement about norms.

I think, based on the way you're phrasing your question, you're perhaps not fully grasping the key ideas of Less Wrong style rationality, which is what EA rationality is mostly about. It might help to read something like this post about what rationality is and isn't as a starting point, and from there explore the Less Wrong sequences.

1
Elliot Temple
2y
I've read the sequences in full.

I fail to be convinced. Many of your arguments seem like fully general arguments about why to worry about anything as a longtermists and thus wash out. For example, you argue

Climate change would interfere with our ability to address extinction risks. States burdened by climate change could not expend efforts on preventing other catastrophic risks

But there's a great many things that fall under the category of impacting the ability of modern civilization to address arbitrary risks. For example, someone could just as easily argue that our failure to prod... (read more)

1
Nir Eyal
2y
Thanks! Sure, various issues might in theory interfere with international cooperation efforts, but as regards climate change, we  see tensions over very large economic stakes on compensation, reparation etc unfolding before our eyes. Just this past month, there were two big illustrations of this. In addition to the UN discussions of a global tax to pay for climate-related  loss and damage, a link on which I included, there was this.  As expensive disasters and flooding abound, new tensions are  likely to arise and interfere with the ability to work together on addressing existential risks, through two mechanisms:  1. angry parties don't work well together (a recent illustration is how the Ukraine war foiled progress on climate). 2. parties with acute, time-sensitive needs will sometimes try to force  stronger parties who are relatively indifferent to those needs to address them by making addressing them a condition for cooperation on shared needs. That can infuriate the stronger parties, who feel blackmailed. Cooperation on the shared needs then collapses (that seems to be precisely how acute, time sensitive needs in development assistance brought down recent  discussions of the biological weapons convention - see my link to the 80,000 hours podcast  with Jaime Yassif).  Climate change will often create acute, time sensitive needs (e.g. in flooding relief, in license to immigrate).

This is a problem where I think we can get some returns with minimal investment. For example, maybe just ask volunteers to bring in their air purifiers from home and hook them up during the day. That won't be a perfect solution, but it will provide cleaner air than having no purifiers. This seems like a space where we can get something like logarithmic returns on effort, which means even doing a little can be quite impactful.

Minor point, but I'll note that most volunteers come from out of town, and I expect only a minority of them own air purifiers anyway.

I think any issues with abuse of strong upvotes is tempered by the fact that someone has to spend a lot of time writing posts and getting upvotes from other forum members before they can have much influence with their votes, strong or otherwise. So in practice this is probably not a problem, because the trust is earned through months and years of writing the posts and comments and getting the votes that earn one a lot of karma.

EAF uses the same software was Less Wrong. Less Wrong is intentionally something like a walled garden but the walls are low enough that people can climb over if they want. The purpose is the maintain high quality discussions. I'm actually kind of avoiding EAF at the moment because there's so many new people it's negatively affecting the culture that was here before, which was always different from but somewhat similar to Less Wrong. The weighted voting is a mechanism to help maintain the desired culture and prevent it from regressing to the internet mean. ... (read more)

I didn't find this compelling. As best I can tell your criticism grounds out in "EA disagrees with my and lots of people's moral intuitions, so it's probably wrong".

To pick on just one quote to explain my point:

The EA position that one’s duty is to the entire global (or future) population tends to be very upsetting to most other people, because they have other ideas about your duty! EA is a movement of some of the most powerful people in our society, and a certain “natural’ reaction occurs when those with power seem to not take up a responsibility that oug

... (read more)

In an important sense, yes!

To take an example of opposing armies, consider the European powers between say 1000 CE and 1950 CE. They were often at war with each other. Yet they were clearly allies in a sense that they were in agreement that the European way was best and that some European should clearly win in various conflicts and not others. This was clear during, for example, various wars between powers to preserve monarchy and Catholic rule. If I'm Austria I still want to fight the neighboring Catholic powers ruled by a king to gain land, but I'd rathe... (read more)

1
Ember
2y
Can you define methodology? If you are defining the term so broadly that monarchy, catholic rule, and republic are methodologies then you don't have to bite the bullet on the "effective nazi" objection. You can simply say, "fascism is a methodology I oppose" however at this point it seems like the term is so broad that your objection to EA fails to have meaning.  I don't think this example holds up to historical scrutiny, but it's so broad Idk how to argue on that front so I'm simply going to agree to disagree. You can work to understand other people's philosophical assumptions and work within those parameters.

In isolation I agree. But I found nothing new or interesting in this post. Since votes control how visible a post is, I view votes as purely a signal about how much I want to see and how much I want others to see content like this. Since I didn't find it new or interested it was a poor use of my time to read it, hence the down vote.

When I down vote I like to tell people why so they have useful feedback on what makes people down vote.

I know many people vote to say "yay" or "boo". I disagree with this voting style, and my votes generally should not be interpreted that way. I down vote to say "I don't think you should bother reading this" and I up vote to say "I think you should read this".

I agree, but is this a post that will make that change? I don't see any really compelling arguments or stories here that are likely to change minds.

6
DavidNash
2y
One person has already commented saying they will change how they act.

There's no reason a person can't be both earnest and still be hitting the applause light button. Intent matters, but so does outcomes.

I don't recall recent EA discussion of this topic, but this is an extremely well-worn topic in general. This is sort of a professionalism 101 topic that most people debate in high school as something of a toy topic because the arguments are already well explored.

Almost everything that gets posted on the Forum has already been explored somewhere else. That doesn't make it worth downvoting.

Downvoted because I don't feel like there's any substance here and it's not worth spending the time to read. I think most people already agree with this sentiment and know the arguments presented in one way or another, so it feels like this post is just flashing the applause lights.

I'd probably have at least not downvoted and maybe would have upvoted this post if it contained some new content, like a proposal for how to get people not to glorify looks.

I disagree - I haven't seen any discussion of this, and the arguments come off as earnest and not as applause lights.

7
tlevin
2y
I disagree that this is an applause light, since an opposing claim like "communities are better when they have more jokes as long as these don't make people feel unwelcome" is pretty reasonable too. And including a proposal for how to change the norm seems like a bad bar, since posting on the EA Forum is probably the first plank that would occur to me in a plan to change the norms.

I think just writing a post can lead to some of the changes you want to see.

One example being the "It's really really hard to get hired by an EA organisation". Having that exist and be spread amongst people was able to start changing expectations that people had.

I also think even if most people already agree, there are some people haven't thought about the subject of this post and may change their behaviour after having read it. I have seen a few examples of this on Twitter and in person before.

6
Florence
2y
Good point. I agree that it's important to have steps to mitigate this!  Happy to discuss in the comments ways in which we could try to reduce this. Here is my current best guess: 1. Not making jokes about finding someone attractive, and if you do this, try to recognize this and prevent your self doing it next time! 2. Noticing if other people do it and call them out! They might not realise they are doing this (feel free to reference this article). 3. Hold all people to high levels of epistemic rigor. 

Hmm, I think these arguments comparing to other causes are missing two key things:

  • they aren't sensitive to scope
  • they aren't considering opportunity cost

Here's an example of how that plays out. From my perspective, the value of the very large number of potential future lives dwarfs basically everything else. Like the value of worrying about most other things is close to 0 when I run the numbers. So in the face of those numbers, working on anything other than mitigating x-risk is basically equally bad from my perspective because that's all missed oppor... (read more)

To your footnote, I'm not sure how many people are directly uncomfortable, but I do find arguments that roughly boil down to "but what about Nazis?" lazy as they try to run around the discussion by pointing to a thing that will make most readers go "Nazis bad, I agree with whatever says 'Nazis bad' most strongly!". This doesn't mean thinking Nazis are bad is an unreasonable position or something, only that it looms so large it swamps many people's ability to think clearly.

Rationalists the to taboo comparing things to Nazis or using Nazis as an example for ... (read more)

7
Saul Munn
2y
? Maybe the formatting of your comment cut off the later portions. It seems like your response to my comment only included a discussion of my end note. To be clear, my end note was meant as merely a side-conversation, only tangentially related to the main body of the comment. I’ll be generous in assuming that it was merely a formatting error — I wouldn’t hope to assume that you ignored the main points of my comment in favor of writing only about my relatively unimportant end note. I await your response to the content of my comment! :)

I disagree with Nazism, but to be intellectually consistent I have to accept that even beliefs about what is good that I find personally unpalatable deserve consideration.

A) No — to be intellectually consistent, you wouldn't merely have to claim that Nazism deserves consideration. You'd have to actively support an anti-Semitic person donating to the Nazi Party and ensuring that it functions as efficiently as possible to eradicate Jewish people.[1] Correct me if I'm wrong, but your post didn't seem to stop at wanting just a discussion of values — it pu... (read more)

There is (or, at least, ought to be) a big gap between "considering" a view and "allying" with it.  If you're going to ally with any view no matter its content, there's no point in going to the trouble of actually thinking about it.  Thinking is only worthwhile if it's possible to reach conclusions that differ depending on the details of what's considered.

Of course we're fallible, but that doesn't entail radical skepticism (see: any decent intro philosophy text).  Whatever premises you think lead to the conclusion "maybe Nazism is okay after... (read more)

I've edited my post to make it clear I think this is an off topic discussion within the context of this question. I think it's fine for this comment to stay because it was there before I made this clarification, but I have asked the moderators to convert this from an answer to a proper comment.

I don't think it actually has (1).

Engaged Buddhism is, as I see it, best understood as a movement among Western Liberals who are also Buddhists, and as such as primarily infused with Western liberal values. These are sometimes incidentally the best way to do good, but unlike EA they don't explicitly target doing the most good, they instead uphold an ideology that values things like racial equality, human dignity, and freedom on religion (including freedom to reject religion).

As for (2), I'm not sure how much there is to learn. There's likely some things, b... (read more)

I think there's some case for specialization. That is, some people should dedicate their lives to meditation because it is necessary to carry forward the dharma. Most people probably have other comparative advantages. This is not a typical way of thinking about practice, but I think there's a case to be made that we could look at becoming a monk, for example, as a case of exercises comparative advantage as part of an ecosystem of practitioners who engage in various ways based on their comparative abilities (mostly focused on what they could be doing in the... (read more)

4
Michael B.
2y
A few years ago I asked a zen nun what exactly is the use of being a nun, living quite secluded and without much impact on the world. Her response was (roughly speaking) that it is good if some people practice and study intensely because that keeps the quality and depth of the tradition alive and develops it. But not everyone should take that path. It seems like you was expressing the same idea as you are! I think she now leads one of the monastic centers in Germany. 
3
Noah Starbuck
2y
Really appreciate that notion.  It is something I've thought a lot about myself.  I also tend to find that my personal spiritual practice benefits from a mix of many short meditation retreats, daily formal meditation sessions & ongoing altruistic efforts in daily life.  I don't feel that I would make a good teacher of meditation if I did that full time or that my practice would reach greater depth faster if I quit my job & practiced full time.  

A couple comments.

First, I think there's something akin to creating a pyramid scheme for EA by leaning too heavy on this idea, e.g. "earn to give, or better yet get 3 friends to earn to give and you don't need to donate yourself because you had so much indirect impact!". I think david_reinstein's comment is in the same vein and good.

Second, this is a general complaint about the active/passive distinction that is not specific to your proposal but since your proposal relies on it I have to complain about it. :-)

I don't think the active/passive distinction is... (read more)

Maybe I can help Chris explain his point here, because I came to the comments to say something similar.

The way I see it, neartermists and longtermists are doing different calculations and so value money and optics differently.

Neartermists are right to be worried about spending money on things that aren't clearly impacting measures of global health, animal welfare, etc. because they could in theory take that money and funnel it directly into work on that stuff, even if it had low marginal returns. They should probably feel bad if they wasted money on a big ... (read more)

Two thoughts:

  1. We should be careful about claiming the GOP is the "worse party". Worse for whom? Maybe they are doing things you don't like, but half the country thinks the Democrats are the worse party. We should be wise to the state of normative uncertainty we are in. Neither party is really worse except by some measure, and because of how they are structured against each other one party being worse means the other is better by that measure. If you wanted to make a case that one party or the other is better for EA and then frame the claim that way I think
... (read more)

to the fall of US democracy and a party that has much worse views on almost every subject under most moral frameworks.

This seems like a pretty partisan take and fails to adequately consider metaethical uncertainty. There's nothing about this statement that I couldn't imagine a sincere Republican with good intentions saying about Democrats and being basically right (and wrong!) for the same reasons (right assuming their normative framework, wrong when we suppose normative uncertainty).

While I don't want to suggest that you or any other person who feels the GOP has an obligation to work for them, part of the reason they are able to be hostile to various groups is because those groups are not part of how they get elected. If tomorrow the GOP was dependent on LGBTQ votes to win elections, they'd transform into a different party.

So while I'm not expert enough here to see how to change the current situation, I think there is something interesting about changing the incentive gradients for both parties to make them both more inclusive (both construct on outgroup—GOP: minorities and foreigners, Democrats: rural and working-class white people) and I expect that to have positive outcomes.

5
Peter
2y
Isn't saying to support a worse party in hopes that it becomes better like saying you should support a worse business in hopes that it becomes better? If they already have your vote/money/support why would they change?  Repeatedly losing elections seems like it would be more likely to cause the Republican party to change. 

The more I practice, the more I've come to believe that that only thing that really matters is that you do it. Not that you do it well by whatever standard one might judge, but just that you do it. 30 minutes of quiet time is a foundation on which more can be explored and discovered. You don't have to sit a special way, do a special thing with your mind, or do anything else in particular for it to be worth the effort, although all those things can help and are worth doing if you're called to them!

You should totally learn a bunch of techniques or practice a... (read more)

What does this funding source do that existing LT sources don’t?

Natural followup: why a new fund rather than convince an existing fund to use and emphasize the >0.0.1% xrisk reduction criterion?

I think there's a pretty smooth continuum between an entirely new fund and an RFP within an existing fund, particularly if you plan to borrow funders and operational support. 

I think I a) want the branding of an apparent "new fund" to help make more of a splash and to motivate people to try really hard to come up with ambitious longtermist projects, and b) to help skill up people within an org to do something pretty specific.

You also shave off downside risks a little if you aren't institutionally affiliated with existing orgs (but get advice in a way that decreases unilateralist-y bad stuff).

Even if he wants to do that, his power is not absolute. I'd expect/hope for his generals to step in if he tries something like that, perhaps using it as reason for a coup.

Load more