All of S.E. Montgomery's Comments + Replies

I notice that the places that provide the most job security are also the least productive per-person (think govt jobs, tenured professors, big tech companies). The typical explanation goes like "a competitive ecosystem, including the ability for upstarts to come in and senior folks to get fired, leads to better services provided by the competitors

Do you have evidence for this? Because there is lots of evidence to the contrary - suggesting that job insecurity negatively impacts people's productivity as well as their physical, and mental health.[1][2][3].&nb... (read more)

I have a couple thoughts here, as a community builder, and as someone who has thought similar things to what you've outlined. 

I don't like the idea of bringing people into EA based on false premises. It feels weird to me to 'hide' parts of EA to newcomers. However, I think the considerations involved are more nuanced than this. When I have an initial conversation with someone about what EA is, I find it difficult to capture everything in a way that comes across as sensible. If I say, "EA is a movement concerned with finding the most impactful careers ... (read more)

1
Quadratic Reciprocity
1y
Thank you for sharing your thoughts here. I found it really difficult to reply to this comment, partly because it is difficult for me to inhabit the mindset of trying to be a representative for EA. When I talk to people about EA, including when I was talking to students who might be interested in joining an EA student group, it is more similar to "I like EA because X, the coolest thing about EA for me is Y, I think Z though other people in EA disagree a bunch with my views on Z for W reason and are more into V instead" rather than trying to give an objective perspective on EA.  I'm just really wary of changing the things I say until it gets people to do the thing I want (sign up for my student group, care about AI safety, etc.) There are some situations when that might be warranted like if you're doing some policy-related thing. However, when running a student group and trying to get people who are really smart and good at thinking, it seems like the thing I'd want to do is just to state what I believe and why I believe it (even and especially if my reasons sound dumb) and then hearing where the other person agrees or disagrees with me. I don't want to state arguments for EA or AI safety to new members again and again in different ways until they get on board with all of it, I want us to collaboratively figure things out.

Thanks for posting this! I agree, and one thing I've noticed while community building is that it's very easy to give career direction to students and very early-career professionals, but much more challenging to mid/late-career professionals. Early-career people seem more willing to experiment/try out a project that doesn't have great support systems, whereas mid/late-career people have much more specific ideas about what they want out of a job. 

Entrepreneurship is not for everyone, and being advised to start your own project with unclear parameters a... (read more)

We already have tons of implicit norms that ask different behaviours of men and women, and these norms are the reason why it's women coming forward to say they feel uncomfortable rather than men. There are significant differences in how men and women approach dating in professional contexts, see power dynamics, and in the ratio of men in powerful positions versus women (as well as the gender ratio in EA generally). Drawing attention to these differences and discussing new norms that ask for different behaviours of men in these contexts (and different behaviours from the institutions/systems that these men interact with) is necessary  to prevent these situations from happening in the future.

Something about this comment rubbed me the wrong way. EA is not meant to be a dating service, and while there are many people in the community who are open to the idea of dating someone within EA or actively searching for this, there are also many people who joined for entirely different reasons and don't consider this a priority/don't want this.  

I think that viewing the relationship between men and women in EA this way - eg. men competing for attention, where lonely and desperate men will do what it takes to to get with women - does a disservice to ... (read more)

I disagree-voted on this because I think it is overly accusatory and paints things in a black-and-white way.

There were versions of the above proposal which were not contentless and empty, which stake out clear and specific positions, which I would've been glad to see and enthusiastically supported and considered concrete progress for the community.

Who says we can't have both? I don't get the impression that EA NYC wants this to be the only action taken on anti-racism and anti-sexism, nor did I get the impression that this is the last action EA NYC will tak... (read more)

People choose whom they date and befriend - no-one is forcing EAs to date each other, live together, or be friends. EAs associate socially because they share values and character traits.

To an extent, but this doesn't engage with the second counterpoint you mentioned: 

2. The work/social overlap means that people who are engaged with EA professionally, but not part of the social community, may miss out on opportunities.

I think it would be more accurate to say that, there are subtle pressures that do heavily encourage EAs to date each other, live togethe... (read more)

3
Amber Dawn
1y
I think this is a reasonable concern (as someone who would avoid moving into a big group house like the plague :p). I'd be in favour of more blinding when people make hiring decisions. Hiring agencies, as well as saving people time here, might also make the process fairer, since they can be more objective and will be less tempted to hire friends. As an empirical matter, do you think people in EA do disproportionately hire friends, or does the causation go the other way? (e.g., people move into group houses with friendly colleagues). 
6
Chris Leong
1y
I don't know if the framing of it "creating barriers" completely captures the dynamic. I would suggest that there is already a barrier (opportunities to exchange ideas/network with like-minded people) and the main effect of starting a group house is to lower these barriers for the people who end up joining these and then maybe there is a secondary effect where some of these people might be less accessible than they would be otherwise since they have a lower need for connecting with outside people, however, this seems like a secondary effect. And I guess I see conflating the two as increasing the chance that people talk past each other.

I think the usefulness of deferring also depends on how established a given field is, how many people are experts in that field, and how certain they are of their beliefs. 

If a field has 10,000+ experts that are 95%+ certain of their claims on average,  then it probably makes sense to defer as a default. (This would be the case for many medical claims, such as wearing masks, vaccinations, etc.)  If a field has 100 experts and they are more like 60% certain of their claims on average, then it makes sense to explore the available evidence your... (read more)

1
Venkatesh
1y
Ah! That makes sense. I agree that the EA thing to do would be to work on and explore cause areas by oneself instead of just blindly relying on 80k hours cause areas or something like that.

As with any social movement, people disagree about the best ways to take action. There are many critiques of EA which you should read to get a better idea of where others are coming from, for example, this post about effective altruism being an ideology, this post about someone leaving EA, this post about EA being inaccessible, or this post about blindspots in EA/rationalism communities. 

Even before SBF, many people had legitimate issues with EA from a variety of standpoints. Some people find the culture unwelcoming (eg. too elitist/not enough diversi... (read more)

Sabs
1y10
5
6

well the elitism charge is just true and it should be true! Of course EA is an elitist movement, the whole point is trying to get elites to spend their wealth better, via complicated moral reasoning that you have to be smart to understand (this is IMO a good thing, not a criticism!). 

I actually think it would be a disaster if EA became anti-elitist, not just for EA but for the world. The civic foundation of the West is made up of Susan from South Nottingham who volunteers to run the local mother & baby group: if she stops doing that to ETG or what... (read more)

9
Manuel Del Río Rodríguez
1y
I rather liked this comment, and think it really hits the nail on the head. Myself being a person that has only recently come into contact and developed an interest, and therefore having mostly an 'outsider' perspective, I would add that there's a big difference in the perception of 'effective altruism', which almost anybody would find reasonable and morally unobjectionable, and 'Effective Altruism' / Rationalism as a movement with some beliefs and practices that will be felt as weird and rejectable by many people (basically, all those mentioned by S.E. Montgomery like elitism, long-termism, utilitarianism, a general hibristic and nerdy belief that complex issues and affairs are reducible to numbers and optimization models, etc...). 

Thinking that 'the ends justifies the means' (in this case, making more donations justifies tax evasion) is likely to lead to incorrect calculations about the trade-offs involved. It's very easy to justify almost anything with this type of logic, which means we should be very hesitant. 

As another commenter pointed out, tax money isn't 'your' money. Tax evasion (as opposed to 'tax avoidance' - which is legal) is stealing from the government. It would not be ethical to steal from your neighbour in order to donate the money, and likewise it is not ethical to steal from the government to donate money. 

I mostly agree with your post from a purely financial perspective, I was just giving some examples where people might think that the potential financial benefits of buying a house are worth the potential risks you mentioned. I've got a friend who falls into the example you gave (doesn't have/plan to have children, will leave his house to charity in his will), and this doesn't seem like that terrible of a decision for him.

EAs who will/may have children however perhaps shouldn't buy a home as, if they do, the pressure to leave the home to their children will

... (read more)

Not saying these situations apply to the person you were replying to, but I can think of a few instances where this would be the case. 

  1. You buy a house that you think will substantially increase in value - This is never guaranteed, but there could be good reasons to think that the value of a house will increase over time, eg. the sale price seems low for what it is, you are planning to do extensive renovations, it's in an up-and-coming area, etc. 
  2. You are bad at saving money - Buying a house forces you to put money towards an asset, whereas renting
... (read more)
2
JackM
1y
Yeah these reasons make some sense, but they don't seem to acknowledge the points I made in my original post that there's always a very real risk of not selling the home and therefore not donating proceeds to charity. It just seems safer to me from a donor's perspective to keep your money as money (that is invested).

My reading (and of course I could be completlely wrong) is that SBF wanted to invest in Twitter (he seems to have subsequently pitched the same deal through Michael Grimes), and Will was helping him out.  I don't imagine Will felt it any of his business to advise SBF as to whether or not this was a good move.  And I imagine SBF expected the deal to make money, and therefore not to have any cost for his intended giving.

I agree that it's possible SBF just wanted to invest in Twitter in a non-EA capacity. My comment was a response to Habryka's comme... (read more)

I can see where you're coming from with this, and I think purely financially you're right, it doesn't make sense to think of it as billions of dollars 'down the drain.' 

However, if I were to do a full analysis of this (in the framing of this being a decision based on an EA perspective), I would want to ask some non-financial questions too, such as:

  • Does the EA movement want to be further associated with Elon Musk than we already are, including any changes he might want to make with Twitter? What are the risks involved? (based on what we knew before the
... (read more)
8
Marcus Rademacher
1y
This is the bit I think was missed further up the thread. Regardless of whether buying a social media company could reasonably be considered EA, it's fairly clear that Elon Musk's goals both generally and with Twitter are not aligned with EA. MacAskill is allowed to do things that aren't EA-aligned, but it seems to me to be another case of poor judgement by him (in addition to his association with SBF).

My guess is there must be some public stuff about this, though it wouldn't surprise me if no one had made a coherent writeup of it on the internet (I also strongly reject the frame that people are only allowed to say that something 'makes sense' after having discussed the merits of it publicly. I have all kinds of crazy schemes for stuff that I think in-expectation beats GiveWell's last dollar, and I haven't written up anything close to a quarter of them, and likely never will).

Yeah, there could be some public stuff about this and I'm just not aware of it.... (read more)

1
Aleksi Maunu
1y
For what it's worth connecting SBF and Musk might've been a time sensitive situation for a reason or another. There would've also still been time to debate the investment in the larger community before the deal would've actually gone through.

If SBF wanted to buy Twitter for non-EA reasons, that's one thing, but if the idea here is that purchasingTwitter alongside Elon Musk is actually worth billions of dollars from an EA perspective, I would need to see way more analysis, much like significant analysis has been done for AI safety, biorisk, animal welfare, and global health and poverty.

If you think investing in Twitter is close to neutral from an investment perspective (maybe reasonable at the time, definitely not by the time Musk was forced to close) then the opportunity cost isn't really b... (read more)

I think it could be a cost-effective use of $3-10 billion (I don't know where you got the $8-15 billion from, looks like the realistic amounts were closer to 3 billion). My guess is it's not, but like, Twitter does sure seem like it has a large effect on the world, both in terms of geopolitics and in terms of things like norms for the safe development of technologies, and so at least to me I think if you had taken Sam's net-worth at face-value at the time, this didn't seem like a crazy idea to me. 

The 15 billion figure comes from Will's text messages ... (read more)

My reading (and of course I could be completlely wrong) is that SBF wanted to invest in Twitter (he seems to have subsequently pitched the same deal through Michael Grimes), and Will was helping him out.  I don't imagine Will felt it any of his business to advise SBF as to whether or not this was a good move.  And I imagine SBF expected the deal to make money, and therefore not to have any cost for his intended giving.

Part of the issue here is that people have been accounting the bulk of SBF's net worth as "EA money".  If you phrase the ques... (read more)

The 15 billion figure comes from Will's text messages themselves (page 6-7). Will sends Elon a text about how SBF could be interested in going in on Twitter, then  Elon Musk asks, "Does he have huge amounts of money?" and Will replies, "Depends on how you define "huge." He's worth $24B, and his early employees (with shared values) bump that up to $30B. I asked how much he could in principle contribute and he said: "~1-3 billion would be easy, 3-8 billion I could do, ~8-15b is maybe possible but would require financing"

Makes sense, I think I briefly sa... (read more)

Investing in assets expected to appreciate can be a form of earning to give (not that Twitter would be a good investment IMO). That's how Warren Buffett makes money and probably nobody in EA has criticized him for doing that. Investing in a for-profit something is very different and is guided by different principles from donating to something, because you are expecting to (at least) get your money back and can invest it again or donate it later (this difference is one of the reasons microloans became so hugely popular for a while).

On the downside, concentr... (read more)

In terms of people coming away from the post thinking that polyamory = bad, I guess I have faith in people's ability on this forum to separate a bad experience with a community from an entire community as a whole. (Maybe not everyone holds this same faith.)

The post was written by one person, and it was their experience, but I expect by now most EAs have run into polyamorous people in their lives (especially considering that EAs on average tend to be young, male, non-religious, privileged, and more likely to attend elite universities where polyamory/discuss... (read more)

7
Amber Dawn
1y
Yeah that's fair, I definitely don't want people to have to watch their wording too closely when sharing their experiences, and I felt complicated about that post and my own replies/reaction to it. 

I'm conflicted here. I completely agree with you that shitting on others' morally-neutral choices is not ideal, but I don't think anyone was coming away from reading that post thinking that polyamory = bad. I would hope that the people on this forum can engage thoughtfully with the post and decide for themselves what they agree/disagree with. 

If someone had a bad experience with a man, and in the process of talking about it said something like, "all men suck and are immoral," I just don't think that is the right time or place to get into an argument w... (read more)

I guess I don't see why someone wouldn't come away from the post thinking that polyamory  = bad. 

I think the analogy here is not "all men suck and are immoral" (though I'm not even sure how much I endorse that), but like, if someone had had a bad experiences with men of a certain race, and in talking about it continually mentioned their race. I think people would rightly call that out as racist and not ok - we want to be sympathetic to victims, but if they are saying things that are harmful to others in the course of telling their experience, it'... (read more)

2
Keerthana Gopalakrishnan
1y
I wrote that post. I just want to clarify that I did not say "all poly men", but "many poly men". The difference is important. As someone who has no theoretical issue with poly practiced consensually, I'm not getting it why Amber Dawn and others feel attacked.  Me: "I was harassed by many poly men. " Amber: "Stop attacking poly men. Not all poly men." Read this https://en.wikipedia.org/wiki/NotAllMen

Great post! I agree with a commenter above who says that "The problem is not a lack of ideas that needs to be rectified by brainstorming - we have the information already. The problem seems to be that no one wants to act on this information." That being said, I have a few thoughts: 

Regarding code of conduct at events, I'm hesitant to make hard and fast rules here. I think the reality around situations such as asking people out/hitting on people, etc, is that some people are better at reading situations than others. For example, I know couples who have... (read more)

So I was one of the top comments disagreeing with that post, and I'm a poly woman, and my interest wasn't to defend predatory poly men but to argue against the idea that my relationship structure, which is consensually, positively practiced by many people the world over, isn't inherently toxic or embedded in predatoriness. Trauma and upset should be met with sympathy, but it doesns't justify shitting on others' morally-neutral choices, and a community that's hostile to polyamory is hostile to many women and NBs, not just men.

Strong upvote. It's definitely more than "just putting two people in touch." Will and SBF have known each other for 9 years, and Will has been quite instrumental in SBF's career trajectory - first introducing him to the principles of effective altruism, then motivating SBF to 'earn to give.' I imagine many of their conversations have centred around making effective career/donation/spending decisions. 

It seems likely that SBF talked to Will about his intention to buy Twitter/get involved in the Twitter deal, at the very least asking Will to make the in... (read more)

I'm not sure I agree with this. I agree that compassion is a good default, but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis, which will include many people in the 'Dank EA Memes' Facebook group. Humour can be a coping mechanism which will make some people feel better about bad situations:

"As predicted, individuals with a high sense of humor cognitively appraised less stress in the previous month than individuals with a low sense of humor and reported less current anxiety despite experiencing a simi

... (read more)
Fai
1y17
3
1

but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis

I agree with this. But I think compassion needs to be even further extended to every sentient beings who might be worse off because of this event (but this is assuming EA suffering from this event will be less good done in the world, and I recognize that there are people who seem to genuinely think that the world will be better if EA disappears from the world).  And the implication of this is that we need to think about whether these mockeries an... (read more)

While I agree that humour is a great de-stressor, I have faith in our ability to find alternative ways to entertain ourselves that don't involve kicking someone while they're down.

Thanks for your response. On reflection, I don't think I said what I was trying to say very well in the paragraph you quoted, and I agree with what you've said.

 My intent was not to suggest that Will or other FTX future fund advisors were directly involved (or that it's reasonable to think so), but rather that there may have been things the advisors chose to ignore, such as Kerry's mention of Sam's unethical behaviour in the past. Thus, we might think that either Sam was incredibly charismatic and good at hiding things, or we might think there actually were some warning signs and those involved with him showed poor judgement of his character (or maybe some mix of both). 

I am glad you felt okay to post this - being able to criticise leadership and think critically about the actions of the people we look up to is extremely important.

I personally would give Will the benefit of the doubt of his involvement in/knowledge about the specific details of the FTX scandal, but as you pointed out the fact remains that he and SBF were friends going back nearly a decade.

I also have questions about Will Macaskill's ties with Elon Musk, his introduction of SBF to Elon Musk, his willingness to help SBF put up to 5 billion dollars towards t... (read more)

I personally would give Will the benefit of the doubt of his involvement in/knowledge about the specific details of the FTX scandal

Of course. This reads as almost bizarre: it would be a baby-eater-type conspiracy theory to think that Will (or anyone else in EA leadership) knew about this.  That's just not how things work in the world. The vast majority of people at Alameda/FTX didn't know (inner circle may have been as small as four). I mean, maybe there's a tiny chance that Sam phoned up someone a week ago and wanted a billion in secret, but you can ... (read more)

I love this! Thanks for sharing :) 

-3
Anthony Fleming
2y
Thank you! Hope you enjoy.

Thanks Julia; this is a really insightful post. I will make sure to use it if anyone in the EA community asks me questions related to community health/the process for complaints in the future. 

One of the things I'm curious about is how you see the balance of these trade-offs: 

Encourage the sharing of research and other work, even if the people producing it have done bad stuff personallyDon’t let people use EA to gain social status that they’ll use to do more bad stuff
Take the talent bottleneck seriously; don’t hamper hiring / projects too muchTak
... (read more)

due to CEA's response leaning towards the side of caution, the accuser walks away feeling like their complaint hasn't been taken seriously enough/that CEA should have been quicker to act

I'm sure this has happened, and I'm sad about that.

I also know different people who would say that CEA has been too aggressive in kicking people out, too willing to take action based on limited evidence.

I want to weigh the fact that people will feel alienated by both of these perceptions/experiences. But ultimately we can't make decisions based only on whether someone wi... (read more)

Good point - an aspect of this that I didn't expand on a lot is that it's really important for organisers to do things that they enjoy doing and this helps it to not feel forced. 

On the other hand, I have had conversations with our group about maximising time spent together as a way to build better friendships and people generally reacted to this idea better than I imagined! I think sharing your intentions to maximise friendship-building activities  will feel robotic to some people but others may appreciate the thought and effort behind it. 

Thanks for posting this - it was an interesting and thoughtful read for me as a community builder. 

This summarised some thoughts I've had on this topic previously, and the implications on a large scale are concerning at the very least. In my experience, EAs growth over the past couple of years has meant bringing on a lot of people with specific technical expertise (or people who are seeking to gain this expertise) such as those working on AI safety/biorisk/etc, with a skillset that would broadly include mathematics, statistics, logical reasoning, and ... (read more)