I have a couple thoughts here, as a community builder, and as someone who has thought similar things to what you've outlined.
I don't like the idea of bringing people into EA based on false premises. It feels weird to me to 'hide' parts of EA to newcomers. However, I think the considerations involved are more nuanced than this. When I have an initial conversation with someone about what EA is, I find it difficult to capture everything in a way that comes across as sensible. If I say, "EA is a movement concerned with finding the most impactful careers ...
Thanks for posting this! I agree, and one thing I've noticed while community building is that it's very easy to give career direction to students and very early-career professionals, but much more challenging to mid/late-career professionals. Early-career people seem more willing to experiment/try out a project that doesn't have great support systems, whereas mid/late-career people have much more specific ideas about what they want out of a job.
Entrepreneurship is not for everyone, and being advised to start your own project with unclear parameters a...
We already have tons of implicit norms that ask different behaviours of men and women, and these norms are the reason why it's women coming forward to say they feel uncomfortable rather than men. There are significant differences in how men and women approach dating in professional contexts, see power dynamics, and in the ratio of men in powerful positions versus women (as well as the gender ratio in EA generally). Drawing attention to these differences and discussing new norms that ask for different behaviours of men in these contexts (and different behaviours from the institutions/systems that these men interact with) is necessary to prevent these situations from happening in the future.
Something about this comment rubbed me the wrong way. EA is not meant to be a dating service, and while there are many people in the community who are open to the idea of dating someone within EA or actively searching for this, there are also many people who joined for entirely different reasons and don't consider this a priority/don't want this.
I think that viewing the relationship between men and women in EA this way - eg. men competing for attention, where lonely and desperate men will do what it takes to to get with women - does a disservice to ...
I disagree-voted on this because I think it is overly accusatory and paints things in a black-and-white way.
There were versions of the above proposal which were not contentless and empty, which stake out clear and specific positions, which I would've been glad to see and enthusiastically supported and considered concrete progress for the community.
Who says we can't have both? I don't get the impression that EA NYC wants this to be the only action taken on anti-racism and anti-sexism, nor did I get the impression that this is the last action EA NYC will tak...
People choose whom they date and befriend - no-one is forcing EAs to date each other, live together, or be friends. EAs associate socially because they share values and character traits.
To an extent, but this doesn't engage with the second counterpoint you mentioned:
2. The work/social overlap means that people who are engaged with EA professionally, but not part of the social community, may miss out on opportunities.
I think it would be more accurate to say that, there are subtle pressures that do heavily encourage EAs to date each other, live togethe...
I think the usefulness of deferring also depends on how established a given field is, how many people are experts in that field, and how certain they are of their beliefs.
If a field has 10,000+ experts that are 95%+ certain of their claims on average, then it probably makes sense to defer as a default. (This would be the case for many medical claims, such as wearing masks, vaccinations, etc.) If a field has 100 experts and they are more like 60% certain of their claims on average, then it makes sense to explore the available evidence your...
As with any social movement, people disagree about the best ways to take action. There are many critiques of EA which you should read to get a better idea of where others are coming from, for example, this post about effective altruism being an ideology, this post about someone leaving EA, this post about EA being inaccessible, or this post about blindspots in EA/rationalism communities.
Even before SBF, many people had legitimate issues with EA from a variety of standpoints. Some people find the culture unwelcoming (eg. too elitist/not enough diversi...
well the elitism charge is just true and it should be true! Of course EA is an elitist movement, the whole point is trying to get elites to spend their wealth better, via complicated moral reasoning that you have to be smart to understand (this is IMO a good thing, not a criticism!).
I actually think it would be a disaster if EA became anti-elitist, not just for EA but for the world. The civic foundation of the West is made up of Susan from South Nottingham who volunteers to run the local mother & baby group: if she stops doing that to ETG or what...
Thinking that 'the ends justifies the means' (in this case, making more donations justifies tax evasion) is likely to lead to incorrect calculations about the trade-offs involved. It's very easy to justify almost anything with this type of logic, which means we should be very hesitant.
As another commenter pointed out, tax money isn't 'your' money. Tax evasion (as opposed to 'tax avoidance' - which is legal) is stealing from the government. It would not be ethical to steal from your neighbour in order to donate the money, and likewise it is not ethical to steal from the government to donate money.
I mostly agree with your post from a purely financial perspective, I was just giving some examples where people might think that the potential financial benefits of buying a house are worth the potential risks you mentioned. I've got a friend who falls into the example you gave (doesn't have/plan to have children, will leave his house to charity in his will), and this doesn't seem like that terrible of a decision for him.
...EAs who will/may have children however perhaps shouldn't buy a home as, if they do, the pressure to leave the home to their children will
Not saying these situations apply to the person you were replying to, but I can think of a few instances where this would be the case.
My reading (and of course I could be completlely wrong) is that SBF wanted to invest in Twitter (he seems to have subsequently pitched the same deal through Michael Grimes), and Will was helping him out. I don't imagine Will felt it any of his business to advise SBF as to whether or not this was a good move. And I imagine SBF expected the deal to make money, and therefore not to have any cost for his intended giving.
I agree that it's possible SBF just wanted to invest in Twitter in a non-EA capacity. My comment was a response to Habryka's comme...
I can see where you're coming from with this, and I think purely financially you're right, it doesn't make sense to think of it as billions of dollars 'down the drain.'
However, if I were to do a full analysis of this (in the framing of this being a decision based on an EA perspective), I would want to ask some non-financial questions too, such as:
My guess is there must be some public stuff about this, though it wouldn't surprise me if no one had made a coherent writeup of it on the internet (I also strongly reject the frame that people are only allowed to say that something 'makes sense' after having discussed the merits of it publicly. I have all kinds of crazy schemes for stuff that I think in-expectation beats GiveWell's last dollar, and I haven't written up anything close to a quarter of them, and likely never will).
Yeah, there could be some public stuff about this and I'm just not aware of it....
If SBF wanted to buy Twitter for non-EA reasons, that's one thing, but if the idea here is that purchasingTwitter alongside Elon Musk is actually worth billions of dollars from an EA perspective, I would need to see way more analysis, much like significant analysis has been done for AI safety, biorisk, animal welfare, and global health and poverty.
If you think investing in Twitter is close to neutral from an investment perspective (maybe reasonable at the time, definitely not by the time Musk was forced to close) then the opportunity cost isn't really b...
I think it could be a cost-effective use of $3-10 billion (I don't know where you got the $8-15 billion from, looks like the realistic amounts were closer to 3 billion). My guess is it's not, but like, Twitter does sure seem like it has a large effect on the world, both in terms of geopolitics and in terms of things like norms for the safe development of technologies, and so at least to me I think if you had taken Sam's net-worth at face-value at the time, this didn't seem like a crazy idea to me.
The 15 billion figure comes from Will's text messages ...
My reading (and of course I could be completlely wrong) is that SBF wanted to invest in Twitter (he seems to have subsequently pitched the same deal through Michael Grimes), and Will was helping him out. I don't imagine Will felt it any of his business to advise SBF as to whether or not this was a good move. And I imagine SBF expected the deal to make money, and therefore not to have any cost for his intended giving.
Part of the issue here is that people have been accounting the bulk of SBF's net worth as "EA money". If you phrase the ques...
The 15 billion figure comes from Will's text messages themselves (page 6-7). Will sends Elon a text about how SBF could be interested in going in on Twitter, then Elon Musk asks, "Does he have huge amounts of money?" and Will replies, "Depends on how you define "huge." He's worth $24B, and his early employees (with shared values) bump that up to $30B. I asked how much he could in principle contribute and he said: "~1-3 billion would be easy, 3-8 billion I could do, ~8-15b is maybe possible but would require financing"
Makes sense, I think I briefly sa...
Investing in assets expected to appreciate can be a form of earning to give (not that Twitter would be a good investment IMO). That's how Warren Buffett makes money and probably nobody in EA has criticized him for doing that. Investing in a for-profit something is very different and is guided by different principles from donating to something, because you are expecting to (at least) get your money back and can invest it again or donate it later (this difference is one of the reasons microloans became so hugely popular for a while).
On the downside, concentr...
In terms of people coming away from the post thinking that polyamory = bad, I guess I have faith in people's ability on this forum to separate a bad experience with a community from an entire community as a whole. (Maybe not everyone holds this same faith.)
The post was written by one person, and it was their experience, but I expect by now most EAs have run into polyamorous people in their lives (especially considering that EAs on average tend to be young, male, non-religious, privileged, and more likely to attend elite universities where polyamory/discuss...
I'm conflicted here. I completely agree with you that shitting on others' morally-neutral choices is not ideal, but I don't think anyone was coming away from reading that post thinking that polyamory = bad. I would hope that the people on this forum can engage thoughtfully with the post and decide for themselves what they agree/disagree with.
If someone had a bad experience with a man, and in the process of talking about it said something like, "all men suck and are immoral," I just don't think that is the right time or place to get into an argument w...
I guess I don't see why someone wouldn't come away from the post thinking that polyamory = bad.
I think the analogy here is not "all men suck and are immoral" (though I'm not even sure how much I endorse that), but like, if someone had had a bad experiences with men of a certain race, and in talking about it continually mentioned their race. I think people would rightly call that out as racist and not ok - we want to be sympathetic to victims, but if they are saying things that are harmful to others in the course of telling their experience, it'...
Great post! I agree with a commenter above who says that "The problem is not a lack of ideas that needs to be rectified by brainstorming - we have the information already. The problem seems to be that no one wants to act on this information." That being said, I have a few thoughts:
Regarding code of conduct at events, I'm hesitant to make hard and fast rules here. I think the reality around situations such as asking people out/hitting on people, etc, is that some people are better at reading situations than others. For example, I know couples who have...
So I was one of the top comments disagreeing with that post, and I'm a poly woman, and my interest wasn't to defend predatory poly men but to argue against the idea that my relationship structure, which is consensually, positively practiced by many people the world over, isn't inherently toxic or embedded in predatoriness. Trauma and upset should be met with sympathy, but it doesns't justify shitting on others' morally-neutral choices, and a community that's hostile to polyamory is hostile to many women and NBs, not just men.
Strong upvote. It's definitely more than "just putting two people in touch." Will and SBF have known each other for 9 years, and Will has been quite instrumental in SBF's career trajectory - first introducing him to the principles of effective altruism, then motivating SBF to 'earn to give.' I imagine many of their conversations have centred around making effective career/donation/spending decisions.
It seems likely that SBF talked to Will about his intention to buy Twitter/get involved in the Twitter deal, at the very least asking Will to make the in...
I'm not sure I agree with this. I agree that compassion is a good default, but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis, which will include many people in the 'Dank EA Memes' Facebook group. Humour can be a coping mechanism which will make some people feel better about bad situations:
..."As predicted, individuals with a high sense of humor cognitively appraised less stress in the previous month than individuals with a low sense of humor and reported less current anxiety despite experiencing a simi
but I think that compassion needs to be extended to all the people who have been impacted by the FTX crisis
I agree with this. But I think compassion needs to be even further extended to every sentient beings who might be worse off because of this event (but this is assuming EA suffering from this event will be less good done in the world, and I recognize that there are people who seem to genuinely think that the world will be better if EA disappears from the world). And the implication of this is that we need to think about whether these mockeries an...
While I agree that humour is a great de-stressor, I have faith in our ability to find alternative ways to entertain ourselves that don't involve kicking someone while they're down.
Thanks for your response. On reflection, I don't think I said what I was trying to say very well in the paragraph you quoted, and I agree with what you've said.
My intent was not to suggest that Will or other FTX future fund advisors were directly involved (or that it's reasonable to think so), but rather that there may have been things the advisors chose to ignore, such as Kerry's mention of Sam's unethical behaviour in the past. Thus, we might think that either Sam was incredibly charismatic and good at hiding things, or we might think there actually were some warning signs and those involved with him showed poor judgement of his character (or maybe some mix of both).
I am glad you felt okay to post this - being able to criticise leadership and think critically about the actions of the people we look up to is extremely important.
I personally would give Will the benefit of the doubt of his involvement in/knowledge about the specific details of the FTX scandal, but as you pointed out the fact remains that he and SBF were friends going back nearly a decade.
I also have questions about Will Macaskill's ties with Elon Musk, his introduction of SBF to Elon Musk, his willingness to help SBF put up to 5 billion dollars towards t...
I personally would give Will the benefit of the doubt of his involvement in/knowledge about the specific details of the FTX scandal
Of course. This reads as almost bizarre: it would be a baby-eater-type conspiracy theory to think that Will (or anyone else in EA leadership) knew about this. That's just not how things work in the world. The vast majority of people at Alameda/FTX didn't know (inner circle may have been as small as four). I mean, maybe there's a tiny chance that Sam phoned up someone a week ago and wanted a billion in secret, but you can ...
Thanks Julia; this is a really insightful post. I will make sure to use it if anyone in the EA community asks me questions related to community health/the process for complaints in the future.
One of the things I'm curious about is how you see the balance of these trade-offs:
Encourage the sharing of research and other work, even if the people producing it have done bad stuff personally | Don’t let people use EA to gain social status that they’ll use to do more bad stuff |
Take the talent bottleneck seriously; don’t hamper hiring / projects too much | Tak |
due to CEA's response leaning towards the side of caution, the accuser walks away feeling like their complaint hasn't been taken seriously enough/that CEA should have been quicker to act
I'm sure this has happened, and I'm sad about that.
I also know different people who would say that CEA has been too aggressive in kicking people out, too willing to take action based on limited evidence.
I want to weigh the fact that people will feel alienated by both of these perceptions/experiences. But ultimately we can't make decisions based only on whether someone wi...
Good point - an aspect of this that I didn't expand on a lot is that it's really important for organisers to do things that they enjoy doing and this helps it to not feel forced.
On the other hand, I have had conversations with our group about maximising time spent together as a way to build better friendships and people generally reacted to this idea better than I imagined! I think sharing your intentions to maximise friendship-building activities will feel robotic to some people but others may appreciate the thought and effort behind it.
Thanks for posting this - it was an interesting and thoughtful read for me as a community builder.
This summarised some thoughts I've had on this topic previously, and the implications on a large scale are concerning at the very least. In my experience, EAs growth over the past couple of years has meant bringing on a lot of people with specific technical expertise (or people who are seeking to gain this expertise) such as those working on AI safety/biorisk/etc, with a skillset that would broadly include mathematics, statistics, logical reasoning, and ...
Do you have evidence for this? Because there is lots of evidence to the contrary - suggesting that job insecurity negatively impacts people's productivity as well as their physical, and mental health.[1][2][3].&nb... (read more)