Denise_Melchin

Comments

EA is vetting-constrained

Yes, everything I said above is sadly still true. We still do not receive many applications per distribution cycle (~12).

Max_Daniel's Shortform

(Have not read through Max' link dump yet, which seems very interesting, I also feel some skepticism of the 'new optimism' worldview.)

One major disappointment in Pinker's book as well as in related writings for me has been that they do little to acknowledge that how much progress you think the world has seen depends a lot on your values. To name some examples, not everyone views the legalization of gay marriage and easier access to abortion as progress, and not everyone thinks that having plentiful access to consumer goods is a good thing.

I would be very interested in an analysis of 'progress' in light of the different moral foundations discussed by Haidt. I have the impression that Pinker exclusively focuses on the 'care/harm' foundation, while completely ignoring others like Sanctity/purity or Authority/respect and this might be where some part of the disconnect between the 'New optimists' and opponents is coming from.

What are the leading critiques of "longtermism" and related concepts

That's very fair, I should have been a lot more specific in my original comment. I have been a bit disappointed that within EA longtermism is so often framed in utilitarian terms - I have found the collection of moral arguments in favour of protecting the long-term future brought forth in The Precipice a lot more compelling and wish they would come up more frequently.

Finding an egg cell donor in the EA community

You would need to check the legality of this however - this is illegal in at least a few European countries, including the UK and Germany.

What are the leading critiques of "longtermism" and related concepts

Most people don't value not-yet-existing people as much as people already alive. I think it is the EA community holding the fringe position here, not the other way around. Neither is total utilitarianism a majority view among philosophers. (You might want to look into critiques of utilitarianism.)

If you pair this value judgement with a belief that existential risk is less valuable to work on than other issues for affecting people this century, you will probably want to work on "non-longtermist" problems.

Finding an egg cell donor in the EA community

Hi linn!

Which country are you in? I have been putting a lot of thought into becoming an egg donor in the UK over the past few months and am currently in the evaluation process for one egg bank and one matching service.

First I would like to note that while most matching services primarily match on phenotype, there certainly are some where you get a detailed profile from the potential donors. I would be happy to tell you the name of the matching agency in the UK that I have been working with which strongly encourages getting a good personality match.

I would expect finding a donor directly from the EA community to be much harder, but maybe someone will respond to your request (but it would be good to know where you live!). Feel free to PM to chat more.

Long-Term Future Fund and EA Meta Fund applications open until June 12th

We have a limited pot of money available, so our decisions are primarily bottlenecked by its size. We have occasionally (once?) decided to not to spend the complete available amount to have more money available for the next distribution cycle, when we had reason to assume we would be able to make stronger grants then.

I am not sure whether that answered your question?

New data suggests the ‘leaders’’ priorities represent the core of the community

This is very much an aside, but I would be really curious how many people you perceive as having changed their views to longtermism would actually agree with this. (According to David's analysis, it is probably a decent amount.)

E.g. I'm wondering whether I would count in this category. From the outside I might have looked like I changed my views towards longtermism, while from the inside I would describe my views as pretty agnostic, but I prioritised community preferences over my own. There might also be some people who felt like they had to appear to have or act on longtermist views to not lose access to the community.

New data suggests the ‘leaders’’ priorities represent the core of the community

Yes, that is what I meant. Thank you so much for providing additional analysis!

New data suggests the ‘leaders’’ priorities represent the core of the community

Thank you for looking into the numbers! While I don't have a strong view on how representative the EA Leaders forum is, taking the survey results about engagement at face value doesn't seem right to me.

On the issue of long-termism, I would expect that people who don't identify as long-termists to now report to be less engaged with the EA Community (especially with the 'core') and identify as EA less. Long-termism has become a dominant orientation in the EA Community which might put people off the EA Community, even if their personal views and actions related to doing good haven't changed, e.g. their donations amounts and career plans. The same goes for looking at how long people have been involved with EA - people who aren't compelled by long-termism might have dropped out of identifying as EA without actually changing their actions.

Load More