Of course you can make arguments to maintain some form of a social life and some other things being neccessary to maintain productivity over the long haul, but I think if you argue that that leads to anything close to a normal life you are being disingenuous.
I likely disagree, but depends on definitions. Could you describe what a life fully committed to helping others looks like for you?
but it does not seem to me most people here are actually completely commiting their life to helping others. I'd love to hear your reasoning for that
No solid reasoning, but some reasons given you asked:
- I am not trying to completely commit my life to helping others. Furthermore, I personally could easily do some more e.g. donating 20% instead of 10%, while still having comfortable normal life.
- I have accepted that this may not be morally acceptable in some objective sense.
- I am happy with what I do, given the baseline in society. Sure I could do more, but why should it be me. Fundamentally, it does boil down to valuing other things beyond helping others or 'total utility', including my own enjoyment, having comfortable life, etc.
- Community will likely achieve more the bigger it is, so having standards that are attainable is important. I think the 10% Pledge is great benchmark.
- Peter Singer, key figurehead of community, is not fully vegan despite thinking it is morally correct. (He eats vegan whenever he cooks for himself, but will eat veggie if others are cooking for him.)
In the end, people are messy and weird and generally doing their best. But at least EAs are doing and achieving more than most others and are significantly moving the needle in a positive direction.
If there are willing volunteers, I would like to see an adversarial collaboration. Reading through the comments, it is tricky to dis-entangle what people mean, what are the fundamental disagreements, what are the facts of the matter, whether somebody (accidentally) mis-represented somebody else or even themselves, etc.
Some disagreements I see are:
- To what extent are the particular individuals 'bad people / racist / eugenicists'
- How much should the EA community influence the norms of and/or associate with the rationality/forecasting community
- Where the line is between absolute free speech vs moderation for different events (e.g. manifest, or eags, or ...).
Additionally, I would be interested in some kind of prediction markets to resolve some of the empirical questions. E.g. What percentage of EAs would say 'agree' or 'strongly agree' to the statement 'person X should not have been an invited speaker to manifest'. Note I am new to prediction markets so am not comfortable setting up a question like this which does not have an objective resolution.
Separate to this post, I emailed the EAG team and they replied with this option: "You could choose a free ticket and donate here through Giving What We Can, which is eligible for gift aid."
Unfortunately I already registered, but will do this in future.
For those interested, the question of whether EAG can qualify for 'Gift Aid' is answered on UK Government website:
https://www.gov.uk/guidance/gift-aid-what-donations-charities-and-cascs-can-claim-on
"If any donor or person connected to the donor benefits significantly from their donation, it does not qualify for Gift Aid."
EDIT: I am (likely) wrong. EAG can qualify for 'Gift Aid'. See domdomegg's reply for why my comment is incorrect.
What is the current funding status of AISC?
Which funding bodies have you asked for funding from and do you know why they are not funding this (assuming they chose not to fund this)? The funding options I know about are OpenPhil, EA Funds and Non Linear.
My understanding is you only just managed to get enough funding to run a budget version of AISC 10, so I presume that means you'll be looking for funding for AISC 11.
Was going to post this too! Good for community to know about these critiques and alternatives to EA. However, as JWS has already pointed out, critiques are weak or based on strawman version of EA.
But overall, I like the sound of the 'Moral Amibition' project given its principles align so well with EA. Though, there is risk of confusing outsiders given how similar the goals are, and also risk of people falsely being put off EA if they get such a biased perspective.