I mostly haven't been thinking about what the ideal effective altruism community
would look like, because it seems like most of the value of effective altruism
might just get approximated to what impact it has on steering the world towards
better AGI futures. But I think even in worlds where AI risk wasn't a problem,
the effective altruism movement seems lackluster in some ways.
I am thinking especially of the effect that it often has on university students
and younger people. My sense is that EA sometimes influences those people to be
closed-minded or at least doesn't contribute to making them as ambitious or
interested in exploring things outside "conventional EA" as I think would be
ideal. Students who come across EA often become too attached to specific EA
organisations or paths to impact suggested by existing EA institutions.
In an EA community that was more ambitiously impactful, there would be a higher
proportion of folks at least strongly considering doing things like starting
startups that could be really big, traveling to various parts of the world to
form a view about how poverty affects welfare, having long google docs with
their current best guesses for how to get rid of factory farming, looking at
non-"EA" sources to figure out what more effective interventions GiveWell might
be missing perhaps because they're somewhat controversial, doing more effective
science/medical research, writing something on the topic of better thinking and
decision-making that could be as influential as Eliezer's sequences, expressing
curiosity about the question of whether charity is even the best way to improve
human welfare, trying to fix science.
And a lower proportion of these folks would be applying to jobs on the 80,000
Hours job board or choosing to spend more time within the EA community rather
than interacting with the most ambitious, intelligent, and interesting people
amongst their general peers.
Reddit user blueshoesrcool [https://old.reddit.com/user/blueshoesrcool]
discovered
[https://old.reddit.com/r/SneerClub/comments/13t23ti/effective_ventures_misses_reporting_deadline/]
that Effective Ventures [https://ev.org/] (the umbrella organization for the
Centre for Effective Altruism, 80000 hours, GWWC, etc) has missed its charity
reporting deadline by 27 days
[https://register-of-charities.charitycommission.gov.uk/charity-search/-/charity-details/5026843/accounts-and-annual-returns].
Given that there's already a regulatory inquiry into Effective Ventures
Foundation
[https://forum.effectivealtruism.org/posts/C89mZ5T5MTYBu8ZFR/regulatory-inquiry-into-effective-ventures-foundation-uk],
maybe someone should look into this.
Rational Animations has a subreddit:
https://www.reddit.com/r/RationalAnimations/
[https://www.reddit.com/r/RationalAnimations/]
I hadn't advertised it until now because I had to find someone to help moderate
it.
I want people here to be among the first to join since I expect having EA Forum
users early on would help foster a good epistemic culture.
SOME POST-EAG THOUGHTS ON JOURNALISTS
For context, CEA accepted at EAG Bay Area 2023 a journalist who has at times
written critically of EA and individual EAs, and who is very much not a
community member. I am deliberately not naming the journalist, because they
haven't done anything wrong and I'm still trying to work out my own thoughts.
On one hand, "journalists who write nice things get to go to the events,
journalists who write mean things get excluded" is at best ethically
problematic. It's very very very normal: political campaigns do it, industry
events do it, individuals do it. "Access journalism" is the norm more than it is
the exception. But that doesn't mean that we should. One solution is to be very
very careful about maintaining the differentiation between "community member"
and "critical or not". Dylan Matthews is straightforwardly an EA and has
reported critically on a past EAG
[https://www.vox.com/2015/8/10/9124145/effective-altruism-global-ai]: if he was
excluded for this I would be deeply concerned.
On the other hand, I think that, when hosting an EA event, an EA organization
has certain obligations to the people at that event. One of them is protecting
their safety and privacy. EAs who are journalists can, I think, generally be
relied upon to be fair and to respect the privacy of individuals. That is not a
trust I extend to journalists who are not community members
[https://observer.com/2012/07/faith-hope-and-singularity-entering-the-matrix-with-new-yorks-futurist-set/]:
the linked example is particularly egregious, but tabloid reporting happens.
EAG is a gathering of community members. People go to advance their goals: see
friends, network, be networked at, give advice, get advice, learn interesting
things, and more. In a healthy movement, I think that EAGs should be a
professional obligation, good for the individual, or fun for the individual. It
doesn't have to be all of them, but it shouldn't harm them on any axis.
Someone might be out ab
On Socioeconomic Diversity:
I want to describe how the discourse on sexual misconduct may be reducing the
specific type of socioeconomic diversity I am personally familiar with.
I’m a white female American who worked as an HVAC technician with co-workers
mostly from racial minorities before going to college. Most of the sexual
misconduct incidents discussed in the Time article
[https://time.com/6252617/effective-altruism-sexual-harassment/] have likely
differed from standard workplace discussions in my former career only in that
the higher status person expressed romantic/sexual attraction, making their
statement much more vulnerable than the trash-talk I’m familiar with. In the
places most of my workplace experience comes from, people of all genders and
statuses make sexual jokes about coworkers of all genders and statuses not only
in their field, but while on the clock. I had tremendous fun participating in
these conversations. It didn’t feel sexist to me because I gave as good as I
got. My experience generalizes well; Even when Donald Trump made a joke about
sexual assault that many upper-class Americans believed disqualified him,
immediately before the election he won, Republican women
[https://www.vox.com/2016/10/9/13217158/polls-donald-trump-assault-tape] were no
more likely to think he should drop out of the race than Republican voters in
general. Donald Trump has been able to maintain much of his popularity despite
denying the legitimacy of a legitimate election in part because he identified
the gatekeeping elements of upper-class American norms as classist
[https://astralcodexten.substack.com/p/a-modest-proposal-for-republicans]. I am
strongly against Trump, but believe we should note that many female Americans
from poorer backgrounds enjoy these conversations, and many more oppose the kind
of punishments popular in upper class American communities. This means strongly
disliking these conversations is not an intrinsic virtue, but a decision EA
culture ha
Proposing a change to how Karma is accrued:
I recently reached over 1,000 Karma, meaning my upvotes now give 2 Karma and my
strong upvotes give 6 Karma. I'm most proud of my contributions to the forum
about economics, but almost all of my increased ability to influence discourse
now is from participating in the discussions on sexual misconduct. An upvote
from me on Global Health & Development (my primary cause area) now counts twice
as much as an upvote from 12 out of 19 of the authors of posts with 200-300
Karma with the Global Health & Development tag. They are generally experts in
their field working at major EA organizations, whereas I am an electrical
engineering undergraduate.
I think these kinds of people should have far more ability to influence the
discussion via the power of their upvotes than me. They will notice things about
the merits of the cases people are making that I won't until I'm a lot smarter
and wiser and farther along in my career. I don't think the ability to say
something popular about culture wars translates well into having insights about
the object level content. It is very easy to get Karma by participating in
community discussions, so a lot of people are now probably in my position after
the increased activity in that area around the scandals. I really want the
people with more expertise in their field to be the ones influencing how visible
posts and comments about their field are.
I propose that Karma earned from comments on posts with the community tag
accrues at a slower rate.
Edit: I just noticed a post by moderators that does a better job of explaining
why karma is so easy to accumulate in community posts:
https://forum.effectivealtruism.org/posts/dDudLPHv7AgPLrzef/karma-overrates-some-topics-resulting-issues-and-potential
[https://forum.effectivealtruism.org/posts/dDudLPHv7AgPLrzef/karma-overrates-some-topics-resulting-issues-and-potential]
LEARNING FROM AMNESTY INTERNATIONAL'S MANAGEMENT MALPRACTICE CRISIS
The recent discussions of harms caused by EAs vaguely reminded me of
controversies around misbehaviour committed by leaders of Amnesty International.
Very horribly, these apparently only came to light due to two suicides that were
as I understand partially caused by workplace bullying at AI offices.
From Wikipedia
[https://en.wikipedia.org/wiki/Amnesty_International#2019_report_on_workplace_bullying]:
POTENTIAL NEXT STEPS
(I likely won't find time to do more here. :/ )
Amnesty hired the Konterra Group which subsequently wrote the "AMNESTY
INTERNATIONAL Staff Wellbeing Review"
[https://www.amnesty.org/en/wp-content/uploads/2021/05/ORG6097632019ENGLISH.pdf],
which seems generally insightful and potentially applicable to EA on a very very
quick skim.
* Skim the report and extract useful lessons for EA.
* Make a quick evaluation whether the report's quality and value suggests that
EAs might want to work with the Konterra Group
[https://konterragroup.net/evaluation-organizational-learning/what-we-do/] to
review the EA community:
I'm hiring for a new Director
[https://docs.google.com/document/d/1GbdjO-H3LjLKMKa42KGWnjXOAGGjBGoZ/edit?usp=sharing&ouid=115335349683161452379&rtpof=true&sd=true]
at Social Change Lab to lead our team! This is a hugely important role so if
anyone is at all interested, I do encourage you to apply. Any questions, please
feel free to reach out as well.
-
Social Change Lab [https://www.socialchangelab.org/] is a nonprofit conducting
and disseminating social movement research to help solve the world’s most
pressing problems. We’re looking for a Director to lead our small team in
delivering cutting-edge research on the outcomes and strategies of social
movements, and ensuring widespread communication of this work to key
stakeholders. You would play a significant role in shaping our long-term
strategy and the programs we want to deliver. See more information below, the
full job description here
[https://docs.google.com/document/d/1GbdjO-H3LjLKMKa42KGWnjXOAGGjBGoZ/edit?usp=sharing&ouid=115335349683161452379&rtpof=true&sd=true]
and apply here [https://forms.gle/3WzrC3FiYc3FdJLW8].
* Application deadline: 2nd of June, 23:59 BST. Candidates will be considered
on a rolling basis so early applications are encouraged. Apply here.
[https://forms.gle/3WzrC3FiYc3FdJLW8]
* Contract: Permanent, working 37.5 hours/week.
* Location: London or UK preferred, although fully remote or overseas
applications will also be considered.
* Salary: £48,000-£55,000/year dependent on experience.
If anyone is interested or knows someone who might be a good fit, please share
the job advert with them or let me know. You can also see some more context on
the leadership change here
[https://www.socialchangelab.org/post/a-leadership-change-at-social-change-lab].