A

AïdaLahlou

Concert Pianist and Founder @ Oxford Concert Circle
19 karmaJoined Working (0-5 years)

Comments
4

Ha! Thanks for the great comment (also very funny!!) to a great post. Really resonates with me re: 

  • warmth of community being a prime motivator for me to get involved in something or stick with something
  • Re the Less Wrong community and what you said about them teaching themselves rationality and then deciding at some point that they know enough to be more rational than the average person. Yes!! It is sooo counterproductive and often results in overconfidence and, frankly, often, misplaced arrogance and contrarian attitudes. It's a shame, but I think with a name like that, Less Wrong was inevitably going to attract that kind of mindset.
  • Also, LOL about 'chunibyo' - will be using that word!

I also agree with OP that it can often feel like EA and its connected circles find it good to treat their members like disembodied thinking machines and often completely erase their wider personality from the picture, unlike other spaces like church which OP describes, where IRL embodied experiences are observed and valued, and real human needs such as the community aspect, are met. In my case it wasn't church but I have really enjoyed being part of direct action environmental groups, BECAUSE of the human element and the thrill of doing things IRL with nice people. 

It's so draining not to be seen!... And no one lives their lives constantly thinking about EA topics.

Also, from the perspective of improving the quality of arguments, considering forum members as 'disembodied thinking beings' rather than very real human beings with different personalities, history, genders, bodies, nationalities, languages etc hurts the discourse in two ways:

  1. It prevents us from monitoring demographic information of people taking part in debates to make sure we have a broad enough range of perspectives - because, yes, new perspectives are often the product of unique LIVED experiences (Eg feminist economics, etc).
  2. Not paying attention to the person behind the arguments means missing an opportunity to be aware of any potential biases in the arguments. Encouraging people to bring 'their whole selves to the discourse' would also nudge them towards more intellectual rigour, as they will be scrutinised more. 

 I think it's a shame that OP feels like he cannot contribute to the EA movement and surely it speaks to a failure to show people how they can contribute, but I think that for example the work that goes into making a community happy and engaged is just as important as whatever work that community does. A parallel I can think of is Reproductive Labour.  For so long domestic work or the work of making and raising children (traditionally done by women) was considered economically useless or even harmful to the broader economy by mainstream (and often male) economists. But when feminist economists came along they explained that all that housework was a precondition to any economic work being done at all — you cannot go to work without food in your tummy, or if there's no one to look after the children (or even make and raise the children who will become workers in turn!). So whatever your method for estimating, it's immensely valuable (a quick Google estimates its value in the UK to be 1.24 Trillion pounds annually, compared to an overall size of the economy of about £2.7 trillion pounds.). This aspect of the economy, reproductive labour, doesn't appear on GDP measures, but is absolutely essential to everything else happening. 

There's a parallel here because I think in order the support the amazing work that goes on in the more concrete sides of EA (eg research, fantastic blog posts, charity entrepreneurship, fundraising etc) we need the human fuel to sustain it. That can mean memes, or frankly fun meetups, and deep friendships that go beyond a purely corporate feel. 

In the environmental groups I was a part of people would meet for random things like knitting or banner-making and although they weren't 'effective' things to do they supported the rest of the work. It's very hard to stay involved with a movement that sees everything through the lens of extreme productivity and effectiveness.

 I think everyone on this forum agrees that taking time to do things just for fun is important for well being and long term productivity. 

What if being REALLY serious about fun and play, in the way that Google and the other Silicon Valley companies used to be known for (toboggans in the offices, playrooms etc), was a way to make the EA community both more inclusive and more effective? I really believe there's a good case to be made for this and would solve the problem that OP has of not feeling like his needs for community and fun are being met AND the uncertainty about what they can contribute. 

I think the question is not 'whether there should be EA efforts and attention devoted to AGI' but whether the scale of these efforts is justified - and more importantly, how do we reach that conclusion.

You say no one else is doing effective work on AI and AGI preparedness but what I see in the world suggests the opposite. Given the amount of attention and investment of individuals, states, universities, etc in AI, statistically there's bound to be a lot of excellent people working on it, and that number is set to increase as AI becomes even more mainstream. 

Teenagers nowadays have bought in to the idea that 'AI is the future' and go to study it at university, because working in AI is fashionable and well-paid. This is regardless of whether they've had deep exposure to Effective Altruism. If EA wanted this issue to become mainstream, well... it has undoubtedly succeeded. I would tend to agree that at this moment in time, a redistribution of resources and 'hype' towards other issues would be justified and welcome (but again, crucially, we have to agree on a transparent method to decide whether this is the case, as OP I think importantly calls for).

 

1) Regardless of who is right about when AGI might be around (and bear in mind that we still have no proper definition for this), OP is right to call for more peer-reviewed scrutiny from people who are outsiders to both EA and AI. 

This is just healthy, and regardless of whether this peer-reviewed reaches the same or different conclusions, NOT doing it automatically provokes legitimate fears that the EA movement is biased because so many of its members have personal (and financial) stakes in AI. 

See this point of view by Shazeda Ahmed https://overthinkpodcast.com/episode-101-transcript She's an information scholar who has looked at AI and its links with EA and one of the critics of the lack of a counter-narrative. 

I, for one, will tend to be skeptical of conclusions reached by a small pool of similar (demographically, economically, but also in the way they approach an issue) people as I will feel like there was a missed opportunity for true debate and different perspectives. 

I take the point that these are technical discussions and therefore it makes it difficult to involve the general public into this debate, but not doing so creates the appearance (and often, more worryingly, the reality) of bias. 

This can harm the EA movement as a whole (from my perspective it already does).
I'd love to see a more vocal and organised opposition that is empowered, respected, and funded to genuinely test assumptions.

2) Studying and devoting resources to preparing the world to AI technology doesn't seem like a bad idea given: 

Low probability × massive stakes still justifies large resource allocation. 

But, as OP seems to suggest it becomes an issue when that focus so prevalent that other just as important / likely issues are neglected because of that. It seems that EA's focus on rationality and 'counterfactuality' means that they should encourage people to work in fields that are truly neglected. 

But can we really say that AI is still neglected given the massive outpourings of both private and public money into the sector? It is now very fashionable to work in AI, and a wide-spread belief is that doing so warrants a comfortable salary. Can we say the same thing about, say, the threat of nuclear annihilation, or biosecurity risk, or climate adaptation? 


3) In response to the argument that 'even a false alarm would still produce valuable governance infrastructure' Yes, but at what cost? I don't see much discussion on whether all those resources would be better spent elsewhere.

Interesting post.

To me, the most interesting bit of your article is the section on WHY we are drawn to short-term pragmatism

A few more to add:
(emojis are for readability, this is not AI!)

🌟 Ego, or just a very understandable need for some motivation, often leads us to short-term pragmatism. It's much nicer to say at a dinner party eg: 'Our organisation has saved 2m chickens from factory farms' than 'we have increased the connectedness of the pro-animal network by 50% in our area'. We'd rather think of ourselves as heroes rather than bureaucrats.

There is always going to be friction between the desire to achieve 'flashy' but ultimately meretricious milestones VS less sexy but more strategically important milestones to bring about a more radical end result.

In many people coexist the desire to bring about the most radical reality but also to see results that warm their hearts in order to keep motivated... and the latter can often takes precedence due to the very practical need of being motivated enough to get out of bed in the morning. 


🔥 Crisis / emergency thinking: our empathy, I feel, generally push us towards solving immediate problems we are witnessing rather than take a step back and adopt a high level strategy.
--> If your house is on fire, you're going to try to put out the fire instead of want to campaign for more effective fire prevention regulations in building.


Every day hundreds of millions of animals are going through factory farming, and many of us would find more comfort in stopping their suffering NOW rather than working more long-term towards a complete ban on factory farming. And many people would think that's not the worst thing in the world because:

🥲 "This is better than nothing" (or "Opportunity cost" type thinking)
If players feel like there is not enough capacity in a work area and feel disempowered to build it, I can see why they would turn towards the low-hanging fruits of achieving good but strategically not as significant results. I would informally call that 'This is better than nothing', or 'opportunity cost' type thinking. 

______

I personally prefer to see deeper, more substantial (visionary pragmatic) approaches to solving a problem (it seems to me that that is definitely more EA), but I have sympathy for people who'd rather see the results right here right now. 

The way to bring more people to visionary pragmatism is probably just a very gentle nudging and challenging. If people are not visionary pragmatists because of ego reasons, then they probably cannot be easily turned. But people needing motivation, crisis mode thinkers and to some extent, disempowered 'better than nothing' people can probably switch to visionary pragmatism, with the right support. I'm grateful to books like Moral Ambition for glamorising the coordination work of activists in the public imagination. 

On another note, I see a lot of parallels in the Global Health space.
Single-issue Global Health initiatives can often be in direct conflict with the implementation of a more sustainable, effective nationally-led health systems, which is the ultimate goal.

(Psst: Now, with new initiatives like the Lusaka agenda, finally this is starting to change as people are starting to think of a new Global Health architecture that transitions towards supporting national health systems as a priority. It took some HARD work, with many individuals GHIs pushing back, but it looks like this is now the consensually agreed direction things are going to move towards.)