Luke Chambers

BSc (Hons) Computer Science with Artificial Intelligence

Master of Laws (LLM) Space Law

PhD Candidate (Law) - Title: "AI and Machine Learning Nascent Visual Biometrics in Police Intelligence and Criminal Evidence – Impacts on Reliability and Fairness"

I currently work in the criminal justice system of England & Wales, as well as researching my PhD. My academic history in AI and in Law have resulted in an avid interest in all things AI Law (esp criminal law and human rights law) and its value to Longtermist principles. If you ever want to chat about the topic at all, please feel free to pop me an email at lukecarlchambers@protonmail.com :)

Topic Contributions

Comments

How I Recommend University Groups Approach the Funding Situation

That's a good point, about community organisers being kind of a filter. I like to think I'd know if someone was looking to extract profit. To be honest we usually have the other problem. I've heard a few times before from people they 'dont want to take the p*ss' and I have to convince them it's alright to stay at a 2 star instead of a 1 star! I think the groups function well because it's (in theory for me, never happened yet) possible to tell when someone's shifty. So I agree with that point. 

I do still think though that too much focus on the discourse risks socioeconomic exclusion. I know people don't intend it this way, but sometimes the discourse can come off quite elitist in writing when worded incorrectly. It's a risk. But at the same time I would hate to chill someone's free speech, and valid concerns. Communities are always a delicate balancing act! Difficult to get right.

 

How I Recommend University Groups Approach the Funding Situation

That's an interesting tie-in to the 'burnout' discourse we've been seeing lately that I had not even considered.

How I Recommend University Groups Approach the Funding Situation

It's something I would be willing to write if others wanted to read it, unless the original poster would rather do it.

How I Recommend University Groups Approach the Funding Situation

I'm actually going to reply to my own comment here with the cardinal sin of thinking of another point after hitting 'post', but not wanting to disrupt the flow of the original comment!

I believe there IS a case to be made for teaching organisers how to better spend funds smartly. I have been to larger EA events before where I've thought to myself 'this could have been done at half the price'.  Maybe it's the fact I grew up in an environment where you had to make every penny stretch as far as possible, but it blew me away when another group leader mentioned to me they don't negotiate costs with vendors! Like haggle on price for room fees, food etc. Some find it distasteful, and I get that, but a lot could be saved. 

Also, some events can be unnecessarily ostentatious. Like do we really need a room with this much gold and antique clocks? You could have rented a soviet-style office room at half the price like 2 miles away. 

Then again, it's very easy for me to criticise others given my near-zero large-scale event planning experience. Maybe there are other factors I'm not considering. That said, maybe give group leaders some books on negotiation or on frugality tips. That may help a range of the issues highlighted in this post. 

How I Recommend University Groups Approach the Funding Situation

I think this is a good guide, and thank you for writing it. I found the bit on how to phrase event advertising particularly helpful.

One thing I would like to elaborate on is the 'rent-seekers' bit. I'm going to say something that disagrees with a lot of the other comments here. I think we need to be careful about how we approach such 'rent-seeking' conversations. This isn't a criticism of what you wrote, as you explained it really well, but more of a trend I've noticed recently in EA discourse and this is a good opportunity to mention it. 

It's important to highlight that not all groups are equal, demographically. I co-lead a group in a city where the child poverty rate has gone from 24% to a whopping 42% in 5 years, and remains one of the poorest cities in the UK.  I volunteer my time at a food bank and can tell you that it's never been under stronger demand. Simply put, things are tough here.  One of the things I am proudest about in our EA group is we've done a load of outreach to people who face extra barriers to participating in academia and research, and as a result have a group with a great range of life backgrounds. I'm sure it's not the only EA group to achieve this, because I've spoken to other group leads who have made an effort to achieve the same effect.

We've adapted our strategies and events a bit to enable this - eg. pre-buying public transport tickets for people to attend our events, or wage replacement where if they attend a day-long event, we'll pay a micro-stipend equivalent of a 10 or 12 hour shift of minimum wage (though this is rare as we're careful about when we arrange stuff).  This was because some people literally couldn't afford a day off to attend conferences, or present their research, because that lost day had significant consequences for them. As a result, we've had some fantastic things come from people who otherwise would not have had the opportunity to contribute their valuable ideas and work.

My point is that a lack of funding is an extremely real barrier to many people's participation not just in EA, but in academia/research in general. I understand that there is a very real risk of people using EA events as a 'free holiday' type deal, and it's something that bears mitigating, but we also have to be really careful about unfairly tarring people who rely on full funding to attend events.  I fully expect to encourage as many members of my group as possible to attend the conferences because they have lots to gain and lots to contribute. I understand the 'rent-seeking' fear is that people will use EA conferences to pursue jobs or grant funding for projects, but I don't think this is as high a risk as people say because those are EA-aligned jobs and grants, and those organisations have their own safeguards. They can see through false interest fairly easily. As for reducing the quality of conferences, I'm not sure how you could reliably tell the difference between a 'rent-seeker' and someone who just doesn't know EA in-depth yet, or who is nervous.

Essentially, it boils down to the fact that in some groups only a few people may be suitable to attend the conferences as in your example. However, there are contextual and geographical factors at play which means that some groups may make more applications than others, and it may not necessarily be a 'rent-seekers' issue. Some groups may just need more help for more people. As a result, higher numbers of people from x group over y group isn't necessarily an indication of 'rent-seeking'.

I'm always extremely apprehensive about any 'rent-seekers' discourse because it seems to follow a similar trend as to class warfare in mainstream media. For example, the demonisation of people on benefits despite that fact that benefits fraud makes up a microscopic rate of overall fraud. The idea of someone taking advantage of the group (whether that's society or an organisation etc) is often overinflated compared to its actual risk. I would be very interested to see any confirmed examples of rent-seeking to try and gauge how big the current threat is. I assume the grant-makers check the hotels people are claiming for (not 5-star etc) and length of stay (not booking 8 days for a 2-day event). You also sign in to events via a QR code, so checking that people actually went to the event is fairly easy. I assume EA can also access people's agendas, to a degree, and are able to see if people are actively engaging with others. These various safeguards should make this issue quite trackable. If it's a matter of engaging in good faith, that's so immensely hard to measure I'm not even sure it's possible.

A final bit I would like to expand on is this:

"I’ve seen cases where people seem more motivated by the free flight than the conference itself"

There is also a risk of mistaking excitement for motivation. For many people, an EA conference will be the first time they've travelled away to another country (or even city), and so lots of excitement surrounding the actual trip is normal. My first ever EAG London was my first time travelling to my own nation's capital. You can bet I had a walk around the tourist sites after my agenda for the day was finished. And that's okay. I understand there's a concern of people doing it just for the flight, travel, hotel, whatever - but the amount of safeguards would (I assume) prevent this from being the case.

You make really good points, and I think the 'rent-seekers' risk bears watching to see if it becomes a genuine threat, but I am concerned about it becoming an increasing part of EA discourse and if we're not careful it could drive away otherwise great contributors because of entrenched social and class issues. EA already has intellectual diversity issues, and we need to be careful about exacerbating rather than fixing these. I also understand that 'rent-seeker' in no way is intended to mean 'low economic background' - however, my point is that many of the 'rent-seeker' red flags listed here and elsewhere could also be signs of someone overcoming class and social barriers and so there's a risk of mistakenly alienating people from certain backgrounds over others.

Again - I 100% know this isn't what you meant and this was a really helpful guide, but I'm commenting more on the general discourse trend I'm noticing on the forum, on the Twitter group, and in some blogs. I am concerned that the fears of rent-seekers could be overblown compared to the real proportion of the risk, and would be interested to see some evidence-based research in this area.

 

My Most Likely Reason to Die Young is AI X-Risk

I face enormous challenges convincing people of this.  Many people don't see, for example, widespread AI-empowered human rights infringements as an 'existential catastrophe' because it doesn't directly kill people, and as a result it falls between the cracks of AI safety definitions - despite being a far more plausable threat than AGI considering it's already happening. Severe curtailments to humanity's potential still firmly count as an existential risk in my opinion.

Half-baked ideas thread (EA / AI Safety)

I've often thought about the idea of paying automated, narrow-AI systems such as warehouse bots or factory robots a wage even though they're not sentient or anything would help with many of the issues ahead of us with increased general automation. As employment goes down (less tax money) and unemployment (voluntary or otherwise) and therefore social welfare goes up, it creates a considerable strain. Paying automated systems a 'wage' which can then be taxed might help alleviate that. It wouldn't be a wage, obviously, more like an ongoing fee for using such systems to be paid towards the cost of caring for humans. Bonus if that money actually goes into a big pot which helps reimburse people who suffer harm from automated systems. Might be a good stop-gap until our economy adjusts correctly, as tax revenue wouldn't dip as far.

Obviously this is MASSIVE spitball territory, not an idea I've thought about seriously because I literally don't have the time, but could be an interesting idea.  First step would be to check if automation is actually resulting in employment going down, because not sure there's evidence of that yet.

EA is more than longtermism

"There's some vague sense that current-day concerns (like algorithmic bias) are not really AI Safety research. Although I've talked to some who think addressing these issues first is key in building towards alignment. 

 

Now don't go setting me off about this topic! You know what I'm like. Suffice to say I think combatting social issues like algorithmic bias are potentially the only way to realistically begin the alignment process. Build transparency etc.  But that's a conversation for another post :D

EA is more than longtermism

Frances, your posts are always so well laid out with just the right amount of ease-of-reading colloquialism and depth of detail. You must teach me this dark art at some point!

As for the content of the post itself, it's funny that recently the two big criticisms of longermism in EA are that EA is too longtermist and that EA isn't longtermist enough!  I've always thought that means it's about right, haha.  You can't keep everyone happy all of the time. 

I'm one of those people you mention who only really interacts with the longtermist side of EA because it meshes well with my area of expertise, but to be honest I feel it's about right at present.  If EA were to become all-longtermist I think it would be a bit of an over-correction and we need other philosophies too in order to keep a broad palette, but if it did happen and if there was good reason for it - I'd get over it.
 

In regards to: 

I personally disagree with this. As a counter-argument: 

Longtermism, as a worldview, does not  want present day people to suffer; instead, it wants to  work towards a future with as little suffering as possible, for everyone.


My one criticism of current longtermist thought is that sometimes people excluse short-term actions because it's not 'longtermist enough' , eg. creating fertile ground for positive future actions,  but this is potentially just in my sphere and isn't representative of the greater philosophy. I just know from conversations I've had that some people are of the opinion that it's the 'big win in 50 years time' or nothing at all, and don't like the idea of positive short-term baby steps towards aligning a longterm future. However, the literature is fine with this and it seems to just be some people who aren't, so perhaps it's the company I keep :)  You're right though in that the worldview itself addresses this.

Great post!

 

Load More