K

Kestrel🔸

952 karmaJoined Working (0-5 years)Lancaster, UK

Bio

I work as a researcher in statistical anomaly detection in live data streams. I work at Lancaster University and my research is funded by the Detection of Anomalous Structure in Streaming Settings group, which is funded by a combination of industrial funding and the Engineering and Physical Sciences Research Council (ultimately the UK Government).

There's a very critical research problem that's surprisingly open - if you are monitoring a noisy system for a change of state, how do you ensure that you find any change as soon as possible, while keeping your monitoring costs as low as possible?

By "low", I really do mean low - I am interested in methods that take far less power than (for example) modern AI tools. If the computational cost of monitoring is high, the monitoring just won't get done, and then something will go wrong and cause a lot of problems before we realise and try to fix things.

This has applications in a lot of areas and is valued by a lot of people. I work with a large number of industrial, scientific and government partners.

Improving the underlying mathematical tooling behind figuring out when complex systems start to show problems reduces existential risk. If for some reason we all die, it'll be because something somewhere started going very wrong and we didn't do anything about it in time. If my research has anything to say about it, "the monitoring system cost us too much power so we turned it off" won't be on the list of reasons why that happened.

I also donate to effective global health and development interventions and support growth of the effective giving movement. I believe that a better world is eminently possible, free from things like lead pollution and neglected tropical diseases, and that everyone should be doing at least something to try to genuinely build a better world.

Comments
128

Amazing work! I really appreciate everything you're doing to get more people into jobs that meaningfully improve the world.

I agree. EA has a cost-effectiveness problem that conflicts with its truth-seeking attempts. EA's main driving force is cost-effectiveness, above all else - even above truth itself.

  • EA is highly incentivised to create and spread apocalyptic doom narratives. This is because apocalyptic doom narratives are good at recruiting people to EA's "let's work to decrease the probability of apocalyptic doom (because that has lots of expected value given future population projections)" cause area. And funding-wise, EA community funding (at least in the UK) is pretty much entirely about trying to make more people work in these areas.
  • EA is also populated by the kinds of people who respond to apocalyptic doom narratives, for the basic reason that if they didn't they wouldn't have ended up in EA. So stuff that promotes these narratives does well in EA's attention economy.
  • EA just doesn't have anywhere near as much £$€ to spend as academia does. It's also very interested in doing stuff and willing to tolerate errors as long as the stuff gets done.  Therefore, its academic standards are far lower.

I really don't know how you'd fix this. I don't think research into catastrophic risks should be conducted on a shoestring budget and by a pseudoreligion/citizen science community. I think it should be government funded and probably sit within the wider defense and security portfolio.

However I'll give EA some grace for essentially being a citizen science community, for the same reason I don't waste effort grumping about the statistical errors made by participants in the Big Garden Birdwatch.

Due to popular demand, we're giving people the option to attend either the full gathering in-person, attend the gathering's organised sessions online (suitable for attendees from other countries for whom travel is burdensome), or day sessions attendance (where no accomodation is booked for you, suitable for locals or EA Hotel grantees). There's now a tick box on the form.

If you signed up before this was available, you're assumed to be attending the full gathering unless you've told me otherwise.

Looking forward to the event, it's going to be great! There's still a few full spots left, so feel free to send to anyone who you think would benefit with some support for their effective giving projects.

Am a charity Trustee. (of a much smaller charity)

I would say: go for it! Try to learn a lot from your experience. It's a huge development opportunity for you.

Hello! Great questions. My answers:

  • SMI is an illness that is present in high-income countries. Therefore, there is a (comparative) lot of funding available to research it, and the space is not so neglected as to be an EA funding possibility.
  • I strongly believe that EA worker time should be spent on improving the treatment of high-income country illnesses like SMI (i.e. it is a high-impact career path that EA should encourage people who are altruistically ambitious to go into, although their salaries should be paid - one way or another - by their country's government and not by EA funds)
  • In my opinion SMI research has, on the whole, suffered horribly over the past few decades by an overpropensity towards neurobiological approaches. Quote by NIMH head Thomas Insel: “I spent 13 years at NIMH really pushing on the neuroscience and genetics of mental disorders, and when I look back on that I realize that while I think I succeeded at getting lots of really cool papers published by cool scientists at fairly large costs—I think $20 billion—I don’t think we moved the needle in reducing suicide, reducing hospitalizations, improving recovery for the tens of millions of people who have mental illness.”
  • If you're interested in cool recent SMI research, check out AVATAR therapy: https://wellcome.org/insights/articles/avatar-digital-therapy-could-help-people-who-hear-voices 

I will add onto this: EA has (and individual EAs have) a vast difference in threshold between "approve of" and "support". EAs tend to not support a lot of things they approve of, because the resource pool for support is extremely limited.

This can sometimes be difficult to navigate relationally, if you are used to equating the concepts. Do not take a lack of support as indicative of a lack of approval. EAs (probably) approve of you doing good things for the world.

If it helps at all, people definitely read your updates, and it would be a shame if you stopped posting them here. I've recommended to students trying to "do EA things" that they should start a local PauseAI chapter. Partially that's because people from PauseAI post on here.

So I think the problem (?) is that nobody donates to EA infrastructure for the purpose of cultivating a nice community. They donate to EA infrastructure almost exclusively for the purpose of cultivating impactful actions (that are the actions they want to see)

I mean, I sure would like it if people donated to cultivate a nice community. However, I don't think I'm owed that from an explicitly EA funding pot. Why should EA-aligned donors spend cash on me and not on e.g. malaria prevention? Heck, I'm an EA-aligned donor, and I spend cash on malaria prevention that could have been spent on me.

I think it's also worth saying: one-day conferences usually require two nights of a hotel, that the attendee pays for, unless they're in day travel range. You can thereby quite reasonably ask for a higher entry fee for a retreat, as it would be what would otherwise be spent on a hotel.

Load more