I work as a researcher in statistical anomaly detection in live data streams. I work at Lancaster University and my research is funded by the Detection of Anomalous Structure in Streaming Settings group, which is funded by a combination of industrial funding and the Engineering and Physical Sciences Research Council (ultimately the UK Government).
There's a very critical research problem that's surprisingly open - if you are monitoring a noisy system for a change of state, how do you ensure that you find any change as soon as possible, while keeping your monitoring costs as low as possible?
By "low", I really do mean low - I am interested in methods that take far less power than (for example) modern AI tools. If the computational cost of monitoring is high, the monitoring just won't get done, and then something will go wrong and cause a lot of problems before we realise and try to fix things.
This has applications in a lot of areas and is valued by a lot of people. I work with a large number of industrial, scientific and government partners.
Improving the underlying mathematical tooling behind figuring out when complex systems start to show problems reduces existential risk. If for some reason we all die, it'll be because something somewhere started going very wrong and we didn't do anything about it in time. If my research has anything to say about it, "the monitoring system cost us too much power so we turned it off" won't be on the list of reasons why that happened.
I also donate to effective global health and development interventions and support growth of the effective giving movement. I believe that a better world is eminently possible, free from things like lead pollution and neglected tropical diseases, and that everyone should be doing at least something to try to genuinely build a better world.
I agree. EA has a cost-effectiveness problem that conflicts with its truth-seeking attempts. EA's main driving force is cost-effectiveness, above all else - even above truth itself.
I really don't know how you'd fix this. I don't think research into catastrophic risks should be conducted on a shoestring budget and by a pseudoreligion/citizen science community. I think it should be government funded and probably sit within the wider defense and security portfolio.
However I'll give EA some grace for essentially being a citizen science community, for the same reason I don't waste effort grumping about the statistical errors made by participants in the Big Garden Birdwatch.
Due to popular demand, we're giving people the option to attend either the full gathering in-person, attend the gathering's organised sessions online (suitable for attendees from other countries for whom travel is burdensome), or day sessions attendance (where no accomodation is booked for you, suitable for locals or EA Hotel grantees). There's now a tick box on the form.
If you signed up before this was available, you're assumed to be attending the full gathering unless you've told me otherwise.
Looking forward to the event, it's going to be great! There's still a few full spots left, so feel free to send to anyone who you think would benefit with some support for their effective giving projects.
Hello! Great questions. My answers:
I will add onto this: EA has (and individual EAs have) a vast difference in threshold between "approve of" and "support". EAs tend to not support a lot of things they approve of, because the resource pool for support is extremely limited.
This can sometimes be difficult to navigate relationally, if you are used to equating the concepts. Do not take a lack of support as indicative of a lack of approval. EAs (probably) approve of you doing good things for the world.
So I think the problem (?) is that nobody donates to EA infrastructure for the purpose of cultivating a nice community. They donate to EA infrastructure almost exclusively for the purpose of cultivating impactful actions (that are the actions they want to see)
I mean, I sure would like it if people donated to cultivate a nice community. However, I don't think I'm owed that from an explicitly EA funding pot. Why should EA-aligned donors spend cash on me and not on e.g. malaria prevention? Heck, I'm an EA-aligned donor, and I spend cash on malaria prevention that could have been spent on me.
Amazing work! I really appreciate everything you're doing to get more people into jobs that meaningfully improve the world.