K

Kestrel🔸

1012 karmaJoined Working (0-5 years)Lancaster, UK

Bio

I work as a researcher in statistical anomaly detection in live data streams. I work at Lancaster University and my research is funded by the Detection of Anomalous Structure in Streaming Settings group, which is funded by a combination of industrial funding and the Engineering and Physical Sciences Research Council (ultimately the UK Government).

There's a very critical research problem that's surprisingly open - if you are monitoring a noisy system for a change of state, how do you ensure that you find any change as soon as possible, while keeping your monitoring costs as low as possible?

By "low", I really do mean low - I am interested in methods that take far less power than (for example) modern AI tools. If the computational cost of monitoring is high, the monitoring just won't get done, and then something will go wrong and cause a lot of problems before we realise and try to fix things.

This has applications in a lot of areas and is valued by a lot of people. I work with a large number of industrial, scientific and government partners.

Improving the underlying mathematical tooling behind figuring out when complex systems start to show problems reduces existential risk. If for some reason we all die, it'll be because something somewhere started going very wrong and we didn't do anything about it in time. If my research has anything to say about it, "the monitoring system cost us too much power so we turned it off" won't be on the list of reasons why that happened.

I also donate to effective global health and development interventions and support growth of the effective giving movement. I believe that a better world is eminently possible, free from things like lead pollution and neglected tropical diseases, and that everyone should be doing at least something to try to genuinely build a better world.

Comments
133

There will definitely be things! I will keep you updated :)

Oh, and while I'm here: amazing work getting your EAG "cost per attendee" numbers down, without sacrificing on quality. It's great work, and will really help EA scale!

I'd just like to say, as a (volunteer) community-builder, thanks for having a "fundraising at the ecosystem level" strategy involving making effective giving a more visible and accepted part of the movement. There were years when it was considered a totally second-rate, outdated thing to do, and merely mentioning you did it invited controversy at the EA coffee that you weren't maximising enough. I'm hoping that we're over that now.

It's not just "overdependence on a single funder" that's the issue, it's the fact that people who give effectively bring huge social and professional networking benefits to EA group meetups that aren't immediately obvious.

For example, the underlying mental focus on evaluations impartiality and cost-effectiveness brought by an effective giver (as opposed to the "here's my project proposal" pitch framing often brought by EA workers or jobseekers) is an important part of the social ecosystem of conversations that happen within a group meetup. Without it, you end up with a bunch of people who want to come up with project proposals and nobody who wants to sit down and evaluate them!

Also, effective givers who are established in a career are really valuable sources of non-transactional mentorship and networking type conversations to EA's younger attendees. They're people who give good advice and access to networks while genuinely not personally wanting anything in return (other than for the mentee to do well at improving the world). That's really valuable to have around.

As this person seems very worried about counterfactuals, I should probably point out that the All Grants Fund does still make substantial grants to the Top Charities because they don't get enough granting opportunities that are reliably estimated as more effective than a top charity, so on the margin your donations are equivalent.

This may change in future - GiveWell are investigating lots more scalable grants in things like water treatment and humanitarian contexts.

There is lots of different stuff here:

  • Learned helplessness with pledge donations is real. Pledge burnout is real. Your feelings are legit and please do talk to people about them. This is a common feeling amongst EA people who focus on global health and development.
  • Do just fund your own weekly meetups using a pledge waiver if you want to. That's ok. Supporting your costs as a volunteer is a fine use of effective money (especially if it helps prevent you from burning out and thereby makes your contribution as a volunteer and even as a donor more sustainable).
  • There is absolutely nothing in any way morally wrong with giving your pledge money to the Against Malaria Foundation (or another Givewell top charity or anything GiveWell-y). Malaria is neglected. AMF has about a £300 million funding gap. Easily preventable childhood death as a whole has about a £20 billion a year funding gap. That gap is real. You giving cash there doesn't put it in Dustin's pocket. CG's donation amounts are basically unaffected by how much other money is in the space. We've seen this with all the stuff about USAID (billions of $) causing basically zero strategic shift in CG's granting (though a lot of strategic shifts in GiveWell's granting as the All Grants Fund tries to cover opened-up areas).
  • If you really want to save lives meaningfully more effectively than a GiveWell donation can, and don't want to in any way substitute for CG money, consider donating to the operations costs of global health and development effective fundraising organisations that Coefficient Giving doesn't fund at the moment. An example might be One for the World https://1fortheworld.org/our-team - there are others. There is a legitimate effective giving portfolio gap between what donation multiplier achieved would cause CG to fund an organisation (which I think is currently about 6x? I believe it's going up as they double down on only funding the ones with the highest multipliers), and what method of employing staff to fundraise would provide a more effective way of saving lives than GiveWell (which is 1.1x). If these kinds of organisations have more money, they legitimately use it to employ more people to run more stuff to raise even more money for GiveWell charities.

I find that many EAs don't know about that last point. This gap exists for somewhat reputational reasons. It's seen as a little bit reputationally gauche for a large philanthropist to donate to a fundraising organisation (rather than the thing the fundraising organisation is fundraising for). Moreso for things with lower multipliers. CG will only take the mild reputational hit of it for things with a high enough multiplier, because CG is watching its reputation carefully. If you as a person do not care one bit about possible reputational consequences of being seen to be paying a fundraiser, just about how many lives you can save with your cash, then it's a great donation area choice.

It's a bit like paying your own direct work costs, but divorced from yourself as the person running the things.

Some more details here: https://coefficientgiving.org/research/reflecting-on-our-recent-effective-giving-rfp/ 

Amazing work! I really appreciate everything you're doing to get more people into jobs that meaningfully improve the world.

I agree. EA has a cost-effectiveness problem that conflicts with its truth-seeking attempts. EA's main driving force is cost-effectiveness, above all else - even above truth itself.

  • EA is highly incentivised to create and spread apocalyptic doom narratives. This is because apocalyptic doom narratives are good at recruiting people to EA's "let's work to decrease the probability of apocalyptic doom (because that has lots of expected value given future population projections)" cause area. And funding-wise, EA community funding (at least in the UK) is pretty much entirely about trying to make more people work in these areas.
  • EA is also populated by the kinds of people who respond to apocalyptic doom narratives, for the basic reason that if they didn't they wouldn't have ended up in EA. So stuff that promotes these narratives does well in EA's attention economy.
  • EA just doesn't have anywhere near as much £$€ to spend as academia does. It's also very interested in doing stuff and willing to tolerate errors as long as the stuff gets done.  Therefore, its academic standards are far lower.

I really don't know how you'd fix this. I don't think research into catastrophic risks should be conducted on a shoestring budget and by a pseudoreligion/citizen science community. I think it should be government funded and probably sit within the wider defense and security portfolio.

However I'll give EA some grace for essentially being a citizen science community, for the same reason I don't waste effort grumping about the statistical errors made by participants in the Big Garden Birdwatch.

Due to popular demand, we're giving people the option to attend either the full gathering in-person, attend the gathering's organised sessions online (suitable for attendees from other countries for whom travel is burdensome), or day sessions attendance (where no accomodation is booked for you, suitable for locals or EA Hotel grantees). There's now a tick box on the form.

If you signed up before this was available, you're assumed to be attending the full gathering unless you've told me otherwise.

Looking forward to the event, it's going to be great! There's still a few full spots left, so feel free to send to anyone who you think would benefit with some support for their effective giving projects.

Am a charity Trustee. (of a much smaller charity)

I would say: go for it! Try to learn a lot from your experience. It's a huge development opportunity for you.

Load more