Hi, EAs! I'm Ed Mathieu, manager of a team of data scientists and researchers at Our World in Data (OWID), an online publication founded by Max Roser and based out of the University of Oxford.
We aim to make the data and research on the world's largest problems accessible and understandable. You can learn more about our mission on our site.
You’re welcome to ask me anything! I’ll start answering questions on Friday, 23 June.
- Feel free to ask anything you may want to know about our mission, work, articles, charts, or more meta-aspects like our team structure, the history of OWID, etc.
- Please post your questions as comments on this post. The earlier you share your questions, the higher the chances they'll reach the top!
- Please upvote questions you'd most like answered.
- I'll answer questions on Friday, 23 June. Questions posted after that are less likely to get answers.
- (This is an “AMA” — you can explore others here.)
I joined OWID in 2020 and spent the first couple of years leading our work on the COVID-19 pandemic. Since then, my role has expanded to coordinating all the research & data work on our site.
I previously worked as a data scientist at the University of Oxford in the departments of Population Health and Primary Care Health Sciences; and as a data science consultant in the private sector.
For a (3.5-hour!) overview of my background, and the work of our team at OWID, you can listen to my interview with Fin Moorhouse and Luca Righetti on Hear This Idea. I also gave a talk at EA Global: London 22.
Hey Ollie – thanks for the question!
I've engaged with a few activist and political communities in the past, primarily around environmental issues and Green politics. My overall take is that I would find it hard today to be part of these communities compared to the ones that interest me today. From what I remember, epistemic practices tended to be very bad, with lots of motivated reasoning, cherry-picking, various biases, etc. It doesn't necessarily mean the people I met were wrong, but how they made up their minds about issues seems very flawed in retrospect. Compared to this, the epistemic quality of Effective Altruism appears to be its main competitive advantage compared to other communities I encountered. Many people in the community are genuinely cause-neutral and truly adopt (or at least try to adopt) a scout mindset.
If anything seems better about these communities, it's the fact that their direct engagement with politics, the media, etc., makes them much more aware of the importance of public relations and not being perceived as bad actors. My perception – reinforced by everything that happened in EA in late 2022 – is that many EAs see public relations as unnecessary (sometimes even bad, when "PR" is used as a derogatory term). I've met quite a few people who seem to think that the way non-EA people perceive EA doesn't matter at all, as long as EA people are saying things that are evidence-based and smart. I believe this is deeply wrong; a community of smart and "very-right" people won't have much impact if it has such a bad image that no one dares involve it in public discussions.
Interestingly, in the case of EA, this dismissive attitude toward image sometimes applies to individuals as well. Both online and at EAG, I've met more people than I expected who seemed to disregard the benefits of social norms, politeness, kindness, etc., and who behaved in a way that seemed to say "I'm too smart to be slowed down by these stupid things". (To be clear, I don't think the majority of EAs are like this at all; but the prevalence of this behavior seems much higher than in the general population.)
Another thing that comes to mind, valued by people outside EA but shrugged off by people inside EA, is institutional stability. From having worked or collaborated with quite a few different companies, political parties, research organizations, NGOs, etc., I think there is genuine value in building institutions on solid foundations. For EA organizations, this relates to many questions people have raised since the FTX debacle: who should run EA organizations? What should their boards look like? What share of board members should be EAs? What share of board members can overlap between very close EA organizations? I think many EAs have shrugged off these questions as boring, but the long-term stability of the overall EA community depends on them.
Funding runway also falls under that category: many EAs reason about funding stability as if every skilled person was happy to work at an organization that could run out of money in less than a year. Again, I don't think this is a good way of planning things out for the long term. This recent post that described NTI as "too rich" for holding more than 1.5 years’ expenditure, is one example of this bad habit.