I am an award-winning public health consultant and researcher who is passionate about using my skillset to address modern health challenges and to improve public services and patient outcomes in the UK and abroad. I specialise in the design, implementation and scale of public health and social impact interventions.
In the first steps of my working life, I have established a strong track record for bringing scientific and technical insight into real-world health and social care problems at national and international levels. With experience pursuing positive outcomes in these areas across commercial, clinical, academic and philanthropic settings, I am a passionate believer in the value of cross-sector collaboration and interdisciplinary approaches when tackling intractable social and medical issues.
I possess both a Master's degree focused in Public Health from Imperial College London (Merit) and a BA in Experimental Psychology from The University of Oxford (First Class). I am a TedXOxford Alum and have had my writing featured in Cosmopolitan, The Observer, About Time Magazine, Phoenix Magazine and Tortoise Media. I have also acted as a contributor on health and wellbeing for the BBC (Woman's Hour, BBC Five Live, BBC News) and hold the Prime Ministerial 'Point of Light' and 'South West Surrey Hero' Awards for excellence in advocacy and public service. My work within social change and inclusive technological innovation was recognised by my winning of the Alan Turing Institute's Community Award, my shortlisting for The Future Stars of Tech's Diversity Advocate Award and my nomination to join both the 50:50 'Ask Her To Stand' political development scheme and McKinsey and Company's 'Next Generation Women Leaders' group.
I am currently pursuing my DPhil in Primary Health Care at the University of Oxford as an Oxford-MRC Enterprise iCASE Award Winner. My research is focused upon developing a real-time vaccine benefit risk platform capable of monitoring and differentiating seasonal vaccine effectiveness, uptake and adverse effects amongst immunocompromised patient groups - the first of its kind. My work utilises a combination of routinely collected primary care medical record data (courtesy of the Oxford-Royal College of General Practitioners' Research and Surveillance Centre) and biological specimen data (virology and serology) and aims to inform vaccine scheduling and dosing amongst this vulnerable group.
I am looking for contacts to help support my work as a Progress Fellow at the Tony Blair Institute of Global Change. Specifically, I am looking to convene a broad tent of actors to discuss how we might be able to change the health data landscape in the UK to be more congenial to big data linkages, advanced clinical informatics and, by extension, precision or personalised medicine. I am also trying to map out my next steps following the completion of my Doctorate. I find myself at a cross-roads: I have the opportunity pursue high-impact post-doctoral research into applying best practice from nascent healthcare systems for the benefit of the NHS, however, after time spent recovering from Long COVID I am keen to get my boots back on the ground in programmatic work. The EA community and their wraparound events offer me opportunities to connect with figures who have navigated this same cross-road successfully and to find the mentors who will help me introspect to find my most meaningful next step rather than the safest one. I am keen to learn how to marry desk and field research-based approaches from such individuals and build out my network in Health-adjacent Corporate Philanthropy (Gates Foundation/ Clinton Health/ Rockefeller/ Ford Foundation etc) and Big Tech to understand overlaps between their interests and my own work in precision/personalised medicine, disease surveillance and pandemic preparedness. I am also eager to learn where Big Tech aims to make their footprint in health after an arguably tentative start.
I'd be delighted to offer mentorship on how to get the best out of your early career, how to navigate chronic illness and working life and strategic support on health tech solutions or programs.
I just wanted to share this as best practice in terms of calling in naturally adverse audiences to longtermist goals: https://republicen.org
I should also note, no theory of change is perfect - your point on the importance of responsiveness is well made - but it’s through collaboration we maximise the chance of controlling for individual blind spots and limitations. I’m not advocating for either / or here, just a move away from ‘more of the same’.
As an aside, I certainly do not think the work and messaging of social justice is vapid and am unsure if that is your own belief or something you have taken from my writing?
Really interesting criticism, Richard - and one I appreciate. I’m fresh out of the EA gate so am keen to be redirected when I’ve missed something important. My starting point here was a more general description about my initial reluctance to engage with EA and how my worries were assuaged by actually getting stuck in and seeing how what I first felt might be empty language was backed up by real sincerity. So to be clear: I am not accusing EA of false rhetoric - I was pleasantly surprised that the key terms and principles shared weren’t parroted back as I’ve experienced in ESG, but were deeply resonant to those I spoke to. It was incredibly refreshing.
Instead, this post is simply my reflection on what I felt was a surprising absence of social justice advocates in this space as I thought this would be their natural habitat. I think that the tension between long term thinking and immediate catastrophe is best bridged by those working at the coalface of how existential risk manifests today, so I wanted to write about the value of appealing to the SJW spirit, rather than dismissing it. However, at present, many in that world view EA as disconnected from the very real suffering and emergency in the here and now and lodge ‘little less conversation, little more action’ criticisms, which aren’t justified either. Overall, we need to do more to explain EA principles in a way which appeals to this important pressure group as it is, in my mind at least, to our detriment that these different factions of the altruistically minded cannot seem to find a common ground to work from.
There is some overlap in the New Yorker coverage: https://www.newyorker.com/magazine/2022/08/15/the-reluctant-prophet-of-effective-altruism
But thank you for your feedback. It’s really valuable to me as I try and accelerate my learning.
I can't agree with this post more. I raised concerns about the number of times I heard those placed into leadership positions of new EA spin-offs/projects relaying that they felt they didn't have sufficient oversight/ reporting mechanisms in place to get the best out of these programmes or even their talents. Idealistic as we all are, big bucks need big governance - it's as simple as that.
So let's get scientific about all this. It is now absolutely essential that EA makes the a-priori publication of impact metrics, programmatic design and oversight mechanisms with absolutely all affiliated activities business as usual. It is too easy to get creative with this accounting if done retrospectively - reach gets inflated and declared financial waste gets minimised/ swept under rugs. If we want the public to ever trust us with their funds again, we have to show all our workings and we have to be open about our failings. The only way of cleaning up the reputational damage of our FTX affiliation is concerted effort around external and objective oversight/audit (annual charity commission reporting is not sufficient) and layered internal governance via boards and committees and, yes, I'm afraid a metric tonne of bureaucratic paperwork.
EA now has brand-value and must act accordingly: it cannot shirk responsibility as 'an ideology' or even as an accelerator perhaps can. Any negative association is now our liability and our responsibility. A single bad apple ruins the brand equity barrel - not least when the public bloodlust for all things EA is considered. Do we really want to be the makings of just another bad Netflix documentary?
I have worked in and alongside global programs via corporate philanthropy, international relations and political parties. I have helped to manage the oversight of programmes ranging from £50k to £50mil in valuation and have seen how paperwork and wraparound scrutiny mechanisms are supposed to increase exponentially with each added zero. This is the only way of safeguarding beneficiaries, of behaving ethically: failing to step up to this bureaucratic burden risks the poor programmatic design or fiscal mismanagement that does unthinkable harm to those who come to depend upon the hope and means to social mobility that these interventions purport to providing.
I would be delighted to leverage this experience and help EA in its current crisis and - it must be stated - was glad to see that when I did raise these initial concerns with EA leadership, they were responded to positively and I was immediately invited to meet with the governance team. That said, my concerns still stand so the offer does too :)
Let's get our house in order folks.