While I'm currently working as a Tea Slinger, I would generally describe myself as a modern day renaissance man, having slaughtered pigs on my family farm and become a vegan, having done HVAC work and academic research, having been a member of both the Republican and Democratic clubs at my university. I've sought to experience much in life, and on the horizon I wish to deepen my experiences within EA, currently having gone from a fellow to a facilitator to a prospective employee looking for work in the space while simultaneously doing personal cause prioritization work. Ill include below a list of some of my deep interests, and would be happy to connect over any of these areas, specifically as they may intersect with EA.
Philosophy, Psychology, Music (have deep interests in all genres but especially electronic and indie), Politics (especially American), Drugs (mostly psychs), Gaming (mostly League these days), Cooking (have been a head chef), Photography, Meditation (specifically mindfulness).
I'm in the process right now of decided which cause area to focus my future work on (general longtermism research, EA community building, nuclear, AI governance, and mental health) so any compelling reasons to go (or not to go) into any of these would be really helpful at this point.
While I can't really offer any expertise in EA related things, I have deep knowledge in Philosophy, Psychology and Meditation, and can potentially help with questions generally related to these disciplines. I would say the best thing I can offer is a strong desire to dive deeper into EA, preferably with others who are also interested.
You can update the EA CoLabs link (under Adding to and/or improving options...) with their website (Impact Colabs) which is a more functional update to this I think.
I swear I'm not usually the one to call for numbers, but I'm compelled to in this case, because one of the common feelings I have reading this claim and claims like it is wanting to know: quantitatively what sort of difference did you make? I don't expect anything beyond really rough numbers, but let me give you an example of something I'd love to read here:
"The 21-22 California budget is 262.5 billion dollars. The organization I work in takes precedence over X billion dollars. My role covers Y% of that budget, and I expect that had another non-EA aligned person been in my role the allocation would have been Z% less effective."
If you had to guess, what do you think your X, Y and Z are? I know Z is tricky too, but if you explain your rough reasoning, I think that'd help. I also know you mention many things that would be hard to directly quantify (changing minds towards being more data oriented in a meeting) but I think you can roughly quantify this by determining what percentage of X became more effective over the course of your influence and what W% were you responsible for of that.
Ah cool, thanks. Would probably include that at the top of the post for others who may be interested.