I currently believe the most impactful marginal funding opportunities focus on improving the welfare of highly numerous but neglected classes of animals (especially wild animals, shrimp, and invertebrates). As a longtermist, my work focuses on existential risk reduction, but my sense is that the key existential risk-related causes (e.g. AI safety, biosecurity) are relatively well-funded compared to the highest priority animal welfare causes, so I choose to donate there.
As a caveat, there are some nuances to Wikipedia editing to make sure you're following community standards, which I've tried to lay out in my post. In particular, before investing a lot of time writing a new article, you should check if someone else tried that before and/or if the same content is already covered elsewhere. For example, there have been previous unsuccessful efforts to create an 'Existential risk' Wikipedia article. Those attempts failed in part because relevant content is already covered on the 'Global catastrophic risks' article.
I broadly agree with this and have also previously made a case for Wikipedia editing on the Forum: https://forum.effectivealtruism.org/posts/FebKgHaAymjiETvXd/wikipedia-editing-is-important-tractable-and-neglected
One other relevant resource I'd recommend is Will and Toby's joint keynote speech at the 2016 EA Global conference in San Francisco. It discusses some of the history of EA (focusing on the Oxford community in particular) and some historical precursors: https://youtu.be/VH2LhSod1M4
Going very broad, I'd recommend going through the EA Forum Topics Wiki and considering the concepts included there. Similarly, you may look at the posts that make up the EA Handbook and look for suitable concepts there.
I joined Coefficient Giving a few weeks ago as Chief of Staff on the GCR executive team, and I've updated strongly upward on how much counterfactual impact each new hire here can have! The GCR team is stretched really thin relative to the problems we're aiming to solve, and some of the opportunities we can't currently pursue feel really painful to leave on the table. I'd really encourage people to apply: in my view, these are some of the most exciting roles in the ecosystem right now, and CG is also just a very fun place to work, with deeply caring, smart colleagues and a great internal culture.
One signal-boost: my own team is hiring a senior generalist as part of this round. Most of the round is grantmaker roles, but if you have a strong generalist skill set, please apply to our team!