Note: I have been involved with EA since 2-3 months only, so my ideas may not be accurate.
One approach is to target people involved in social issues who believe in some of the more popular EA concept(s).
Climate Change
Out of all the EA priority areas, climate change is arguably the most popular one (among non-EAs/general population).
Quite a few people (among non-EAs) work on climate change because they think it's the most pressing problem. They believe in some of the more well-known EA concepts like:
1. Using supply/demand concept to choose a social issue (aka neglectedness).
2. The fact that climate change needs to be fixed quickly while other social issues can be solved later also (similar to idea of existential risk).
On the whole, more involved groups appear to prioritise Global Poverty and Climate Change less and longtermist causes more. ~ EA 2019 Survey
However, I must point out that Global Poverty is ranked the most popular EA cause area, followed by Climate Change. I suspect this is due to a lot of people in the EA movement having joined recently, and taking some time to understand EA's ideas on cause prioritization.
Similarly, it may be efficient to target people who are involved in nuclear security (they share ideas of existential risk and sudden catastrophes are more important than catastrophes that build up over time).
Essentially, we are looking for people who are working on a particular social cause because of logical reasons. This greatly increases their chance of being a fit with the ideas of EA, since this approach captures both the effective and altruism aspects of EA.
Hi Prabhat!
First things first, I'm also relatively new to EA (approximately 8 months) and I think that it's of great value to take into consideration the ideas of new community members who still have a kind of 'outsider view' on things.
By in large, I agree and I actually started working on strategies to target people who are involved in relevant cause areas or might be more open to EA's concepts of expanding the circle of morality.
There a few assumptions that we can be the base of building this strategy:
Having said that, I think we should be careful with popular causes like climate change and animal welfare, the reason is that a respectful amount of the people who support these causes do so for reasons that are not suitable with EA, don't really have reasoning for their views, or are even aggressive towards people who think differently.
It's completely anecdotal but yesterday when I mapped relevant facebook communities I noticed some groups explicitly state they do shaming to meat-eaters, or are conspiracy-based.