[ Question ]

Personal Data for analysing people's opinions on EA issues

by aberdeen 1 min read11th Jan 20202 comments


Large datasets containing personal information (eg interests, online behaviour, political tendencies) are highly valuable in informing strategies. Especially, once large enough they are very general. Knowing whether someone likes football or not will, in combination with a lot of other data, influence credence about whether they may support an animal welfare plan. This is valuable information into how easy or hard certain problems are to solve.

The most publicised cases of how this was put to practice were the political campaigns of Trump and of Brexit. The Vote Leave campaign is probably the one talking most openly about it, eg https://youtu.be/CDbRxH9Kiy4

But effectively, the social science with the most insight on the planet is done at facebook internally, and of course they are using it to inform their own agendas and strategies (no source, high credence speculation if you will).

The problem is of course that such data cannot be shared for legitimate privacy reasons. Hence there is no public source for data detailled enough to really know, say, the UK's stance on current EA issues and how they interact (data may exist but is very difficult to compare over different issues). Therefore, while EA could massively benefit from access to such data (given the right analytics tools) to inform its topics, goals and strategies, we currently have no way of doing so.

EA could find a way to gather and maintain a dataset and develop tools for analysing it that makes sure it's not public access but can be used (maybe even by anyone) for analysis of EA related issues. We could use technology so far used to elect Trump et al to better understand our own policies and issues and their impact. The data could come from various sources, and (the legal part of) the political campaigns of Trump and Vote Leave could serve as inspiration.

Should we?


New Answer
Ask Related Question
New Comment

2 Answers

I think that it’s unnecessary to go to such great (and risky) lengths to find out what the public believes with respect to issues relevant to EAs. A well-constructed survey conducted via Mechanical Turk, for example, would (in conjunction with a technique like multilevel regression and poststratification) yield very accurate estimates of public opinion at various arbitrary levels of geographic aggregation. I’d be supportive of this and would be interested in helping to design and/or fund such a survey.

The negative repercussions of this in terms of how EA is perceived seem absolutely enormous. Cambridge Analytica has got to be one of most despised companies in the Western world.