Our post yesterday provided an overview of the recent Facebook study on emotional contagion and summarized the various responses to it. The wider issues are important to address, and I recommend reading yesterday’s post to see some different, and conflicting (but thoughtful), perspectives on those issues. In this post, I talk about some of the conversations we’ve had at People Pattern and provide our take on the study. At People Pattern, we highly value experimentation and the accuracy of human insights based on data. I focus on our perspective of what the study actually shows and how its findings have been presented to the public.
The academic value of the Facebook study, the authors say, is that the experiment shows us something new about emotions, and it’s the fear that they manipulated individuals’ emotions to show this that makes it so outrageous. But really this study isn’t so much about emotions as it is about counting things — in this case words.
The ethical gray zone: the Facebook study is just A/B testing and in fact just counts words.
If you use the internet like I do — online shopping, searching, reading ad-sponsored news or using ad-sponsored apps, etc. — you’re sometimes, if not frequently, the subject of tiny experiments called A/B tests that are functionally identical to the Facebook experiment. For example, in an effort to have more effective advertising, ads on websites are frequently placed in different locations or sizes to see which get the best click through rates. Similar things happen on the results pages of search engines and the product suggestions of online retailers.
A ton of the web content you see is being manipulated in a completely innocent way so that content providers can count how often you click on certain links, in order to count how many products you buy in certain categories, so they can count which displays of search engine results are leading you to the pages you’re searching for. This is a good thing by the way; it makes it possible to connect us to the things we actually want in better and faster ways.
The Facebook study is just like this. Timelines were temporarily modified to change the overall count of positive and negative words visible in them, and Kramer, Guillory and Hancock then counted the positive and negative words in subsequent posts to understand whether there was any relationship. This certainly has something to do with counting words, but we don’t yet know for sure whether these counts actually have anything to do with emotions. This entire area of research is relatively new and we’re still figuring out what the connection between word usage and psychological states really is.*Full disclosure: as part of a collaboration between the University of Texas and Cornell, I have worked with Jeff Hancock on research trying to figure out questions like these.
So we can really only say that the Facebook study manipulated emotions to the extent that it changed up the number of positive and negative words people saw, and it’d be just as fair to call what happened word count contagion as it is to call it emotional contagion. And this is no less interesting. From this perspective, there’s arguably not a whole lot of difference between what happened in this study and what happens to us every time we use a search engine or buy something online — some stuff got counted and relationships were established.
To have any idea whether this study crossed any lines, we have to look at it alongside the ubiquity of data-driven, A/B-tested decision making, which hasn’t drawn anything near the level of ire that has been thrown at this study.
Linguistic framing: the wording of the paper itself made it a lightening rod for deeper worries in the public.
We suspect that the emotional response/outcry from the general public comes from the linguistic framing of the paper. The paper presents itself as investigating ‘emotional contagion’. As already mentioned, it’s about emotions to the extent that it’s about positive and negative words. There is an equivalent study that could have looked at pronouns and function words (this would be a natural follow up, but I somehow suspect Facebook isn’t ever going to let that happen). The design would be the same (except for the type of words counted), and instead of being about emotional contagion, it’d be about style matching. Style matching sounds a lot less scary. Then there’s the use of the word ‘contagion’. The word contagion, here, is a term of art referring to the spread of some variable through a network represented by a graph. While sounding highly negative, it’s just an analogy to the way diseases spread.
If the paper had been described with a less sexy but more literal title, e.g., “Correlations in lexical frequencies by networked agents in response to modified serially ordered text streams”, I’m certain we wouldn’t be having this discussion today in this way. This would have mitigated the knee-jerk or opportunistic criticisms, while still providing an interesting jumping off point for thoughtful responses about the influence of an individual’s social network, the use of data in computational social science and the need for frameworks for the use of online data. But then again, without being noticed, it would have had a much smaller impact on academics and society. I’m not sure if that would have been better or not.
What next?
The response to the study was disproportionate to what the study actually did. Nonetheless, I still believe Facebook should use an opt-in model for performing such experiments going forward. There are various ways to do this, such as Tal Yarkoni’s suggestion that users can set a preference in their profile for whether they are willing to be part of such studies. The advantage of this setup is that users have informed consent, but they wouldn’t know when they are participating. This helps with the experimental control of the subject population as they won’t consciously or unconsciously adapt their behavior while the study is on going.
Only Facebook can do this sort of opt-in; nonetheless, others can still do an opt-in version of the study via means external to Facebook. In our third post in this series, to be posted on Tuesday, we’ll describe our Chrome extension to allow Facebook users to participate in a similar study, with their consent.
Recent Comments