Is Facebook using all your 'feels' to sell you stuff?
Facebook denies it, but report raises alarm about potential impact on youth
Online advertisers can already target you based on your age, gender, interests and location. Now, leaked documents suggest Facebook is working on ways to show advertisements based on your emotions and feelings.
What is Facebook doing?
According to a report from The Australian, Facebook has the ability to target its users based on how they're feeling. In particular, the report focuses on how this technology could be used to target young people — teenagers as young as 14 — categorizing them as feeling stressed, defeated, overwhelmed, anxious, nervous, stupid, silly, useless and a failure. One headline characterized Facebook as trying to "target emotionally vulnerable and insecure youth."
For its part, Facebook has called The Australian's story misleading, saying "Facebook does not offer tools to target people based on their emotional state." They went on to say that the documents obtained by The Australian were research, "intended to help marketers understand how people express themselves on Facebook."
Regardless, this story gets at a much broader trend, which is so-called "emotion detection" technology, and the fact that a huge amount of time and money is being spent trying to teach computers to track and understand human emotions, and that has consequences for all of us.
How can Facebook tell how I'm feeling?
We don't know precisely how Facebook's systems do this, but to get a better sense of how they're likely doing it, I called up Luke Stark. He's a Canadian researcher at Dartmouth College who studies social media and emotion. He told me Facebook uses "sentiment analysis" techniques to analyze the words people use in their posts and assigns those words an emotional score.
But Stark stresses the social network also has other means of tracking us: "The other big way that Facebook is doing this is through the recent reaction buttons. So on a Facebook post there is a love button, a sad button, an angry button that lets users tag those posts with basic emotional states. So I think that's also an important element of how Facebook is; actually getting users to track their own emotional states."
It's not just Facebook. Many other social media sites do this same type of sentiment analysis, and emotion detection, to varying degrees.
Facebook has allegedly been tracking the emotions of teenagers as young as 14. How does age play into this?
There's no reason to believe this type of emotion detection technology is only being used on teenagers. It's all of us. But Stark told me a lot of the focus on this story has been on the potential dangers to youth — the worry that vulnerable children might be manipulated by an algorithm. Which, he adds, is understandable.
"Some of that anxiety is appropriate and concerning, but in a way might obscure the bigger issue which is that this technology is being used against all of us," he says. "And as much as we'd like to believe we are more savvy than our 14-year-olds, I think in the case of social media, we often aren't."
Beyond advertising, how else can emotion detection technology be used?
It's easy to focus on the darker, more nefarious uses for this technology; how it might be used to influence or manipulate people in their most vulnerable moments. But over the years, I've interviewed a number of people who work on emotion-detecting technology, and there are lots of positive uses for this category of technology as well.
For instance, there have been a number of projects that analyze social media for signs of depression or other mental health issues. In that sense, emotion-detection and sentiment analysis can be used as a screening tool, or an early warning system.
We've also seen quite a bit of research into using physiological signals to identify emotions, such as facial recognition that looks for micro-expressions or audio analysis software that listens for subtle changes in the human voice that may indicate the emotional cadence of an interaction. These systems are often positioned as a kind of social prosthetic for people on the autism spectrum — or others who may have difficulty reading emotions.
What strikes me is that our response to these technologies depends a lot on how they're used. I might be fine to have my social media posts scanned for signs of depression if the goal is to flag potential health issues. But the exact same technology might creep me out if I know it's being used to show me advertisements.
What can people do to hide their true feelings from social media companies?
To be fair, you may not want to hide your emotions from social networks. Your emotional data can be used to show you more relevant posts and more useful advertisements, and that can be a benefit.
If you do want to hide your emotions, there are tools out there that can help. For instance, there's a plugin called "Go Rando" that you can install in your web browser. As the name suggests, it randomizes your Facebook reactions. The idea is if you can't stop a social network from trying to detect your emotions, you can at least try to confuse it.
But for most people, I think the important piece here is simply knowing this is happening. And, according to all the experts I've talked to, these technologies aren't going away any time soon.