The latest Facebook scandal is no surprise
This past week, the big tech news was all about Facebook, and the user data accessed by Cambridge Analytica. To sum it up, about 270,000 FB users took a personality test via an app.
This was back in 2014. At the time (but no longer) Facebook's policies allowed app developers to access not only app user data, but the data of their FB friends, resulting in about 50 million profiles accessed.
This data was then acquired — against Facebook's policy — by Cambridge Analytica, a company that provided services to the Trump presidential campaign. The company has denied that they used the data in their work for Trump, however.
You can follow the ins and out of the evolving story pretty much anywhere, because there's been wall-to-wall media coverage.
Here at Spark, though, we were a little surprised — by the surprise.
But maybe that's because we talk a lot about the use and abuse of data on Spark. We cover it partly because anyone who is online contributes lots more data about what we like, where we go, how we shop, than they used to. But also because how that data can be used has changed, thanks to Big Data analytics and artificial intelligence algorithms.
There's a reason they call data "the new oil" after all.
Related links:
- You give away private data just by opening your email
- Exercise app shows why anonymous data can still be dangerous
- Getting emotional about online privacy
So, in the wake of everything that's been going on, we thought we'd check in with Zeynep Tufekci. She's an associate professor at the School of Information and Library Science at the University of North Carolina. She's been a guest on the show a number of times. And she's been talking for years about the consequences of a digital economy based on data harvesting. She wrote about it for the NY Times this past week, and Nora spoke to her again this week.
Here's part of their conversation.
NY: You argue that the events that happened are quote a consequence of Facebook's business model. What do you mean by that?
ZT: What I mean is as long as we are collecting this much data, and as long as the business model is accumulation of data on people in order to profile them so that their attention can be sold to advertisers, these things are bound to happen.
The way the platforms operate also means that they have an incentive to keep us on the site as long as possible, so that they can see their advertisements to us. And that creates perverse incentives.
You don't see the post from your friends and pages you follow in chronological order -- there is an algorithm that sorts that out. They show us things that keep us on the site. And in my experience that leans towards either you know the cute baby pictures things that make us go all yeah or things that anger us.
NY: What about the argument that you're essentially exchanging your data as a way of paying for a service that is otherwise free?
ZT: This kind of consent model assumes there is informed consent -- but it's not like people are being informed on the full implications of what this data can do and what it can do in the future.
Once you turn this data over the decision is practically irrevocable given the amount of uncertainty. I think it's just not consensual in any meaningful sense, as someone who has been pointing out these issues literally for years.
More related links: