Q & A |Ron Deibert revisits his 2020 Massey Lectures with tech experts
*Originally published on May 28, 2021.
In its brief history, the internet has spiraled out of our control, taken over by malign forces and has ultimately been turned against us, the users, according to technology and security expert Ron Deibert.
"Information and communications technologies are, in theory, supposed to help us reason more effectively, facilitate productive dialogue and share ideas for a better future. They're not supposed to contribute to our collective demise," Deibert said in his CBC 2020 Massey Lectures, Reset: Reclaiming the Internet for Civil Society.
In his lectures which is also available in a book version published by House of Anansi Press, Deibert exposes the disturbing influence and impact of the internet and social media on politics, the economy, the environment and humanity.
He calls for a reset, a chance to slow down and imagine an alternative, "and then begin the process of actually bringing that alternative about."
The original broadcasts of Deibert's 2020 Massey Lectures included colleagues who also study and write about the ways in which the internet and social media are being used.
Meredith Whiteaker, co-founder of the AI Now Institute at New York University, John Naughton, senior research fellow in the Center for Research in the Arts, Social Sciences and the Humanities at Cambridge University, and Tamsin Shaw, Professor of Philosophy at New York University joined Ron Deibert and Nahlah Ayed to revisit the 2020 CBC Massey Lectures and answer some questions sent in by our listeners.
Here are excerpts from their conversation.
Nahlah Ayed: Ron, you start the Massey lectures by quoting from Martin Luther King Jr.: "The arc of the moral universe is long, but it bends towards justice." But then you go on to say: "looking around at the climate crisis, deadly diseases, species extinction, virulent nationalism, systemic racism, audacious kleptocracy and extreme inequality. It's really hard to share his optimism these days. It feels more like everything's all imploding instead." So you're not just referring to issues around social media and the internet are you?
Ron Deibert: No, I'm most definitely not. And I think it's hard not to share that perspective these days, especially given our situation with the pandemic and prior to that, some of the issues around police brutality and racism.
Part of it also reflects my own profession and my day-to-day job as director of the Citizen Lab, where our mission is to uncover abuses of power. I'm kind of immersed in it all the time and maybe that biases the way I look at things. But putting aside the pessimism for a second, I was recently at a workshop and I was with a group of people who are mostly looking at issues around biases and discrimination with respect to artificial intelligence. And it was really the first time I felt like there's a movement happening.
Nahlah Ayed: Here's a listener question that we received stemming from the lectures. It's a very short question, but gets to the point: "This has happened so quickly. My parents' generation were fierce about protecting their privacy. Why are we so different now?"
Tamsin Shaw: There is an awful lot of convenience that can be derived from predictive algorithms. And if you are an executive at Facebook or Google, you'd probably say people like to have targeted ads. They don't want to be shown a bunch of stuff that they have no interest in. They want to be targeted with products that they genuinely might want to buy.
And a few years ago, you would have people say, "I don't mind my information being collected because I have nothing to hide," because it would seem as though you would just be giving away data about your Facebook friends or about what you buy at the supermarket. And people weren't really thinking about the way that data could be aggregated and the harms that might come from that ... I think it's become a much, much more serious issue for people.
Meredith Whittaker: We were sold a bunch of technology, a bunch of devices that were meant to connect us. Privacy is relational. Private from whom? At what time? It is contextual. So I want to be really careful about individualizing a problem that is clearly structural and it clearly has more to do with the way in which these large corporations are able to effectively make up data about us that may or may not reveal our private interiority, and then make assumptions based on that data that are shaping our lives in ways that have real material implications — and replicate structural inequality, racial inequity.
John Naughton: I also think that there's a problem with the narrative. I mean, in the sense that most of the public and policy narratives about the problems we now face is entirely deterministic. It basically says technology drives history and the role of society is the same role as the guy who used to walk behind the elephant in the British Raj. In other words, the role of society is to pick up the pieces from this rampage of creative destruction. And what astonishes me time and time again is why apparently normal people believe this. Why do people accept it?
Democracies appear to have lost the capacity to say, actually, you can't do that. We don't allow people to have free trade in plutonium. Well, we shouldn't allow people to, for example, develop and deploy live facial recognition systems. It's simply incompatible with a free liberal society that you do things like that. ..You can't just have a discovery in a lab that looks as if it's very promising in dealing with Alzheimer's. And four days later, AstraZeneca releases the product. That doesn't happen. And yet, we allow it to happen in relation to this [tech] industry, and so I think that one of the tasks that those of us who are really concerned about this have, is somehow challenging that narrative.
Nahlah Ayed: So, Tamsin, many of us, if not most of us, are kind of careless with social media because we don't think that our personal information is really worth that much. Is it worth that much?
Tamsin Shaw: Our personal information is worth a great deal to a lot of people. And of course, it depends on what kind of information it is. I think the gold standard of predictive data is probably DNA and people have been increasingly giving away their DNA information to companies like '23 and me' without knowing anything about the ownership of those companies. And a lot of that information is going to end up in the hands of governments.
As Ron says, there is this dark side to what is being done with it. And I don't think that dark side is a side effect. These businesses don't really fit the model of neoliberalism in the sense of free market economics. They were all established and still protected and supported with government funds because you can't have this kind of tech innovation without massive government support.
Nahlah Ayed: So given all of that, John, if the government has more to gain by kind of mirroring what the corporate sector is doing and less by defending the average citizen's interests or rights, can we trust governments to have our best interests at heart?
John Naughton: No, in short. For example, I gather that many aspects of the federal government's computing operations, including some of the security agencies, are actually run on Amazon Cloud. There's this famous quote from somebody who said, "If Marxist revolutionaries took over the United States and arrived at the White House, what they'd have to do is simply nationalize Amazon and then say 'job done.'"
I think one of the big questions actually is: can democracies now actually control this? My answer is: I'm not sure that they could... just at this minute, it's very hard to see how we crack this particular problem. In the end, it comes down to political will. It comes down to whether or not we have democracies with administrations which realize that this is an existential threat to the democracy we cherish.
Ron Deibert: We live now more than ever in a world saturated with digital technologies. And yet, perversely, we also live in a time when we're actively discouraged by laws and other means from opening up that technology, from understanding what's going on beneath the surface — and that prohibition, if you will, is symptomatic of a whole bunch of other problems. If we corrected that, beginning with young people educating them, not in media literacy, but in actually taking things apart, questioning authority, we might be in a better place 10, 15 years from now.
Guests in this episode:
Ron Deibert is the founder and director of Citizen Lab, a research centre based at the University of Toronto, which studies technology, surveillance and censorship.
Meredith Whittaker is the co-founder of the AI Now Institute at New York University.
John Naughton is a senior research fellow in the Center for Research in the Arts, Social Sciences and the Humanities at Cambridge University.
Tamsin Shaw is professor of philosophy at New York University.
* This Q&A was edited for clarity and length. The episode Reset Revisited was produced by Philip Coulter.