Jeanette Winterson envisions a post-material future
'It's not the fault of the machine, it's us,' says Winterson.
Originally published on October 8, 2021
Our concerns over AI are not really about the technology itself, writer Jeanette Winterson says, but rather the societal problems that creep into the algorithms that power it.
Winterson is a prize-winning author of 11 novels, a memoir, children's books, short-story collections and non-fiction works. Her writing depicts possible worlds, explores our relationship to technology and makes sense of the world beyond the binary.
She has also spent more than a decade thinking about AI, humans and the point where the two meet. Her latest book, 12 Bytes: How We Got Here; Where We Might Go Next, is a collection of 12 essays on the past and future of AI.
She spoke to Spark host Nora Young about how this technology may affect how we live, love and interact with the human and non-human beings around us.
Here is part of their conversation.
The book considers the far future of possible artificial general intelligence, the sort of Commander Data idea of AI, but also the near future of technologies that we see emerging today, personal assistants, the kind of Siri version 3.0, the Internet of Things, smart homes, all of which promise us, and what you described in the book, this world of hyper personalization. What do you think the consequences of that data rich personalization will be?
It's a very seductive concept. Young people I talk to — people around their 20s — say that there is no privacy anyway, and that my generation and people who believe in privacy are just in cloud cuckoo land. Everybody already knows everything, so we might as well trade that privacy for greater services like having your own personal assistant, who would know everything about you, but that includes where the money is and where the bodies are.
It's not just remembering your kids' birthdays and booking your favourite restaurant, it would mean that you become transparent, overfished in every sense. So you are a data set, which is then scrutinized and monetized 24/7. The idea of the Internet of Things is that everything will link up. So your fridge will order the food, your toaster will say 'it's a no carbs day. So no, you can't have any toast.' Your bed will monitor your sleep and maybe automatically ring your doctor and say 'this guy shouldn't be using machinery. This lady is not fit to look after her kids.'
You are a data set, which is then scrutinized and monetized 24/7.
The point of a self-driving car is to drive you anywhere. It can self-drive you to the police station, if that suddenly seemed to be the right thing. Those cars can also be immediately disabled at source. So if you haven't kept up your payments, your car won't start.
And there's a whole science of telemetrics coming through saying 'well, if we've got everybody's data, and they need to be connected all of the time, then we can control their lives'. In China, they're already experimenting with what they call social passports, which means that if you've got a healthy credit rating, you can buy train and plane tickets. If you've done something the Chinese state doesn't like, you'll be refused those basic services.
This is quite scary, because it does become an Orwellian future, where our behaviour will determine what services we can or can't access. And it means that good or bad behaviour is determined, not by the state in this case, but by the companies who will be providing all of these services.
Given that all of that data is part of what allows these machine learning applications to function, is that a deal with the devil that we just have to make if we want to have artificial intelligence in our lives?
That in itself is an interesting ethical question. It may well be that privacy is over because we're moving into a different kind of social contract, which isn't about the privacy of the individual.
Elon Musk, with his Neuralink company, is working on brain implants to help paralyzed people connect using their thoughts to their device. It's a wonderful idea. But this is leading towards the sense that we could all have neural implants. So you won't need a device, you will be your own device.
But that is a two-way door. Even that moment where you think "OK, there's CCTV everywhere, all my data is disclosed, but I have my private thoughts," well, not if you have a neural implant and not if we have nano proteins in the body, which will monitor our blood pressure, blood sugar, whether we're going to have a heart attack.
All of that sounds really good, but it means we're absolutely known. You can't hide away, you can't go off grid, you will belong to whatever version of the state is the next iteration, which I think will be a hybrid between tech companies and what we think of as democratic government.
What's to come will not be dependent on materiality, it will be dependent on consciousness
There's a fascinating reflection in the book on the relationship between our post-material, virtual future and religious perspectives on impermanence. Can you explain that a little bit more?
It has been of great interest to me, as someone who was brought up in a very religious Pentecostal household, that science and religion, which were always so far apart, are now lining up and talking the same language, saying the same things. The idea that no, you're not condemned to live in this body made of meat, you will be able to transcend your physical limits, that you will be able to have something close to immortality.
That's really the same as saying, when you die, your spirit goes somewhere else. What we're saying is what religious people have always said, this is not the end, this is part of the journey and what's to come will not be dependent on materiality, it will be dependent on consciousness, which is a very exciting idea and also quite a scary idea.
The push at the minute isn't just for longevity, that we can live longer, healthier lives, though there's a great deal of Silicon Valley money going into all that, like Ray Kurzweil, who is hoping to, if not live forever, manage to get to at least 200 years old.
The idea that we could upload our consciousness would mean that this made-of-meat version that lasts 80 or 100 years is about to be superseded, it's going to be whatever it is, the next version of homosapiens or a blended reality which will be the new mixed race where we become transhuman on the way to becoming post-human.
What surprised me in reading the book is that you are quite optimistic about transhumanism, about all these possible ways that things might turn out. And yet, you're very clear on how messed up we are as human beings and how we are constantly messing these things up.
How do you square that optimism for the future with the realism about who we are as human beings?
I somehow need to believe that this is all gonna work out because it's my temperament. But I also think that if we are too doom-laden, if we think there's nothing to be done and we're heading towards a dystopia, that's where we're heading.
When you learn to ride a motorbike, they always say where your head goes, the body follows. And it's absolutely true. If we're balanced about the risks and the difficulties, but we believe in ourselves and have confidence in ourselves, then I think we can pull off the most amazing future, but of course, more women need to be involved. More people from the humanities need to be involved.
We can't just leave it to a load of guys working in Silicon Valley to decide the future of the planet, especially not rich ones with rockets.
This Q&A has been condensed for length and clarity. To hear the full conversation with Jeanette Winterson, click the 'listen' link at the top of the page.
Written by Samraweet Yohannes. Produced by Nora Young and Michelle Parise.