Science·Analysis

'I'm sorry to hear that': Why training Siri to be a therapist won't be easy

Apple plans to make Siri, its digital assistant, better at responding to people's mental-health issues. Experts say the ambitious plan faces some big challenges.

Privacy and understanding the nuances of language are just 2 of the concerns experts raise

Apple wants to make Siri, its digital assistant, better at responding to people’s mental-health issues. Experts say that won't be easy. (Apple/Shutterstock)

We already turn to our smartphones for help with all sorts of tasks, such as checking the weather or getting directions. But could the next role for your hand-held device be as your therapist?

Apple plans to make Siri, its digital assistant, better at responding to people's mental-health issues — an ambition that has raised serious ethical concerns among some health experts.

The company wants Siri to be as capable responding to a user's comment about depression as it is answering questions like, "Who won the baseball game?" or "What's the weather?" 

According to Apple, users already turn to Siri with mental-health questions, such as when they're having a stressful day or have something serious on their mind. They say things like, "Siri, I'm depressed."

That's a natural progression in user behaviour as the technology becomes more sophisticated, says Diana Inkpen, a professor of computer science at the University of Ottawa who studies artificial intelligence.

"Usually, people ask factual questions, about the weather, or other particular information. But once a system like Siri starts to seem intelligent, people tend to personalize the AI system and expect a real conversation. They start asking more personal questions."

One big concern about having digital assistants providing health services is privacy, experts say.

Of course, to provide that kind of support requires an understanding of both the technical intricacies of programming artificial intelligence, as well as the nuances of human communication and behaviour. And that's exactly what Apple is looking for. An open job posting on the tech giant's website calls for someone with a background in engineering, as well as psychology or peer counselling.

That combination of skills is the key to the success of this undertaking, says Dr. John Torous, the co-director of the digital psychiatry program at Harvard Medical School and chair of the American Psychiatric Association's workgroup on smartphone apps. The solutions being proposed must be both clinically important and useful, as well as technologically feasible, he says.

'I'm sorry to hear that'

And Apple isn't the only company interested in integrating behavioural and mental-health applications into its tools. In fact, all of the big tech companies are developing services in this space.

A couple of weeks ago, Google announced it now offers mental-health screenings when users in the U.S. search for "depression" or "clinical depression" on their smartphones. Depending on what you type, the search engine will actually offer you a test.

Amazon is also interested in learning more about what data can be gathered and what services can be delivered, especially through its voice-activated Echo devices.

And Facebook is working on an artificial intelligence that could help detect people who are posting or talking about suicide or self-harm.

But there's still lots of work to be done. About a year and a half ago, a group of researchers at Stanford University tested Siri and Microsoft's equivalent, Cortana, with questions about suicide and domestic violence. The researchers found the digital assistants couldn't provide appropriate responses. And while Apple and Microsoft have since made efforts to make sure their digital assistants link people to suicide hotlines or other resources, telling Siri you're feeling blue is still likely to yield the response, "I'm sorry to hear that."

Big challenges

The fact is, fleshing out Siri's responses to be more helpful is no easy task.

"One of the trickiest things is that language is complex ... and there's a lot of different ways that people can phrase that they're in distress or need help," says Torous, which is why he believes we're still a long way from being able to rely on such devices in real emergencies.

"They have words they can look for, and they can try to identify patterns, but they really haven't been around long enough and haven't been validated medically to really offer a safety net at this point."

It's a case where likely the technology has outpaced the research and our knowledge about how to apply it and deliver safe and effective mental-health services.- Dr. John Torous, co-director of the digital psychiatry program at Harvard Medical School

So what's driving this push to have AI be responsive to these kinds of human needs? Part of the answer is the ubiquity of these digital assistants.

Siri is often more accessible for someone in distress than other human beings, Torous says. After all, our phone is with us at all times, even when there aren't other people around.

The trouble, he says, is we're still just learning about how AI can be used to improve mental health. "It's a case where likely the technology has outpaced the research and our knowledge about how to apply it and deliver safe and effective mental-health services."

Because Apple and its competitors are doing so much of the work in this field, there's not much publicly disclosed information or published research that shows how people are using these tools, what they're looking for and what the big trends are. Torous calls these proprietary undertakings "scientific black boxes."

Privacy issues

When it comes to "Dr. Siri," the other big concern that both Torous and Inkpen share is privacy. Our phones already collect a tremendous amount of personal data. They know where we are and who we're speaking and texting with, as well as our voice, passwords, and internet browsing activities.

"If on top of that, we're using mental-health services through the phone, we may actually be giving up a lot more data than people realize," Torous says.

He also cautions that many of the mental-health services currently available in app stores aren't protected under federal privacy laws, so you're not afforded the same privacy protections as when you talk to a doctor.

In other words, just because you're talking to the digital equivalent of a doctor, doesn't mean the same rules apply.

ABOUT THE AUTHOR

Ramona Pringle

Technology Columnist

Ramona Pringle is an associate professor in Faculty of Communication and Design and director of the Creative Innovation Studio at Ryerson University. She is a CBC contributor who writes and reports on the relationship between people and technology.