Health

Instagram changes rules on self-harm postings

Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.

Changes announced after Instagram and other social media firms met with Britain's health secretary

Instagram's chief said they're working with experts and the wider industry to find ways to support people when they're most in need. (Damian Dovarganes/Associated Press)

Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
 
Instagram chief Adam Mosseri said Thursday evening the platform is making a series of changes to its content rules.
 
He said: "We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community."
 
Mosseri said further changes will be made.
 
"I have a responsibility to get this right," he said. "We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they're most in need."

The U.K. could ban social media sites in the wake of teen's death

6 years ago
Duration 2:12
The suicide of a young girl in the U.K. is prompting a heated debate about the responsibility of social media sites to remove harmful content. Her family says she had been viewing disturbing content about self harm on Instagram and Pinterest. Now the British government is considering banning certain platforms if companies don't comply.

 
The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
 
Her father, Ian Russell, said he believes the content Molly viewed on Instagram played a contributing role in her death, a charge that received wide attention in the British press.
 
The changes were announced after Instagram and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide. 
Instagram is also removing non-graphic images of self-harm from searches.
 
Facebook, which owns Instagram, said in a statement that independent experts advise that Facebook should "allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it."

Where to get help:

Canada Suicide Prevention Service: 1-833-456-4566 (Phone) | 45645 (Text) | crisisservicescanada.ca (Chat)

In Quebec (French): Association québécoise de prévention du suicide: 1-866-APPELLE (1-866-277-3553)

Kids Help Phone: 1-800-668-6868 (Phone), Live Chat counselling at www.kidshelpphone.ca

Canadian Association for Suicide Prevention: Find a 24-hour crisis centre

If you're worried someone you know may be at risk of suicide, you should talk to them about it, says the Canadian Association for Suicide Prevention. Here are some warning signs: 

Suicidal thoughts.
Substance abuse.
Purposelessness.
Anxiety.
Feeling trapped.
Hopelessness and helplessness.
Withdrawal.
Anger.
Recklessness.
Mood changes.