Facebook 'tightening' policy on self-harm and suicide in an effort to protect ...

sonos sonos One (Gen 2) - Voice Controlled Smart Speaker with Amazon Alexa Built-in - Black read more
() Facebook says it is 'tightening' its policy on content relating to self-harm and suicide in an effort to protect users' mental health Facebook says it will start removing content that depicts self-harm That will include 'graphic cutting images' which it says could trigger users Instagram will also remove such content from its Explore tab The company is seeking mental health expert to join its security team 

By James Pero For Dailymail.com

Published: 17:38 BST, 10 September 2019 | Updated: 17:42 BST, 10 September 2019

View
comments

Facebook said it will tighten its grip on content relating to suicide and self-harm in an effort to make the platform and its sister-site, Instagram, safer.

In a blog post, Facebook announced several policy changes that will affect how content relating to self-harm and suicide are treated once posted to its platform. 

The company says it will 'no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm.' 

That policy will apply 'even when someone is seeking support or expressing themselves to aid their recovery' said Facebook in a blog post.

Facebook said it will start to remove content depicting self-harm in an effort to avoid triggering users who may be dealing with similar issues

Facebook said it will start to remove content depicting self-harm in an effort to avoid triggering users who may be dealing with similar issues

The new policy will also encompass images of healed self-inflicted cuts, which the company says it will temper with a 'sensitivity screen' that users must click through to access the underlying content.   

Likewise, Instagram will start to deprioritize content that depicts self-harm, removing it from the Explore tab and sequestering it from the company's suggestion algorithm.  

To help promote healthy dialogue on suicide and self-harm, Facebook says it will also direct users to guidelines developed by the National Centre of Excellence in Youth Mental Health, ORYGEN, when they search for content relating to those topics. 

The guidelines, are meant to 'provide support to those who might be responding to suicide-related content posted by others or for those who might want to

read more from dailymail.....

Get the latest news delivered to your inbox

Follow us on social media networks

PREV Apple will introduce the latest iPhones TODAY: Watch here as firm takes wraps ...
NEXT Google and Amazon use smart speakers for 'surveillance,' top tech investor says