By James Pero For Dailymail.com
Published: 17:38 BST, 10 September 2019 | Updated: 17:42 BST, 10 September 2019
Facebook said it will tighten its grip on content relating to suicide and self-harm in an effort to make the platform and its sister-site, Instagram, safer.
In a blog post, Facebook announced several policy changes that will affect how content relating to self-harm and suicide are treated once posted to its platform.
The company says it will 'no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm.'
That policy will apply 'even when someone is seeking support or expressing themselves to aid their recovery' said Facebook in a blog post.
Facebook said it will start to remove content depicting self-harm in an effort to avoid triggering users who may be dealing with similar issues
The new policy will also encompass images of healed self-inflicted cuts, which the company says it will temper with a 'sensitivity screen' that users must click through to access the underlying content.
Likewise, Instagram will start to deprioritize content that depicts self-harm, removing it from the Explore tab and sequestering it from the company's suggestion algorithm.
To help promote healthy dialogue on suicide and self-harm, Facebook says it will also direct users to guidelines developed by the National Centre of Excellence in Youth Mental Health, ORYGEN, when they search for content relating to those topics.
The guidelines, are meant to 'provide support to those who might be responding to suicide-related content posted by others or for those who might want to