Facebook says it is ‘tightening’ its policy on content relating to self-harm and suicide in an effort to protect users’ mental health

Facebook says it is ‘tightening’ its policy on content relating to self-harm and suicide in an effort to protect users’ mental health


Daily Mail | Source URL

Facebook said it will tighten its grip on content relating to suicide and self-harm in an effort to make the platform and its sister-site, Instagram, safer.

In a blog post, Facebook announced several policy changes that will affect how content relating to self-harm and suicide are treated once posted to its platform. 

The company says it will 'no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm.' 

That policy will apply 'even when someone is seeking support or expressing themselves to aid their recovery' said Facebook in a blog post.

The new policy will also encompass images of healed self-inflicted cuts, which the company says it will temper with a 'sensitivity screen' that users must click through to access the underlying content.   

Likewise, Instagram will start to deprioritize content that depicts self-harm, removing it from the Explore tab and sequestering it from the company's suggestion algorithm.  

To help promote healthy dialogue on suicide and self-harm, Facebook says it will also direct users to guidelines developed by the National Centre of Excellence in Youth Mental Health, ORYGEN, when they search for content relating to those topics. 

The guidelines, are meant to 'provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors,' said Facebook.

According to Facebook, the changes come as the result of input from mental health professionals and experts in the field of suicide prevention. 

In February, Facebook vowed to help weed out graphic content depicting self-harm citing experts from ten countries who had advised Facebook to 'allow people to share admissions of self-harm and suicidal thoughts, but should not allow people to share content promoting it'.  

To oversee its effort, Facebook said it will also hire a health and well-being expert as a part of its safety team.

That person will be responsible for coordinating with external experts and organizations to addressed issues relating to mental health,  including, 'suicide, self-harm, eating disorders, depression, anxiety, addiction, nutrition, healthy habits, vaccinations' and more.

That coordination will apparently go both ways, said Facebook. For the first time, the platform said it will begin sharing data with academics on how Facebook's users talk about suicide. 

Researchers will have access to a tool called CrowdTangle, which allows the wielder to suss out specific content on the platform. 

While the tool has been used primarily by publishes are media companies to identify trends on Facebook as they gather steam, Facebook says it will now grant access to two unnamed researchers for their efforts in the field of suicide prevention. 

Facebook and other large companies like YouTube and Twitter have both been under an unprecedented amount of pressure from lawmakers and concerned users to crack down on toxic content emanating from their platforms.

This month, YouTube said it had banned more than 17,000 accounts for spreading 'hateful content' while Twitter has rolled out a number of new policy changes surrounding what it considers a violation of its user agreement.  

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *