Search

Suicide and self-harm content is scarily easy to find on social media - CNET

gettyimages-1242740261

Content about suicide and self-harm keeps appearing on social media. 

Getty Images

If you're struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 1-800-273-8255; in the UK, call the Samaritans at 116 123; and in Australia, call Lifeline at 13 11 14. Additionally, you can find help at these 13 suicide and crisis intervention hotlines.


When Ian Russell's daughter took her own life in 2017, the grieving father turned to the teenager's social media accounts to search for answers.

Russell discovered that 14-year-old Molly had viewed graphic content about self-harm, including memes encouraging suicide. The images appeared on popular social and image-sharing sites such as Instagram, which is owned by Facebook, and Pinterest

Since Molly's death, Russell has strongly urged social media companies to do more to tackle content encouraging self harm and suicide, which he says contributed to his daughter's death. The UK resident has called on social media companies to provide researchers with data so they can study the potential dangers of this type of content and has advocated for government regulation. 

"As time passes, it's increasingly my belief that platforms don't really know what to do, so they move when pushed to do something," Russell said. "It's often a PR exercise." 

Pressure like Russell's is mounting in the wake of a series of recent stories in The Wall Street Journal reporting that Facebook, the world's biggest social network, is aware of the harm its services can inflict on users' mental health but downplays the risks of its services in public. 

Antigone Davis, Facebook's head of safety, testified Thursday in a US Senate hearing on Facebook's and Instagram's impact on children's mental health

"We have a set of suicide prevention experts that we work with on a regular basis and we are constantly updating our policies," Davis told lawmakers. 

One internal Instagram study cited by the Journal found that 13% of British users and 6% of American users traced the desire to kill themselves to Instagram. Facebook has disputed the Journal stories, saying that the paper mischaracterized the company's research and that teens also have positive experiences on social media. The company shared some of its research ahead of Thursday's hearing. On Monday, the company said it would pause development of a children's version of Instagram

Facebook, Instagram, Twitter, TikTok and Pinterest all have rules banning the promotion or encouragement of suicide and self-harm and direct users to suicide prevention resources. At the same, these sites say they also don't want to prevent people from sharing their mental health struggles or getting support.  

"While we do not allow people to intentionally or unintentionally celebrate or promote suicide or self-injury, we do allow people to discuss these topics because we want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another," Facebook's community standards states. 

Some of these sites block or restrict search results for self-harm or suicide, while others warn users about looking at suicide-related content. Certain platforms are more aggressive than others about moderating the harmful content. 

Still, the types of images that Russell and other parents worry about can easily be found online. 

Researchers say there's no easy fix, such as restricting some keywords. 

"It's not a matter of just blocking some searches," said Jeanine Guidry, an assistant professor at Virginia Commonwealth University, who has researched suicide content on Instagram and Pinterest. Though the sites have made progress in recent years, Guidry says, "if you dig a little bit further you see these messages."

Searching for suicide and self-harm content

More than 700,000 people worldwide die by suicide every year. Globally, suicide was the fourth leading cause of death among 15- to 29-year-olds in 2019, according to the World Health Organization

Exposure to suicide and self-harm content on social media has been linked to harmful mental health effects. A study published in the New Media & Society Journal in 2019 found that people who saw self-harm content on Instagram showed "more self-harm and suicidality-related outcomes." The study, which surveyed US adults between the ages of 18 and 29, noted that may be because of the exposure itself or that people who see such content are already at higher risk and more likely to encounter self-harm content. 

screenshot-20210928-152016.png

 An Instagram search for "#selfharm" surfaces this message.

Screenshot by CNET

"Either way, findings suggest that the self-harm content on Instagram should be a source of concern," the study stated.

Instagram hides posts for #selfharm and directs users to support resources. But self-harm images still surfaced in search results for #suicide, which users can view if they click through a warning. "Posts with the words you're searching for often encourage behavior that can cause harm and even lead to death," the prompt cautions. 

Some of the results for suicide content include mental health support resources or discourage people from taking their life. Other posts appeared to violate Instagram's rules against promoting and encouraging suicide. In 2019, Instagram said it would ban fictional self-harm or suicide content, including memes and illustrations.

This type of content, however, popped up in search results for suicide content. One account, suicidal_obsessions, posted more than 170 illustrations showing self-harm and depicting methods for suicide for at least two weeks before Instagram banned the user. Instagram said it restricted the account from streaming live video but later removed it entirely for continued violations. 

The images CNET found represent a sliver of the suicide and self-harm content online. From April to June, Facebook said it took action against 16.8 million pieces of suicide and self-harm content, while Instagram took action against 3 million pieces. The majority of the content was flagged by the platforms and views of violating content was "infrequent," according to a report from Facebook.

Samaritans, which has worked with social media companies and provides suicide prevention resources, has tips about how to safely post about suicide online. Some of that guidance includes providing a trigger warning, avoiding language such as "committed suicide" because it can make the act sound like a crime, steering clear of posts about suicide methods and thinking about how regularly you post about the topic.

Policing a variety of harmful content

Tech companies also use artificial intelligence to help identify and hide self-harm content before a user reports it. Not all platforms, though, block self-harm and suicide content from search results.

Twitter bans the promotion of self-harm or suicide content, but will still allow images of self-harm as long as they're marked as sensitive media, which users have to confirm to see. (The user is expected to identify sensitive content, but Twitter will also do so if the tweet gets reported). Unlike on other social networks, results for self-harm content aren't blocked. A number to a crisis hotline appears at the top of the results.

The social network appears to enforce its policies inconsistently. Graphic photos of cuts show up in the results for self-harm, some of which aren't marked as sensitive. Some posts received more than 200 "likes" but don't appear to meet Twitter's definition for encouraging self-harm. Some tweets prompted users to ask questions about what instrument was used to cause the harm or reply with comments such as "so hot, keep it up."

screenshot-20210928-153539.png

TikTok sends users searching for "#selfharm" to resources where they can get help.

Screenshot by CNET

Twitter marked self-harm images as sensitive after CNET brought them to the company's attention but more popped up on their site. A spokesperson said it will delete tweets "encouraging self harm," in keeping with its rules. 

TikTok, the short-form video app, has also struggled to handle suicide content. In 2020, a video of a man killing himself that originally came from Facebook went viral on the platform. TikTok attributed the spread of the video to a "coordinated raid from the dark web" in which users edited the video in different ways so it would escape the platform's automated detection. The incident was a striking example of suicide material surfacing on social media even if users aren't searching for it. 

TikTok displays support resources when a user searches for self-harm content. Searches for suicide also includes resources, but include an option to view results. Users are warned that "this content may not be appropriate for some viewers." Results included videos about suicide prevention or awareness. 

Still, TikTokers have used other hashtags or language to talk about suicide, making it tougher to moderate this content. One anonymous account, @justasadgirl_, posted several videos about depression, self-harm and the urge to kill herself. The profile described the user as a 16-year-old girl. TikTok removed the account for violating its rules against "depicting, promoting, normalizing, or glorifying activities that could lead to suicide, self-harm, or eating disorders." The company also removed a sound that was being used by TikTok users to indicate where on their body they had harmed themselves.

Pinterest includes exercises to cope with feelings of sadness, as well as the number for the National Suicide Prevention Hotline. Results for suicide and self-harm aren't shown. But a search for depression led to recommendations for more sad images, such as people crying, and included illustrations depicting suicide. Pinterest said users are able to turn off recommendations for certain pins, which are bookmarks used to save images, videos or other content. 

With the help of AI, Pinterest said it's seen an almost 80% drop in reports for self-harm content since April 2019.

"Our work here is never finished, because we don't want any harmful content on Pinterest," a company spokesperson said.  

For Ian Russell, the work is far too late. The depressing content that Pinterest and other social media companies showed Molly was far too intense for a teenager, he said.

"What she saw online educated her about subjects that no 14-year-old should need to be exposed to," he said. "It encouraged, in her, hopelessness."

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.

Adblock test (Why?)



"easy" - Google News
October 01, 2021 at 10:47PM
https://ift.tt/3D8HBYV

Suicide and self-harm content is scarily easy to find on social media - CNET
"easy" - Google News
https://ift.tt/38z63U6
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update

Bagikan Berita Ini

0 Response to "Suicide and self-harm content is scarily easy to find on social media - CNET"

Post a Comment

Powered by Blogger.