TikTok shared some new features on Tuesday Supporting user mental healthIncludes a guide on how to interact with people who may be struggling, and updates to warning labels for sensitive content. This change is due to a survey of Instagram, a photo-sharing app launched by Facebook last year. , Reportedly ..
In a blog post, TikTok wrote, “We do not allow content that promotes, beautifies, or normalizes suicide, self-harm, or eating disorders, but supports those who choose to share their experiences and raise awareness. Help others. Struggle to find support among our community. “
To help these conversations and connections more securely, TikTok is rolling out a new well-being guide that supports everyone who shares their personal experience with video apps.The guide was developed together International Association for Suicide Prevention, Crisis text line, Live for tomorrow, Singapore Sumaritan When Samaritan (UK), And available on TikTok Safety Center.. It also provides tips on how to interact with people who may be struggling.
Social video apps are also sharing new ones Safety Centerr guide About eating disorders in teens, caregivers and educators. This guide National Eating Disorders Association, National Eating Disorder Information Center, Butterfly foundation When Bodywhys, And provide information, support, and advice on eating disorders. Earlier this year, TikTok added the ability to direct users searching for terms related to eating disorders to the right resources.
In addition, when someone searches for a word or phrase like #suicide, Crisis text line Helpline for finding information on treatment options and support.
TikTok has also updated the warming label for sensitive content, so when a user searches for a term that may display offensive content, such as “scary makeup,” an opt-in display screen appears on the search results page. It came to be. Users can view the content by tapping View Results.
The site also features content from creators who share their personal experiences with mental health, information on where to get help, and advice on how to talk to loved ones.
“These videos appear in search results for specific terms related to suicide and self-harm, and the community can choose to watch them if they wish,” said TikTok.
on Tuesday, The Wall Street Journal reported In a study conducted over the last three years, Facebook researchers found that Instagram was “harmful to a significant proportion” of young users, especially teenage girls. For years, advocates of children have expressed concern about the mental health implications of sites like Instagram. On Instagram, it can be difficult to distinguish reality from modified images.Advocates and lawmakers have long criticized Instagram and pro-Facebook Contain harmful content Promotes anxiety and depression, Especially among young audiences..
According to a 2017 report by the Royal Society for Public Health, Instagram is the worst social media platform For the mental health of young people.Report earlier this year revealed Instagram plans to launch a platform for children under the age of 13Stir more criticism from worried child health advocates Threats to children’s online privacy and mental health..
In response to criticism, both Facebook and Instagram will reach all users in May Hide the number of likes their posts get From General, choose whether all posts in your feed can be counted in the same way.
Of course, concerns about the impact of technology on young people’s minds also extended to last month’s TikTok. Ticktaku The company was also sued in April for allegedly collecting and using children’s data illegally, and the company said those claims were “no merit.”The person who uses the app.
If you are suffering from negative thoughts or suicidal ideation, here is 13 Suicide and Crisis Intervention Hotlines You can use it to get help.
TikTok launches mental health guide after research shows Instagram is harmful to teens Source link TikTok launches mental health guide after research shows Instagram is harmful to teens