Impact of TikTok Content on Teens' Mental Health, as Revealed in a Study
TikTok, the popular social media platform, has been under the spotlight lately due to concerns about its impact on the mental health of young users. After intense public hearings and alarming revelations from Facebook whistleblower Frances Haugen concerning Instagram's detrimental effects on teens, TikTok finds itself in a similar position.
The Center for Countering Digital Hate (CCDH) believes further legislation is required to ensure children are safeguarded from any inappropriate material on TikTok. The CCDH recently reported that users can access content related to suicide and eating disorders within less than 8 minutes of creating an account.
In response, TikTok maintains that they regularly consult with health experts, remove violations of their policies, and provide access to supportive resources for anyone in need. They offer filters that can sort out potentially inappropriate content for a more age-appropriate viewing experience.
One of TikTok's recent initiatives is the introduction of a "maturity score" to help users decide if they are comfortable viewing content with complex or mature themes. Searching for harmful words or phrases, including #selfharm, on TikTok will bring up no results and viewers will be directed towards local support resources.
However, critics argue that TikTok’s content recommendation algorithm can create “filter bubbles” where vulnerable users are repeatedly exposed to negative content related to self-harm, eating disorders, and suicide, worsening symptoms like depression and anxiety among youth. Internal research leaked during litigation disclosed that TikTok’s algorithm can place users in harmful content loops in as little as 30 minutes, contributing to serious mental health consequences requiring medical intervention.
To address these concerns, TikTok offers tools that allow users to set time limits on videos and take regular screen breaks. For users under 18, TikTok automatically applies stricter settings such as a 60-minute daily screen time limit, Restricted Mode to limit inappropriate content, and Family Pairing, which allows parents to control content and privacy settings directly.
Despite these measures, TikTok faces criticism and legal challenges for allegedly exacerbating mental health issues in teens. Ongoing efforts include enforcing Community Guidelines and collaborating with experts, yet challenges persist due to the complex effects of social media on youth mental health.
Recent findings by the CCDH suggest there is still more work to be done to make all digital platforms safer for young users. TikTok removed 93.4% of videos that violated their policies on suicide and self-harm within 24 hours with no views and 97.1% before any reports were filed between April - June this year.
Executives of social media platforms, including TikTok, have faced questions from Congress about the potential impact of their sites on young users, particularly adolescent girls, and whether they might be contributing to negative mental health outcomes. TikTok has committed to banning content that could potentially lead to self-harm or suicide.
The CCDH's Ahmed stated that the report underscores the urgent need for reform of online spaces, including TikTok. As the debate continues, it's clear that the mental health of young users remains a pressing concern for lawmakers, parents, and the tech industry alike.
[1] TikTok Safety Center - https://www.tiktok.com/safety/en/ [2] The Verge - https://www.theverge.com/2021/8/25/22641134/tiktok-algorithm-mental-health-harmful-content-research [3] The New York Times - https://www.nytimes.com/2021/08/25/technology/tiktok-mental-health.html [4] TechCrunch - https://techcrunch.com/2021/08/26/tiktok-us-kids-safety-features/ [5] Wired - https://www.wired.com/story/tiktok-moderates-content-mental-health-suicide-eating-disorders/
- The Center for Countering Digital Hate (CCDH) has been pushing for legislation to safeguard children from inappropriate content on TikTok, as they believe it's possible to access content related to suicide and eating disorders within less than 8 minutes of account creation.
- TikTok has pledged to combat these issues by offering various resources to support mental health, such as a "maturity score" and the removal of content that violates their policies on self-harm and suicide, in addition to providing tools that allow users to set time limits on videos and take regular screen breaks.
- Despite TikTok's efforts to address concerns about its impact on mental health, particularly among young users, ongoing criticism and legal challenges persist due to allegations that their content recommendation algorithm can expose vulnerable users to negative content, potentially exacerbating symptoms such as depression and anxiety.