It’s one of the youngest’s favorite social networks… And sometimes one of their best enemies. Tik Tok “exposes children and young adults who already have mental health problems to serious risks,” denounces Lisa Dittmer, researcher at Amnesty International. The organization shows in a study that the video platform exposes the most vulnerable to content that “romanticizes” depression and suicide. To reach this conclusion, the organization created fictitious accounts of 13-year-old children on the platform, some driven by artificial intelligence in Kenya and the United States, others managed manually in the same countries and the Philippines. . The videos offered by the platform were then analyzed. And the results are worrying.
After 5 to 6 hours spent by the AI on Tik Tok (over several days), almost one in two videos offered was potentially harmful to the user’s mental health. These videos were ten times more often offered to an account whose owner showed an interest in mental health. For accounts manually controlled by researchers, it’s even worse: 3 to 20 minutes of use on the subject of mental health was enough for Tik Tok to offer half of the videos related to this theme. “When I “like” a sad video that speaks to me, suddenly my whole page is sad. I find myself in the “sad TikTok”. It affects my mood,” says Francis, an 18-year-old student in the Philippines.
» READ ALSO – Better understanding suicide to better prevent it
Worse still, after only an hour of use, videos idealizing, trivializing or even encouraging suicide were offered. “Social networks are vectors of vulnerability for certain young people who are already vulnerable,” worries Professor Charles-Édouard Notredame, psychiatrist at Lille University Hospital who works in particular on the Elios research program, on the interaction between young suicidal people and the social network. .
“Our study shows that TikTok may be exposing children and youth to serious health risks by persisting with its current business model, which focuses more on keeping users’ eyes glued to the platform rather than respecting the right to health of children and young people,” describes Lisa Dittmer. According to Amnesty, the very model of Tik Tok encourages harmful use of the application, by targeting obsessive user behavior. Professor Charles-Édouard Notredame describes “a phenomenon of compulsive consultation which is based on short videos and the risk of which must be recognized. »
» READ ALSO – Depression, anxiety, attention disorders… The worrying rise of self-diagnosis on TikTok among young people
Even though the maximum length of a video can be 10 minutes, most do not go beyond a few tens of seconds to quickly “hook” the user. Two viewing modes are offered: the first allows you to see the videos published by the profiles to which you are subscribed, while the second, mode called “For you”, leaves room for an algorithm which offers videos according to a personalized selection, and which can scroll endlessly. When the user watches an entire video, the algorithm understands that the subject appeals to him, and offers him other videos on the same theme which will keep him on the application. The information collected is also made available to advertisers who can target their campaigns.
“What is at stake is the redundancy of risky content”, which can be reported to the platform, underlines Professor Charles-Édouard Notredame. Officially, Tik Tok prohibits content “showing, promoting or providing instructions on suicide and self-harm, and related challenges, games and pacts”, as well as those “showing or promoting false information relating to suicide and self-harm” and “sharing plans for suicide and self-harm”. But “the content is not necessarily explicitly dangerous”, indicates the doctor. On the other hand, we can, for example, find numerous “testimonies of despair.” People in distress enter a spiral, a loop, it is co-rumination reinforced by algorithmic bubbles.”
» READ ALSO – Hateful messages, calls for riots: what will European law change on social networks?
The psychiatrist advises young people and those around them to “contact a trusted adult, a teacher or a professional, if suicidal thoughts appear”. In addition to the direct support of the young person in difficulty, several systems have been put in place: the number 31 14 is a line dedicated to suicidal people, the Teen Centers offer discussion spaces and sometimes digital support sessions, the website filsantejeunes.com provides advice and information…
“We must remember that a social network is a social space, it is neither good nor bad, we must think about how we use it,” underlines the doctor. He calls for “better collaboration between clinical and mental health research and large social media companies. We won’t be able to do without these behemoths.” Amnesty International advises all states to legislate to prohibit advertisements based on the collection of data from minors, and suggests that Tik Tok inform the user about its algorithm and offer them the possibility of measuring the use or not of that -this. The organization also points to a lack of impact study on users, which “clearly goes against Tik Tok’s obligation to respect their rights”. The psychiatrist notes that “censorship is difficult to apply, however we know what works, such as regulation through messages of help and support. We must consider the social network as a means of prevention.” Amnesty advocates for corporate responsibility and reparations from Meta, Google, Tik Tok and other “tech giants”.