Shouldn't TikTok be blocked

Children's privacyTikTok did not enforce its own age restrictions

Actually, according to TikTok's own rules, users have to be at least 13 years old to be able to create an account in the video sharing app. But many children and young people care little. The app only asks for the date of birth when it is set up. TikTok does not check whether the information is correct.

TikTok has long known the problem internally. Not long ago, TikTok and its predecessor app musical.ly were considered dedicated platforms for children, and hardly anyone over 14 used the app.

To counter the problem, TikTok has developed an internal system, the so-called "user rate". Moderators should classify users into different age groups based on their appearance. This is reported by two independent sources to netzpolitik.org, which had insight into TikTok's internal moderation practices in Europe. The practice was used at least until the end of 2019. The categories were:

  • Whitelist + for users who look 15 or older.
  • Whitelist for users who look 13 or older.
  • Blacklist for users who look younger than 13.
  • No rate

Children were not banned

Noteworthy: Users who were presumably younger than 13 years of age and therefore ended up in the “Blacklist” group should not have been automatically blocked. Rather, their videos have been restricted in range.

This not only contradicts TikTok's official usage rules and the company's message to parents. It reads, “If your teen is under 13 and has signed up for a TikTok account aged 13 and over, you can notify us at [email protected] We will then initiate appropriate measures. "

It is also a violation of the European Union's data protection laws. Because according to the General Data Protection Regulation, children and young people under the age of 16 need the consent of their parents if they want to use apps like TikTok, Facebook or Instagram that process their data. This consent must be given in writing beforehand.

Suspicious videos escalated to Beijing

In some countries such as Austria and Denmark, the age for consent has been reduced to 13 years, but even there TikTok would have at least consciously risked breaking the law with the practice. Because even in the case of users, who internally assumed that they were below the age limit, parental consent was not obtained.

The so-called “user rating” was often carried out in the same team as the other moderation, and there were specially trained employees. Moderators also had the opportunity to escalate suspicious videos to another team in Beijing, which could also decide to block accounts.

When asked by netzpolitik.org, TikTok does not provide any information on how many users ended up on the blacklist internally and why their accounts were not blocked or checked. When we asked how many users were subsequently banned due to the age restriction, TikTok did not answer.

The company responded to our detailed questions with the following written statement: “As our Terms of Use make clear, TikTok is intended for users aged 13 and over. We have set up an age rating system that asks people to provide their actual age to determine whether they are eligible to use our platform. If we also discover that a person under the age of 13 may be using TikTok, we will investigate this immediately and take appropriate action. "

Categories also visible in the source code

Until March of this year, the “user_rate” category was also visible in the app's source code: where users left comments under other people's videos. This is what security researchers at the Australian Strategic Policy Institute (ASPI) discovered when they examined the app. TikTok used a numerical system and for most users: the “user rate” was set to 1. However, some users had different numerical values. ASPI and netzpolitik.org could not determine what the values ​​stand for and TikTok did not provide any information on request.

With an update in March, the “user rate” disappeared from the source code of the comments. Instead, there is now a new field: "user_buried", which translates as "user: buried in". The value can be set to "true" or "false". TikTok did not provide any information about what this value stands for.

TikTok is repeatedly criticized for suppressing certain users or videos within its reach, especially through so-called shadowbanning. In these cases, the videos or comments will not be deleted, but they will become invisible to everyone except the exiles. The corresponding category in TikTok's internal moderation guidelines was "visible to self", only visible to you.

Shadowbanning can be a way to keep bullies and trolls at bay. The intervention is difficult to understand for those affected as well as for outsiders. Exiles only notice that their videos are rarely viewed or that no one replies to comments.

A quarter of all 10-year-olds use the app

A current survey shows how many children are actually on TikTok: In Denmark, 4 out of 10 children between 9 and 14 years of age use the app every day, according to a recent survey by Danish broadcaster DR. In Germany, a quarter of them use it, according to a study by Bitkom of all 10 to 11 year olds used the app, among 12 to 13 year olds it was almost a third. The numbers are from 2019. In the meantime, they are likely to be even higher.

Now the Danish data protection authority has intervened. She is investigating whether TikTok is possibly violating data protection laws there. In Denmark, users under the age of 13 are only allowed to log into an app such as TikTok with the consent of their parents. TikTok is not doing enough to ensure that, a lawyer for the agency told DR.

The data protection authorities have also initiated proceedings in the Netherlands and France. The trigger there is also the way TikTok handles children's data.

In Denmark, TikTok this week blocked the account of 12-year-old user Lianna Riedel Frank, who had more than 10,000 followers, but only after DR reported on her. She used TiKTok with her mother's consent.

In the US, TikTok has already had to pay a fine

In the US, TikTok has paid millions for careless handling of children's data. In 2019, the US authorities fined TikTok a record $ 5.7 million for collecting data from children under the age of 13 without parental consent.

As part of the settlement, TikTok agreed to set up a special service for children under the age of 13, which should only be available in the US, and promised to remove all uploaded videos from these users.

However, TikTok has failed to meet its commitments, according to a new complaint with the US Consumer and Competition Authority (FTC).

Currently, TikTok is on the verge of extinction in the US after Donald Trump issued two decrees against the company.

More control for guardians

In April, TikTok introduced new control options for parents. In the so-called “accompanied mode”, parents can limit the screen time for their children, filter videos from the stream or specify who is allowed to send them messages. Since the end of April, only users aged 16 and over have been allowed to send and receive direct messages in the app.

The data protection authorities do see the dilemma of young people and their parents. "For many users, this is an important way to be in contact with friends and to spend time together during the Corona crisis," wrote the Dutch data protection authority in a statement. At the same time, she writes that children and young people in particular are particularly vulnerable and therefore need to be particularly well protected. Preliminary results of the review there are expected in the course of the year.

Would you like more critical reporting?

Our work at netzpolitik.org is financed almost exclusively by voluntary donations from our readers. With an editorial staff of currently 15 people, this enables us to journalistically edit many important topics and debates in a digital society. With your support, we can clarify even more, conduct investigative research much more often, provide more background information - and defend even more fundamental digital rights!

You too can support our work now with yours Donation.

About the author

Chris Köver

Chris Köver is a journalist. In her work, she researches the cross-connections between digital technologies and social justice, machine learning and discrimination, surveillance and gender - from an intersectional feminist perspective. Chris reports on all these topics for netzpolitik.org and also moderates the netzpolitik.org podcast from time to time. She gives lectures, moderates panels, gives workshops and is happy to share tips where you can see, read and hear other experts in these areas. Contact: email, OpenPGP, Twitter.
Published 08/18/2020 at 3:18 PM