We take a look in this article at the mistakes made by TikTok’s operator over the past couple of years, which has led to them being fined across the globe for illegally obtaining children’s data.  In the UK, privacy laws for children will be strengthened in September, so we have examined TikTok’s processing in the light of the new code.

The Age Appropriate Design Code, also known as the Children’s Code, is a set of standards for the provision of online services likely to be accessed by children, including apps, online games, and web and social media sites. It came into force in September last year, with companies given a 12 month transition period. From September 2021, all companies to whom it applies will be expected to conform to its standards. You can read more about the standards in our earlier blog here.

In 2019, ByteDance were fined $5.7 million by the Federal Trade Commission (FTC) for collecting information via its lip-synching app Musical.ly from children under the age of 13. They were also ordered to take down any videos uploaded and to comply with US children’s privacy legislation going forwards. Bought in November 2017 by the Chinese company ByteDance, the app Musical.ly was later rebranded as TikTok. The app allows users share short video clips set to music, it has been installed more than a billion times and in the UK is a firm favourite of ‘Generation Z’ consumers. According to Ofcom, 44% of eight to 12-year-olds in the UK use TikTok, despite its policies forbidding under-13s on the platform.

Failure to obtain Parental Consent

The FTC found that Musical.ly failed to require parental consent for users under 13, neglected to notify parents about how the app collected personal information on underage users, and did not permit parents to request that their children’s data be deleted. Under the new Children’s Code this would breach the standard relating to Parental Controls, which requires companies to provide parents with information about the child’s right to privacy (under the United Nations Convention on the Rights of the Child), as well as resources to support age appropriate discussion between parent and child.

Published, not Private, by default

TikTok accounts were public by default, so other people could see the content users post unless they adjusted their privacy settings. The FTC also claimed that, moreover, even users who set their privacy settings to private could still be messaged by others. For years, it has been widely reported in the media that underage users were being solicited to send nude images on Music.ly, and later on TikTok as well.

Public default settings for children’s data breach the Default Settings standard in the Children’s Code, which states that settings for children must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child). The standards also include a commitment to act in the best interests of the child, as defined by the United Nations Convention on the Rights of the Child (UNCRC). By failing to protect children from being messaged by strangers and allowing access to children’s contact details in this way, TikTok did not protect and support the health and wellbeing of the children.

No checks on age of users

TikTok’s operators were aware that a ‘significant percentage’ of its users were younger than 13, and received thousands of complaints from parents whose underage children had created accounts. The app’s website even warned parents to monitor their activity on the app but still did not request proof of age. Although the app was modified later in 2017 to prevent people creating an account if they indicated they were under the age of 13, there were still no actions taken to verify the ages of existing users.

The Children’s Code has an Age Appropriate Application standard which states that companies must apply the code to give all children an appropriate level of protection in how their personal data is used. This means establishing what age range the individual users fall into, so that the protections and safeguards given to their personal data can be tailored accordingly, by applying the standards in the Code. The ICO advises that if companies are unsure of the age range of their individual users the standards should be applied to all their users instead. This is so that all children are protected, even when companies are not certain of whether they are children or not.

Failure to provide a comprehensible privacy notice

TikTok’s operators were also fined by the Dutch Data Protection Authority who imposed a fine of €750,000. This time it was for failing to provide an understandable privacy statement to Dutch users. The app and the privacy notice were in English and therefore not readily understandable, particularly to children. This meant that TikTok had failed to provide an adequate explanation of how the app collects, processes and uses personal data. This was an infringement of the GDPR, which has a transparency principle, stating that people should be given a clear idea of what is being done with their personal data.

The principle is strengthened by the Transparency standard in the Children’s Code which sets out further requirements of the privacy notice, which should be specifically designed for children to be easily understood and broken down into bite sized chunks if necessary, all delivered at the point that personal data is collected. It also encourages prompting the child to speak to a parent depending on the age of the child or the risk posed to the child’s rights and freedoms by the processing.

The true cost to TikTok will likely be far higher; they are also facing a €1.5 billion claim by consumer groups demanding that TikTok pay damages to Dutch children. Signatures are being collected in support of the claim from parents of Dutch children who have downloaded the app since May 2018. In India the app has been banned and the US indicated just last month that they are considering banning all Chinese apps.

The Children’s Code is a statutory code of practice and companies who fall within its remit are obliged to comply with its standards. While it applies to ‘information society services likely to be accessed by children, companies should take note that the definition of such services is broad with the ICO advising that it includes “many apps, programs, connected toys and devices, search engines, social media platforms, streaming services, online games, news or educational websites and websites offering other goods or services to users over the internet. It is not restricted to services specifically directed at children.’

The standards in the Children’s Code are high, but as long as companies like ByteDance are operating, there is a clear need for precise and robust laws to protect the young and vulnerable.

If you have a website and haven’t already checked whether the new Code applies to your organisation you need to take action now.  The Information Commissioner’s Office provides extensive information to help companies determine whether or not it applies to them and what to do if so.   At Data Protection Consulting we offer ongoing Data Protection as a Service and have carried out applicability assessments this year for all our clients. For those to whom it applies, we have provided detailed guidance on the steps they need to take to comply. If you are interested in receiving this level of support you can find out more about our DPaaS service here.