TikTok faces landmark legal claim following the illegal collection of millions of children’s personal data
Scott+Scott Partner Tom Southwell and Counsel James Hain-Cole explain the landmark High Court claim against TikTok, and parent firm ByteDance, over the illegal collection of, and profiteering from, children’s personal data.
TikTok has risen to become one of the world’s most popular social media platforms with over 800 million active monthly users across the globe. The platform is especially popular with young people.
During the Covid-19 pandemic, the app helped children keep in touch with their friends during an incredibly difficult year. But behind the entertainment value of the app lies something far more concerning.
As it has grown in popularity, TikTok has come under scrutiny for a number of reasons, including for its excessive collection of children’s personal data. We understand that this includes, but is not limited to, private information such as children’s telephone numbers, bios, videos, pictures, and their exact location, along with biometric (or facial recognition) data.
In December 2020, Scott+Scott filed an opt-out representative action in the High Court of England & Wales against TikTok, and its Cayman Islands-based parent firm ByteDance. The case is being brought by a twelve-year-old girl, who will remain anonymous, on behalf of millions of affected children in the UK and Europe. The former Children’s Commissioner for England, Anne Longfield CBE, is representing her and all of the other children who were impacted, as the appointed litigation friend.
The claim is based on clear evidence that the platform illegally harvests the personal information of these children in direct violation of UK and EU data protection law, including the EU General Data Protection Regulation (GDPR).
We encourage all parents of children that use TikTok to visit the website for more information on how their child may be affected.
Millions of children’s data harvested, while parents are left in the dark
The claim alleges that every child that has used TikTok since 25 May 2018, regardless of whether they have their own TikTok account or what their privacy settings are, may have had their private personal information illegally collected for the benefit of ByteDance, TikTok and unknown third parties, including advertisers.
Put into perspective, TikTok has allegedly collected the personal information of over 3.5 million children illegally in the UK alone, with millions more impacted across the European Economic Area.
TikTok’s business model is predicated on the collection of user data, which is subsequently shared with advertisers who pay TikTok for the right to place adverts (including targeted adverts tailored to the user). Public information suggests that two-thirds of the company’s revenue involves the sale of personal information to advertisers.
TikTok is opaque about who has access to children’s private data, information which is increasingly valuable to the company. ByteDance’s annual gross profit increased by 93% during the pandemic to $19 billion in 2020.
This model is not unique to TikTok, nor is it necessarily wrong for tech companies to sell data to advertisers to enhance user experiences. However, that personal data must be processed in a manner that is compliant with the relevant data privacy laws.
Unfortunately, despite their best efforts, due to the lack of transparency from TikTok children and their parents and guardians remain unaware of the extent to which their personal data is being processed when they use the app. TikTok – an app that has a massive appeal to children due to its features and design – continues to profit from their data without fulfilling its legal obligations, and its moral duty to protect children online.
Severe breach of UK and EU data protection law
TikTok, the app and its parent company collect children’s personal data without sufficient warning, transparency or necessary consent required by UK and EU data protection law.
Evidence indicates that TikTok and ByteDance continue to violate the UK Data Protection Act and the GDPR.
The claim led by Anne Longfield CBE alleges that TikTok have been violating data protection law since at least 25 May 2018 by processing excessive amounts of users’ data without:
(a) having adequate measures in place to prevent children from downloading and/or using the app;
(b) having in place adequate messaging to explain which data was collected and how this was being further processed to facilitate informed decision-making by users;
(c) providing the user with adequate transparency about the nature and extent of the processing of their data;
(d) acquiring the relevant and necessary consent of the children’s parents or guardians, or any effective consent; and further or alternatively/or;
(e) any effective contractual basis or legitimate interest.
Anne Longfield CBE – supported by Scott+Scott – is fighting on behalf of children and parents to stop TikTok illegally processing millions of children’s information and demanding that the company deletes children’s personal information.
If successful, the claim also aims to win compensation for the millions of affected children, which could be as much as thousands of pounds per child – equating to billions of pounds in damages owed by TikTok.
TikTok’s track record
TikTok has been the subject of intense regulatory scrutiny which has shown a troubling pattern of disregard for child data protection law by the company.
In 2019, TikTok was issued a record $5.7 million fine by the US Federal Trade Commission (FTC) following allegations the platform violated the parental consent terms of Children’s Online Privacy Protection Act (COPPA).
The FTC maintains that TikTok breached COPPA rules by failing to notify parents about the app’s collection and use of personal information from users aged under 13 and failure to delete personal information following requests from parents.
The fine remains the largest civil penalty ever obtained by the FTC in a children’s privacy case and was quickly followed by an additional fine by the Korea Communications Commission (KCC) following 6,000 breaches of local data privacy laws.
In the UK, members of Parliament’s Business, Energy and Industrial Strategy Committee have expressed concern surrounding information sharing between TikTok users in the UK and ByteDance, which could be subject to China’s National Intelligence Law.
The substantial international fines and broader concerns around the ultimate beneficiaries of user data align with our claim that TikTok continues to display a blatant disregard for child data protection policies.
Supreme Court: Lloyd v Google
Although the claim was filed in the High Court of England & Wales in December 2020, the claim is currently stayed pending judgment by the Supreme Court in the Lloyd v Google appeal. We expect the judgment to be published later this year.
The Supreme Court’s decision is likely to set an important precedent for similar data-related representative actions, including the claim against TikTok.
Regardless of the outcome of the Lloyd v Google case, however, we remain committed to holding TikTok to account for its illicit conduct.
As the case continues, we hope that TikTok gives serious consideration to the gravity of the concerns of millions of parents and takes considerable steps to improve its practices in light of the issues raised by the action.
Learn more about our fight for justice for the 3.5 million children impacted by one of the world’s most troubling data policy violations: https://tiktokdataclaim.uk/
June 2022 News
Law Commission proposals on corporate criminal liability
On 10 June 2022, the Law Commission published an options paper for the Government on how it can improve the law to ensure that corporations are effectively held to account for committing serious crimes. Tom McNeill and John Binns of the Financial Crime team at BCL Solicitors analyse the key points.