A UK High Court judge has granted permission for a class-action style privacy lawsuit to proceed against TikTok over its handling of children’s data.
The lawsuit was filed back in December 2020 by a then 12-year-old girl who has been granted anonymity by the court to bring the claim that the social networking site is processing children’s data unlawfully.
The suit is seeking damages on behalf of millions of children for alleged abuse of their information — and if the legal action succeeds TikTok could be on the hook to pay billions of dollars in compensation.
TikTok was contacted for a response to the U.K. litigation proceeding. In a statement a company spokesperson said:
The claimant is being supported in the litigation by the (now) former Children’s Commissioner for England, Anne Longfield, who argues TikTok has broken U.K. and EU data protection law. She had said she wants the suit to ensure greater protection for under 16s who are using the service in England. The suit is being brought under the General Data Protection Regulation (GDPR) and the U.K.’s Data Protection Act 1998.
In a statement, Longfield said: “We are pleased the Judge agrees that TikTok’s troubling and, we believe, illegal collection of children’s private information is a serious issue that should be tried in the High Court. TikTok’s data collection practices and structures are deliberately opaque and the final destination of the information gathered from children in the UK and Europe is not clear.
“Today’s decision gives us permission to take the necessary steps to serve TikTok’s overseas entities, including in China, the Cayman Islands and the US, which we hope will help shed light on their actions and give clarity to parents and children around the world.”
“We look forward helping millions of children stand up to this social media giant’s shadowy practices in Court,” she added.
In another supporting statement, the claimants’ counsel at Scott+Scott, James Hain-Cole, said: “Complex cases like this often face delays, particularly with overseas defendants. While we note some of the comments from the Judge, we are pleased that we can move forward with this important case and represent the millions of affected children across the UK and Europe.”
The decision to allow the suit against TikTok to proceed was by no means a given. The action had been stayed pending the outcome of another class-action style privacy litigation — brought against Google, in relation to a Safari settings workaround (aka Lloyd v Google), which was also seeking representative damages for privacy harms.
However, last November, Google won its appeal after the Supreme Court was not persuaded to allow privacy damage to be compensated for collectively — killing off that claim and damaging the prospect of similarly representative privacy litigations. So there were doubts over whether the TikTok privacy class action would be allowed to proceed. (Although the Lloyd v Google case predated GDPR coming into force.)
In the event, the judge has allowed the TikTok suit to be served against a number of linked corporate entities (most of which are based outside the U.K.) and which the claimants argue are implicated in TikTok’s processing of children’s data.
Although TikTok’s U.K.-based entity previously filed an application for a summary judgement on the class-action style claim — so another ruling is expected in the coming months which could lead to the case being tossed if the court agrees with its counter arguments.
“The issue is primarily (if not exclusively) an issue of law and the proper interpretation of the GDPR and whether the Supreme Court’s conclusion in Lloyd -v- Google can properly be distinguished,” wrote Mr Justice Nicklin of the Queen’s Bench division of the High Court in a ruling issued today. “I can readily see the arguments that could be advanced by the Defendants, but they have not yet been developed.”
“The test is whether the argument advanced by the Claimant in relation to Lloyd -v- Google has a real prospect of success. Having heard only the Claimant’s argument properly argued, I consider that she has satisfied me at this stage, and expressly on that basis, that there is a serious issue to be tried on this point,” he added.
There is a further complication with the suit in that the claimants had sought more time to serve all the defendants but the judge has not allowed this so it’s likely they will be unable to serve the Beijing-based entity which the claimants allege is behind the TikTok algorithm within the allotted time remaining, potentially limiting the scope of the litigation.
Since the TikTok suit was filed, a Children’s Code has come into force in the U.K. — requiring digital services which are likely to be accessed by children to comply with a set of design standards intended to prioritize privacy and safeguard minors from being tracked and profiled — with the threat for platforms that fail to respect the code of drawing wider GDPR scrutiny (and fines) from the U.K.’s Information Commissioner’s Office.
Tech platforms operating in the U.K. are also facing a major new content oversight regime with a strong child protection component fast coming down the pipe: The government is in the process of pushing the Online Safety Bill through parliament and that legislation is also linked to meaty fines for violations.
Elsewhere in the region, TikTok remains under scrutiny by the European Commission following a series of consumer protection and child safety complaints last year, and in the wake of an emergency data protection procedure instigated in Italy following reports of the death of a child.
An EU data protection investigation of TikTok’s handling of children’s information, originally highlighted by Italy’s DPA, remains ongoing, with Ireland’s Data Protection Commission now the lead authority for that procedure.
A similar compensation-seeking children’s data suit has also been filed against TikTok in the Netherlands.