EU begins formal investigation of TikTok over potential violations of Digital Services Act
The European Commission has opened formal proceedings to assess whether TikTok may have breached the European Union’s Digital Services Act (DSA) in various ways associated with the protection of minors, advertising transparency, data access for researchers, and managing risk for addictive design and harmful content.
The formal investigation adds to the privacy and safety concerns that have plagued the video-sharing platform, giving enterprises yet another reason to consider banning its use by employees while they access corporate networks. The Commission had previously conducted a preliminary investigation and risk assessment that found further oversight to be necessary.
The DSA is a somewhat controversial content-policing law that sets out rules for how internet companies should keep European users safe from online disinformation and illegal content, goods and services. The DSA bans the practice of targeting users online based on their religion, gender, or sexual orientation, as well as dark patterns—deceptive web design aimed at encouraging people to unwillingly click on online content.
TikTok and its China-based parent company ByteDance are no strangers to controversy over the potential dangers of the platform, which remains enormously popular even though it’s been deemed a threat to national security by Brendan Carr, a commissioner on the US Federal Communications Commission (FCC). In June 2022 he wrote to Apple and Google to request that both companies remove the app from their stores over its collection of sensitive data from people in the United States. However, it remains available on both stores.
The European Commission itself even banned its staff from using TikTok, requiring them to remove it from their phones and devices in the name of protecting data and increasing cybersecurity. The French government imposed a similar ban in March 2023.
Depending on the outcome of the Commission’s formal investigation, this may be a practice that other enterprise organizations or government entities worried about data privacy and security may want to consider adopting, even as some government officials—including US President Joe Biden—have embraced TikTok to promote themselves or their agendas.
The Commission will investigate whether TikTok has fulfilled its obligations under the DSA.
These obligations include the assessment and mitigation of “systemic risks” such as negative effects arising from algorithmic systems “that may stimulate behavioural addictions and/or create so-called ‘rabbit hole effects’” that recommend content to users that could send them on a path of compulsive TikTok use or disinformation. This can have ill effects on a person’s physical and mental health, especially when it comes to children, according to the Commission.
“Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being,” the Commission said.
It is also concerned that the measures TikTok has put in place to protect minors from inappropriate content, such as age verification tools, may be ineffective, and that the app’s default settings may not be sufficient to ensure a high level of privacy, safety, and security for them as they use the app.
The Commission will also investigate TikTok’s compliance with its obligation to provide a searchable and reliable repository for ads presented on TikTok, as well as measures taken by the platform to increase its transparency regarding “suspected shortcomings in giving researchers access to TikTok’s publicly accessible data,” according to a Commission statement.
TikTok and ByteDance have long faced regulatory scrutiny for alleged shady information-collection and other practices, hence widespread criticism, industry- and company-specific bans, and attempted regulatory oversight to prevent potential misuse.
ByteDance—alongside Apple, Microsoft, Google and Meta—was even deemed a “gatekeeper” under the EU’s Digital Markets Act (DMA), which came into force in September 2023 and is aimed at limiting the power of large tech corporations.
As a gatekeeper—a designation ByteDance has challenged, so far to no avail—the company has until March 7 to comply with the full set of DMA obligations, which address issues of access controls for personal data; advertising transparency; and putting a stop to both self-preferencing of services as well as certain restrictive app-store requirements for developers.
The Commission plans to continue to gather evidence in its investigation and consider whether to take further enforcement steps, such as interim measures, as it moves forward. There is no legal deadline for the investigation to end; it depends on the complexity of the case and level of cooperation of ByteDance in terms of compliance, according to the Commission.