European Commission announces proceedings against TikTok under the Digital Services Act

The action is based on a preliminary investigation, including TikTok’s risk assessment report

European Commission announces proceedings against TikTok under the Digital Services Act

The European Commission has officially opened formal proceedings to examine whether TikTok has breached regulations under the Digital Services Act (DSA).

The Commission’s regulatory action targets specific concerns related to the protection of minors, advertising transparency, access to data for research, and the management of potentially addictive designs and harmful content. The initiation of these proceedings follows a preliminary investigation, which included an analysis of a risk assessment report submitted by TikTok in September 2023 and the company's responses to the Commission's formal requests for information on several critical issues.

The investigation will focus on multiple areas of concern under the DSA, including systemic risks, user protection, minors’ safety, advertising transparency, and research data access.

Systemic Risks and User Protection

The Commission will evaluate TikTok's compliance with obligations to assess and mitigate systemic risks that could negatively affect users' physical and mental well-being. This includes examining TikTok's design and algorithmic systems to ensure they do not foster behavioural addictions or lead users to harmful content, known as “rabbit hole effects.”

Minors' Safety

The proceedings will review TikTok's measures for protecting minors, focusing on the effectiveness of default privacy settings, age verification tools, and safety measures designed to shield young users from inappropriate content.

Advertising Transparency

TikTok's adherence to DSA obligations regarding maintaining a searchable and reliable advertisement repository is under scrutiny. The aim is to ensure that advertising content is transparent and distinguishable to users.

Research Data Access

The Commission will also investigate TikTok's transparency measures, specifically its compliance with the requirement to provide researchers with access to publicly available data, as mandated by the DSA.

“The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users - young as well as old,” said Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age. “The Commission will now carry out an in-depth investigation without prejudice to the outcome.”

The Commission will continue to gather evidence, which may include additional requests for information, conducting interviews, and performing inspections. The DSA did not establish a fixed legal deadline for concluding formal proceedings. The duration of the investigation will depend on various factors, such as the case's complexity and the extent of cooperation from TikTok. The opening of these proceedings empowers the Commission to take further enforcement actions if necessary, including interim measures and decisions on non-compliance.

Recent articles & video

Legal profession approaching genAI with hesitancy, but also excitement: Thomson Reuters report

Manitoba Chief Justice Marianne Rivoalen on going digital and what informs her judicial philosophy

The search is on for the Top 25 Most Influential Lawyers

Law Society of Manitoba issues guidelines to help lawyers navigate generative AI in practice

National Council for Reconciliation Act officially becomes law

Ontario Superior Court emphasizes estate trustee must account for trust property

Most Read Articles

BC Court of Appeal upholds monthly spousal support for ex-RCMP officer despite claims of hardship

Ontario Court of Appeal dismisses malpractice suit over child who was assaulted after doctor visit

Ontario Court of Appeal restores owner's right to repurchase property after initial buyback fails

Ontario Superior Court refuses to dismiss medical negligence case under frivolous litigation rule