The European Commission has officially launched an investigation into TikTok for suspected violations of the Digital Services Act (DSA), focusing on concerns about its addictive design, inadequate age verification, privacy lapses, and potential breaches related to harmful content. The commission is particularly interested in TikTok’s handling of minors, transparency in advertising policies, and its approach to mitigating risks associated with product design and content. The DSA, effective since August 2023, holds major tech platforms accountable for the content they host, with a specific emphasis on child safety, privacy, and combating disinformation. TikTok, classified as a Very Large Online Platform (VLOP) due to its extensive user base in Europe, could face fines of up to 6% of its global annual revenue if found non-compliant.
TikTok responded by emphasizing its efforts to protect teens and restrict access for children under 13. The company expressed a commitment to collaborating with experts and the industry to enhance safety measures for young users. This investigation follows previous actions by the European Commission, which sought details from TikTok and Meta regarding their control of disinformation, particularly related to the Israel-Hamas conflict. Thierry Breton, European Commission for Internal Market, highlighted the importance of this probe in addressing a “suspected breach of transparency and obligations to protect minors.” The commission aims to collect evidence through requests for information, interviews, and potential inspections, with the power to penalize TikTok or accept commitments to remedy the issues identified.