The EU has asked TikTok and YouTube for more information on how they protect children

The European Commission has sent another pair of formal requests for information to major platforms subject to the bloc’s rebooted online content management and moderation, the Digital Services Act (DSA).

The latest requests, focused on child safety, were sent to TikTok and YouTube.

“The Commission asks companies to provide more information on the measures they take to fulfill their obligations related to the protection of minors under the DSA, including obligations related to risk assessments and mitigation measures to protect minors online, especially regarding the risks to mental and physical health, and the use of their services by minors,” the Commission wrote in a news release.

The EU has given companies until November 30 to respond to the data. Regulators will then assess next steps – which could include opening a formal investigation.

The DSA has established a governance framework for platforms to respond to reports of illegal content or products. Larger platforms have additional responsibilities – including the involvement of algorithm-driven features such as recommendation engines. This includes carrying out risk and mitigation assessments in relation to the safety and well-being of children.

The regulation also expressly prohibits targeted advertising to minors.

Confirmed breaches of the DSA can attract fines of up to 6% of global annual turnover. Penalties may also be issued for failure to provide data upon request.

This is the second such information request that the EU has sent to TikTok since the company’s application regime began. Last month the EU asked the video-sharing platform for “general aspects” regarding the protection of minors, as well as requesting information on its response to the Israel-Hamas war.

The Commission’s follow-up request to TikTok on child protection suggests it is seeking more detail on how the platform fulfills its obligations to protect minors.

Last month the Commission sent information requests to Meta and X (formerly Twitter), following reports about the spread of terrorist content, hate speech and disinformation referring to the Israel-Hamas war. In the case of Meta the EU also requested information on its approach to election security.

The DSA will now be implemented in what are called very large online platforms (VLOPs), a designation used by the aforementioned four tech giants.

The full list of 19 VLOPS and VLOSE (aka very large online search engines) was announced by the Commission in April.

So far, the EU has not confirmed any formal investigation under the DSA. But the blitz of information requests since the regulation became enforceable on VLOPs at the end of August suggests it may be preparing to take the next step.

The Commission has also recently issued a call for designated Member State level bodies, which, from early next year, will be responsible for managing the DSA of small digital services based in their countries, to help support its implementation of the regime of technology giants.

TikTok and YouTube have been contacted for comment on the Commission’s latest DSA information request.

Leave a comment