In a significant move reflecting the European Union’s (EU) assertive stance on digital regulations, the European Commission has formally launched investigations into two tech giants, YouTube and TikTok. This development aligns with the EU’s ongoing efforts to impose stricter controls on the technology sector through the Digital Services Act (DSA), which was enacted in August.
The EU’s executive branch confirmed that it has issued formal requests for information from YouTube and TikTok, setting a deadline of November 30 for their responses. The primary focus of these investigations is to understand the measures implemented by these platforms to comply with the DSA, with a particular emphasis on the protection of children.
The Rules As It Follows
The commission has explicitly sought details on the companies’ “obligations related to risk assessments and mitigation measures to protect minors online, especially concerning the risks to mental and physical health, and on the use of their services by minors.” This underscores the EU’s commitment to ensuring that digital platforms prioritize the well-being and safety of younger users.
The DSA, a cornerstone of the EU’s regulatory framework, encompasses a comprehensive set of measures designed to govern the tech industry. One of its pivotal mandates is to compel tech firms to take proactive measures against disinformation, illegal content, and harmful materials. Additionally, the legislation prohibits targeted advertising directed at individuals aged 17 and below.
Under the provisions of the DSA, platforms found in violation could face substantial fines, reaching up to 6% of their global turnover. This stern approach highlights the EU’s determination to hold tech companies accountable for any lapses in compliance with the newly enacted regulations.
TikTok, a popular social media platform among younger users, owned by the Chinese company ByteDance, and YouTube, owned by Google’s parent company, Alphabet, now find themselves under regulatory scrutiny. Both companies have a limited timeframe to provide detailed information regarding their adherence to the DSA.
These investigations mark a proactive step by the European Commission to ensure that major digital platforms align with the regulatory framework set out by the DSA. The subsequent actions will be determined by the assessment of the responses received from YouTube and TikTok, potentially including the formal initiation of proceedings if their responses are deemed inadequate or non-compliant.
Child Protection As The Priority
The significance of child protection within the DSA framework was emphasized by Thierry Breton, the EU commissioner in charge of the internal market, who declared in August that “child protection will be an enforcement priority” under the new legislation.
This move follows a previous action where the commission sought information from TikTok on October 19. The inquiry at that time focused on concerns related to the spread of terrorist and violent content, hate speech, and the alleged dissemination of disinformation, particularly in the context of the conflict between Israel and Hamas.
As the EU continues its pursuit of stringent regulatory measures for the tech industry, the investigations into YouTube and TikTok underscore a commitment to ensuring that digital platforms prioritize user safety, especially among the younger demographic. The evolving landscape of digital regulations highlights the intricate challenges faced by tech companies as they navigate a complex regulatory environment, with potential global implications for the industry.