Snapchat Faces EU Child Safety Probe Under Digital Services Act
好,我需要帮用户总结这篇文章,控制在100字以内。首先,文章主要讲欧盟委员会对Snapchat展开调查,涉及儿童保护和数据服务法案。我得抓住关键点:Snapchat的年龄验证方法、默认隐私设置、非法内容监管以及可能的处罚。 然后,用户要求不要以“文章内容总结”开头,直接描述。所以开头要简洁明了。接着,我需要确保涵盖所有主要方面:年龄验证不足、隐私设置不够安全、非法内容处理不力,以及可能的罚款。 最后,检查字数是否在限制内,并确保语言流畅自然。这样用户就能快速了解文章的核心内容了。 </think> 欧盟委员会根据《数字服务法案》对Snapchat展开调查,重点关注其年龄验证机制、默认隐私设置及非法内容监管是否符合儿童保护标准。Snapchat被指依赖自我声明验证年龄,可能无法有效防止未成年人使用,并存在隐私和安全风险。若调查发现违规行为,Snapchat可能面临高达全球年营业额6%的罚款。 2026-3-27 06:18:59 Author: thecyberexpress.com(查看原文) 阅读量:3 收藏

The European Commission has launched a formal DSA child protection investigation into Snapchat, examining whether the platform is meeting its obligations to ensure a high level of safety, privacy, and security for minors.

The move comes under the framework of the Digital Services Act (DSA), which sets strict standards for online platforms operating in the European Union and can impose fines of up to 6% of global annual turnover for non-compliance.

Age Assurance Under Digital Services Act Scrutiny

At the center of the DSA child protection investigation is Snapchat’s approach to age assurance. According to its terms, users must be at least 13 years old to access the platform. However, the Commission suspects that Snapchat’s reliance on self-declaration is insufficient.

It raises concerns that this method neither prevents children under 13 from accessing the service nor adequately verifies whether users are under 17, which is necessary to ensure age-appropriate experiences.

There are also concerns that tools to report underage users may not be easily accessible within the app.

The investigation also focuses on the risk of minors being exposed to grooming attempts and recruitment for criminal purposes. The Commission suspects that Snapchat may not be doing enough to prevent users with harmful intent from contacting children, particularly in cases where individuals misrepresent their age or manipulate their profiles.

report-ad-banner

This includes concerns around exposure to harmful content, conduct, and contact that could place minors at risk.

Default Settings And Privacy Concerns 

Another key area under the DSA child protection investigation is Snapchat’s default account settings. The Commission believes that the platform may not provide sufficient privacy, safety, and security protections for minors by default.

Features such as the “Find Friends” system, which recommends users, and push notifications that remain enabled by default are under scrutiny.

The Commission also notes that users may not receive adequate guidance during account creation on how to manage privacy and safety settings, or how to adjust them effectively.

Illegal Content And Reporting Mechanisms Under Review

The investigation further examines whether Snapchat is effectively preventing the dissemination of illegal content, including information related to the sale of drugs and age-restricted products such as alcohol and vapes.

Under the DSA, platforms are required to mitigate systemic risks arising from their services. The Commission suspects that current content moderation measures may not be sufficient to block or limit access to such content, especially for younger users.

Reporting mechanisms for illegal content are also part of the Digital Services Act child protection investigation. The Commission raises concerns that these systems may not be easy to access or user-friendly and could involve design practices that make reporting less straightforward.

There are also concerns that users may not be properly informed about complaint procedures or available redress options within the platform.

Next Steps in DSA Child Protection Investigation

The European Commission will now conduct an in-depth investigation by gathering further evidence, including requesting information from Snapchat and conducting interviews or inspections.

The opening of formal proceedings allows the Commission to take further enforcement actions, including adopting interim measures or issuing a non-compliance decision. It can also accept commitments from Snapchat to address the issues identified during the investigation.

The action against Snapchat builds on broader regulatory efforts under the Digital Services Act to strengthen online child protection across platforms.

The Commission has used its 2025 DSA Guidelines on the protection of minors as a benchmark for evaluating compliance, emphasizing that self-declaration alone should not be considered a reliable age assurance method and that default settings should offer the highest level of protection for minors.

“From grooming and exposure to illegal products to account settings that undermine minors’ safety, Snapchat appears to have overlooked that the Digital Services Act demands high safety standards for all users. With this investigation, we will closely look into their compliance with our legislation,” said Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy.

Age Verification Under Question

In a related development, the European Commission has also taken preliminary action against adult content platforms including Pornhub, Stripchat, XNXX, and XVideos under the Digital Services Act. The Commission found that these platforms may have failed to adequately protect minors from accessing pornographic content.

It noted that their risk assessments did not sufficiently identify or evaluate risks to children, and in some cases, placed more emphasis on business considerations than on child safety.

“In the EU, online platforms have a responsibility. Children are accessing adult content at increasingly younger ages and these platforms must put in place robust, privacy-preserving and effective measures to keep minors off their services. Today, we are taking another action to enforce the DSA – ensuring that children are properly protected online, as they have the right to be,” said Virkkunen.

The findings also indicate that these platforms rely heavily on self-declaration for age verification, which the Commission considers ineffective. Additional measures such as content warnings, page blurring, or “restricted to adults” labels were also deemed insufficient to prevent minors from accessing harmful material. The Commission has suggested that more robust, privacy-preserving age verification methods are required to address these risks.

As part of ongoing proceedings, these platforms will have the opportunity to respond to the Commission’s findings and take corrective measures.

If the breaches are confirmed, the Commission may issue a non-compliance decision, which could result in significant financial penalties or enforcement actions to ensure compliance.

The broader enforcement push reflects a clear regulatory direction under the Digital Services Act, with authorities focusing on ensuring that platforms, regardless of size, take stronger responsibility for protecting minors online.


文章来源: https://thecyberexpress.com/dsa-child-protection-investigation/
如有侵权请联系:admin#unsafe.sh