EU launches probe into Meta's handling of child safety on Facebook and Instagram
Meta is once again facing scrutiny in the European Union, this time for its approaches to safeguarding children. The European Commission (EC) has launched formal proceedings to determine if the parent company of Facebook and Instagram violated the Digital Services Act (DSA). The concern is that Meta may have fueled social media addiction among children and neglected to ensure robust safety and privacy measures.
The Commission's official press release reads:
Such an assessment is crucial to mitigate potential risks to children's physical and mental well-being and to ensure their rights are respected.
The EU's Digital Services Act (DSA), which came into effect for all online platforms on February 17 this year, requires particularly large online platforms and search engines to implement additional measures to combat illegal online content and safeguard public safety.
Meta reacted to the formal proceedings by highlighting features like parental supervision settings, quiet mode, and automatic content restriction for teens. A Meta spokesperson, in a statement to the tech media outlet Engadget, remarked:
The Commission's official press release reads:
The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called 'rabbit-hole effects'. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.
The EC's investigation will focus on whether Meta effectively evaluates and addresses risks from its platforms' interfaces. The EC is troubled by how Meta’s designs might exploit the vulnerabilities and lack of experience among minors, leading to addictive behaviors and reinforcing the so-called "rabbit hole" effect.
Such an assessment is crucial to mitigate potential risks to children's physical and mental well-being and to ensure their rights are respected.
The investigation will also delve into whether Meta implements necessary measures to block minors from accessing inappropriate content, provides effective age verification tools, and equips minors with simple yet robust privacy tools, like default settings.
The EU's Digital Services Act (DSA), which came into effect for all online platforms on February 17 this year, requires particularly large online platforms and search engines to implement additional measures to combat illegal online content and safeguard public safety.
Meta reacted to the formal proceedings by highlighting features like parental supervision settings, quiet mode, and automatic content restriction for teens. A Meta spokesperson, in a statement to the tech media outlet Engadget, remarked:
We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.
Meta isn't the sole tech company facing scrutiny in the EU. Earlier this year, the EC initiated a similar probe into whether TikTok breached online content regulations aimed at protecting children and ensuring transparent advertising.
Things that are NOT allowed: