EU launches probe into Meta’s handling of child safety on Facebook and Instagram

EU launches probe into Meta’s handling of child safety on Facebook and Instagram

Disclaimer: Content Source The content on this website is for informational purposes only. We would like to clarify that the information provided here is sourced from various publicly available outlets on the internet. None of the content on this website is authored, reviewed, or endorsed by our team.



Meta is once again facing scrutiny in the European Union, this time for its approaches to safeguarding children. The European Commission (EC) has launched formal proceedings to determine if the parent company of Facebook and Instagram violated the Digital Services Act (DSA). The concern is that Meta may have fueled social media addiction among children and neglected to ensure robust safety and privacy measures.

The EC’s investigation will focus on whether Meta effectively evaluates and addresses risks from its platforms’ interfaces. The EC is troubled by how Meta’s designs might exploit the vulnerabilities and lack of experience among minors, leading to addictive behaviors and reinforcing the so-called “rabbit hole” effect.

 
Such an assessment is crucial to mitigate potential risks to children’s physical and mental well-being and to ensure their rights are respected.

The investigation will also delve into whether Meta implements necessary measures to block minors from accessing inappropriate content, provides effective age verification tools, and equips minors with simple yet robust privacy tools, like default settings.

The EU’s Digital Services Act (DSA), which came into effect for all online platforms on February 17 this year, requires particularly large online platforms and search engines to implement additional measures to combat illegal online content and safeguard public safety.



Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *