The European Union (EU) has opened a formal investigation into Meta over concerns that its platforms, Instagram and Facebook, aren’t doing enough to protect the mental and physical health of children.
The EU says Meta may have breached the Digital Services Act (DSA)— a law passed by the commission in 2023 that makes digital companies liable for disinformation, shopping scams, child abuse, and other forms of harm online.
“Today we open formal proceedings against Meta,” the EU commissioner for the internal market, Thierry Breton, said in a statement on the EU’s official website. “We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram.”
The investigation will explore the “rabbit hole” effect of the platforms, where the algorithm feeds children negative content. The EU will also look at the effectiveness of Meta’s age verification tools and privacy for minors. “We are sparing no effort to protect our children,” Breton said.
The probe will also look into whether Meta’s content suggestions and default privacy settings adequately safeguard the privacy, safety, and security of minors.
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
“We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation,” said Margrethe Vestager, executive vice-president for a Europe fit for the digital age.
If the issues are proven, they would infringe DSA, and that could lead to a 6% fine on Meta’s global revenue. There’s no set deadline for the proceedings, and the EU retains the authority to implement interim enforcement measures against Meta during the investigation.
Meta has been stepping up its efforts to boost child safety on Facebook and Instagram. They’ve rolled out new features like restricting kids’ access to harmful content and limiting their interaction with “suspicious” adult accounts.