EU Commission Opens Probe into Facebook’s Safety Measures for Kids

The European Union (EU) has initiated an investigation into the practices of tech giant, Meta, concerning its social media platforms Facebook and Instagram. The investigation follows concerns that Meta is not adequately protecting children online and may be in breach of EU laws.

The probe is being led by the European Commission, the executive branch of the EU, which stated in a press release that it had commenced formal proceedings against Meta to assess if the company founded by Mark Zuckerberg has potentially violated the EU’s Digital Services Act (DSA) when it comes to safeguarding minors.

This act requires very large online platforms and search engines to implement various measures to protect children online, including preventing them from accessing inappropriate content and ensuring a high level of privacy and safety.

Any failure to adhere to these rules could result in companies being fined up to 6% of their global revenue.

However, the commission expressed concerns that the systems behind both Facebook and Instagram, including the algorithms that recommend videos and posts, may lead to addiction among children and create so-called “rabbit-hole effects.

EU officials are also worried about the age verification methods put in place by Meta, which include requiring users to upload an image of their government-issued ID, record a video selfie, or ask mutual friends to verify their age.

The investigation will focus on multiple areas, including Meta’s compliance with DSA obligations regarding the assessment and mitigation of risks caused by the “design of Facebook’s and Instagram’s online interfaces,” which the commission said “may exploit the weaknesses and inexperience of minors and cause addictive behavior.

In addition, the probe will examine whether or not Meta has complied with DSA obligations to ensure a high level of privacy, safety, and security for minors, particularly with regard to default privacy settings for minors.

The investigation is based on a preliminary analysis of a risk assessment report submitted by Meta in September 2023, according to the commission.

In a statement on Thursday, Margrethe Vestager, executive vice president for a Europe Fit for the Digital Age said that the EU Commission had concerns that Facebook and Instagram “may stimulate behavioral addiction” and that the age verification methods implemented by Meta were not adequate.

Today we are taking another step to ensure safety for young online users,” Ms. Vestager said. We want to protect young people’s mental and physical health.

Elsewhere, Thierry Breton, European commissioner for Internal Market, stated that officials were “not convinced” that Meta had done enough to adhere to its obligations under the DSA in order to mitigate the risks of negative physical and mental health effects associated with its platforms on young Europeans.

Mr. Breton promised an “in-depth” investigation into Meta and its potentially addictive nature, as well as its age verification tools and the level of privacy afforded to minors.

We are sparing no effort to protect our children,” he said.

The latest probe into Meta comes shortly after the European Union launched an investigation into whether the platform was failing to counter disinformation ahead of EU elections in June.

In February, the commission also began probing video-sharing platform TikTok over potential non-compliance with the Digital Services Act and possible failures to address the negative effects of its app on young people.

Similar to Meta, that probe will focus on whether or not the platform—which is owned by Chinese parent company ByteDance—ensures safety and privacy for minors, whether its algorithms may stimulate addictions, and whether TikTok complies with various DSA requirements including an obligation to conduct and submit a risk assessment report prior to deploying certain functionalities.

Meta said at the time that the complaint misrepresented its work over the past decade to bolster the safety of teenagers online, and touted its various support tools in place for parents and young users. Other lawsuits filed against Meta argue the company’s apps promote body dysmorphia and expose underage users to potentially harmful content.

Leave a Reply

Your email address will not be published. Required fields are marked *