Meta Platforms, the parent company of Facebook and Instagram, is under a major investigation by the European Union for alleged breaches of the bloc's stringent online content law concerning child safety risks. The European Commission, the EU's executive body, announced the probe on Thursday, focusing on potential behavioral addictions in children and privacy concerns linked to Meta's recommendation algorithms.

The investigation stems from the EU's Digital Services Act (DSA), a groundbreaking law aimed at curbing harmful online content. The Commission expressed concerns over Meta's age verification methods and the potential "rabbit-hole effects" of its platforms. "We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them," a Meta spokesperson told CNBC via email. "This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission."

The Commission's decision to initiate an in-depth investigation follows a preliminary analysis of a risk assessment report submitted by Meta in September 2023. Thierry Breton, the EU's commissioner for the internal market, stated that the regulator is "not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects on the physical and mental health of young Europeans on its platforms."

The DSA probe allows the EU to take further enforcement steps, including interim measures and non-compliance decisions. The Commission can also gather evidence through information requests, interviews, or inspections. Companies found in violation of the DSA can be fined up to 6% of their global annual revenues. The EU has yet to issue fines to any tech giants under the new law, but the ongoing investigations signal a rigorous enforcement approach.

Meta, alongside other U.S. tech giants, has increasingly found itself in the EU's regulatory crosshairs. In December 2023, the EU opened infringement proceedings against X (formerly Twitter) for alleged failures to combat disinformation. Similarly, Meta is being investigated for its handling of election disinformation and its overall compliance with the DSA.

In addition to the EU's actions, Meta faces scrutiny in the U.S. The attorney general of New Mexico has sued the company over allegations that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking. Meta has stated it employs sophisticated technology and preventive measures to combat such issues.

The EU's current focus includes examining whether Facebook's and Instagram's interfaces exploit the vulnerabilities and inexperience of minors, potentially leading to addictive behavior. The Commission is particularly concerned about the effectiveness of Meta's age assurance and verification methods.

In response to these allegations, Meta reiterated its commitment to child safety. "We want young people to have safe, age-appropriate experiences online," a Meta spokesperson said in a statement shared with CNN. "We have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing."

Despite Meta's efforts, the EU remains unsatisfied with the measures taken so far. "We are not convinced that [Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans," Commissioner Breton said. "We are sparing no effort to protect our children."