Meta Faces $375 Million Penalty for Misleading Child Safety Practices

A court in New Mexico has ordered Meta to pay $375 million (£279 million) for misleading users about the safety of its platforms for children.

The jury found that Meta, which owns Facebook, Instagram, and WhatsApp, was liable for how its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators.

New Mexico Attorney General Raul Torrez described the verdict as 'historic,' stating it is the first time a state has successfully sued Meta over child safety issues.

In response, a spokesperson for Meta, led by CEO Mark Zuckerberg, declared that the company disagrees with the verdict and intends to appeal. The spokesperson emphasized that Meta works hard to keep users safe and is clear about the challenges of removing harmful content.

The jury ruled that Meta violated New Mexico's Unfair Practices Act, misleading the public about its platforms' safety for young users. The seven-week trial exposed internal Meta documents and testimonies revealing the company was aware of child predators using its platforms.

Former Meta engineer Arturo Béjar testified that his own daughter was propositioned for sex on Instagram and shared internal research indicating that approximately 16% of Instagram users reported being shown unwanted nudity or sexual content in just one week.

The jury's total of $375 million in penalties stemmed from numerous violations of the act, each carrying a maximum fine of $5,000. This ruling comes amid Meta's ongoing efforts to improve child safety, including introducing Teen Accounts on Instagram and other measures to alert parents regarding self-harm content searches.

In 2024, Meta is involved in another trial in Los Angeles, where a young woman alleges that she became addicted to platforms like Instagram and YouTube during her youth, challenging their design choices. Various lawsuits concerning similar issues are currently proceeding through US courts.

Following New Mexico's claims, the lawsuit focused on how Meta allegedly directed young users towards sexually explicit content through its recommendation algorithms. Attorney Torrez noted that Meta executives knowingly allowed their products to harm children and ignored warnings from employees, asserting, 'Today the jury joined families, educators, and child safety experts in saying enough is enough.'