Meta, the parent company of Facebook, Instagram, WhatsApp, and Thread, has been ordered to pay $375 million after a jury found it violated consumer protection laws by misleading users about the safety of its platforms and enabling child sexual exploitation.
The verdict was delivered on Tuesday in New Mexico, marking the first time a jury has ruled against the company on such claims.
The case, brought by the state’s attorney general, accused Meta of falsely presenting Facebook, Instagram and WhatsApp as safe for children while failing to adequately address harmful content and exploitation risks.
The jury found the company guilty of engaging in unfair and deceptive trade practices under New Mexico’s consumer protection law.
What they are saying
The ruling underscores growing scrutiny of Big Tech over user safety, particularly the protection of children on social media platforms.
Raúl Torrez, New Mexico Attorney General, described the verdict as a landmark decision against corporate negligence.
- “This is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.”
He added that the scale of the penalty should serve as a warning to technology firms.
- “The substantial damages the jury ordered Meta to pay should send a clear message to big tech executives that no company is beyond the reach of the law.”
In response, Meta said it disagreed with the outcome and plans to challenge the decision.
- “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the company said.
More insights
The case stems from a 2023 undercover investigation conducted by the New Mexico Attorney General’s office, which exposed significant gaps in Meta’s content moderation systems.
Investigators created fake accounts posing as users under the age of 14 on Facebook and Instagram. These accounts were quickly exposed to sexually explicit material and contacted by adults seeking similar content, leading to criminal charges against several individuals.
According to the state, Meta continued to assure the public that its platforms were safe for children despite internal evidence acknowledging risks tied to sexual exploitation and mental health harm.
The lawsuit also accused the company of deliberately designing features such as infinite scroll and auto-play videos to maximise user engagement, even when such features contributed to addictive behaviour among young users.
- The jury found that Meta committed 75,000 violations of state law, awarding $5,000 per violation — resulting in the $375 million penalty.
What you should know
The ruling comes as governments across the world intensify efforts to regulate social media platforms and protect younger users from online harm.
In Nigeria, the Federal Government has begun consultations on introducing age restrictions for social media use, aimed at strengthening online safety for children.
If implemented, Nigeria would join a growing list of countries tightening access to digital platforms for minors.
- Australia has already introduced a ban on social media use for individuals under 16, requiring platforms like Instagram, TikTok and YouTube to enforce stricter access controls.
- Similarly, Indonesia has announced plans to restrict social media access for children under 16, while Denmark is moving to ban social media use for children under 15, with broad political backing.
These measures reflect a global shift toward tighter regulation of social media, as policymakers respond to rising concerns about child safety, mental health, and online exploitation.











