top of page
fnlogo.png

New Mexico Jury Delivers ‘Historic’ Verdict Against Meta Over Platform Safety Failures

  • 4 hours ago
  • 4 min read

A jury in New Mexico has ordered Meta to pay $375 million in civil penalties after concluding that the company misled users about the safety of its platforms and facilitated harm, including incidents of child sexual exploitation. The ruling, delivered on Tuesday, marks the first time a jury has held Meta legally responsible for actions carried out via its platforms.


New Mexico Jury Delivers ‘Historic’ Verdict Against Meta Over Platform Safety Failures

The case was initiated by the state attorney general’s office in December 2023, following a two-year investigation by The Guardian published earlier that year, which exposed how Facebook and Instagram had been used as hubs for child sex trafficking. That investigation was repeatedly referenced in the legal complaint.


New Mexico attorney general Raúl Torrez described the outcome as a landmark moment. “The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” he said. He added: “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”


Jurors imposed the maximum allowable fine under state law—$5,000 per violation—bringing the total to $375 million for breaches of New Mexico’s consumer protection statutes. The company was found liable on both counts brought under the Unfair Practices Act.


Meta has announced plans to challenge the verdict, criticizing Torrez’s arguments as exaggerated and selective. A spokesperson for the company stated: “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” adding, “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”


During the trial, internal documents and testimony revealed that both Meta staff and outside child safety specialists had repeatedly raised concerns about dangers on the platforms. Evidence also included details from a 2024 sting operation, known as “Operation MetaPhile,” in which three men were arrested and charged with attempting to exploit minors through Meta’s services and arrange in-person meetings.


Cyprus Company Formation

The court also heard that Meta’s 2023 move to encrypt Facebook Messenger hindered investigators by limiting access to key evidence. Law enforcement officials and representatives from the National Center for Missing and Exploited Children testified that Meta’s reporting systems were flawed, particularly in cases involving child sexual abuse material. They argued that the company’s reliance on artificial intelligence generated large volumes of low-quality reports, which proved ineffective for investigations.


A second phase of proceedings is scheduled to begin on 4 May, during which the attorney general’s office will pursue additional penalties and seek court-ordered reforms aimed at strengthening protections for minors. Proposed changes include implementing reliable age verification systems, removing predatory users, and limiting encrypted communications that could conceal harmful activity.


Depositions from Meta leadership, including CEO Mark Zuckerberg and Instagram head Adam Mosseri, were presented during the trial. Both acknowledged that harms such as sexual exploitation and mental health impacts are unavoidable given the scale of their platforms. They also emphasized that the company has invested billions in safety improvements, including the introduction of Instagram Teen Accounts in 2024, which automatically apply protections for users aged 13 to 17.


Meta has long argued that it cannot be held accountable for user-generated content due to protections under Section 230 of the Communications Decency Act. However, a judge rejected the company’s attempt to dismiss the case on those grounds in June 2024, ruling that the lawsuit focused on product design and operational decisions rather than speech.

The trial spanned nearly seven weeks and featured testimony from a wide range of witnesses, including child safety experts and current and former Meta employees. The jury reached its decision after roughly one day of deliberation.


Legal analyst and former New Mexico deputy district attorney John W Day commented on the outcome, telling the Guardian: “It’s a huge win for the New Mexico attorney general. His jury didn’t even deliberate very long.” He added, “This wasn’t surprising, as there’s an undercurrent of resentment and fear and concern among not just families but the community in general, about the invasiveness of social media, and this one certainly opens the floodgates to lots of other litigation and reforms and regulation.”


Meta is also facing a separate lawsuit in Los Angeles, where hundreds of families and school districts accuse major tech companies—including Snap, TikTok, and YouTube—of designing addictive platforms that harm young users. Allegations include links to depression, eating disorders, self-harm, and other mental health issues. While Snap and TikTok have reached settlements, Meta and YouTube continue to contest the claims. The jury in that case is currently deliberating.


An amendment to the report on 25 March 2026 clarified that the New Mexico case represents the first jury trial—rather than a bench trial—to find Meta liable for actions carried out on its platform.

By fLEXI tEAM

Comments


bottom of page