New Mexico Seeks $3.7B Plan to Overhaul Facebook and Instagram

New Mexico attorney general pursues landmark $3.7 billion abatement plan against Meta, demanding major changes to Facebook and Instagram.
In a significant escalation of its legal battle against Meta, New Mexico's attorney general's office is pushing for sweeping reforms to Facebook and Instagram following a substantial jury verdict. Building on its recent $375 million victory against the social media giant, the state is now pursuing an ambitious $3.7 billion abatement plan that would fundamentally reshape how the platforms operate within the state and potentially set precedents for other jurisdictions nationwide.
Attorney David Ackerman, representing New Mexico, presented the case during the second phase of this landmark trial, laying out an extensive framework for how Meta should compensate the state and its residents for alleged harms. The proposed plan represents one of the most comprehensive attempts to regulate a major social media platform through litigation, combining financial penalties with operational mandates that would affect how millions of users experience Facebook and Instagram.
The abatement plan targets multiple areas of concern, with a primary focus on protecting vulnerable populations, particularly minors. The proposal mandates that Meta establish robust funding for mental health providers across New Mexico, recognizing the documented connection between social media use and psychological issues among teenagers. Additionally, the plan calls for substantial investment in law enforcement initiatives and educational programs designed to address the digital safety concerns that have become increasingly prominent in public discourse.
One of the most stringent requirements in the proposed settlement terms involves content moderation and child protection. The plan demands that Meta implement age verification mechanisms across its platforms, a technical challenge that has long been debated in the tech industry. Furthermore, Ackerman argued for an unprecedented 99 percent detection rate for new child sexual abuse material (CSAM), setting an extremely high bar for the company's content moderation capabilities and signaling how seriously the state views the protection of minors online.
The notification restrictions represent another significant intervention into Meta's business practices. Under the proposed framework, Meta would be prohibited from sending notifications to teenage users during school hours and late-night periods—times when such alerts might interfere with education or healthy sleep patterns. This requirement acknowledges the addictive design patterns that have been criticized by researchers, parents, and policymakers alike, demonstrating New Mexico's commitment to addressing the psychological manipulation built into social media algorithms.
This legal action reflects broader national concerns about social media regulation and the role of technology companies in society. New Mexico's aggressive stance contrasts with the more measured approach taken in other states and at the federal level, where legislative efforts to regulate social media have moved slowly. By pursuing these demands through litigation rather than legislation, New Mexico is attempting to achieve through the courts what may be difficult to accomplish through traditional regulatory channels.
The financial component of the proposal is substantial, with the $3.7 billion figure dwarfing the initial jury verdict and representing a comprehensive response to the alleged harms caused by Meta's platforms. The funding breakdown would support infrastructure for mental health services, law enforcement training and resources, and educational initiatives—all areas that state officials argue have been impacted by the social media platform's practices. This approach shifts beyond simple monetary penalties to require the company to actively invest in remediation and prevention.
Mental health providers in New Mexico would receive dedicated funding to address the documented surge in anxiety, depression, and other psychological issues among adolescent users. Law enforcement agencies would gain resources to investigate and combat crimes facilitated through social media, including exploitation and trafficking. Educational institutions and programs would be funded to teach digital literacy and help young people navigate online spaces more safely and responsibly.
The push for age verification represents a particularly contentious demand, as it raises questions about privacy and feasibility. Tech companies have long resisted implementing robust age verification, citing concerns about data collection and regulatory burden. However, New Mexico's attorneys argue that protecting children justifies these requirements, and that Meta's substantial resources make such implementation entirely feasible if the company prioritizes user safety over convenience.
The case highlights fundamental tensions between free expression, corporate autonomy, and public health protection in the digital age. Supporters of New Mexico's approach contend that social media platforms have become too powerful and too influential in shaping young people's behaviors and mental health without adequate oversight or accountability. Critics argue that such demands may set precedents that burden companies with impossible compliance requirements or that the regulatory approach should come through legislation rather than litigation.
This second phase of the trial represents a critical moment in determining how states can use their legal authority to protect residents from corporate practices they deem harmful. The outcome could influence how other states approach similar challenges and whether litigation becomes the primary mechanism for regulating social media platforms in the absence of comprehensive federal legislation. Other state attorneys general are watching this case closely, understanding that a favorable verdict could embolden similar lawsuits nationwide.
The discovery phase of litigation has already revealed internal Meta communications and decision-making processes regarding mental health impacts and youth engagement. Evidence presented during the first phase apparently convinced jurors that Meta's platforms posed genuine public health threats, leading to the substantial $375 million verdict. The second phase builds on this foundation, presenting the court with detailed evidence about potential remedies and the specific harms that funding should address.
The notification restrictions proposed for teenage users represent practical interventions into Meta's algorithmic design. By preventing notifications during school hours and nighttime, the state seeks to reduce the compulsive checking behaviors that research suggests contribute to academic difficulties and sleep disruption. These measures acknowledge the sophisticated psychology underlying social media design, where notifications are carefully timed to maximize engagement and habit formation.
Looking forward, this case will likely influence national conversations about technology regulation and corporate responsibility. Whether the court accepts New Mexico's ambitious demands, modifies them, or rejects them entirely, the case has already established that social media platforms face serious legal liability for alleged harms to public health. The detailed nature of the proposed remedies suggests a model for how litigation might be used to achieve specific behavioral and financial outcomes when traditional regulation has stalled or proven insufficient.
Source: The Verge

