Verdicts Against Meta, YouTube Validate Child Safety Concerns/ Newslooks/ WASHINGTON/ J. Mansour/ Morning Edition/ Juries in two states found Meta and YouTube liable for harm to children. The decisions mark a turning point in holding social media companies accountable. Advocates say the verdicts could spark new lawsuits and regulation.

Meta YouTube Child Safety Verdict Quick Looks
- Juries find Meta and YouTube liable
- Cases decided in Los Angeles and New Mexico
- Plaintiffs cited addiction and mental health harm
- Advocates call verdicts historic shift
- Companies plan to appeal rulings
- Lawsuits focus on platform design
- Section 230 protections sidestepped
- More lawsuits expected nationwide
- Public concern about teen mental health rising
- AI chatbots emerging as next safety concern
Deep Look: Verdicts Against Meta, YouTube Validate Child Safety Concerns
For years, parents, educators, pediatricians and child safety advocates have warned that social media platforms contribute to mental health struggles among young users. This week, juries in two states sided with those concerns, marking a major shift in how courts view the responsibility of technology companies.
In Los Angeles, a jury found Meta and YouTube liable for harm caused to children using their platforms. Meanwhile, a separate jury in New Mexico concluded that Meta knowingly damaged children’s mental health and concealed information about sexual exploitation risks on its platforms.
The twin verdicts represent one of the most significant legal challenges yet to major social media companies and could reshape the technology industry’s relationship with young users.
Advocates Say “Big Tech Invincibility” Is Over
Child safety advocates and watchdog groups praised the decisions, calling them a long-overdue acknowledgment of the dangers posed by social media.
Sacha Haworth, executive director of the Tech Oversight Project, said the rulings confirm what families and experts have warned about for years.
“The era of Big Tech invincibility is over,” Haworth said, adding that testimony and evidence presented during the trials validated long-standing concerns about addiction, mental health issues, and harmful content exposure.
The verdicts signal growing public skepticism toward tech companies that have historically argued that harms on their platforms were unintended consequences rather than the result of product design.
Companies Push Back And Consider Appeals
Both Meta and Google, which owns YouTube, said they disagree with the verdicts and are considering appeals. The companies have consistently argued that broader societal issues, rather than platform design, contribute to mental health challenges among young users.
During testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg was asked whether addictive features encourage greater usage. Zuckerberg responded that he did not believe addiction applied to Meta’s platforms.
Despite company resistance, experts say the verdicts may increase legal pressure and encourage lawmakers to pursue stronger regulation.
Whistleblower Says Regulation Needed
Arturo Béjar, a former Meta engineering director who previously raised internal concerns about Instagram’s impact on young users, said legal rulings alone may not be enough to force meaningful change.
He noted that regulatory actions by attorneys general or federal agencies have historically driven stronger compliance from tech companies.
Béjar said state attorneys general involved in the cases now have a unique opportunity to push for significant reforms to improve child safety online.
Key Differences Between The Two Cases
The New Mexico lawsuit was filed by Attorney General Raúl Torrez in 2023. Investigators posed as children online and documented sexual solicitations received through Meta platforms. The case centered on whether Meta violated state consumer protection laws.
The Los Angeles case involved a single plaintiff identified as KGM, who sued Meta, YouTube, TikTok and Snap. TikTok and Snap settled before trial, leaving Meta and YouTube as the remaining defendants.
The lawsuit argued that platform features were deliberately designed to encourage addictive behavior, particularly among young users. The case is considered a bellwether trial, meaning it could influence thousands of similar lawsuits filed nationwide.
Legal experts say the strategy of focusing on product design allowed plaintiffs to bypass Section 230 protections, which typically shield platforms from liability for user-generated content.
Legal Shift Could Reshape Industry
Nikolas Guggenberger, a law professor at the University of Houston, said the verdicts represent new legal territory.
“For the first time, courts have held social media platforms accountable for how their product design can harm users,” he said.
The rulings could force companies to reconsider engagement-driven business models that prioritize user attention.
Although appeals could take years, experts say the shift in public opinion is already underway.
A 2025 Pew Research Center survey found that 48% of teens believe social media harms people their age — up from 32% in 2022.
AI Emerging As Next Safety Frontier
Even as social media faces increased scrutiny, experts warn that new technologies could create additional challenges.
Sarah Kreps, director of Cornell University’s Tech Policy Institute, said artificial intelligence chatbots represent the next frontier in protecting young users.
“You can ban today’s harm, but how do you know what tomorrow is going to bring?” Kreps said.
As technology continues evolving, she warned, new platforms will emerge and attract young users, creating ongoing safety challenges.
A Turning Point For Tech Industry
The verdicts against Meta and YouTube may take years to fully resolve, but they already mark a turning point in how courts, policymakers, and the public view social media companies.
With more lawsuits expected and lawmakers increasingly focused on child safety, the legal and regulatory landscape for Big Tech appears poised for significant change.








You must Register or Login to post a comment.