February 20, 2026

Zuckerberg Testifies as Trial Examines Meta’s Impact on Children and Teens
 

In a landmark Los Angeles trial examining whether Meta and Google platforms harmed children, Meta CEO Mark Zuckerberg testified Wednesday, acknowledging the company’s awareness of how young users engage with its platforms while defending Meta’s intentions. Nearly a dozen parents who say their children were harmed or died because of social media gathered and joined hands outside the courthouse as they waited for Zuckerberg to arrive.

Under oath, Zuckerberg admitted that Meta previously tracked time spent on its platforms as a goal, while emphasizing his belief that “if something is valuable, people will use it more because it’s useful to them.” He also conceded that age restrictions are difficult to enforce, noting there are “a meaningful number of people who lie about their age to use our services."

Instagram maintains that users must be at least 13 years old to create an account -- a requirement Zuckerberg reaffirmed during his testimony. 

However, a 2015 internal company document estimated that more than 4 million Instagram users were under age 13, representing approximately 30% of all 10- to 12-year-olds in the United States.

Zuckerberg rejected claims that Meta intentionally designed addictive products, stating, “I don’t think that applies here,” when asked whether addictive features drive usage. Yet, internal documents and testimony presented in court suggest Meta was aware of how design features influenced engagement among young users, even as Zuckerberg maintained that “a reasonable company should try to help the people that use its services.” 

Big Tech is the New Tobacco: Addiction by Design 

and Profit from Harm

The case—one of thousands filed against Meta and other tech companies alleging harm to children—centers on claims that design and engagement-driven features contributed to depression and compulsive use among youth, raising urgent questions about whether growth and engagement priorities came at the expense of child safety. At Enough Is Enough®, we believe the answer is a resounding NO!

This trial underscores the urgent need for continued advocacy to hold technology companies accountable. Advocates and policymakers -- including Enough Is Enough®-- are pushing for stronger safeguards, transparency, and enforceable protections to ensure platforms prioritize the safety and well-being of children and teens over engagement and growth.

Information Coming Soon...