Images chosen by Narwhal Cronkite
Newly unsealed court documents from an ongoing social media addiction lawsuit have revealed internal chat logs in which YouTube employees discussed ‘viewer addiction’ as a goal and described safety tools for younger users being shelved because they didn’t generate enough revenue. The disclosures came during the same week a Los Angeles jury found Meta and YouTube liable for harming a young user through addictive platform design.
What the Internal Documents Show
The unsealed records, filed as part of ongoing litigation in the Northern District of California, include internal chat logs in which YouTube staff explicitly referenced ‘viewer addiction’ in the context of product decisions. When a YouTube executive was confronted with the logs during legal proceedings, the executive confirmed their authenticity but argued the discussion referred to a separate video creation tool, not the main platform. The subsequent portion of the exchange was redacted in the court record.
A separate August 2024 internal presentation titled ‘Teen (Unsupervised) Viewer Wellbeing and Safety’ was also entered into the record. In it, YouTube staff acknowledged that the platform’s ‘infinite feed’ feature was a key driver of concerns about young users. Court filings also allege that proposed protective tools for teen viewers were ultimately abandoned after internal analysis showed they would not deliver a sufficient return on investment.
Researchers cited in the documents reportedly concluded that YouTube was built with addictive intent, pointing to features like autoplay and algorithmic recommendations as tools engineered to encourage extended viewing sessions. YouTube has not publicly commented on the specific documents.

The Landmark Trial
The disclosures came in the same week as a landmark verdict in Los Angeles. On March 25, 2026, a California jury found Meta and YouTube negligent and determined their platforms had caused harm to a 20-year-old plaintiff, identified in court documents by her initials KGM. The plaintiff testified that she became addicted to social media as a child and that the addiction worsened her mental health over time.
The jury awarded $6 million in damages, assigning 70 percent of responsibility to Meta and 30 percent to YouTube. The verdict followed weeks of testimony that included internal documents and executive appearances — including Meta CEO Mark Zuckerberg, who testified in February. Snapchat and TikTok, which had also been named as defendants, reached settlements before the trial began.
The case is widely seen as a potential turning point. Thousands of similar lawsuits have been filed across the country, and the outcome of this trial is expected to influence how courts handle the broader wave of claims against social media companies for harm to minors.
How the Platforms Are Designed
The core argument in these cases centers on specific design features: autoplay, which automatically begins a new video the moment one ends; infinite scroll feeds, which remove natural stopping points; and recommendation algorithms that surface progressively more engaging content to keep users watching. Plaintiffs argue these features are not incidental but were deliberately engineered to maximize time on platform — and that this approach caused particular harm to children and teenagers whose brains are still developing.
In a prior case, a separate jury had already found both platforms liable for contributing to a young user’s mental health decline and awarded $6 million in damages. In that case, autoplay and endless scrolling were cited as contributing factors.
YouTube and Meta have argued that their platforms offer genuine value to users and that they provide tools for parents to manage and limit usage. Both companies are expected to appeal the recent verdict.
What Comes Next
The unsealed documents are likely to be cited in future proceedings as similar lawsuits move toward trial. Legal observers say the internal communications — particularly the acknowledgment of ‘viewer addiction’ as a frame and the shelving of safety tools on financial grounds — could be significant evidence as courts evaluate whether the companies acted knowingly. More than a dozen states are also pursuing legislation that would impose stricter requirements on platforms regarding the design of products used by minors.