A Historic Victory for Digital Wellbeing
On March 25, 2026, a Los Angeles jury delivered a verdict that sent shockwaves through Silicon Valley. After more than 40 hours of deliberation over nine days, twelve jurors found Meta (Instagram’s parent company) and Google (YouTube’s owner) liable for causing serious mental health harm to a young woman through their deliberately addictive platform designs. The total damages: $6 million.
This wasn’t just another lawsuit. This was the first time in history that social media companies were held legally responsible for the harm their platforms cause to young users—not because of what content appeared on their sites, but because of how they designed those sites to be addictive in the first place.
Who Is Kaley G.M.?
Known in court documents as “K.G.M.” or simply “Kaley,” the plaintiff is now a 20-year-old woman from Chico, California. Her story is heartbreakingly familiar to millions of parents and teens across the world.
- Downloaded YouTube on her iPod Touch at age 6
- Watched videos about lip gloss and online games
- Circumvented parental controls by age 9 to access Instagram
Kaley downloaded YouTube on her iPod Touch at just 6 years old to watch videos about lip gloss and online games. By age 9, she had circumvented her mother’s safety controls to download Instagram. What started as innocent childhood curiosity quickly spiraled into something far more serious.
“Every single day I was on it, all day long,” Kaley testified during the trial. “I just can’t be without it.”
By her teenage years, Kaley was spending up to 16 hours a day on Instagram. She would sneak out of class to check her phone, stay up late scrolling, and buy fake “likes” to appear popular. She created multiple accounts just to like and comment on her own posts. The platforms had become her entire world—and that world was slowly destroying her mental health.
The Mental Health Toll
Kaley’s addiction to social media didn’t just steal her time—it fundamentally changed how she saw herself and the world. She developed severe depression, anxiety, and body dysmorphia (a mental health condition where someone becomes obsessed with perceived flaws in their appearance).
Impact of Beauty Filters
Instagram’s beauty filters played a particularly damaging role. These filters—which can make users appear thinner, change their facial features, or add virtual makeup—became an obsession for Kaley. She testified that she did not experience negative feelings about her body before she started using social media and filters.
- Frequent use of filters on almost all photos
- Distorted perception of self-image
- Long-term impact on daily appearance routines
“At one point, almost all my photos had a filter on,” she said.
Even today, years later, she spends 3 to 4 hours each morning on her appearance—a lasting impact of the distorted self-image created by constant exposure to filtered, curated content.
Emotional and Psychological Impact
The emotional damage went even deeper. Kaley began cutting herself to cope with depression. She contemplated suicide. She withdrew from her family and friends.
| Impact Area | Details |
|---|---|
| Mental Health | Depression, anxiety, body dysmorphia |
| Behavioral Changes | Self-harm, social withdrawal |
| Daily Life | Excessive screen time, disrupted routine |
Victoria Burke, a therapist who worked with Kaley in 2019, testified that social media and Kaley’s sense of self “were closely related,” and that what happened on the platforms could “make or break her mood.”
What Made This Case Different?
For years, social media companies have been nearly untouchable in court, protected by a 1996 law called Section 230 of the Communications Decency Act. This law says that tech companies aren’t legally responsible for content that users post on their platforms.
But Kaley’s legal team, led by attorney Mark Lanier, took a different approach. Instead of suing over harmful content, they sued over harmful design. They argued that specific features built into Instagram and YouTube—like infinite scrolling, autoplay videos, constant notifications, and recommendation algorithms—were deliberately engineered to be addictive, especially to young users.
This was a game-changer. The lawsuit wasn’t about what was on social media—it was about how social media itself was built.
The Smoking Gun: Internal Documents Revealed
During the six-week trial, Kaley’s lawyers presented damning internal documents from Meta and Google that showed company executives knew exactly what they were doing.
One YouTube strategy memo stated: “If we want to win big with teens, we must bring them in as tweens.” In other words, YouTube deliberately targeted children as young as 10 or 11 to build lifelong platform addiction.
An Instagram employee’s internal message was even more revealing: “We’re basically pushers… We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
These weren’t rogue employees—these were internal strategy documents that showed Meta and Google understood their platforms were addictive and were actively working to increase engagement among children and teenagers.
The jury also saw evidence that Meta CEO Mark Zuckerberg and other executives discussed efforts to attract and keep kids and teens on their platforms. One document said: “If we wanna win big with teens, we must bring them in as tweens.”
The Addictive Design Features
Kaley’s lawyers argued that several specific features were designed to keep users hooked:
- Infinite Scroll: Unlike traditional media with natural stopping points, social media feeds never end. You can scroll forever, and there’s always something new.
- Autoplay: Videos automatically play the next one without you having to do anything. Kaley testified: “I would say okay I’m going to get off after that, but then it would autoplay and I would be on for hours.”
- Push Notifications: Constant alerts about likes, comments, and activity create a fear of missing out and trigger dopamine releases that make it hard to stay off the app.
- Recommendation Algorithms: These systems learn what keeps you engaged and feed you more of it, creating echo chambers and pushing increasingly extreme content to keep you scrolling.
- Beauty Filters: Instagram’s filters that alter appearance were central to Kaley’s case. These features distorted her self-image and contributed directly to her body dysmorphia.
Kaley described notifications as giving her a “rush”—the same kind of dopamine hit that makes gambling and drug use addictive. Even when she experienced bullying on the platforms, she couldn’t stop using them. The fear of missing out was too strong.
How Meta And YouTube Defended Themselves
Both Meta and YouTube fought back hard, arguing that Kaley’s mental health problems weren’t their fault.
They pointed to her difficult home life: her parents divorced when she was 3, her father was largely absent, and there was evidence of emotional and physical abuse from her mother. The defense showed video evidence of Kaley’s mother yelling at her, arguing that family trauma—not social media—was the root cause of her depression and anxiety.
YouTube took a particularly interesting approach, arguing that it’s not even a social media platform—it’s a “streaming platform, not a social media site,” more like television. They noted that Kaley spent very little time on YouTube Shorts (their infinite-scroll feature), averaging only about one minute per day.
Both companies insisted there’s no scientific proof that social media causes mental health issues, suggesting they were being used as scapegoats for complex problems that have many causes.
Meta’s lawyer, Paul Schmidt, asked the jury: “If you took Instagram away, would anything be different? That’s the core question in this case.”
The Verdict That Shook Silicon Valley
After hearing all the evidence—including testimony from Kaley, Mark Zuckerberg, Instagram head Adam Mosseri, therapists, and tech experts—the jury reached a clear conclusion: Meta and YouTube were negligent.
Jury Findings
- The companies were negligent in designing and operating their platforms
- Their negligence was a “substantial factor” in causing Kaley’s mental health harm
- The companies knew their platforms could have adverse effects on minors but failed to warn users
- Most damning of all: the companies acted with “malice, oppression or fraud”
Damages Awarded
| Type of Damages | Amount | Details |
|---|---|---|
| Compensatory Damages | $3 million | For Kaley’s suffering |
| Punitive Damages | $3 million | To punish and deter misconduct |
| Total | $6 million | Meta: $4.2M | YouTube: $1.8M |
Why This Case Matters
The $6 million verdict may seem small to companies as massive as Meta and Google. But the money isn’t the point—the legal precedent is everything.
This was a “bellwether” trial, meaning it was specifically chosen to test how similar cases might go. There are now approximately 2,500 other pending lawsuits from families, school districts, and state attorneys general making similar claims against social media companies.
Key Legal Implications
- Platform Design Can Be Sued: Section 230 doesn’t protect companies from lawsuits about how they design their platforms—only from lawsuits about user-generated content.
- Social Media Addiction Is Real: A jury of ordinary citizens heard the evidence and concluded that social media platforms can be addictive and that this addiction causes measurable harm.
- Tech Companies Can Be Held Accountable: For the first time, major social media platforms were found legally liable for the mental health harm they cause to young users.
- They Knew What They Were Doing: The jury didn’t just find the companies negligent—they found they acted with malice, meaning Meta and Google knew they were causing harm and did it anyway.
Legal experts have compared this moment to the 1990s tobacco lawsuits, when internal documents revealed that cigarette companies knew nicotine was addictive and cigarettes caused cancer, but lied to the public for decades. Those lawsuits resulted in billions of dollars in damages and fundamentally changed how tobacco products are marketed and sold.
This could be social media’s “Big Tobacco moment.”
TikTok And Snapchat: The Companies That Settled
It’s worth noting that Kaley originally sued four companies: Meta, YouTube, TikTok, and Snapchat. But TikTok and Snapchat both chose to settle with Kaley before the trial began, avoiding a public courtroom battle and the risk of an even worse verdict.
The settlement amounts were not disclosed, but the fact that these companies settled rather than fight suggests they saw the writing on the wall. Meta and YouTube chose to take their chances in court—and lost.
Source: :contentReference[oaicite:0]{index=0}
How the Companies Responded
Both Meta and Google immediately announced they would appeal the verdict.
A Meta spokesperson said: “We respectfully disagree with the verdict and are evaluating our legal options. Teen mental health is profoundly complex and cannot be linked to a single app.”
Google’s response was similar: “We disagree with the verdict and plan to appeal. This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”
The appeals could take years, and the companies will almost certainly continue fighting. But the damage to their public image and legal standing has already been done.
A Double Blow: The New Mexico Verdict
Just one day before the Kaley G.M. verdict, Meta suffered another major legal defeat. On March 24, 2026, a New Mexico jury ordered Meta to pay $375 million for violating state consumer protection laws by failing to protect children from sexual predators on its platforms.
That case originated from a 2023 undercover investigation by New Mexico Attorney General Raúl Torrez, who created a fake profile of a 13-year-old girl. The profile was quickly flooded with sexually explicit material and contact from predators, showing that Meta knew about widespread child safety problems but failed to address them.
The back-to-back verdicts in New Mexico and California represent the first times juries have held Meta financially liable for harms to young users. The timing couldn’t have been worse for Meta’s legal defense.
What Comes Next?
Eight more bellwether trials are scheduled, with the next one starting this summer. Each trial will test different aspects of the social media harm claims and could result in additional multi-million dollar verdicts.
Legal experts predict that after a few more plaintiff victories, social media companies may seek a global settlement—similar to what happened with tobacco companies—rather than fight thousands of individual lawsuits.
But the real question is whether this will force meaningful change in how social media platforms are designed. The addictive features that Kaley’s lawyers identified—infinite scroll, autoplay, algorithmic recommendations—aren’t minor details. They’re core to how these platforms make money.
Removing or limiting these features could significantly reduce user engagement and, consequently, advertising revenue. Whether social media companies will voluntarily make these changes, or whether courts and regulators will force them to, remains to be seen.
What Parents and Users Can Do Now
While legal battles continue, there are steps parents and users can take to protect themselves and their children:
- Use Built-In Tools: Both iOS (Screen Time) and Android (Digital Wellbeing) have features that let you set hard limits on app usage.
- Turn Off Notifications: Disable push notifications from social media apps to reduce the constant pull back to the platforms.
- Set Boundaries: Establish phone-free times and zones in your home, especially during meals and before bedtime.
- Have Open Conversations: Talk to your children about how these platforms are designed to be addictive and how filtered, curated content doesn’t represent real life.
- Monitor Usage Patterns: Pay attention to changes in mood, sleep patterns, or self-esteem that correlate with social media use.
Remember: these platforms employ some of the world’s best engineers and psychologists to make their products as engaging as possible. It’s not a fair fight, especially for developing brains.
Kaley’s Story Today
Today, Kaley works as a personal shopper at Walmart and still lives with her mother in the home she grew up in. She testified that her relationship with her mother has improved and that she now understands her mom was “doing her best” during difficult times.
But her relationship with social media remains complicated. She still catches herself sneaking to the bathroom at work to scroll through the apps. She’s even considering a career in social media marketing. The platforms that harmed her still have a powerful pull.
Kaley told the court that her life would be better without social media—but she can’t quite bring herself to delete the apps. That’s the nature of addiction.
After the verdict, Kaley’s attorney, Mark Lanier, said she felt “grateful” and “vindicated.” In a statement, Kaley’s legal team said:
“Today’s verdict is a historic moment—for Kaley and for the thousands of children and families who have been waiting for this day. She showed extraordinary courage bringing this case and telling her story in open court. For years, social media companies have profited from targeting children while concealing the addictive and dangerous design features built into their platforms. Today, we finally have accountability.”
The Beginning, Not the End
The Kaley G.M. case marks a turning point in how we think about social media and corporate responsibility. For the first time, a jury has looked at the evidence and concluded that these platforms aren’t just neutral tools—they’re products deliberately designed to be addictive, and that design causes real harm to real people, especially children.
This verdict won’t solve the social media mental health crisis overnight. Meta and YouTube will appeal. The legal battles will continue for years. And thousands of families are still waiting for their day in court.
But something fundamental has shifted. The narrative that social media companies have promoted for years—that they’re just platforms, that they can’t control how people use their products, that parents and users are solely responsible for managing screen time—has been rejected by a jury of ordinary people who heard all the evidence.
The internal documents don’t lie. The testimony from children like Kaley doesn’t lie. The mental health statistics don’t lie.
Whether this verdict ultimately forces Big Tech to redesign their platforms, face billions in damages, or simply continue fighting in court remains to be seen. But one thing is clear: the era of unaccountable social media is over.
Kaley showed extraordinary courage in sharing her story. Now it’s up to courts, regulators, parents, and users to make sure that courage wasn’t in vain—and that future generations don’t have to suffer the same harm.

