The night of 23 June 2020 passed by like any other for 16-year-old Carson Bride. The teen had just gotten a new job at a pizza restaurant, his mother, Kristin Bride, said, and the family had been celebrating at home in Lake Oswego, Oregon. He wrote his future work schedule on the kitchen calendar after dinner, said goodnight, and went to his room for bed. But the next morning, Kristin says, the family woke to “complete shock and horror”: Carson had died by suicide.
Kristin soon discovered that in the days leading up to his death, her son had received hundreds of harassing messages on Yolo – a third-party app that at the time was integrated into Snapchat and allowed users to communicate anonymously. Search history on Carson’s phone revealed some of his final hours online were spent desperately researching how to find who was behind the harassment and how to put an end to it.
After Carson’s death and the harassment Kristen says attributed to it, the mother tried to take action to prevent such tragedy from striking again – but found herself running into walls. She says she contacted Yolo four times only to be ignored until receiving a single automated response email. She filed charges against Snapchat and the two anonymous messaging apps it hosted in May 2021, in a suit that is partially ongoing. Days after the suit was filed, Snapchat removed Yolo and LMK, the other app, from the platform, and a year later the company banned all apps with anonymous messaging features.
The company declined to comment on the suit specifically, but a spokesperson noted that in 2020 it added an in-app support tool for users who may be experiencing mental health crises called Here for You.
“Snapchat was intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends in an environment that prioritizes their safety and privacy,” said Ashley Adams, a spokesperson for Snap, the parent company of Snapchat.
“While we will always have more work to do, we feel good about the role Snapchat plays in helping close friends feel connected, happy and prepared as they face the many challenges of adolescence.”
Kristin said the changes made by Snap and other apps felt like a victory – but one that was “bittersweet”.
“For those of us who have lost our children to online harms, it feels like Whac-A-Mole,” she said. “You want to honor your child by putting an end to it, but that is not happening because these companies won’t self-regulate when all they care about is profit. When I look at the whole scope of this mess, it feels like I just made a tiny impact.”
‘Defectively designed’
Kristin Bride’s lawsuit is one of hundreds filed in the US against social media firms in the past two years by family members of children who have been affected by online harms. Lawyers and experts expect that number to increase in the coming year as legal strategies to fight the companies evolve and cases gain momentum.
In February this year, a master complaint was filed combining lawsuits from more than 400 plaintiffs across the US against social media firms for their “role in creating a youth mental health crisis through their addictive services”. The suit distills the claims into 18 counts, alleging that the social media firms have created defective and addictive products that would knowingly cause harm to young users and failed to warn parents and children of those effects. Other counts include violations of consumer protection laws and enabling the distribution of sexually explicit content of minors.
The suit targets TikTok’s owner, ByteDance, YouTube owner Alphabet, Facebook owner Meta and Snapchat owner Snap Inc. It includes not only cases of suicidal behavior following harassment like in Bride’s case, but broader harms alleged to be caused by social media.
The master complaint alleges that the social media products were “defectively designed” in that they create “an inherent risk of danger”. Dangers cited include risk of “abuse, addiction and compulsive use by youth which can lead to a cascade of harms”. Harms listed include dissociative behavior, social isolation, damage to body image and self-worth, suicidal ideation and self-harm.
Social media firms have spoken out against the suit, with YouTube stating that the allegations are “simply not true”. A spokesperson from Meta said that the company has developed more than 30 tools to support teens and their family members on company platforms and invested in technology that “finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us”.
“These are complex issues but we will continue working with experts and listening to parents to develop new tools, features and policies that are effective and meet the needs of teens and their families,” the spokesperson said.
While social media firms have long faced scrutiny from Congress and civil rights organizations over their impact on young users, the new wave of lawsuits underscores how parents are increasingly leading the charge, said Jim Steyer, an attorney and founder of Common Sense media, a non-profit that advocates for children’s online safety.
“This is a major shift from where we were a decade ago,” he said. “People see the impact on their own children, they know the platforms are intentionally designed to addict their kids into harmful stuff, and they are fed up. The tide has turned.”
New strategies to take down big tech
Calls to address the harms caused by social media have been intensifying for years, drawing bipartisan support in and outside Congress. Even Meta’s CEO, Mark Zuckerberg, told US Congress in 2021 that it “may make sense for there to be liability for some of the content”.
But concrete legislative action has yet to materialize. Parents, legal teams and non-profits battling social media firms typically have one answer as to why: Section 230 – a portion of a 1996 US law, the Communications Decency Act, that they say effectively shields online platforms from responsibility for illegal actions of their users.
“Section 230 has been liberally construed far beyond its language or intent, and until recently had been seen as a veritable carte blanche for social media companies to operate with complete immunity,” said Matthew P Bergman, founding attorney of the Social Media Victims Law Center, a law firm dedicated exclusively to representing the families of children that are harmed by social media.
After repeatedly facing companies who blocked lawsuits by invoking Section 230 protections, Bergman and his firm employed a new strategy: product liability law. In other words, rather than targeting problematic content hosted on platforms, the legal teams allege the product itself is defective. They argue the product is addictive and harmful by design – and the companies were obligated to do more to warn and protect their users. It’s a method that has been used across the decades in a number of major legal movements, from lawsuits targeting corporations over opioids, cigarettes and asbestos.
The Social Media Victims Law Center has had varying levels of success with this new argument, but Bergman – who has a background in asbestos litigation – said the most important result was that cases are able to be heard at all.
“This is not a question of whether we win or lose – this is a question of whether we get to play the game,” he said.
Part of that game is forcing companies to go through the litigation process, including discovery, during which attorneys can request evidence and information from social media firms to help make their cases. Regardless of outcomes, this process will bring more information to light, Bergman said.
“Because these companies had been exonerated before, they’ve never had to subject themselves to depositions, produce documents, or answer questions under oath,” he said. “They have been allowed to operate in the dark.”
The effects of these strategies are already being seen, with a judge in November denying social media firms’ attempt to throw out a slew of legal cases around child safety and stating that section 230 does not shield them from all legal liability. That decision came as 42 states and the District of Columbia last month sued Meta for youth addiction to the company’s social media platforms in a landmark case.
A separate case filed by the state of New Mexico in December 2023 alleged that Meta allowed its social media platforms, Facebook and Instagram, to become marketplaces for child predators.
Multiple education boards and school districts also filed class-action suits in 2023 against social media firms over alleged harm caused to their student bodies by these platforms. Additionally, the CEOs of the biggest social media firms – including Meta, TikTok and Snap – have been subpoenaed to testify at a US Senate judiciary hearing on 31 January about their “failure to protect children online”.
“We anticipate that 2024 will be a year of robust litigation, in which more families come forward to hold companies accountable for the carnage that social media platforms have inflicted on our young people,” Bergman said. “We have no doubt that this is going to be a long, hard slog but we know that now we at least have a path forward.”
‘I did everything that a mom could do’
As lawsuits make their way slowly through the court systems, social media companies claim they have put in a growing number of guardrails to protect children on their platforms – but grieving parents say the measures are insufficient and poorly enforced.
When Joann Bogard lost her 15-year-old son Mason to a “choking challenge” – an online trend in which participants temporarily asphyxiate themselves – she had implemented all of the safety measures available on YouTube. Still, dangerous videos were able to reach her son on the platform, despite violating its policies, which explicitly ban content relating to “harmful or dangerous acts, challenges and pranks”.
“I did everything that a mom could do, and it still wasn’t enough,” she said. “It’s not a fair fight for parents to go up against these billion-dollar companies who pay employees a lot of money to come up with ways to keep our kids engaged.”
Bogard is currently involved in a class-action lawsuit against YouTube and TikTok that alleges the platforms failed to stop the spread of choking challenge videos. Other plaintiffs in that case claim the platforms also enable or promote cyberbullying and the use of unlawful drugs. TikTok did not immediately respond to request for comment.
Ivy Choi, a spokesperson for YouTube, said regarding Bogard’s suit that YouTube has “invested heavily in the policies, resources and products needed to protect the YouTube community” over the years. “In accordance with our policies prohibiting extremely dangerous challenges, we remove choking challenge videos on YouTube,” she said.
Parents say they are still facing an uphill battle against powerful companies to stem a deluge of content similar to that which led to their children’s deaths. Bogard said once a week she searches online platforms for videos containing the “choking challenge” and reports any that she finds. Despite such content being against most platforms’ policies, often the videos remain online.
“As traumatizing as it is for me to go online every week and do the job that the social network should be doing, I do it, because I don’t ever want to see another family go through what we have gone through,” she said. “I want other families to be protected, and the only way to do that is to speak out and spread awareness.”
As parents continue their fight to get meaningful legislation passed to stem the harms they say are caused by social media, many are working to address the issue through other means. Family members of children lost to online harms have launched advocacy campaigns, support groups, and educational resources.
Bogard started a Facebook page in honor of her son called “Mason’s Message”, where she posts tips for parents to keep their children safe on social media. She and Bride helped co-found the Online Harms Prevention Work Group – an operation within the online safety non-profit Fairplay that incorporates knowledge from experts and parents to promote best practices for parents and advocate for legislation.
Most recently the group has fiercely advocated for the Kids Online Safety Act – a bill that would increase parental control over children’s online experiences and force companies to collect less data on children, as well as implement more safeguards. KOSA now has 49 co-sponsors and is scheduled to be voted on in the Senate in January.
Bogard said she and the parents she works with recognize they are going up against a “huge entity” in their fight with social media firms, and must take a varied approach.
“It’s going to take more than just a federal bill, more than just a lawsuit, and more than just education – all of those things together are key to making a difference,” she said. “I want to make sure no parent ever has to bury their child for something as senseless and preventable as this. This is how I can honor my son.”