Social Media Platforms Facing Lawsuits for Suicides, Mental Health Issues Among Young Users

The parents of a 17-year-old boy who killed himself in 2015 have filed a wrongful death lawsuit against the parent companies of Facebook, Instagram, and Snapchat. The suit claims that social media companies knowingly get kids addicted to their platforms — even though the companies know it will lead some to take their own lives. Sadly, that young man’s is not the only such case, and other parents are also suing social media companies.

Chris and Donna Dawley, of Salem, Wisconsin, filed the suit earlier this month along with the Social Media Victims Law Center (SMVLC). SMVLC describes itself as working “to hold social media companies legally accountable for the harm they inflict on vulnerable users.” The Dawleys turned to SMVLC after seven years of “trying to figure out what happened” to their son, Christopher James “CJ” Dawley.

In 2012, Barack Obama was president, Marvel’s The Avengers was released in theaters, Hurricane Sandy ravaged the Atlantic Ocean and the East Coast of the United States, the Mayan Calendar ran out, and CJ was 14 years old. He did what many kids his age were doing: He signed up for Facebook, Instagram, and Snapchat. And — as did many of his peers — he used those social media platforms as a sort of “log” to document his day-to-day life for the perusal of friends and strangers.

CJ’s parents described him in an interview with CNN Business:

CJ worked as a busboy at Texas Roadhouse in Kenosha, Wisconsin. He loved playing golf, watching “Doctor Who” and was highly sought after by top-tier colleges. “His counselor said he could get a free ride anywhere he wanted to go,” his mother Donna Dawley told CNN Business during a recent interview at the family’s home.

One would expect a young man from a loving family who had hopes of a promising college career to go on to do well and live a fulfilling life with children of his own. But CJ’s story does not have a happy ending. His parents say that throughout high school he developed a growing addiction to social media, spending hours every day posting and reading others’ posts. His mother told CNN that by his senior year “he couldn’t stop looking at his phone” and would often stay up until 3 a.m. getting his social media fix. CNN also reported that on Instagram and other platforms, he began swapping nude photos. He became sleep deprived and obsessed with his body image.

His sleep deprivation and body image obsession led to darker problems. Just after his 17th Christmas, as his family took down their Christmas tree, CJ went to his bedroom, texted his best friend “God’s speed,” and posted “Who turned out the light?” to his Facebook profile. Then, still holding the phone he could never seem to put down, he fatally shot himself with a .22 rifle. As CNN reported:

Police found a suicide note written on the envelope of a college acceptance letter. His parents said he never showed outward signs of depression or suicidal ideation.

“When we found him, his phone was still on, still in his hand, with blood on it,” Donna Dawley said. “He was so addicted to it that even his last moments of his life were about posting on social media.”

Seven years later, CJ should have been a 24-year-old college grad with a promising and fulfilling career, and maybe a wife and a child or two. Instead, his parents have spent those seven years “trying to figure out what happened” that led a beloved son, brother, and friend who was considered the comedian in his circle of friends to take his own life and post about it to social media.

What they have learned has led them to realize that social media platforms are well aware of the danger they pose to young people. The algorithms are designed to keep kids clicking, “to maximize time spent on the platform for advertising purposes and profit,” according to CNN’s report on the lawsuit.

After Facebook whistleblower Frances Haugen leaked hundreds of internal documents, including some showing that the company — and its parent company, Meta — was aware that its products when used as intended have negative effects on teens’ mental health, the Dawleys began to understand what happened to their son.

As Daily Mail reported:

The files, which were published by the Wall Street Journal, revealed the company was aware of the problem since 2019, with Facebook’s own research showing that young users were going through mental health declines using Instagram.

One message posted on an internal message board in March 2020 said the app revealed that 32 percent of girls said Instagram made them feel worse about their bodies if they were already having insecurities.

Further, a slide from an internal presentation plainly stated the problem, saying:

We make body image issues worse for one in three teen girls. Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.

So, Meta — the parent company of Facebook and Instagram — was fully aware that teens (which make up the lion’s share of Instagram users) suffer mental health issues as a direct result of their product. And — as if that weren’t bad enough — the company was also aware that those issues lead to suicidal tendencies: Another presentation showed that the company’s research into the dangers of their product found that 13 percent of British users and six percent of American users who felt suicidal traced those feelings to their use of Instagram.

One such victim was 11-year-old Selena Rodriguez. Though both Instagram and Snapchat ostensibly require users to be 13 to sign up for their platforms, it is well-known that younger children often have accounts. On January 21, 2022, Selena’s mother — along with SMVLC — filed a wrongful death suit against Meta and Snap. According to documents SMVLC provided to The New American:

Selena Rodriguez struggled for more than two years with an extreme addiction to Instagram and Snapchat before taking her own life at 11-years-old. Her mother, Tammy, confiscated all electronic devices from her possession, which led to Selena running away to use social media. On multiple occasions, Selena received mental health treatment for her addiction. One outpatient therapist who evaluated Selena remarked that she had never seen a patient as addicted to social media as Selena.

And:

During the COVID-19 pandemic, Selena spent even more time on Instagram and Snapchat, which only worsened her depression and level of sleep deprivation. This addiction resulted in multiple absences from school and a subsequent investigation by the Connecticut Department of Children and Families.

It is well documented that already-present underlying mental health issues — particularly in children and teens — were exacerbated by the lockdowns and isolation imposed as a heavy-handed government “solution” to Covid. But when those issues include addiction to a technology product that is designed to be addictive, the result is a recipe for disaster.

Selena — already suffering from a worst-case-scenario social media addiction — was driven deeper into the abyss by government policies that caused her only connection to the outside world to be the thing that — according to the lawsuit — led to her death.

And like CJ, Selena found herself sharing intimate photos using that technology.

The lawsuit states:

While on Instagram and Snapchat, Selena was constantly solicited for sexually exploitive content. She succumbed to the pressure and sent sexually explicit images using Snapchat, which were leaked and shared with her classmates, increasing the ridicule and embarrassment she experienced at school.

As a result, Selena was hospitalized for emergency psychiatric care and experienced worsening depression, poor self-esteem, eating disorders, self-harm, and ultimately, suicide.

On July 21, 2021, Selena — two years into her downward slide of social media addiction and related mental health issues — took her own life. She was only 11 years old.

The list goes on. Jennifer Mitchell lost her 16-year-old son, Ian, to a self-inflicted gunshot wound. He was creating a Snapchat video of himself playing Russian roulette when the gun fired, killing him. His mother received news of his death while she was away on a business trip. She filed a lawsuit against Snap, saying that she hopes it will make more parents aware of the dangers of social media. She would also like to see some common-sense regulations of social media, saying, “If we can put age restrictions on alcohol, cigarettes and to purchase a gun, something needs to be something done when it comes to social media.”

Not everyone who loses their child because of social media loses them to suicide — or even to death.

Brittney Doffing’s daughter turned 14 just days before Oregon implemented the first Covid lockdown. Doffing told CBS affiliate KOIN she “broke down” and got her daughter a mobile phone. As KOIN reported:

“She was in volleyball. She was in track. She was in basketball. She did drama. I mean, she was very outgoing,” Doffing said. “Her birthday came around and she was really wanting a cell phone and I was really hesitant about it, but then I kind of broke down because she lost all connections with everybody through school and that’s how a lot of her friends interacted, through cell phones.”

And that is when everything started. Her daughter — referred to in the lawsuit only as M.D. — “developed an addiction to social media, specifically Instagram and Snapchat,” as KOIN reported. According to documents provided to The New American by SMVLC, Instagram’s and Snapchat’s algorithms led “M.D.” down a rabbit hole of eating disorders, leading to behavior that was destructive to both her and her family.

Her mother told KOIN, “It happened very, very fast,” adding, “Anytime I try to take the phone, she would get very physical, violent, verbal with me, with her sisters. She would smash the phones so that I couldn’t review the content.”

The 31-page complaint said Doffing’s daughter has been hospitalized twice for psychiatric episodes triggered by Doffing’s attempt to take away or restrict the use of Instagram and Snapchat.

In an exclusive interview with The New American, Matthew P. Bergman — founding attorney of SMVLC — said that his hope is that such litigation will bring pressure to bear on social media companies, since currently such companies are able to use what he describes as the “externality” of risk to their advantage. “Essentially, the nature of product liability, the law of economic theorists — mostly on the conservative side of the spectrum — view this as a case of an externality,” he told The New American, adding, “The costs of these dangerous social media products are not being borne by the manufacturers.” If — as a result of paying out huge claims or settlements because of litigation — Meta, Snap, and others were forced to “internalize” those costs, Bergman explains, they would have to build the costs of safety into the product.

He recommends five simple steps that he believes would make social media products safer. First is age and ID verification. This would be a simple matter for social media to implement, since other types of sites already do so. He also says social media sites should “turn off the algorithms” that lead users down dark paths. As an example, Bergman says, “a young girl who expresses an interest in healthy foods” should not be led along the path of seeing emaciated girls presented as the standard for body image. Next, there should be warnings — similar to those of other potentially harmful products — that make users aware of the possible dangers of social media use. He also recommends that companies implement age restrictions for users to share or post intimate photos. He said, “they have facial-recognition software,” so they can easily recognize nudity in pictures, as well. Finally, it would be a simple matter to implement limits to screen time, by blocking underage users during the wee hours of the morning.

When asked, he said he would favor a parental “administrator account” for minors. This would allow parents to sign up for an account that allowed minors to sign up as “users” under that parent’s account. Such a “user” account would appear the same as any other account, except that parents could set certain restrictions, such as times, content, etc. Then, when Billy tries to browse his feed at 3 a.m., he would get a page telling him that his feed is blocked by the administrator — and mom or dad would get a notification that he had attempted access during blocked hours.

Bergman says he is “all for parental responsibility and control,” but “social media is designed” to circumvent parental authority, since teens can sign up without parents’ knowledge and can access material on the sites that parents would not allow. This often leads to dangerous behaviors, he said.

Bergman has the data on his side. A ten-year study published last year shows an “elevated suicide risk from excess social media time for teen girls.” As usage goes up, so does suicide. According to the study, “Researchers tracked the media use patterns and mental health of 500 teens” and found that “Girls who used social media for at least two to three hours per day at the beginning of the study — when they were about 13 years old — and then greatly increased their use over time were at a higher clinical risk for suicide as emerging adults.”