Dozens of bizarre and often dangerous challenges are posted on social media every year. If one of those challenges hurts a loved one, can you bring a wrongful death suit? What
In most cases, such a suit would be unsuccessful if you go after the social media platform. A 1996 law known as the Communications Decency Act usually shields these platforms from liability.
It is also difficult to go after the initial creator of the challenge, as tracking an internet phenomenon to its origin point is often impossible.
There are also cases where the issue is not the specific content but the nature of the platform and how it can impact a person’s mental health, specifically a child’s mental health or a teenager’s.
Nevertheless, this is a question about a legal matter. Thus, the answer is, ultimately: it depends.
TikTok and the Blackout Challenge
In 2022, TikTok started facing multiple wrongful death lawsuits regarding deadly challenges.
The challenge encouraged users to choke themselves until they passed out.
One suit claimed TikTok knew or should have known that its content was addictive, that it was directing children to harmful content, and that it failed to take significant action to either stop those videos from reaching children or to warn parents about the presence of those videos on their site. The suit drew particular attention to the algorithm serving user content.
TikTok responded, citing a 1995 federal report about a choking game predating social media.
A few months later, a federal judge dismissed a suit involving a 10-year-old girl. The judge ruled the app is protected by Section 230 of the Communications Decency Act, which protects internet platforms from being held liable for content posted by third party users.
In response to the suit, TikTok did block users from searching for the blackout challenge and are shown a warning screen instead.
The screen tells users, “some online challenges can be dangerous, disturbing, or even fabricated.”
Whether TikTok changed its algorithm to make it harder for such challenges to show up on screen without a search is difficult to say. Algorithms are proprietary information.
It might be hard for TikTok to prevent every dangerous video from reaching the eyes of any given user.
Most challenges are harmless: performing a dance move or making your bed for 30 days. It might be difficult or impossible for the platform to screen every challenge video to discover whether it suggested a dangerous course of action.
TikTok could filter for specific tags, but screening the video content before reaching users might take more resources than they could be reasonably expected to bring to bear.
More than a billion videos are uploaded every single day. Any given video could flow to millions of users within five minutes of upload.
Thus, even if the Communications Decency Act didn’t protect TikTok, it might be challenging to draw a bright line between the tragic death of these children and the business practices of this social media platform.
Meta and Suicide
Another 2022 case involved a mother suing Meta over her son’s suicide. She claimed that his time on Facebook, Instagram, and Snapchat left him obsessed with body image. According to the lawsuit, this obsession led directly to suicide.
The suit cited research from a United States Surgeon General Advisory stating teen suicides went up 146% between 2007 and 2018.
The Social Media Victims Law Center filed a similar suit over the wrongful deaths of three children in Louisiana. Similar lawsuits soon followed.
These suits are slightly different. They claim Meta knew its apps harmed teen mental health and claim that Meta misrepresented how addictive their platforms might be. The Communications decency act can’t be used to defend these claims.
The outcome of these lawsuits has yet to be decided. These cases might be more successful, as Meta’s own internal research supports some of the conclusions drawn by these lawsuits.
What about your case?
The outcome of a case depends solely on the facts of that case. Minor differences between cases can significantly impact whether they are viable.
Much will depend on whether it is possible to draw a direct line between a platform’s actions or design and the harm done to your loved one. Sometimes this will be possible, and sometimes it won’t. It will also depend on whether the injury came from the platform or another source.
Personal injury and lawful death claims against social media platforms are an emerging area of the law and one we’ll continue to watch.
In short: we can’t tell you in a blog post whether you should press a lawsuit given the specific facts of your case. We’d have to examine your claim and decide whether it’s worth pursuing.
If you think you have a lawsuit, don’t do guesswork. Contact us to schedule a free case review today.
What to Do If a Product Makes You Sick
Hands-Free Technology Doesn’t Equal Brain-Free Driving
Why a Plaintiff’s Attorney Would Take the “Gorilla Glue in the Hair” Case