The mother of a 10-year-old daughter who died last year is suing TikTok and its parent company ByteDance over allegations the company’s algorithm encouraged a so-called “blackout challenge” in the child’s feed.
In a complaint filed Thursday, Tawainna Anderson of Pennsylvania said her daughter Nylah died last year after choking while attempting to perform what is known as the “Blackout Challenge,” which encourages people to record themselves as they catch their breath stop or suffocate until they pass out. corresponding the first complaint. The mother said she took her to a local hospital on December 7, but she died of her injuries on December 12.
Court documents claim that the challenge was recommended to her by the algorithm, which “determined that the deadly blackout challenge was well tailored and likely to be of interest to 10-year-old Nylah Anderson.”
The Blackout Challenge has reportedly been running on the platform for years, but similar challenges have been part of the school playground environment for decades. Still, Tawainna’s death follows a string of similar and highly publicized TikTok choking incidents in recent years. Another 10 year old girl in Italy died after attempting the challenge last January, and a 12-year-old boy from Colorado died in April 2021 after attempting the challenge.
In an email statement, a Tiktok spokesperson said: “This disturbing ‘challenge’ that people seem to be learning about from sources other than TikTok long predates our platform and was never a TikTok trend. We remain vigilant in our commitment to user safety and would remove offending content immediately if found. Our deepest sympathy goes out to the family for their tragic loss.”
The platforms has explicit rules about content that advocates self-harm. The app has a curated version for users under the age of 13, restricting personal information users can share and restricting their ability to comment or post content, but it is unclear how automated systems might restrict content from appearing User Feeds.
The platform is rated 12+ in both the Apple and Google app stores, but like most apps, all of them To create an account, you must indicate that you are over the age limit. The Company claims it removed more than 15 million underage accounts in the past year.
during one press conference On Thursday, Bob Mangeluzzi, one of Anderson’s attorneys, said: “TikTok is one of the most powerful and technologically advanced companies in the world. So what did TikTok do when it found out about this? … [they] used their app and algorithm to forward a blackout challenge video to a 10-year-old.”
The complaint describes that the app’s algorithm is intentionally designed to “maximize user engagement and dependency,” which encourages children to engage repeatedly. The lawsuit is directed against TikTok as the developer of the algorithm as the distributor who promoted the content on Tawainna.
“It’s about time these dangerous challenges came to an end,” Anderson said during the press conference. “Something has to change, something has to stop because I don’t want other parents to go through what I’m going through.”
That lawsuit isn’t the only lawsuit prosecuting TikTok over allegations that they promote dangerous content for children. This became known in March Several prosecutors are investigating whether TikTok harms young adults and whether the company is aware of the content that younger users are watching.
TikTok has quickly become one of the most popular social media platforms available and it is expected to be more in advertising than Twitter or Snap altogether.