Girls who died in the “blackout challenge” on TikTok were sued in the US.
The craze for uploading videos After youngsters lost their lives while participating in the “Blackout Challenge,” which encourages people to choke themselves until they pass out, TikTok is being sued in California.
According to the lawsuit, TikTok software “intentionally and repeatedly” promoted the Blackout Challenge, which resulted in the deaths of two young girls last year: an eight-year-old in Texas and a nine-year-old in Wisconsin. The case was filed last week in state court in Los Angeles.
“TikTok needs to be held accountable for pushing deadly content to these two young girls,” said Matthew Bergman, an attorney at the Social Media Victims Law Center, which filed the suit
“TikTok has invested billions of dollars to intentionally design products that push dangerous content that it knows are dangerous and can result in the deaths of its users.”
ByteDance, a company based in China that owns TikTok, did not reply to a request for comment right away.
According to the lawsuit, each of the females who died from self-strangulation—one using rope and the other a dog leash—had the Blackout Challenge advertised to them by TikTok’s algorithm. It also included a list of kids who have passed away as a result of the TikTok Blackout Challenge, including those in Australia, Italy, and other countries.
TikTok has highlighted and encouraged a variety of challenges in which users record themselves engaging in themed activities that might occasionally be risky. The “Skull Breaker Challenge,” in which participants leap while having their legs kicked out from beneath them, flip over and hit their skulls, was one of the several TikTok challenges mentioned in court records.
According to court filings, the “Fire Challenge” is dousing objects with a flammable liquid and lighting them on fire, while the “Coronavirus Challenge” entails licking random objects and surfaces in public during the epidemic. The lawsuit requests that a judge compel TikTok to cease using its algorithm to lure kids in and promote risky tasks, as well as to pay specific financial damages.