Last July, 8-year-old Lalani Walton had just returned to Temple, Texas after a road trip with her family. They got home, unpacked their things, and her stepmother promised to take her out for a swim once Lalani cleaned her room. Tired from the trip, her stepmother rested her head and took a quick nap. When she awoke, she went to check up on Lalani, only to find a limp body with a rope hanging around her neck.
Lalani was not a victim of suicide. She was, however, a victim of a TikTok challenge that encouraged social media users to choke themselves until they passed out—the Blackout Challenge.
Unfortunately, this is not the only instance where algorithms like that of TikTok’s have boosted content that is harmful—and as we see in this case, fatal—to children.
Lalani’s story begins like many others.
Described by her parents as an “extremely sweet and outgoing” girl who “loved dressing up as a princess and playing with makeup,” she received her first smartphone when she turned 8 and became almost immediately addicted to TikTok. By design, the TikTok algorithm promotes content that they know will generate the most engagement, serving up content that will entice the user and perhaps get them to act on it. And while TikTok alleges that the virality of this challenge originated on other sites, even if that is true, the challenge was still in the feeds of and promoted to highly impressionable children on TikTok’s app.
This story is only one of many instances where children have died as a direct result of this TikTok challenge. Nylah Anderson, 10, also died from the Blackout Challenge last December, and her mother sued TikTok in May, alleging that the app repeatedly pushed dangerous content to her daughter. Two weeks ago, a federal judge dismissed this lawsuit, declaring that Section 230 shielded TikTok from any liability in Nylah’s death. Furthermore, the judge stated that it did not matter whether TikTok’s algorithm elevated the challenge in Nylah’s TikTok feed, because this algorithm also reaps the same protections.
Parents like that of Nylah’s can’t help but wonder, “Where are our protections?” It’s like David and Goliath, only if David had no stones for his sling.
A quick dive into the history of Section 230 makes that question even more harrowing. Section 230 was a part of a 1996 law titled the Communications Decency Act. It contained a number of provisions, but the overarching goal was to prevent minors from seeing or gaining access to sexually explicit materials on the internet. Plainly and intentionally, the law was designed to protect impressionable kids from indecent content that they would be better off not seeing. One year later, the U.S. Supreme Court ruled that two provisions of the Communications Decency Act were unconstitutionally overbroad, because in pursuit of protecting minors from harmful speech, a significant amount of protected adult speech was suppressed. And while these provisions of the Act were struck down, Section 230 remains in effect.
It is morosely ironic that a law initially designed to protect children was used to shield TikTok from any liability in a lawsuit alleging the app boosted harmful content to children.
As Milton Friedman said: “Judge public policies by their results, not their intentions.” There is no denying that, in the cases of Lalani and Nylah, the results are tragic. When companies like TikTok can flout any responsibility when children are dying from content its app is promoting, the status quo is not protecting children the way it should.