I’m old enough to have heard the term “latchkey kid” as a child. I grew up in a house where I would often walk home from school to an empty house, finish my homework, shoot hoops with the neighbor kids, and then return home to join my family for dinner.

This is something of a foreign concept for parents today, where kids are much more inclined to come home and binge-watch TikTok and scroll through Instagram than socialize or fend for themselves until mom and dad return from work after 5 pm. While social forces certainly have driven this new norm, Facebook has made “big bets” on designing products that reach tweens aged 10 to 12, children ages 5 to 9 and, yes, even kids ages 0 to 4.

While the addictive nature of social media is one factor parents must be aware of when determining screen time, a new Internet Watch Foundation report illuminates another concern: The United States accounts for 30% of the global share of child sexual abuse material (CSAM), up from 21% at the end of 2021. As defined by federal law, CSAM is any visual depiction of sexually explicit conduct involving a minor.

To understand the scale of this problem, in the first three quarters of 2021, Facebook flagged more than 55.5 million instances of CSAM. This is a meteoric rise from 2021 (35.6 million), and this figure doesn’t even encapsulate all the incidents that Facebook is incapable of identifying. TikTok is not any better, amassing more than 56 million documented cases of CSAM in the first three quarters of 2021.

When this content is found, it is incumbent upon each social media platform to report it to the National Center for Missing and Exploited Children (NCMEC). In 2019 alone, 521,658 cases were reported to NCMEC in the United States, with 112,202 total reports being investigated by law enforcement—just over a 20% investigation rate. And as the quantity of CSAM increases, investigations cannot keep up. In the current structure, the federal government is unable to sufficiently address this crisis, and social media companies claim they lack the bandwidth to even reactively clean up CSAM on their platforms.

Texas’s 88th Legislature has an opportunity to lead on this issue by considering enhanced consumer protection and parental empowerment measures to better protect minors online from CSAM, trafficking, and obscene and harmful content. State legislatures and the U.S. Congress have elevated this as an issue in need of public policy, and Texas can lead these efforts by securing gold standard legislation this upcoming session.

As concerning as this data is, parents can use this information to determine their plan for their child’s online presence and use of social media. Even though there is trepidation nowadays around raising a latchkey kid, maybe that skinned knee isn’t as perilous as the alternative.