“Predictive analytics sounds like something lifted from the pages of a dystopian novel, but it’s quickly becoming a reality,” said Andrew Brown, director of the Center for Families and Children. “While there’s an understandable temptation to adopt new technologies that purport to hold the solution to some of the hardest problems we face as a society, applying predictive analytics to the child welfare system is rife with ethical and legal problems.

Key Points:

  • Like Netflix uses data to forecast your movie interests, child welfare agencies are using data to identify children at risk of future abuse to intervene in advance.
  • Predictive risk models implemented in other states have created more problems than they have solved, leading several states to abandon those efforts.
  • Using data analytics to predict child maltreatment does not eliminate bias but legitimizes it by cloaking it in the mantle of empirical science.
  • here is a clear distinction between families in need and families at risk. Predictive analytics blurs this line.
  • In light of the spectacular failures and extreme risks, child welfare agencies should be hesitant to latch onto the false hope of predictive analytics.