Op-Ed: Could big data have saved Anthony Avalos’ life?
Even Anthony Avalos’ relatives can’t agree on whether the 10-year-old Lancaster boy who died last week was being regularly abused by his mother and her boyfriend. Family members were heard arguing at his memorial service, with one shouting, “Stop blaming people; let’s wait for the report,†while another said that in the past the boy had had “bruises all over his body.â€
If a child’s extended family can’t be certain he is being abused, how can we expect government agencies to figure it out? State and local child protective services have tried to answer this question for years, without a lot of success.
In New York earlier this month, a 5-month-old died at the hands of his mother, according to police. City authorities, who had removed three older children from her care in 2010, checked out a complaint in May but decided the baby wasn’t in danger — the mother had no history of outright abuse, just neglect. Two weeks later, Raymond Porfil was dead.
In Michigan too child protective services saw neglect but not abuse in a family a few months ago. The mother, an addict, left her child alone for long periods while she was high. The problem was deemed to be too low-level to merit further surveillance. The child died.
The individual judgment of workers in our child protective services is often not sufficient.
The Avalos death is still being investigated. His mother had previous interactions with the county Department of Children and Family Services — according to one source there were as many as 16 complaints since 2013 and Anthony was even briefly placed with an aunt. After Anthony’s death, seven other children were removed from the home.
A report released earlier this year by the federal Department of Health and Human Services found that nationally there were 1,700 child fatalities resulting from maltreatment reported in fiscal year 2016, compared to 1,589 the previous year — a 7% increase.
The opioid epidemic is partly to blame. As one Indiana judge explained to the Associated Press when the report was released: “Traditional systems of early warning are overwhelmed. And parents, because of addiction, aren’t seeking intervention because their kids are going to be removed.It allows kids to die. It’s a fact.â€
Local officials blame budgetary issues and the resulting high caseloads among social workers. Unfortunately, neither adding more caseworkers nor increasing their training has historically resulted in significantly improved outcomes for child welfare systems. Bobby Cagle, L.A. county director of Children and Family Services said, “We work hard to figure out how they might have been prevented in the first place. But, unfortunately, we are reminded at times that people are capable of the unspeakable.â€
This is no doubt true, but there are approaches that are more effective at preventing child abuse than caseworker judgment calls.
The most promising is predictive analytics, the same sort of “big data†modeling that has revolutionized baseball and investing. We can’t know everything about what goes on inside of families, but what we do know, crunched via algorithms, has the potential to tell us the likelihood that a child will be subject to neglect, abuse or worse.
Civil libertarians raise concerns about such projects. Studies have suggested that models used to forecast criminal behavior over-predict crime by blacks and Latinos. There are also questions about what kinds of data are collected and how. But in pilot programs — including one in Los Angeles in 2014 — there is no new intrusion into people’s lives. The goal is to do a better job of analyzing what’s already known to determine how urgently problems should be investigated and how resources such as preventive services are allocated.
The Los Angeles program was discontinued because researchers made what they now say was a mistake of using a private firm to create the algorithm, as opposed to a public university or some other open-source entity. The result was that the project was not transparent enough; the county couldn’t replicate the results. Perhaps more important, the data also could not be updated in real time. And with predictive analytics, more inputs, and more current ones, make for more accuracy.
The Allegheny Family Screening Tool was rolled out in Pittsburgh in August 2016, and it didn’t have these problems. More than 100 factors go into the system’s algorithm (race is not one of them). When calls alleging mistreatment are received on a hotline, screeners enter a name or address and the tool calculates a score between 1 and 20. The higher the score the better the likelihood that the child in question will be re-referred or removed from the home within two years, based on what is known from historical data. The model predicts with 76% accuracy whether a child will be placed in foster care within two years, and with 73% accuracy whether child services will be alerted about that child again. Relying on caseworkers yields results closer to simply flipping a coin.
And important patterns are emerging. “What you find,†explains Richard Gelles, former dean of social work at the University of Pennsylvania, “ is there are a series of neglect reports — four, five, six neglect reports — that predate a fatality.†Until models revealed this, most child welfare workers assumed there would be “a progression of physical violence up to a fatal incident.†In other words, repeated problems that stop short of abuse require strong action.
Enter the Fray: First takes on the news of the minute from L.A. Times Opinion »
There are other ways big data could improve child protection too. Many states have no way of knowing whether a parent has been charged with abuse in another state because there is no federal database that keeps track of this information. And yet one researcher found that infants born to parents with a prior termination of rights made up more than 10% of entries into foster care in Maryland in 2013.
Crunching data is not going to solve all of our child welfare problems, but it is clear that the individual judgment of workers in our child protective services is often not sufficient. In Anthony Avalos’ case, L.A.’s Children and Family Services must determine why so many complaints didn’t merit more action. It should also investigate whether predictive analytics would have done a better job.
Naomi Schaefer Riley is a visiting fellow at the American Enterprise Institute studying child welfare issues. She reported on predictive analytics in Reason magazine in February.
Follow the Opinion section on Twitter @latimesopinionand Facebook
More to Read
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.