Understanding the mechanism of racial bias in predictive risk models of child welfare : [a dissertation submitted to Auckland University of Technology in partial fulfilment of the requirements for the degree of Master of Business (MBus), 2021] / Huyen Dinh ; supervisors: Matthew Ryan, Rhema Vaithianathan.

Each year approximately 3.6 million children in the US are referred to Child Protective Services (CPS) - despite these high levels of surveillance, child maltreatment deaths have not fallen. Additionally, many children who are victims of abuse and neglect come to the attention of CPS when it is too...

Full description

Saved in:
Bibliographic Details
Main Author: Dinh, Huyen (Author)
Corporate Author: Auckland University of Technology. Faculty of Business, Economics and Law
Format: Ethesis
Language:English
Subjects:
Online Access:Click here to access this resource online
Description
Summary:Each year approximately 3.6 million children in the US are referred to Child Protective Services (CPS) - despite these high levels of surveillance, child maltreatment deaths have not fallen. Additionally, many children who are victims of abuse and neglect come to the attention of CPS when it is too late and where early intervention might have helped them. That is where Predictive Risk Modelling (PRM), a type of statistical algorithm that uses linked administrative data to predict the likelihood of adverse events happening in the future, comes into play. The PRM tool typically estimates a child's risk of abuse and neglect at the time of birth, then its predictions are employed to assist decision-making for connecting families to prevention services before incidents of abuse and neglect occur. However, there are growing concerns about racial disparity around the use of PRM in the child maltreatment context: whether it will reproduce, or even exacerbate, human bias. This study focuses on understanding one of the causes of machine bias, which is measurement error or target variable bias. In particular, the research investigates whether the use of a proxy variable, which is foster care placement in our context, can potentially lead to racial disparity in child maltreatment predictions.
Author supplied keywords: Machine bias; Racial bias; Machine learning; Predictive risk modelling; Child welfare; Proxy variable bias; Measurement error in proxy variable.
Physical Description:1 online resource
Bibliography:Includes bibliographical references.
Requests
Request this item Request this AUT item so you can pick it up when you're at the library.
Interlibrary Loan With Interlibrary Loan you can request the item from another library. It's a free service.