© 2020 Relevant Protocols Inc.
© 2020 Relevant Protocols Inc.
Relevant
Hot
New
Spam
Relevant
Hot
New
Spam
0
97.6
0
97.6
"Training algorithms on crime reports from victims rather than arrest data was supposed to make the tools less biased. But it doesn’t look like it does." It’s no secret that predictive policing tools are racially biased. A number of studies have shown that racist feedback loops can arise if algorithms are trained on police data, such as arrests. But new research shows that training predictive tools in a way meant to lessen bias has little effect. Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
"Training algorithms on crime reports from victims rather than arrest data was supposed to make the tools less biased. But it doesn’t look like it does." It’s no secret that predictive policing tools are racially biased. A number of studies have shown that racist feedback loops can arise if algorithms are trained on police data, such as arrests. But new research shows that training predictive tools in a way meant to lessen bias has little effect. Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced.
Some low-ranking comments may have been hidden.
Some low-ranking comments may have been hidden.