A space for sharing and discussing news related to global current events, technology, and society.
69470 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
A space for sharing and discussing news related to global current events, technology, and society.
69470 Members
We'll be adding more communities soon!
© 2020 Relevant Protocols Inc.
Relevant
Hot
New
Spam
Relevant
Hot
New
Spam
8
58.9
8
58.9
It’s no secret that predictive policing tools are racially biased. A number of studies have shown that racist feedback loops can arise if algorithms are trained on police data, such as arrests. But new research shows that training predictive tools in a way meant to lessen bias has little effect. Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced
It’s no secret that predictive policing tools are racially biased. A number of studies have shown that racist feedback loops can arise if algorithms are trained on police data, such as arrests. But new research shows that training predictive tools in a way meant to lessen bias has little effect. Arrest data biases predictive tools because police are known to arrest more people in Black and other minority neighborhoods, which leads algorithms to direct more policing to those areas, which leads to more arrests. The result is that predictive tools misallocate police patrols: some neighborhoods are unfairly designated crime hot spots while others are underpoliced
Some low-ranking comments may have been hidden.
Some low-ranking comments may have been hidden.