"Yet as awareness of algorithmic bias has grown, a rift is emerging around the question of what to do about it. On one side are advocates of what might be called the “inclusion” approach. These are people who believe that criminal justice technologies can be made more benevolent by changing how they are built. Training facial recognition machine learning models on more diverse data sets, such as the one provided by IBM, can help software more accurately identify black faces. Ensuring that more people of color are involved in the design and development of these technologies may also mitigate bias.
If one camp sees inclusion as the path forward, the other camp prefers abolition. This involves dismantling and outlawing technologies like facial recognition rather than trying to make them “fairer.” The activists who promote this approach see technology as inextricable from who is using it. So long as communities of color face an oppressive system of policing, technology — no matter how inclusively it is built — will be put towards oppressive purposes."
Via E-Flux conversations.