New plans to use personal data to predict crimes raise serious questions about fairness and potential discrimination.
Existing evidence shows these systems haven't reduced crime rates and often unfairly target specific communities.
The algorithms are trained on past data, which might reflect historical biases, leading to biased predictions and unfair targeting.
These systems aren't truly predictive; they reinforce existing inequalities, making them unreliable and potentially harmful.