16
May
2018

Human rights groups call for protections against discriminatory and biased artificial intelligence

The company behind Compas disagreed with the report and claimed the system bases its risk assessments on a variety of questions, none of which specifically query race. The report, on the other hand, looked at the data, examining the risk scores attributed to 7,000 people, and followed up for several years to compare those scores to the reality of whether they actually reoffended. Despite the algorithms not specifically addressing race, the bias still seemed present, with the system wrongly labeling black defendants as potential reoffenders at double the rate of white defendants.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WP Facebook Auto Publish Powered By : XYZScripts.com