Posts Under: data-driven innovation

SIIA Releases Brief on Algorithmic Fairness

Recent technological developments have led to the rise of “big data” analytics which include machine learning and artificial intelligence.  These new technologies will without question provide ample opportunity for growth for consumers, businesses, and the global economy as a whole.  As this technological evolution continues to take place, it does not come without some risk.   Over the last few years, algorithmic fairness, has become an issue of serious debate.  Most recently, Cathy O’Neil released a book titled, “Weapons of Math Destruction,” and Frank Pasquale published “The Black Box Society,” in which they look at issues of discrimination and the role that algorithms play in exacerbating discrimination.  SIIA responded to these works in a blog by saying that tech leaders must quickly act to ensure algorithmic fairness. To go even further, on Friday, November 04, 2016, SIIA released an issue brief on the topic ...

more

Assessing the Value of Data Analytics in Criminal Justice and Beyond

This week, SIIA published an issue brief assessing the use of data analytics in the criminal justice system.  Not surprisingly, data analytics has helped to reduce crime and improve the criminal justice system, particularly through its application in predictive policing and criminal risk assessment.  The report also explores critical questions and concerns raised about the effectiveness and unintended outcomes of the various tools in use today. This report is timely, as it coincides with a critical decision handed down by the Wisconsin Supreme Court about the use of evidence-based risk assessment tools at sentencing.  The Court supported the use of predictive tools, such as the COMPAS tool at the center of the trial, but it ruled that “risk scores may not be considered as the determinative factor in deciding whether the offender can be supervised safely and effectively in the community.” [emphasis added] Essentially, the court’s decision confirme ...

more