Institutions involved in predictive modeling are using ever more advanced techniques to predict outcomes of interest from credit scoring to facial recognition to spam detection. Institutions assess the performance of these models through standard measures such as accuracy (the number of correct predictions divided by the total number of predictions) or error rate (the number of incorrect predictions divided by the total number of predictions). They can in addition assess the fairness of their predictions with respect to vulnerable groups using measures such as predictive parity across groups, statistical parity, or equal error rates.
Institutions also face legal and ethical obligations to explain the basis of their consequential decisions to those who are affected, to regulators and to the general public. The idea is that people have rights based on autonomy and dignity to be able to understand why institutions make the decisions they do. When predictive models are ...
The public must be confident in the fairness of algorithms, or a backlash will threaten their very real and substantial benefits.