10 March, 2017
Are Artificial Intelligence Systems Intrinsically Racist?
At the heart of AI systems are statistical models that have no concept of social inequality, fairness, or hardships. In Cathy O’Neil’s book, Weapons of Math Destruction (WMD), she points out that big data is discriminating nearly at every juncture of our society and pummeling the poor at each opportunity.
How is this happening? Her book points to many avenues of misuse of data, but most offensive is through the use of proxies. Proxies are statistical correlations. Data statistics that are designed for one purpose but are repurposed to be used for economic or convenience sake. There are a number of examples of this. The most profound is the use of your FICO score as a proxy.