BI Insights

Are Artificial Intelligence Systems Intrinsically Racist?

10 March, 2017

Are Artificial Intelligence Systems Intrinsically Racist?

At the heart of AI systems are statistical models that have no concept of social inequality, fairness, or hardships. In Cathy O’Neil’s book, Weapons of Math Destruction (WMD), she points out that big data is discriminating nearly at every juncture of our society and pummeling the poor at each opportunity.

How is this happening? Her book points to many avenues of misuse of data, but most offensive is through the use of proxies. Proxies are statistical correlations. Data statistics that are designed for one purpose but are repurposed to be used for economic or convenience sake. There are a number of examples of this. The most profound is the use of your FICO score as a proxy.

Read full story

Related Articles

30 March, 2017

Artificial Intelligence Will Make Its Mark Within Next 3 Years

Publication: Forbes

Shared:

30 March, 2017

How Companies Will Use Artificial Intelligence To Sell To You

Publication: Fortune

Shared:

5 April, 2017

How Artificial Intelligence Will Change Everything

Publication: Huffington Post

Shared:

5 April, 2017

Why Artificial Intelligence Still Needs A Human Touch

Publication: Information Age

Shared:

The BI Guru
Presented by