BI Insights

Are Artificial Intelligence Systems Intrinsically Racist?

10 March, 2017

Are Artificial Intelligence Systems Intrinsically Racist?

At the heart of AI systems are statistical models that have no concept of social inequality, fairness, or hardships. In Cathy O’Neil’s book, Weapons of Math Destruction (WMD), she points out that big data is discriminating nearly at every juncture of our society and pummeling the poor at each opportunity.

How is this happening? Her book points to many avenues of misuse of data, but most offensive is through the use of proxies. Proxies are statistical correlations. Data statistics that are designed for one purpose but are repurposed to be used for economic or convenience sake. There are a number of examples of this. The most profound is the use of your FICO score as a proxy.

Read full story

Related Articles

21 April, 2016

SAS Revamps Its Analytics Lineup For The Machine Learning Era

Publication: SiliconANGLE

Shared:

2 March, 2017

Can Artificial Intelligence Solve Today's Big Data Dilemma?

Publication: Forbes

Shared:

6 March, 2017

Why You Should Let Artificial Intelligence Creep Into Your Business

Publication: Inc.

Shared:

14 March, 2017

AI Expansion Into Analytics, Intelligence Gathering And Visualization

Publication: ITProPortal

Shared:

The BI Guru
Presented by