GeoPython2019

Detect and Remediate Bias in Machines Learning Datasets and Models
2019-06-25, 11:30–12:00, Room 1

We will share lessons learnt while using AI Fairness 360 and show how to leverage it to detect and de bias models during pre-processing, in-processing, and post-processing.


One of the most critical and controversial topics around artificial intelligence centers around bias. As more apps come to market that rely on artificial intelligence, software developers and data scientists can unwittingly inject their personal biases into these solutions.

Because flaws and biases may not be easy to detect without the right tool, we have launched AI Fairness 360, an open source library to detect and remove bias in models and data sets.

The AI F 360 Python package includes a comprehensive set of metrics for data sets and models to test for biases, explanations for these metrics, and algorithms to mitigate bias. In total, AIF360 has 30 fairness metrics and ten bias mitigation algorithms.

We will share lessons learnt while using AI Fairness 360 and show how to leverage it to detect and de bias models during pre-processing, in-processing, and post-processing.