In an effort to avoid 'Nazi-robot syndrome', IBM has launched a new tool that the company says can detect bias in artificial intelligence (AI).
The Fairness 360 Kit will analyse how machine learning algorithms make decisions in real-time and figure out if they are accidentally being biased; for example, by failing to correctly identify non-white people in photos.
Big Blue's software boffins have made the Fairness 360 Kit available on the cloud and as open source, so it should be fairly easy for smart systems and software builders to put the tool to good use.
The way AI code changes and mutates as systems and software learn more things can make it difficult for developers to see where bias has been created and adopted.
Some bias can be traced back to the unbalanced datasets used to train machine learning algorithms, or sometimes the unconscious bias in the developers, who may have programmed an AI's initial instructions forgetting to address certain attributes of a race or gender that's not their own.
IBM reckons its tool will open the pseudo black box in which AI learns and develops, and give developers more transparency in the judgements that their smart systems are coming up with.
"Machine learning models are increasingly used to inform high-stakes decisions about people. Although machine learning, by its very nature, is always a form of statistical discrimination, the discrimination becomes objectionable when it places certain privileged groups at systematic advantage and certain unprivileged groups at systematic disadvantage. Bias in training data, due to either prejudice in labels or under-/over-sampling, yields models with unwanted bias," said Kush Varshney, principal research staff member and manager at IBM Research.
The AI Fairness 360 Kit will check for bias at the initial training part of AI development as well as check for bias when it is undergoing testing and deployments, and at the AI's final stage in its lifecycle a final bias check will be carried out.
Given AIs have already shown a propensity to get pretty racist after exposure to public data, Big Blue's tool could just be the means to stop the rise of xenophobic machines.
Stanford researchers made the discovery via data from Greenland
Created via a thin, flexible, and transparent hierarchical nanocomposite film
Rolls Royce will use AI powered by Intel's Xeon Gold processors and SSDs for memory
The most extreme range of orbits yet observed in such a young star system, claim University of Cambridge astronomers