Novel algorithm can

Artificial Intelligence
Image credit: source

Scientists at have developed an algorithm that can automatically identify and minimise any hidden biases in the data used by (AI) systems.

In recent years, AI systems have been shown to be unfair sometimes, said researchers at Massachusetts Institute of (MIT) in the US.

This is dangerous as such systems are increasingly being used to do everything from predicting crime to determining what we consume, they said.

“Last year’s study showing the racism of demonstrated a fundamental truth about AI: if you train with biased data, you will get biased results,” said PhD student

The new algorithm can learn both a specific task like face detection, as well as the underlying structure of the training data, which allows it to identify and minimise any hidden biases.

In tests, the algorithm decreased “categorical bias” by over 60 per cent compared to state-of-the-art facial detection models — while simultaneously maintaining the overall precision of these systems.

A lot of existing approaches in this field require at least some level of human input into the system to define specific biases that researchers want it to learn.

In contrast, the team’s algorithm can look at a dataset, learn what is intrinsically hidden inside it, and automatically resample it to be more fair without needing a in the loop.

“Facial classification in particular is a that’s often seen as ‘solved,’ even as it’s become clear that the datasets being used often aren’t properly vetted,” said Amini.

“Rectifying these issues is especially important as we start to see these kinds of algorithms being used in security, and other domains,” he said.

The system would be particularly relevant for larger datasets that are too big to vet manually and also extends to other beyond facial detection, reserachers said.

(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)

(Excerpt) Read more Here | 2019-01-30 12:30:09

Leave a Reply

Your email address will not be published. Required fields are marked *