Fighting crime with AI needs more transparency, former MI5 chief says

Artificial Intelligence
Image credit: source

‘You have got be confident that you have not inadvertently built prejudice in there,’ he says

Sunday, 29th December 2019, 12:48 pm

Lord Evans of Weardale was former head of MI5 (Photo: PA)

Lord Evans of Weardale, who is leading a review into AI use in the public sector, said he found it “troubling” how difficult it is to find out how authorities employ this technology.

“At the very minimum, [AI use] should be visible, and declared, where it has the potential for impacting on civil liberties and human rights and freedoms,” he said.

Sign up to our daily newsletter

The i newsletter cut through the noise

Lord Evans, who was director general of UK’s security services between 2007 and 2013, said there were particular concerns over its use in the police service.

“If you’re using it in the criminal justice system,  to assist in decisions on granting bail and all that sort of stuff,” he said in an interview with The Sunday Telegraph, “you have got be confident that you have not inadvertently built prejudice in there.”

AI use is ‘difficult to find out’

Civil rights groups in the UK have opposed police use of facial recognition technology (Photo: Robyn Beck/AFP/Getty Images)

He said: “That’s where we would say the most important area is.”

Currently, facial recognition software is being used by authorities to assist in determining whether suspects should receive bail.

Read More

Read More

Met police’s facial recognition tech ‘wrongly identified four in five people’

Lord Evans, whose review is due to be submitted in February, said it is still “very difficult to find out where AI is being used in the public sector”.

“It shouldn’t, in our view, be as difficult as that,” he said.

“We haven’t got a very good view as to where this is being used and how. I think that’s a little bit troubling.”

‘Proper scrutiny and accountability’

Lord Evans said he believed public authorities should be “proactively open” about using artificial technology.

This is “not because there’s a problem” but because it is essential for “proper scrutiny and accountability”.

An independent review in 2019 found that the Met police’s facial recognition technology did not correctly identify suspects in 81 per cent of cases.

Duncan Ball, the Met’s deputy assistant commissioner, called the report’s tone “negative and unbalanced”.

“We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.”

Additional reporting by Press Association.

(Excerpt) Read more Here | 2019-12-29 17:48:00

Leave a Reply

Your email address will not be published. Required fields are marked *