Weaponized drones. Machines that attack on their own. ‘That day is going to come’

Artificial Intelligence
Image credit: source

Technicians and researchers are cautioning about the threat such technology poses for cybersecurity, that fundamentally important practice that keeps our computers and data — and governments’ and corporations’ computers and data — safe from hackers.

In February, a study from teams at the University of Oxford and University of Cambridge warned that AI could be used as a tool to hack into drones and autonomous vehicles, and turn them into potential weapons.

“Autonomous cars like Google’s (Waymo) are already using deep learning, can already raid obstacles in the real world,” Caspi said, “so raiding traditional anti-malware system in cyber domain is possible.”

Another study, by U.S. cybersecurity software giant Symantec, said that 978 million people across 20 countries were affected by cybercrime last year. Victims of cybercrime lost a total of $172 billion — an average of $142 per person — as a result, researchers said.

The fear for many is that AI will bring with it a dawn of new forms of cyber breaches that bypass traditional means of countering attacks.

“We’re still in the early days of the attackers using artificial intelligence themselves, but that day is going to come,” warns Nicole Eagan, CEO of cybersecurity firm Darktrace. “And I think once that switch is flipped on, there’s going to be no turning back, so we are very concerned about the use of AI by the attackers in many ways because they could try to use AI to blend into the background of these networks.”

CNBC’s

“Beyond the Valley,

” brings listeners the brightest minds in technology discussing all the trends shaping the tech industry — and your world.

Listen to the podcast and sign up for the “Beyond the Valley” newsletter here

.

(Excerpt) Read more Here | 2018-07-25 05:40:32

Leave a Reply

Your email address will not be published. Required fields are marked *