Weaponized drones. Machines that attack on their own. ‘That day is going to come’

  • AI

CNBC | Ryan Browne | 07/24/2018

  • Artificial intelligence has clear positive uses, but it could be used to teach machines to attack people and their computer networks on their own.
  • Drones and autonomous vehicles could be hacked using AI and turned into weapons
  • Traditional cybersecurity methods won’t know how to cope with new attacks carried out by smart machines.

The idea of a computer program learning by itself, growing in knowledge and becoming increasingly sophisticated may be a scary one. It’s even scarier when it’s learning to attack things.

It’s easy to dismiss artificial intelligence as yet another tech buzzword, but it’s already being used in everyday applications via algorithmic processes known as machine learning.

Far from the killer robots of “Blade Runner,” machine learning applications are designed to train a computer to fulfill a certain task on its own. Machines are essentially “taught” to complete that task by doing it over and over, learning the many obstacles that could inhibit them.

“Such attacks, which seem like science fiction today, might become reality in the next few years,” Guy Caspi, CEO of cybersecurity start-up Deep Instinct, told CNBC’s new podcast “Beyond the Valley.