Autonomous technology may suffer from mood swings

Keenan Browe

Engineers are uncertain of autonomous technology’s benefits

By Sports & Health Editor

Like those of a teenager, technology’s mood shifts may be dangerous.

Autonomous technology, featuring devices that can operate without user guidance, has caught up with sci-fi predictions of automated cars and robots, and one study warns that these machines could become dangerously self-aware.

The study, published in January in the Journal of Experimental and Theoretical Artificial Intelligence, found that autonomous systems might drive themselves toward replication and self-preservation, potentially leading to menacing behavior toward its user if the device thinks its goal is being threatened.

For example, a system with the simple goal of winning a chess game may have unintended consequences, said Steve Omohundro, president of Possibility Research. If a person tries to shut off the system or unplug it, the computer might realize that in a world in which it is unplugged, it is not able to achieve its goal. The system might develop a secondary goal to prevent its deactivation or view the person trying to unplug it as an enemy, Omohundro said, adding that in extreme situations, the computer may attempt to harm the operator.

“We need to be very careful about how we design these systems and how we specify what their goals are,” Omohundro said. “We need to include in the goals not just whatever thing they’re supposed to do, but also information about what it means to be a good citizen.”

Omohundro said he began focusing on self-improving artificial intelligence that can monitor itself and potentially make improvements, such as becoming more efficient in completing a certain task. But if systems start changing themselves, the developer may not completely understand the system as they did when it was originally engineered, Omohundro said.

“I began to realize that there are lots of ways for outcomes to happen which are not what you expect,” Omohundro said.

The technology’s performance is influenced by the tasks programmed into its software by the engineer, Omohundro said, adding that its goals could be either harmful or beneficial.

However, Omohundro’s concerns are not slowing down the autonomous technology industry. Autonomous vacuums and cars are currently on the market with a myriad of new devices on the way.

Marco Pavone, assistant professor of aeronautics and astronautics at Stanford University, said recent improvements in autonomy are allowing engineers to give systems more complex goals, which has increased interest in the technology.

“There is an intersection between what people have been dreaming of for the past century and what now is indeed making [this] possible,” Pavone said.

Because the technology is becoming more efficient, products with autonomous systems are becoming more accessible, according to Satinder Baveja, professor of computer science and artificial intelligence lab director at the University of Michigan. He said communities could benefit from the technology’s increased popularity and that people who have difficulty walking could use autonomous wheelchairs to help them get around more efficiently.

Omohundro said autonomous systems could also be used in the manufacturing industry and in the medical field to assist doctors in diagnosing medical problems.

However, the increased popularity of autonomous technology could be dangerous if people put too much trust in the systems, Pavone said. For example, if an autonomous car detects an inconsistency, it will shift control of the vehicle back to the driver, but if the person is sleeping, control is completely lost and the car may crash, he said.

“This is the reason the key aspect of automation is interaction between the system and a human,” Pavone said. “It is not that the machine is trying to do something bad because of its negative will. It’s more just the software.”

Baveja said nightmare scenarios in which autonomous technology siezes control should not be a concern, but that safety should always be considered when developing new technologies.

“I think we are not really thinking about where these technologies are going to take us,” Baveja said. “We will develop them and then after the fact figure out our rules.”