In “Scientists Worry Machines May Outsmart Man,” John Markoff reports on concerns about whether machines might overrun their human creators. It’s the stuff of science fiction, no? Reminds me of the endgame in Sim Earth.
A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.
Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.
Although I pretty much dismiss this concern out of hand (who would build a machine that’s out of control?), I did have a what-if moment.
- If machines ran the world, would they wage wars?
- If machines ran the world, would they immediately take steps to resolve global heating?
- If machines ran the world, would there be capital punishment?
- If machines ran the world, would they behave differently toward each other based on the color of their paint?
- If machines ran the world, would they prevent each other from saying or writing things?
- If machines ran the world, would they worship humans?
Link to Mr. Markoff’s article from the New York Times.