Musk said that if AI had a specific goal and humans happens to be on its way, it will destroy humanity in its route without even thinking about it. “No hard feelings,” he said in a documentary titled- ‘Do You Trust this Computer?’
"If one company or small group of people manage to develop god-like super intelligence, they could take over the world," Musk said in the documentary.
The documentary presents a momentous look at the dangers that could be related with AI, including what could happen if AI evolves to be smarter than humans and becomes its own master.
It could mean that organizations could fashion a dangerous AI, that could live longer than human leaders and never be destroyed. According to Musk, one way to avoid this is to democratize AI.
"We are rapidly heading towards digital superintelligence that far exceeds any human. I think it's very obvious. We have five years. I think digital superintelligence will happen in our lifetime, 100%," Musk said.
The problem that lies with digital superintelligence is that, once it becomes smarter than humans, it is going to be very difficult to get control over them.
Musk said via Twitter in an early September post that, "competition for AI superiority at a national level will most possibly be the cause of World War III.” He denoted AI as the greatest risk facing civilization.