Emerging Technologies

These were Stephen Hawking's last bold predictions on AI, superhumans, and aliens

Physicist Stephen Hawking sits on stage during an announcement of the Breakthrough Starshot initiative with investor Yuri Milner in New York April 12, 2016. REUTERS/Lucas Jackson - GF10000380261

Some bold predictions for the future of humanity. Image: REUTERS/Lucas Jackson

Max de Haldevang

The late physicist Stephen Hawking’s last writings predict that a breed of superhumans will take over, having used genetic engineering to surpass their fellow beings.

In Brief Answers to the Big Questions, to be published on Oct. 16 and excerpted today in the UK’s Sunday Times (paywall), Hawking pulls no punches on subjects like machines taking over, the biggest threat to Earth, and the possibilities of intelligent life in space.

Artificial Intelligence

Hawking delivers a grave warning on the importance of regulating AI, noting that “in the future AI could develop a will of its own, a will that is in conflict with ours.” A possible arms race over autonomous-weapons should be stopped before it can start, he writes, asking what would happen if a crash similar to the 2010 stock market Flash Crash happened with weapons. He continues:

In short, the advent of super-intelligent AI would be either the best or the worst thing ever to happen to humanity. The real risk with AI isn’t malice, but competence. A super-intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours we’re in trouble. You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green-energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.

Earth’s bleak future, gene editing, and superhumans

The bad news: At some point in the next 1,000 years, nuclear war or environmental calamity will “cripple Earth.” However, by then, “our ingenious race will have found a way to slip the surly bonds of Earth and will therefore survive the disaster.” The Earth’s other species probably won’t make it, though.

The humans who do escape Earth will probably be new “superhumans” who have used gene editing technology like CRISPR to outpace others. They’ll do so by defying laws against genetic engineering, improving their memories, disease resistance, and life expectancy, he says

Hawking seems curiously enthusiastic about this final point, writing, “There is no time to wait for Darwinian evolution to make us more intelligent and better natured.”

Once such superhumans appear, there are going to be significant political problems with the unimproved humans, who won’t be able to compete. Presumably, they will die out, or become unimportant. Instead, there will be a race of self-designing beings who are improving themselves at an ever-increasing rate. If the human race manages to redesign itself, it will probably spread out and colonise other planets and stars.

Intelligent life in space

Hawking acknowledges there are various explanations for why intelligent life hasn’t been found or has not visited Earth. His predictions here aren’t so bold, but his preferred explanation is that humans have “overlooked” forms of intelligent life that are out there.

Does God exist?

No, Hawking says.

The question is, is the way the universe began chosen by God for reasons we can’t understand, or was it determined by a law of science? I believe the second. If you like, you can call the laws of science “God”, but it wouldn’t be a personal God that you would meet and put questions to.

The biggest threats to Earth

Threat number one one is an asteroid collision, like the one that killed the dinosaurs. However, “we have no defense” against that, Hawking writes. More immediately: climate change. “A rise in ocean temperature would melt the ice caps and cause the release of large amounts of carbon dioxide,” Hawking writes. “Both effects could make our climate like that of Venus with a temperature of 250C.”

The best idea humanity could implement

Nuclear fusion power. That would give us clean energy with no pollution or global warming.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Biotechnology

Related topics:
Emerging TechnologiesClimate Action
Share:
The Big Picture
Explore and monitor how Biotechnology is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

5 ways to achieve effective cyber resilience

Filipe Beato and Jamie Saunders

November 21, 2024

Why AI is Southeast Asia's new engine for profitable growth

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum