London, November 19 (Science Alert): Physicist Stephen Hawking has warned humanity that human beings probably only have about 1,000 years left on earth, and the only thing that could save them from certain extinction is setting up colonies elsewhere in the solar system.
“[W]e must … continue to go into space for the future of humanity,” Hawking said in a lecture at the University of Cambridge this week, ”
The celebrated theoretical physicist and cosmologist painted a grave picture of the future while delivering a lecture on the universe and the origins of human beings at the Oxford Union debating society on Monday.
Professor Hawking, 74, reflected on the understanding of the universe garnered from breakthroughs over the past five decades, describing 2016 as a “glorious time to be alive and doing research into theoretical physics”.
“Our picture of the universe has changed a great deal in the last 50 years and I am happy if I have made a small contribution,“ he went on.
“The fact that we humans, who are ourselves mere fundamental particles of nature, have been able to come this close to understanding the laws that govern us and the universe is certainly a triumph.”
Highlighting “ambitious” experiments that will give an even more precise picture of the universe, he continued: “We will map the position of millions of galaxies with the help of [super] computers like Cosmos. We will better understand our place in the universe.
“Perhaps one day we will be able to use gravitational waves to look right back into the heart of the Big Bang.
“But we must also continue to go into space for the future of humanity,” he stressed.
“I don’t think we will survive another 1,000 years without escaping beyond our fragile planet.”
Prof Hawking’s predictions for humanity have been bleak in recent months. In January, he cautioned developments in science and technology are producing “new ways things can go wrong”.
He also estimated self-sustaining human colonies on Mars would not be constructed for another 100 years, meaning the human race must be “very careful” in the time before then.
The fate of humanity appears to have been weighing heavily on Hawking of late. He has also recently cautioned that artificial intelligence (AI) will be “either the best, or the worst, thing ever to happen to humanity”.
Given that humans are prone to making the same mistakes over and over again – even though we’re obsessed with our own history and should know better – Hawking suspects that “powerful autonomous weapons” could have serious consequences for humanity.
Without even taking into account the potentially devastating effects of climate change, global pandemics brought on by antibiotic resistence and nuclear capabilities of warring nations, we could soon be sparring with the kinds of enemies we’re not even close to knowing how to deal with.
Last year, Hawking added his name to a coalition of more than 20,000 researchers and experts, including Elon Musk, Steve Wozniak, and Noam Chomsky, calling for a ban on anyone developing autonomous weapons that can fire on targets without human intervention.
As the founders of OpenAI Musk’s new research initiative dedicated to the ethics of artificial intelligence, said last year, our robots are perfectly submissive now, but what happens when we remove one too many restrictions?