New Rising Media

View Original

Cambridge University Centre To Study "Extinction-Level" Threat To Humans From Technology

Researchers at the University of Cambridge have formed a new centre to explore the threat advances in technology may pose on the human species.  The Centre for the Study of Existential Risk (CSER), they will investigate developments in biotechnology, nanotechnology and artificial intelligence, figuring out how these areas could become a threat to humanity.

The first line of defence against Skynet is now in place.

Few can deny the benefits we have received in this new age, but this centre will be dedicated to answering the question posed by many-a-science fiction story: what will humankind do when under threat from ultra-intelligent machines?  Co-founded by Philosophy professor Huw Price, Astrophysics professor Lord Martin Rees and Co-founder of Skype Jaan Tallinn, this 'Terminator study' (as the most crass of news outlets have referred to it as) will analyse whether further innovations in technology will help humans survive or will lead to their extinction.  

“At some point, this century or next, we may well be facing one of the major shifts in human history – perhaps even cosmic history – when intelligence escapes the constraints of biology,” says Huw Price, the Bertrand Russell Professor of Philosophy and one of CSER’s three founders.  

“Nature didn’t anticipate us, and we in our turn shouldn’t take Artificial General Intelligence (AGI) for granted. We need to take seriously the possibility that there might be a ‘Pandora’s box’ moment with AGI that, if missed, could be disastrous. I don’t mean that we can predict this with certainty, no one is presently in a position to do that, but that’s the point! With so much at stake, we need to do a better job of understanding the risks of potentially catastrophic technologies.”

The launch of the centre is planned for next year; but with the likes of a Cheetah Robot that runs at 30mph, an intelligent robot kingdom at Foxconn and DARPA's humanoid robot, we may be too late! (Sarcasm)

Source: CSER.org

Jason England