Visitors Now:
Total Visits:
Total Stories:
Profile image
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

UK University To Study Technology’s Risk To Humanity

Monday, November 26, 2012 6:50
% of readers think this story is Fact. Add your two cents.

(Before It's News)

redOrbit Staff & Wire Reports – Your Universe Online

Machines rising up to enslave humanity and take over the world has been a popular theme in many science fiction works, but a new academic center in the works at Cambridge University will be dedicated to studying whether or not these kinds of potential technology-related threats could actually happen.

According to Huffington Post reporter Sylvia Hui, the project, which has been dubbed the Center for the Study of Existential Risk (CSER), has been co-founded by Cambridge philosophy professor Huw Price, Cambridge professor of cosmology and astrophysics Martin Rees, and Skype co-founder Jann Tallinn.

The Center, which is scheduled to open sometime next year, will examine the “unchecked and unabated” advanced of technology in recent decades, as computers and machines have spread globally and have become essential to a vast array of different facets of life, including economics, healthcare, and communication, the university said in a statement on Sunday.

“While few would deny the benefits humanity has received as a result of its engineering genius — from longer life to global networks — some are starting to question whether the acceleration of human technologies will result in the survival of man… or if in fact this is the very thing that will end us,” they continued, adding that the CSER would be built in order to “address these cases — from developments in bio and nanotechnology to extreme climate change and even artificial intelligence — in which technology might pose ‘extinction-level’ risks to our species.”

“At some point, this century or next, we may well be facing one of the major shifts in human history — perhaps even cosmic history — when intelligence escapes the constraints of biology,” Price explained. “Nature didn’t anticipate us, and we in our turn shouldn’t take AGI [artificial general intelligence] for granted. We need to take seriously the possibility that there might be a ‘Pandora’s box’ moment with AGI that, if missed, could be disastrous.”

The philosophy professor said that his interest in AGI began after an encounter with Tallinn, who in recent years has become an advocate for the education about the potential ethical and safety implications of technology. Price said that Tallinn had become convinced that he was more likely to die as a result of an artificial intelligence accident than a disease such as cancer or heart disease, and his arguments both intrigued and impressed Price.

“In the case of artificial intelligence, it seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology,” he told Hui, adding that we would no longer be “the smartest things around” and could possibly find ourselves at the mercy of “machines that are not malicious, but machines whose interests don’t include us.”

“It tends to be regarded as a flakey concern, but given that we don’t know how serious the risks are, that we don’t know the time scale, dismissing the concerns is dangerous. What we’re trying to do is to push it forward in the respectable scientific community,” Price added.

redOrbit.com
offers Science, Space, Technology, Health news, videos, images and
reference information. For the latest science news, space news,
technology news, health news visit redOrbit.com frequently. Learn
something new every day.\”



Source:

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.