Visitors Now:
Total Visits:
Total Stories:
Profile image
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

Cambridge boffins fear ‘Pandora’s Unboxing’ and RISE of the MACHINES

Monday, November 26, 2012 5:40
% of readers think this story is Fact. Add your two cents.

(Before It's News)

 

Fair use Notice:
 
 
 
 
 
 
 
What horrors will YOU likely face after a cave-in of YOUR nation’s economy, war, geophysical upheaval, or whatever crisis is bad enough to disturb or stop YOUR nation from working and functioning?  There are plenty of very potential SHTF events that are simply awaiting a catalyst to trigger them…- SHTF Plan – When It Hits The Fan, Don’t Say I Didn’t Warn You… 
 
 
 
 
 
 
 
 
 
Privacy Matters…
This may be old news, but I am, pardon the expression, sick and tired, of the expression “if you have nothing to hide, you have nothing to worry about…” etc. etc.
 
To which I emphatically declare: Hell Yes You Do!
———————————————————————————————————————————————————————-

 

Cambridge boffins fear ‘Pandora’s Unboxing’ and RISE of the MACHINES

 

‘You’re more likely to die from robots than cancer’

By Brid-Aine Parnell • Get more from this author

Posted in Science26th November 2012 11:44 GMT

Boffins at Cambridge University want to set up a new centre to determine what humankind will do when ultra-intelligent machines like the Terminator or HAL pose “extinction-level” risks to our species.

A philosopher, a scientist and a software engineer are proposing the creation of a Centre for the Study of Existential Risk (CSER) to analyse the ultimate risks to the future of mankind – including bio- and nanotech, extreme climate change, nuclear war and artificial intelligence.

Apart from the frequent portrayal of evil – or just misguidedly deadly – AI in science fiction, actual real scientists have also theorised that super-intelligent machines could be a danger to the human race.

Jaan Tallinn, the former software engineer who was one of the founders of Skype, has campaigned for serious discussion of the ethical and safety aspects of artificial general intelligence (AGI).

Tallinn has said that he sometimes feels he is more likely to die from an AI accident than from cancer or heart disease, CSER co-founder and philosopher Huw Price said.

Humankind’s progress is now marked less by evolutionary processes and more by technological progress, which allows people to live longer, accomplish tasks more quickly and destroy more or less at will.

Both Price and Tallinn said they believe the rising curve of computing complexity will eventually lead to AGI, and that the critical turning point after that will come when the AGI is able to write the computer programs and create the tech to develop its own offspring.

2001 HAL poster

“Think how it might be to compete for resources with the dominant species,” says Price. “Take gorillas for example – the reason they are going extinct is not because humans are actively hostile towards them, but because we control the environments in ways that suit us, but are detrimental to their survival.”

CSER hopes to gather experts from policy, law, risk, computing and science to advise the centre and help with investigating the risks.

“At some point, this century or next, we may well be facing one of the major shifts in human history – perhaps even cosmic history – when intelligence escapes the constraints of biology,” Price said.

“Nature didn’t anticipate us, and we in our turn shouldn’t take artificial general intelligence (AGI) for granted.

 

More athttp://www.theregister.co.uk/2012/11/26/new_centre_human_extinction_risks/

If you liked this story don’t forget to hit the RECOMMEND CONTRIBUTOR button at the top of the page so that I can bring you more stories like this. I don’t get paid and it is a way to say thanks. Thanks so much!!!

 

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.