SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Artificial Intelligence, Robotics, Chat bots - ChatGPT -- Ignore unavailable to you. Want to Upgrade?


To: Triffin who wrote (3436)4/18/2023 12:01:13 PM
From: Savant1 Recommendation

Recommended By
Triffin

  Read Replies (1) | Respond to of 5484
 
T, I see the most danger is FROM humans, then natural disasters, and weather patterns changing over time.

AI w/o out humans misusing it, is unlikely to cause our extinction...however, it is the normal course of actions for humans...to misuse any and all things....for many reasons.

Author is calling out 'the sky is falling, the sky is falling...

Just my two bits.
S.



To: Triffin who wrote (3436)4/18/2023 12:10:46 PM
From: Ron1 Recommendation

Recommended By
Triffin

  Respond to of 5484
 
Some interesting comments in that article:

Absent that caring, we get “the AI does not love you, nor does it hate you, and you are made of atoms it can use for something else.”


The likely result of humanity facing down an opposed superhuman intelligence is a total loss. Valid metaphors include “a 10-year-old trying to play chess against Stockfish 15”, “the 11th century trying to fight the 21st century,” and “Australopithecus trying to fight Homo sapiens“.



---------
Perhaps we could program AI to take on a core mission to prevent extinction of the human race.
The alternative could well be similar to chimps figuring out how to trigger an atomic bomb, without
knowing what it might do....