Existential risk from artificial general intelligence  

From The Art and Popular Culture Encyclopedia

Jump to: navigation, search

Related e

Wikipedia
Wiktionary
Shop


Featured:

Existential risk from artificial general intelligence is the theory that substantial progress in artificial intelligence (AI) could someday result in human extinction (or some other unrecoverable global catastrophe).

See also





Unless indicated otherwise, the text in this article is either based on Wikipedia article "Existential risk from artificial general intelligence" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools