Quote of the Day | 0427

Intelligence has to be defined relative to goals and the knowledge needed to attain them. In any case the argument against the doomsday fear-mongering of existing AI extends to more powerful systems: any system that monomaniacally pursued one goal (such as making paperclips) while being oblivious to every other goal (such as not turning human beings into paperclips) is not artificially intelligent: it’s artificially stupid, and unlike anything a technologically sophisticated society would ever invent and empower. And scenarios in which the systems take over themselves commit the fallacy that intelligence implies a will to power, which comes from confusing two traits that just happened to come bundled in Homo sapiens because we are products of Darwinian natural selection.

Steven Pinker, ‘Counter-Enlightenment Convictions are ‘Surprisingly Resilient’’


[x]#14036 fan vrijdag 27 april 2018 @ 23:58:50


© eamelje.net 2001-2018. Alle rechten voorbehouden

eamelje sels

XHTML: In bytsje HTML is tastien: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>