Fonte: Business Insider
AI researcher Eliezer Yudkowsky warns that superintelligent systems could one day pursue their own goals at humanity's expense. Yuichiro Chino/Getty ImagesEliezer Yudkowsky says superintelligent AI could wipe out humanity by design or by accident. The researcher dismissed Geoffrey Hinton's "AI as mom" idea: "We don't have the technology.
Ler na Fonte
Notícias Relacionadas