Citing work done by the Future of Humanity Institute, Stephen Hawking warned that dismissing the dangers of advanced artificial intelligence could be the “worst mistake in history.” The result of an intelligence explosion could “outsmart financial markets, out-manipulate human leaders, and develop weapons we cannot even understand.”  Hawking goes on to say, “Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues outside non-profit institutes such as the Cambridge Centre for the Study of Existential Risk, the Future of Humanity Institute, the Machine Intelligence Research Institute, and the Future Life Institute.”  For the full article, see here.

Posted in News.

Share on Facebook | Share on Twitter