The Future of Humanity Institute is a multidisciplinary research institute at the University of Oxford. It enables a select set of leading intellects to bring the tools of mathematics, philosophy, and science to bear on big-picture questions about humanity and its prospects. The Institute belongs to the Faculty of Philosophy and is affiliated with the Oxford Martin School.
|Now Hiring Postdoctoral Researchers|
|Order Superintelligence: Paths, Dangers, Strategies|
|Oxford Martin Programme on the Impacts of Future Technology|
|Amlin Research Collaboration on Systemic Risk of Modelling|
|Global Priorities Project|
At a lecture at the Cambridge Centre for the Study of Existential Risk, Dr. Toby Ord discussed the relative likelihood of natural existential risk, as opposed to anthropogenic risks. His analysis of the issue indicates a much higher probability of anthropogenic existential risk.
On June 2nd Professor Marc Lipsitch will be giving a public lecture at FHI on the ethics of creating of potential pandemic pathogens. Professor Lipsitch is director of the Center of Communicable Disease Dynamics and Professor of Epidemiology at Harvard.
In a recent open letter, Toby Ord describes FHI’s position on experiments that create potential pandemic pathogens, noting that “the experiments involve risks of killing hundreds of thousands (or even millions) of individuals in the process.”
At the latest TED conference in Vancouver, Professor Nick Bostrom discussed concerns about machine superintelligence and FHI’s research on AI safety.