Governments are failing to understand the human-driven catastrophic risks that threaten global security, prosperity and potential, and could in the worst case lead to mass harm and societal collapse, say researchers at the University of Cambridge.
Governments are failing to understand the human-driven catastrophic risks that threaten global security, prosperity and potential, and could in the worst case lead to mass harm and societal collapse, say researchers at the University of Cambridge.
Without action, these catastrophic risks will only grow over time, whether it be on climate change, ecothreats, synthetic biology or cyber
Martin Rees
The plausible global catastrophic risks include: tipping points in environmental systems due to climate change or mass biodiversity loss; malicious or accidentally harmful use of artificial intelligence; malicious use of, or unintended consequences, from advanced biotechnologies; a natural or engineered global pandemic; and intentional, miscalculated, accidental, or terrorist-related use of nuclear weapons.
Researchers from Cambridge’s Centre for the Study of Existential Risk (CSER) today release a new report on what governments can do to understand and inform policy around these risks, which could threaten the global population.
The likelihood that a global catastrophe will occur in the next 20 years is uncertain, say the researchers, but the potential severity means that national governments have a responsibility to their citizens to manage these types of risks.
Des Browne, former UK Secretary of State for Defence, said: “National governments struggle with understanding and developing policy for the elimination or mitigation of extreme risks, including global catastrophic risks. Effective policies may compel fundamental structural reform of political systems, but we do not need, nor do we have the time, to wait for such change.
“Our leaders can, and must, act now to better understand the global catastrophic risks that are present and developing. This report offers a practical framework for the necessary action.”
Governments must sufficiently understand the risks to design mitigation, preparation and response measures. But political systems often do not provide sufficient incentives for policy-makers to think about emerging or long-term issues, especially where vested interests and tough trade-offs are at play.
Additionally, the bureaucracies that support government can be ill-equipped to understand these risks. Depending on the issue or the country, public administrations tend to suffer from one or more of the following problems: poor agility to new or emerging issues, poor risk management culture and practice, lack of technical expertise and failure of imagination.
The report provides 59 practical options for how governments can better understand the risks. Ranging from improving risk management practices to developing better futures analysis, to increasing science and research capability, most national governments must take major policy efforts to match the scale and complexity of the problem, say the researchers.
Catherine Rhodes, CSER’s Executive Director, said: “This report gives policy-makers a set of clear, achievable and effective options. Few countries are making efforts to understand these risks, so most governments will be able to draw policy ideas from the report.
“In the UK, the government is ahead of its peers when it comes to conducting national risk assessments, delivering foresight and horizon-scanning and engaging with the academic community. But even it needs new approaches to understand and deal with global catastrophic risks.”
Professor Lord Martin Rees, Astronomer Royal and co-founder of CSER, said: “Global problems require global solutions. But countries must also act individually. Without action, these catastrophic risks will only grow over time, whether it be on climate change, ecothreats, synthetic biology or cyber.
“Governments have a responsibility to act, both to minimise the risk of such events, and to make plans to cope with a catastrophe if it occurred. And those that take the initiative will set a positive example for the rest of the world. Protect your citizens and be a world leader – that decision is available to every country.”
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.