Technology may provide the solutions to some of the world’s most pressing concerns – but if we’re not careful, it could also bring about the downfall of humanity, a new paper warns.
The paper titled The Vulnerable World Hypothesis investigates the possibility of the emergence of technology that is both destructive on a large scale, and easy to access.
While humankind has already created technology that has the potential for mass devastation, such as nuclear weaponry, such developments have been limited by a number of factors, including cost and rarity of the materials required to make them.
But, according to Oxford philosopher Nick Bostrom, that’s only because we’ve been ‘lucky’ thus far.
Scroll down for video
Technology may provide the solutions to some of the world’s most pressing concerns – but if we’re not careful, it could also bring about the downfall of humanity, a new paper warns. File photo
In the paper, Bostrom explains that we have yet to achieve technology that, as a result, ‘civilization almost certainly gets devastated by default’ – in other words, creating what Bostrom refers to as a ‘vulnerable world.’
So far, the Future of Humanity Institute director says we’ve managed to achieve technology that has been, for the most part, beneficial to society.
These he refers to as ‘white’ balls pulled from a hypothetical urn. Those such as nuclear weaponry could be considered grey.
But at this stage, that’s as far as we’ve gotten.
‘What we haven’t extracted, so far, is a black ball – a technology that invariably or by default destroys the civilization that invents it,’ Bostrom argues.
‘The reason is not that we have been particularly careful or wise in our technology policy. We have just been lucky.’
According to Bostrom, ‘The most obvious kind [of black ball] is a technology that would make it very easy to unleash an enormously powerful destructive force.’
This could, for example, include advances in biohacking that would allow a person with basic training to cause mass illness.
While humankind has already created technology that has the potential for mass devastation, such as nuclear weaponry, such developments have been limited by a number of factors, including cost and rarity of the materials required to make them. File photo
THE TOP THREATS TO HUMANKIND
A survey conducted by Times Higher Education and the Lindau Nobel Laureate Meetings last year posed the question to 50 Novel Laureates: ‘What is the biggest threat to humankind, in your view? And is there anything science can do to mitigate it?’
Some respondents listed more than one threat, and responses were recorded, with the threats ranked as follows:
- Population rise/environmental degradation (34%, 18 respondents)
- Nuclear war (23%, 12 respondents)
- Infectious disease/drug resistance (8%, 4 respondents)
- Selfishness/dishonesty/loss of humanity (8%, 4 respondents)
- Donald Trump/ignorant leaders (6%, 3 respondents)
- Artificial Intelligence (4%, 2 respondents)
- Inequality (4%, 2 respondents)
- Drugs (2%, 1 respondent)
- Facebook (2%, 1 respondent)
- Fundamentalism/terrorism (6%, 3 respondents)
- Ignorance/distortion of truth (6%, 3 respondents
Bostrom revisits historical examples such as the creation of the atomic bomb to investigate how things could go awry.
Had the ingredients for the atomic bomb relied on widely available materials such as ‘a piece of glass, a metal object, and a battery arranged in a particular configuration’ instead of plutonium or uranium, Bostrom notes, preventing a global disaster could be a near-impossible task.
‘In the present age, when one can publish instantaneously and anonymously on the Internet, it would be even more difficult to limit the spread of scientific secrets,’ the researcher notes.
To prevent the destabilization of civilization, or to re-stabilize, the world would have to take harsh steps.
It would be impossible to ban access to ubiquitous materials such as glass or metal, and even attempting to do so could spark violent backlash.
This type of technology could also push the world into a mass surveillance state, or spark global arms races.
Bostrom also considers the world may establish ‘extremely effective preventive policing,’ or attempt to ‘establish effective global governance.’
Each scenario presents its own set of challenges and potential consequences.
While we may not be in a ‘vulnerable’ state quite yet, the research highlights just how difficult it could be to manage if we do stumble upon the ‘black ball’ technology.
A new paper titled The Vulnerable World Hypothesis investigates the possibility of the emergence of technology that is both destructive on a large scale, and easy to access. File photo
And, it raises questions on what should be done to prevent us from getting there in the first place.
‘Even if we became seriously concerned that the urn of invention may contain a black ball, this need not move us to favor establishing stronger surveillance or global governance now, if we thought that it would be possible to take those steps later, if and when the hypothesized vulnerability came clearly into view,’ Bostrom says.
‘We could then let the world continue its sweet slumber, in the confident expectation that as soon as the alarm goes off it will leap out of bed and undertake the required actions.
‘But we should question how realistic this plan is.’