- Artificial intelligence can reduce human bias, but human supervision over autonomous systems remains essential to safety and accountability
- The cyber domain favors asymmetric warfare, giving nontraditional actors a means of wielding significant power
- Policymakers need to be aware of the limitations of new technologies, particularly the precision of munitions
The United States is anticipating major changes in the future of warfare. The Pentagon’s unclassified 2017 budget spent about $7.4 billion on artificial intelligence and supporting fields. The Trump administration requested $15 billion in cybersecurity spending for 2019. Yet from the humanitarian perspective, it’s still low-tech, age-old tactics—bombing hospitals in Syria, scorched-earth campaigns in South Sudan—that are the primary causes of harm to civilian populations.
How will emerging technologies shape the conduct and consequences of war? And how will they impact civilian security? Today on Displaced, Loren DeJonge Schulman and Erin Simpson—both hosts of Bombshell, a podcast covering national security and defense issues—explore how technological innovation is transforming armed conflict. Delving into autonomous weapons, AI, and cyber, Schulman and Simpson draw out their implications for the proliferation of violence and humanitarian response.
The public debate surrounding autonomous weapons often evokes fears of Terminator-style “killer robots.” Schulman and Simpson temper these concerns: militaries tend to be “control freaks,” Schulman says, and humans maintain authority over weapons release. It’s possible that the AI and machine learning systems that guide targeting processes could ultimately be used to protect civilian lives by reducing human bias to more accurately identify non-combatants.
Still, both guests agree that human decision-making remains essential, both to ensure accountability for collateral damage and to guard against countermeasures designed to fool machines. The challenge, Simpson suggests, lies in determining “what are those applications that machines are particularly good at, where they can reduce human bias, and then what are those sorts of decisions...where human judgment actually is really deeply necessary.”
Cyber warfare isn’t often discussed in the context of humanitarian action, but cyber attacks on critical civilian infrastructure—such as the 2017 “Wannacry” attack on Britain’s National Health Service—could have devastating impacts on local populations. Simpson points out that most cyber attacks aim to “stay below the threshold of large scale military response” by targeting private companies and social media rather than government systems. Still, it’s not yet clear whether cyber war will be confined to cyberspace, or if cyber aggressions will trigger conventional military responses that could threaten civilian lives. “We are still figuring out as a national security system how is it that cyber fits into our conception of what goes next on the ladder” of military and diplomatic escalation, Schulman observes.
“Policymakers who are not well versed on [new technology] may over-assume how good and how impactful it is and perhaps possibly how many lives it could possibly save … skepticism of how good they actually are needs to be consistent and brought up frequently with human level decision makers as much as we can.“
How might the technologies underlying these new warfighting tools be used to reduce the likelihood or severity of violent conflict and humanitarian crises? Schulman and Simpson explore positive applications of new technologies, including sensor systems for disaster warning and the ability of ordinary people to broadcast the impacts of war on their own lives.
If you've been enjoying the Displaced Podcast, please consider rating and reviewing it on iTunes, and of course, share it with your friends and colleagues.
Related Resources:
Are Killer Robots the Future of War? Parsing the Facts of Autonomous Weapons - Kelsey Atherton, New York Times
Autonomous Weapons: An Open Letter from AI & Robotics Researchers - Future of Life Institute
The Algorithms of August - Michael C. Horowitz, Foreign Policy
Weaponized AI is Coming. Are Algorithmic Forever Wars Our Future? - Ben Tarnoff, The Guardian
The Potential Human Cost of Cyber Operations: Starting the Conversation - Laurent Gisel and Lukasz Olejnik, ICRC
Behind the Magical Thinking: Lessons from Policymaker Relationships with Drones - Loren DeJonge Schulman, Center for a New American Security
To Stop Endless War, Raise Taxes - Sarah Keps, Vox
Opinions and views expressed by guests are their own and do not reflect those of the International Rescue Committee.