Neuroethics and the Third Offset Strategy
|The Pentagon, image courtesy Wikimedia.
The arguments for intensifying a focus on what might be called neurosecurity are many, including a steady pattern of substantial funding in the tens of millions for neuroscience projects by various national security agencies. Though DARPA has received the most attention among those who have followed these developments, both the Intelligence Advanced Research Projects Agency (IARPA) and the Office of Naval Research (ONR) have substantial programs in fields like electro-magnetic neurostimulation and computational neuroscience. As well, the dual use argument that reverberated in the exchange about the Nature editorial points up the momentum behind federal funding for neuroscience. Even those who worry about “militarized” science are put in the awkward position of threading a moral needle when, for example, new prosthetics for severely incapacitated persons are in the offing and when new therapies for dementia and trauma are so desperately needed. Such is the case in the U.S. Brain Initiative in which DARPA plays a key role.
|U.S. Army “CyberCenter of Excellence,” courtesy of Wikimedia
The appreciation of the importance of a technological edge has been characterized among U.S. defense planners as an “offset strategy.” For that community nuclear weapons comprised the first offset in the face of a Soviet enemy with significant numerical advantages in conventional weapons. For all its MAD quality (the doctrine was called “Mutual Assured Destruction”), the strategy enabled a balance of power during the cold war and was subject to a more or less successful nonproliferation regime. The second offset included precision-guided munitions like laser-guided “smart bombs” and computerized command-and-control systems, proving themselves in the Gulf War of 1990-91. These technologies were clearly cutting edge in their day, but new possibilities have emerged that require new ways of thinking about defense research and development. As well, national security strategists face a multi-polar world that also includes non-state actors capable of terror attacks that pose mainly a psychological rather than an existential threat.
Autonomous “deep learning” machines and systems for early warning based on crunching big data;
Human-machine collaboration to help human operators make decisions;
Assisted-human operations so that humans can operate more efficiently with the help of machines like exoskeletons;
Advanced human-machine teaming in which a human works with an unmanned system;
Semi-autonomous weapons “hardened” for cyber warfare (4).
Moreno, J.D. (2017). Neuroethics and the Third Offset Strategy. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/01/neuroethics-and-third-offset-strategy.html