Defense Technology Monitor No. 61

Related Categories: Military Innovation; Science and Technology; SPACE; China; Russia

Social media and mass advertising have become proven methods by which companies can alter an individual's preferences, and do so without his or her consent. Now, advances in neurotechnology are trending in this direction as well. While deep brain stimulation and brain computer interfaces are providing medical breakthroughs, scientists worry that similar products with more commercial applications may open the door to manipulating people's minds and invading their privacy. This has led prominent neuroscientists such as Rafael Yuste of Columbia University to push for the United Nations to add "neuro-rights" to the Universal Declaration of Human Rights. Yuste's team of scientists and ethicists, known as the Morningside Group, are proposing a set of neuro-rights that will protect people's rights to mental privacy, identity, free will, protection from algorithmic bias, and equal access to advancements in brain augmentation. According to Yuste, "If you can record and change neurons, you can in principle read and write the minds of people... This is not science fiction. We are doing this in lab animals successfully." The Morningside Group's efforts build on numerous similar initiatives around the world, including neuro-rights legislation in Chile and Spain. (Reuters, December 3, 2020)

In early December, then-Director of National Intelligence John Ratcliffe penned an op-ed in the Wall Street Journal in which he claimed that China has conducted tests on People's Liberation Army soldiers with the goal of developing biologically-advanced warfighters. A similar claim was made by the Jamestown Foundation think tank in a 2019 report entitled "China's military biotech frontier," which stated that the Chinese military is researching potential military applications for the CRISPR gene-editing technology. CRISPR technology was notably used in 2018 by Chinese scientist He Jiankui to create gene-edited twins (see Defense Technology Monitor No. 39).

The advancement raises hard questions for Western nations. The use of gene-editing technology and the creation of super soldiers raises considerable ethical concerns, but could give nations willing to stretch those boundaries (like China) significant military benefits. In response, Western nations are increasingly being forced to consider other human enhancement technologies in order to avoid falling behind. For instance, France is now contemplating the use of exoskeletons, performance enhancing drugs, and implants as alternatives to gene editing in order to enhance its troops. (Wall Street Journal, December 3, 2020; International Business Times, December 9, 2020; Taipei Times, December 11, 2020)

The trend toward artificial intelligence and autonomous systems in military applications is moving forward unabated. However, military officials and policymakers continue to insist on keeping a human "in the loop" in lethal robotic systems because it provides for clearer assessments and increases accountability. Unfortunately, the Defense Advanced Research Project Agency's (DARPA) recent work on AI-drone research for the System-of-Systems Enhanced Small Unit (SESU) program has demonstrated that keeping humans in the kill chain is seriously problematic. According to DARPA simulation results, AI driven systems without human decisionmakers would win if pitted against semi-autonomous variants.

The finding puts the U.S. military on the horns of a serious dilemma. Removing humans from the loop has several advantages, improving speed of decision-making among other potential benefits. However, critics warn that the findings may lead the military to develop increasingly autonomous AI systems, raising ethical, legal, and safety concerns in the process. In order to address those ethical challenges, Alka Patel, the AI ethics policy lead at the Department of Defense's Joint Artificial Intelligence Center, told National Defense magazine that "We need to make sure that... we're creating or establishing a responsible A.I. culture." Furthermore, DARPA is focused on developing "explainable AI," which provides more insight into how AI systems reach conclusions and can instill greater confidence in decisionmakers. (Smithsonian Magazine, December 7, 2020)

In an apparent first, the Pentagon has conducted a test flight with a U-2 Dragon Lady that permitted an AI system to direct and actively manipulate its sensor and navigation systems — without any backup system allowing for human intervention. Dr. Will Roper, the Assistant Secretary of the Air Force for Acquisition, Technology and Logistics, has stated that the AI's ability to detect incoming missiles was tested as part of the flight. The test demonstrated the concept of "man and machine teaming," where humans oversee mission critical tasks while robots or AI control certain technical assignments. According to Google executive Eric Schmidt, who formerly headed the Pentagon's Defense Innovation Board, the flight test was, "the first time, to my knowledge, that you have a military system integrating AI, probably in any military." The AI used is known as ARTUµ, and is modeled after an open-source algorithm known as µZero. µZero was created by the AI company DeepMind, which famously demonstrated its innovative strategies engineering wins against humans in Go and Chess. (Washington Post, December 16, 2020)

The recent clash between Armenia and Azerbaijan in the South Caucasus highlighted the extensive and decisive role that drones can play in contemporary conflict. Russia, for its part, has taken notice and identified inadequacies in its existing defenses. In response, Moscow has hastened its development of helicopter drone that will be tasked with destroying enemy UAVs. A Russian government source told the RIA Novosti news agency that the new drone will "track down small and low-speed enemy drones at low and extremely low altitudes." According to the Russian Air Force's Central Scientific Institute research center, there is also a project underway for an offensive helicopter drone with a range of close to 20 miles. (Defense News, December 17, 2020)