This study identifies the dimensions of autonomous decision-making (DADs)—the categories of potential risk that one should consider before transferring decision-making capabilities to an intelligent autonomous system (IAS). The objective of this study was to provide some of the tools needed to implement existing policies with respect to the legal, ethical, and militarily effective use of IAS. These tools help to identify and either mitigate or accept the risks associated with the use of IAS that might result in a negative outcome.
The 13 DADs identified by this study were developed from a comprehensive list of 565 “risk elements” drawn from hundreds of documents authored by a global cadre of individuals both in favor of, and opposed to, the use of autonomy technology in weapons systems. Additionally, these risk items go beyond current DOD policies and procedures because we expect those to change and IAS technologies to evolve. We captured each risk element in the form of a question. Each can then be easily modified to become a “shall statement” for use by the acquisition community in developing functional requirements that ensure the legal and ethical use of autonomous systems. This approach can elevate artificial intelligence (AI) ethics from a set of subjectively defined and thus unactionable policies and principles, to a set of measurable and testable contractual obligations.
The risk elements can also be used by military commanders as a (measurable and testable) pre-operational risk assessment “checklist” to ensure that autonomous systems are not used in an unethical manner. In this way, the Department of Defense (DOD) can make fully informed risk assessment decisions before developing or deploying autonomous systems. Because our study results were specifically designed for use within the defense acquisition system and the military planning process, they provide a first step in transforming the policies and ethics principles regarding autonomous systems into practical system engineering requirements.
Download reportDISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Public Release. 12/30/2021
Details
- Pages: 114
- Document Number: DRM-2021-U-030642-1Rev
- Publication Date: 12/30/2021