A senior defense official clarified that the Defense Advanced Research Projects Agency is prioritizing ethics and human guidance in a program that seeks artificial intelligence-based drones designed to distinguish enemies from civilians and allied troops in urban battles, Defense One reported Friday.
âWe try to use the autonomy where appropriate, where suspicion is low and when suspicion increases, revert to a more human-in-the-loop mode,â said Lt. Col. Philip Root, program manager for DARPAâs Urban Reconnaissance through Supervised Autonomy program.
The reconnaissance program aims to build unmanned aerial systems that collect information about people in complex warfighting environments and help troops identify who is a threat. Root noted that the drones will only provide information, and the judgment on the personâs risk will still be handled by a human operator. He added that it will have legal, moral and ethical implications.
âWe really want to try to ensure we allow non-hostiles, non-combatants, to move out of the way. Future urban conflict is going to take place in large cities where the population canât just go to the mountains,â Root said.
Drones will spot unidentified individuals in the field by delivering a warning message and observing how a person responds. The system will then submit the information along with video and location data to an official who will help decide what to do about the situation. DARPA aims to begin testing the drones in 2021.