At the recent Federal Aviation Administration (FAA) Drone Symposium (co-hosted by AUVSI), FAA Deputy Regional Administrator Deb Sanning discussed the impact of autonomy and AI, human/machine integration, and the strategies for gaining public trust in autonomous systems, like drones. Sanning discussed this topic along with Brendan Groves from Skydio; Taylor Lochrane, the Deputy Director for Science and Technology at DOT; Lauren Haertlein from Zipline; and Margaret Nagle from Wing. What did the panel have to say about this issue? Well, in the aviation sector, “[a]utomation is making a meaningful impact in worker safety.” For example, over 30 state DOTs use drones for bridge inspections, which helps to cut time and costs as well as reduce the likelihood of dangerous (and even deadly) outcomes. While most would agree that the use of an autonomous drone to perform these inspections makes sense, the issue of safe and responsible use of AI and robotics still lingers. The panel suggested that responsible autonomous drone use rests on 1) the obligation to mitigate potential misuse of the technology; and 2) governments should be the final arbiter of appropriate conduct.

The core concepts behind these points for drone manufacturers and drone operators, as well as drone software developers using AI and machine learning, are to educate, listen, and respond. When a drone company communicates with the people of the cities and towns in which they operate, they can cultivate acceptance, build connections, and alleviate potential privacy concerns.

To promote widespread use of autonomous drones and vehicles, drone companies must engage stakeholders at all levels: the FAA, civil aviation authorities, AND mayors and community boards. Automation and societal acceptance of drones are connected: automation allows for scale, and scale allows for widespread value amongst a community.