Anirudh More

Interface for Human-Autonomous Agent Interaction

by Anirudh More

Researcher

Designer

Developer


Interface was developed using Python (Django) for web development and data management. JavaScript, HTML, and CSS, were used for customizing Graphical User Interface (GUI).


Connected to ArcGIS to enable high-fidelity maps.


Unreal Engine was used to generate images of objects.

Purpose of the Interface


  1. This application was developed to investigate the process of leadership assignment from the human operator to autonomous agents in dynamic environments.

Contextual inquiry

Systematic literature research indicated that current Human-Autonomy Teaming (HAT) concepts rely on rigid allocation strategies to maintain collaboration between agents.

Leadership dynamics where leadership can be flexibly assigned to any agent, human or autonomous, are underexplored.

Research concepts employed


The platform evaluates leadership assignment HAT. HAT aims to improve collaboration between humans and autonomous agents.


The platform simulates four autonomous UAVs in search-and-detect tasks, employing two imaging types: Thermography and Infrared.

It also supports re-assignment of detections to other UAVs.


For each detection task, the platform facilitates the collection of the user’s trust and competence perceptions of the UAVs.



Together, it aids in the generation of Graphs (Networks) to investigate evolution of Human-Autonomous Agent Interaction.

HAT


HAT seeks to enable seamless interaction between the human operator and the autonomous agent by considering the autonomy as an equal teammate.

UAV Imaging types


Utilizes UAVs with two imaging capabilities: Thermal and Near/Shortwave Infrared.

Platform capabilities


The UAVs can be assigned to any of the quadrants of the 2D view of the map.


These UAVs traverse the map in an

autonomous lawn-mower pattern,

searching for and detecting various types

of objects including human beings, wildlife, buildings, etc.



The custom-built UI allows the assessment of the human operator's perception of UAVs technological competence and their trust in the UAVs under various dynamic conditions:

including, the imaging type of UAV, the type of object captured, and environmental conditions such illumination or terrain type.


Platform in action