Virginia Tech engineers hope to redefine search and rescue protocols by teaming up human searchers with UAS

Advertisement

With the help of a grant from the National Science Foundation worth $1.5 million, a group of Virginia Tech engineers will pair up human researchers with UAS, in hopes of redefining search and rescue protocols.

Utilizing autonomous algorithms and machine learning, the UAS will complement search and rescue efforts from the air. Additionally, they will suggest tasks and send updated information to human searchers on the ground. 

The researchers hope to make searches more effective by using mathematical models based on historical data that reflect what lost people actually do combined with typical searcher behavior, which balances autonomy with human collaboration.

Having received support from the Virginia Department of Emergency Management, the researchers will also work closely with the local Black Diamond Search and Rescue Council throughout the project.

“Human searchers are excellent at what they do. Drones are unlikely to replace people on the ground because searchers are too good at their jobs,” explains the leader of these efforts, Ryan Williams, an assistant professor in the Bradley Department of Electrical and Computer Engineering within the College of Engineering.

Williams’ family has also been involved with the Black Diamond Search and Rescue Council since its founding more than three decades ago.

“However, what drones can do is address these niche problems of the search process by providing large-scale data that can help a search team make better decisions.”

Some instances where UAS could prove beneficial are exploring treacherous terrain that’s difficult for human searchers to reach. They could also collect information about areas of the search environment that are relatively unknown, which would save time. 

To build the mathematical model that will help the UAS decide where to go and how to search, Nicole Abaid, an assistant professor in the Department of Biomedical Engineering and Mechanics, is using historical data gathered from more than 50,000 documented lost person scenarios. The data comes from the work of Robert Koester, a search and rescue expert who will consult on the project.

“From the historical search data, we know that certain types of people tend to do certain things when they’re lost,” Abaid says. “For example, people with cell phones tend to move up in elevation as they try to get service, while an elderly person might not travel very far.”

Abaid plans to build more than 30 lost person profiles into the model that incorporate various pieces of information, including age, mental status, and activity—i.e. hiking, horseback riding, hunting, etc. Abaid will also use topographical data from ArcGIS maps that can provide insight into how people usually move through terrain.

“The overall goal is to create an autonomous, scalable system that can make search and rescue processes here and elsewhere more effective simply by intelligently incorporating existing technology,” Abaid says.

For Williams, making the UAS as unobtrusive as possible to searchers on the ground is of great importance.

“Most of the data a drone will pick up from sensors, whether it be from cameras, thermal imaging, or Lidar surveys, will be really uninteresting,” Williams says. “We don’t want to interrupt human searchers to show them video feed of a piece of trash on the ground, for instance. That wastes precious time.”

The team also includes Nathan Lau, an assistant professor in the Grado Department of Industrial and Systems Engineering, and James McClure, a computational scientist in Advanced Research Computing.

Specializing in human-computer interaction, Lau will seek to address the issue of when and how the UAS will interrupt searchers to provide new information, while McClure will design a wearable, backpack-based server to provide on-the-ground processing and communications support.