Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:GEORGE MASON UNIVERSITY
Doing Business As Name:George Mason University
PD/PI:
  • Zhisheng Yan
  • (716) 867-4737
  • zyan4@gmu.edu
Award Date:07/28/2021
Estimated Total Award Amount: $ 150,000
Funds Obligated to Date: $ 150,000
  • FY 2021=$150,000
Start Date:10/01/2021
End Date:09/30/2023
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:EAGER: Collaborative Research: Augmented 360 Video for Situation Awareness in Firefighting
Federal Award ID Number:2140620
DUNS ID:077817450
Parent DUNS ID:077817450
Program:HCC-Human-Centered Computing
Program Officer:
  • Andruid Kerne
  • (703) 292-8574
  • akerne@nsf.gov

Awardee Location

Street:4400 UNIVERSITY DR
City:FAIRFAX
State:VA
ZIP:22030-4422
County:Fairfax
Country:US
Awardee Cong. District:11

Primary Place of Performance

Organization Name:George Mason University
Street:
City:
State:VA
ZIP:22030-4422
County:Fairfax
Country:US
Cong. District:11

Abstract at Time of Award

Fire incidents have caused substantial injuries, illness, and death on both firefighters and civilians. Poor communications between commanders in the control center and firefighters working at emergency sites are frequently cited as the determining factor in fatality and loss reports of firefighter operations. Traditionally, remote commanders visualize emergency sites and lead the operation using videos captured by helmet cameras of firefighters. Unfortunately, these video systems suffer from a fundamental problem, i.e., commanders are only able to see a single view of the emergency site at a time. This limitation restricts the situation awareness of commanders and leads to productivity and safety issues such as miscommunication of locations and failure to identify dangerous events. This project combines 360-degree videos and augmented reality to enable remote commanders to achieve 360-degree situation awareness of the entire emergency site in all viewing directions and enhance firefighting productivity and safety. The 360-degree video viewing benefits various emergency response communities in planning and training. Other research outcomes, including open datasets and software, are widely disseminated through publications, presentations, and websites to contribute to the computer science and engineering communities. Educational activities including undergraduate research as well as fire safety training for K-12 students are also planned to enhance the impacts of this project. This project will design and develop augmented 360 video technology to enable remote commanders to switch viewports within a 360-degree scene and visualize machine-detected events of interest. It will investigate an augmented 360 video viewing system to enable panoramic situation awareness for incident command in firefighter operations through these steps: (1) to address commanders’ needs and requirements for situation awareness in firefighter videos, interviews will be performed to identify and categorize important objects and events in firefighter response and their technology preferences; (2) to fill in the knowledge gap of how to enable automatic machine detection of important events in firefighter videos, a view of interest detection model will be designed detects target objects and events; and (3) to provide panoramic situation awareness to commanders, an augmented 360 video viewing system for remote incident command will be developed and evaluated through a field study. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.