Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:OKLAHOMA STATE UNIVERSITY
Doing Business As Name:Oklahoma State University
PD/PI:
  • Weihua Sheng
  • (405) 744-7590
  • weihua.sheng@okstate.edu
Award Date:09/06/2019
Estimated Total Award Amount: $ 488,692
Funds Obligated to Date: $ 488,692
  • FY 2019=$488,692
Start Date:10/01/2019
End Date:09/30/2022
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:RI: Small: Enabling Sound-based Human Activity Monitoring for Home Service Robots
Federal Award ID Number:1910993
DUNS ID:049987720
Parent DUNS ID:049987720
Program:Robust Intelligence
Program Officer:
  • James Donlon
  • (703) 292-8074
  • jdonlon@nsf.gov

Awardee Location

Street:101 WHITEHURST HALL
City:Stillwater
State:OK
ZIP:74078-1011
County:Stillwater
Country:US
Awardee Cong. District:03

Primary Place of Performance

Organization Name:Oklahoma State University
Street:101 WHITEHURST HALL
City:Stillwater
State:OK
ZIP:74078-1011
County:Stillwater
Country:US
Cong. District:03

Abstract at Time of Award

The increasing demand for in-home elderly care involves challenges that call for innovative solutions. As more older adults prefer to live in their own homes as they age, living alone may pose serious risks to those who have age-related problems such as reduced mobility, dementia, or other chronic diseases. This at-risk population needs regular visits from in-home healthcare services, which in turn creates pressure on the geriatric home healthcare industry. Home service robots offer a solution to this societal problem by facilitating smart aging-in-place. This project aims to solve a fundamental research problem critical to the application of service robots in complex home environments: human activity monitoring. By creating a bridge between environmental understanding and human behavior understanding, this project offers a new theory to realize sound-based monitoring of resident behaviors in realistic home environments. Such a human-aware capability frees home service robots to do their daily routine work, while being able to care for the resident more proactively and effectively. Sound-based human behavior understanding will greatly improve the capability and usability of home service robots, therefore accelerating their adoption in human daily life. This project also incorporates education and outreach activities to stimulate prospective and current college students to pursue degrees and careers in science and engineering, attract underrepresented minority students to these research activities, and to disseminate new, useful datasets to the research community and home healthcare industry to promote continued advances in this area. This project investigates a new theoretical framework for human activity monitoring in home environments, which takes advantage of deep learning while considering the locational context, thereby greatly improving the accuracy of human behavior understanding. The target framework is intended for broader application to similar deep learning-based machine perception problems. The project aims to establish a novel visual-acoustic semantic map (VASM) to connect environmental understanding and behavior understanding. Constructed through robotic semantic mapping and voice-based human-robot interaction, the VASM concept extends traditional visual semantic maps by incorporating rich acoustic information in the environment. When cloud-connected and scaled up to a large number of robots, this approach is expected to provide an effective and distributed solution to constructing a large dataset with annotated home event sounds. That dataset will then be used to train deep neural networks for sound event recognition. The project also develops a multi-sensor fusion approach to combining sound data with distributed motion sensor data to solve the problem of human activity recognition without using visual sensors. Such an approach overcomes the shortcomings associated with vision sensors and offers a fundamentally different solution to human activity monitoring. Finally, the planned theoretical framework will be verified and evaluated through experiments in a robot-integrated smart home. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.