Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:TRUSTEES OF MOUNT HOLYOKE COLLEGE, THE
Doing Business As Name:Mount Holyoke College
PD/PI:
  • Yun-Hsuan Su
  • (206) 484-2063
  • msu@mtholyoke.edu
Award Date:05/13/2021
Estimated Total Award Amount: $ 174,386
Funds Obligated to Date: $ 174,386
  • FY 2021=$174,386
Start Date:06/01/2021
End Date:05/31/2023
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:CRII: RI: RUI: Generating Haptics in Telerobotics through Perception Complementarities during Physical Distancing
Federal Award ID Number:2101107
DUNS ID:066985714
Parent DUNS ID:066985714
Program:Robust Intelligence
Program Officer:
  • Erion Plaku
  • (703) 292-8695
  • eplaku@nsf.gov

Awardee Location

Street:50 College Street
City:South Hadley
State:MA
ZIP:01075-6456
County:South Hadley
Country:US
Awardee Cong. District:01

Primary Place of Performance

Organization Name:Mount Holyoke College
Street:50 College Street
City:South Hadley
State:MA
ZIP:01075-6456
County:South Hadley
Country:US
Cong. District:01

Abstract at Time of Award

Haptic feedback, the sense of touch and awareness of movement, is a fundamental sensory pathway for human everyday life and plays a particularly essential role in object manipulation. In a physically distanced or miniaturized world, telepresence using robot proxies can assist humans with remote tasks, yet an ongoing issue is the limited haptic capabilities due to the lack of high-fidelity, low-cost haptic interfaces or sensors. As a plethora of information exists in the digital age, mostly through images or video streams, the ability to interpret the sense of touch from vision enables new opportunities. This project explores software solutions that can estimate contact force/torque from visual data and train robots to replicate force sensitive soft body manipulation tasks. This software agent will be trained to transmit real-time force/torque estimation without the need for haptic sensors. Promising application domains include providing surgeons with vision-based haptic feedback during robot-assisted minimally invasive surgery, short term telepresence demand for emergencies or disaster response, and teleoperating a companion robot to perform a haptically enabled virtual hug or a remote handshake. The goal is to create a novel software solution that helps humans stay connected, complete remote tasks through intelligent touch estimation, and lower the barrier to entry for haptic teleoperation by reducing accessibility and hardware requirements. This project leverages preliminary endeavors in vision-based force estimation in robot assisted minimally invasive surgery (RMIS). Meanwhile, the technology will evaluate the accuracy of the artificial force, analyze added benefits or limitations of the artificial haptic information, explore the sim-to-real transfer learning capabilities of the proposed framework through Variational Autoencoder-Generative Adversarial Networks (VAE-GANs) and prioritize cross-robot support by ensuring software compatibility with Robot Operating System (ROS), Collaborative Robotics Toolkit (CRTK) and Asynchronous Multi-Body Framework (AMBF) for dynamic simulation and visualization. The wide spectrum of applications ensure significant potential of this forward looking research to impact telerobotics in soft object manipulation. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.