Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:UNIVERSITY OF UTAH, THE
Doing Business As Name:University of Utah
PD/PI:
  • Neda Nategh
  • (801) 213-3675
  • neda.nategh@utah.edu
Award Date:11/09/2017
Estimated Total Award Amount: $ 124,588
Funds Obligated to Date: $ 124,588
  • FY 2016=$124,588
Start Date:07/31/2017
End Date:04/30/2018
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:CRII: RI: A Biologically-Inspired Algorithm to Detect, Segment, and Track Moving Objects with Observer Motion
Federal Award ID Number:1811543
DUNS ID:009095365
Parent DUNS ID:009095365
Program:ROBUST INTELLIGENCE
Program Officer:
  • Kenneth C. Whang
  • (703) 292-8930
  • kwhang@nsf.gov

Awardee Location

Street:75 S 2000 E
City:SALT LAKE CITY
State:UT
ZIP:84112-8930
County:Salt Lake City
Country:US
Awardee Cong. District:02

Primary Place of Performance

Organization Name:University of Utah
Street:
City:
State:UT
ZIP:84112-8930
County:Salt Lake City
Country:US
Cong. District:02

Abstract at Time of Award

This project aims to develop a real-time computational algorithm to detect, segment and track moving objects in the presence of observer motion. This algorithm will be based on a computational model of experimentally measured properties of Object Motion Sensitive (OMS) cells in the vertebrate retina, which solve a similar problem. From the known computational properties of the retina, it is expected that the algorithm will be robust under difficult tasks in motion tracking, including object occlusion, multiple moving objects, varying scene statistics, substantial background motion, and optic flow. From a biological perspective, the developed computational model will give insight to how the retina encodes moving objects. From an engineering perspective, the resultant computational algorithm will be applicable to machine vision from a moving platform, including autonomous vehicles, surveillance and reconnaissance applications, as well as, smart sensor design and neuromorphic systems. Despite many advances in motion analysis methods, techniques based on moving observations are still in a preliminary stage when compared to static observations, as far as reliability, efficiency, robustness, and runtime are concerned in real world scenarios. At other hand, our biological visual system is capable of performing similar motion computations reliably in the presence of constant eye movements. Recently, it was discovered that segmentation of moving objects, and rejection of background motion, begins in the retina. A subset of retinal ganglion cells responds to differential motion between the receptive field center and surround, as produced by an object moving over the background, but are strongly suppressed by global image motion, as produced by the observer's head or eye movements. This selectivity for differential motion is independent of direction and the spatial pattern of the object, enabling our visual system to find the boundaries of the moving objects, segregate multiple moving objects, and also anticipate the direction of motion. The retina performs these tasks simultaneously, in real-time, and with high accuracy using a network of only five basic cell types. These properties, along with the experimental accessibility of the retina, makes this neural circuit an ideal working biological system to serve as the design for an object-tracking algorithm. This project will develop a unique motion analysis algorithm for mobile observers based on recent findings of retina's motion computations and circuitry, and demonstrate its performance in a variety of realistic scenarios, which includes lateral observer motion, optic flow, dynamic scenes, and other objects that occlude the moving object.

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.