Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:UNIVERSITY OF NOTRE DAME DU LAC
Doing Business As Name:University of Notre Dame
PD/PI:
  • Chaoli Wang
  • (574) 631-9212
  • chaoli.wang@nd.edu
Award Date:07/23/2021
Estimated Total Award Amount: $ 499,944
Funds Obligated to Date: $ 499,944
  • FY 2021=$499,944
Start Date:10/01/2021
End Date:09/30/2024
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:III: Small: DeepRep: Unsupervised Deep Representation Learning for Scientific Data Analysis and Visualization
Federal Award ID Number:2101696
DUNS ID:824910376
Parent DUNS ID:048994727
Program:Info Integration & Informatics
Program Officer:
  • Hector Munoz-Avila
  • (703) 292-4481
  • hmunoz@nsf.gov

Awardee Location

Street:940 Grace Hall
City:NOTRE DAME
State:IN
ZIP:46556-5708
County:Notre Dame
Country:US
Awardee Cong. District:02

Primary Place of Performance

Organization Name:University of Notre Dame
Street:
City:Notre Dame
State:IN
ZIP:46556-5708
County:Notre Dame
Country:US
Cong. District:02

Abstract at Time of Award

Learning features or representations from data is a longstanding goal of data mining and machine learning. In scientific visualization, feature definitions are usually application-specific, and in many cases, they are vague or even unknown. Representation learning is often the first and crucial step toward effective scientific data analysis and visualization (SDAV). This step has become increasingly important and necessary as the size and complexity of scientific simulation data continue to grow. For more than three decades, manual feature engineering has been the standard practice in scientific visualization. With the thriving of AI and machine learning, leveraging deep neural networks for automatic feature discovery has emerged as a promising and reliable alternative. The overarching goal of this project is to develop DeepRep, a systematic deep representation learning framework for SDAV. The outcomes will provide a paradigm shift to best represent scientific data in the abstract feature space, helping scientists better understand various physical, chemical, and medical phenomena such as those from climate, combustion, and cardiovascular applications. This project thus serves the national interest, as stated by NSF's mission: to promote the progress of science; to advance the national health, prosperity, and welfare. SDAV mainly deals with unlabeled data. Therefore, the project team will investigate unsupervised learning techniques and explore their uses in learning abstract, deep, and expressive features. The proposed framework considers a broad range of inputs, including three-dimensional scalar and vector data and their visual representations (i.e., line, surface, and subvolume). Specifically, the team will study different unsupervised deep representation learning techniques, including distributed learning, disentangled learning, and self-supervised learning. The DeepRep project aims to demonstrate their utility in various subsequent SDAV tasks, such as dimensionality reduction, data clustering, representative selection, anomaly detection, data classification, and data generation. The proposed research includes four primary tasks: (1) autoencoders for distributed learning of volumetric data and their visual representations, (2) graph convolutional networks for representation learning of surface data to support node-level and graph-level operations, (3) ensemble data generation from independent features via disentangled learning, and (4) self-supervised solutions for robust data representation via contrastive learning. Furthermore, the team will perform comprehensive objective and subjective evaluations using multilevel metrics to evaluate the framework's effectiveness. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.