Skip directly to content

Minimize RSR Award Detail

Research Spending & Results

Award Detail

Awardee:GRAMMATECH, INC.
Doing Business As Name:GRAMMATECH, INC.
PD/PI:
  • David Cok
  • (607) 273-7340
  • davidcok24@gmail.com
Award Date:09/17/2013
Estimated Total Award Amount: $ 962,190
Funds Obligated to Date: $ 978,190
  • FY 2015=$16,000
  • FY 2013=$962,190
Start Date:10/01/2013
End Date:09/30/2017
Transaction Type:Grant
Agency:NSF
Awarding Agency Code:4900
Funding Agency Code:4900
CFDA Number:47.070
Primary Program Source:040100 NSF RESEARCH & RELATED ACTIVIT
Award Title or Description:TTP: Medium: Crowd Sourcing Annotations
Federal Award ID Number:1314674
DUNS ID:603978321
Program:Secure &Trustworthy Cyberspace
Program Officer:
  • Kevin Thompson
  • (703) 292-4220
  • kthompso@nsf.gov

Awardee Location

Street:531 Esty STREET
City:ITHACA
State:NY
ZIP:14850-3250
County:Ithaca
Country:US
Awardee Cong. District:23

Primary Place of Performance

Organization Name:GRAMMATECH, INC.
Street:531 Esty Street
City:Ithaca
State:NY
ZIP:14850-3250
County:Ithaca
Country:US
Cong. District:23

Abstract at Time of Award

Both sound software verification techniques and heuristic software flaw-finding tools benefit from the presence of software annotations that describe the behavior of software components. Function summaries (in the form of logical annotations) allow modular checking of software and more precise reasoning. However, such annotations are difficult to write and not commonly produced by software developers, despite their benefits to static analysis. The Crowdsourcing Annotations project will address this deficiency by encouraging software-community-based crowd-sourced generation of annotations. This effort will be supported by tools that generate, use, and translate the annotations; the results of annotation efforts will be shared through openly available repositories. We will also use pilot projects to demonstrate and encourage the use of annotations and static analysis. The project will leverage and interact with the Software Assurance Marketplace (SWAMP) project's collection of static analysis tools and example software. Some of the technical challenges are developing uniform styles and languages for annotations, reliably validating crowd-sourced submissions, merging annotations and the corresponding source code, version control, and integration with typical software development environments. The social challenges are also important: designing and implementing a crowd-sourcing infrastructure in a way that enhances and motivates community and individual technical and social benefits.

Publications Produced as a Result of this Research

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

M. Sighireanu, D. R. Cok "Report on SL-COMP 2014" Journal on Satisfiability, Boolean Modeling and Computation, v., 2016, p.. doi:http://www.liafa.univ-paris-diderot.fr/ 

M. Sighireanu, D. R. Cok. "Report on SL-COMP 2014" Journal on Satisfiability, Boolean Modeling and Computation, v.1, 2016, p.. doi:http://www.liafa.univ-paris-diderot.fr/ 

David R. Cok "OpenJML: Software verification for Java 7 using JML, OpenJDK, and Eclipse" Proceedings 1st Workshop on Formal Integrated Development Environment, F-IDE 2014, v., 2014, p.. doi:http://arxiv.org/pdf/1404.6608v1.pdf 

David R. Cok, Aaron Stump, Tjark Weber "The 2013 Evaluation of SMT-COMP and SMT_LIB." Journal of Automated Reasoning., v.55, 2015, p.. doi:10.1007/s10817-015-9328-2 

David R. Cok, Aaron Stump, Tjark Weber. "The 2013 Evaluation of SMT-COMP and SMT_LIB" Journal of Automated Reasoning, v.55, 2015, p.. doi:10.1007/s10817-015-9328-2 

D. R. Cok and S. Johnson "SPEEDY: An Eclipse-based IDE for invariant inference" Proceedings 1st Workshop on Formal Integrated Development Environment, F-IDE 2014., v., 2014, p.. doi:http://arxiv.org/pdf/1404.6605v1.pdf 

D. R. Cok, D. Deharbe, T. Weber "The 2014 SMT Competition" Journal on Satisfiability, Boolean Modeling and Computation, v., 2016, p..

D. R. Cok, D. Deharbe, T. Weber "The 2014 SMT Competition" Journal on Satisfiability, Boolean Modeling and Computation, v.1, 2016, p..


Project Outcomes Report

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Safe software does what it is supposed to; secure software does only what it is supposed to and does not have extra (malicious) functionality. Modern software is complex and includes many components written by the software’s authors and many libraries that are written by other people. To effectively automatically check that the software is safe and secure requires expressing the intent of the software in machine-readable language, known as annotations or specifications. Even modern commercial tools that do not require the tool user to write such specifications internally have proprietary models of all relevant library routines.

This project improved the state of models (specifications) for commonly used libraries. We assessed and documented the current state of practice on a public web-site, annotationsforall.org. We found that the current state of tools was not mature enough for industrial use and that this was an impediment to widespread interest in writing library specifications. Consequently, in addition to library specification writing and testing, we also improved the underlying tools and generated tutorial material about using automated tools for checking the safety and security of software.

Our work focused on programs in C, with specifications in the ANSI-C Specification Language (ACSL) and the Frama-C checking tool, and on programs in Java, with specifications in the Java Modeling Language (JML) and the OpenJML checking tool.

The outcomes of the project are these:

(a) assessment of the use, appropriate workflows, and impediments to use of formal specifications in support of software verification, including connections to proprietary specifications and the usefulness of annotations for identifying historical bugs in public software;
(b) documenting on a public web-site (annotationsforall.org) the languages and tools that support semi-automated verification;
(c) improved tools, in particular the Frama-C tool suite and OpenJML, in the form of improved functionality, additional test, bug reports, and improved documentation;
(d) improved library specifications in ACSL and JML usable by those tools, including test suites that exercise those specifications;
(e) improvements and corrections to the logical underpinnings and semantics of the specification languages through organized discussions at conferences, workshops and in on-line discussion groups;
(f) publicly available publications, technical reports, public web-sites, tutorials and book material documenting the results of the project;
(g) advice, review and support to undergraduate and graduate projects at various universities that build on these tools and specification languages;
and (h) transition to industrial use through separately contracted research to apply the improved tools to checking the security and safety of in-use industrial software.

 


Last Modified: 10/02/2017
Modified by: David Cok

For specific questions or comments about this information including the NSF Project Outcomes Report, contact us.