skip to main content
research-article

Minimal Specifications for Detecting Security Vulnerabilities

Authors Info & Claims
Published:06 December 2019Publication History
Skip Abstract Section

Abstract

Computers are nearly ubiquitous in modern society with uses from maintaining friendships and monitoring homes to managing money and coordinating health care. As the roles of a computer continue to expand, so to does the threat posed by cyberattacks. An important challenge for today's software engineers is to build secure software and help neutralize these threats. Formal methods have long been suggested as an excellent way to build secure software but have not been widely adopted for this purpose. The "conventional wisdom" has suggested several reasons for this slow adoption, including a steep learning curve, difficulty in augmenting existing systems, and lack of tools with security-specific abstractions. Our hypothesis, however, is that applying a small and easy to learn subset of the techniques available today could significantly decrease software vulnerabilities and reduce the risk of cyberattacks. In this paper, we discuss the motivation for our hypothesis and discuss our ongoing experiment to test it.

References

  1. Magdy S. Abadir, Kenneth L. Albin, John Havlicek, Narayanan Krishnamurthy, and Andrew K. Martin. 2003. Formal Verification Successes at Motorola. Formal Methods in System Design 22, 2 (01 Mar 2003), 117--123. https://doi.org/10.1023/A: 1022917321255Google ScholarGoogle Scholar
  2. Paul E. Black, Lee Badger, Barbara Guttman, and Elizabeth Fong. 2016. Dramatically Reducing Software Vulnerabilities.Google ScholarGoogle Scholar
  3. US National Security Agency Center for Assured Software (CAS). 2017. Juliet Test Suite. https://samate.nist.gov/SRD/testsuite.phpGoogle ScholarGoogle Scholar
  4. Stephen Chong, Joshua Guttman, Anupam Datta, Andrew Myers, Benjamin Pierce, Patrick Schaumont, Tim Sherwood, and Nickolai Zeldovich. 2016. Report on the NSF Workshop on Formal Methods for Security. Technical Report. USA.Google ScholarGoogle Scholar
  5. Omar Chowdhury, Limin Jia, Deepak Garg, and Anupam Datta. 2014. Temporal Mode-Checking for Runtime Monitoring of Privacy Policies. In Proceedings of the 16th International Conference on Computer Aided Verification - Volume 8559. Springer-Verlag, Berlin, Heidelberg, 131--149. https://doi.org/10.1007/ 978--3--319-08867--9_9Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. David R. Cok. 2011. OpenJML: JML for Java 7 by Extending OpenJDK. In NASA Formal Methods, Mihaela Bobaru, Klaus Havelund, Gerard J. Holzmann, and Rajeev Joshi (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 472--479.Google ScholarGoogle Scholar
  7. The MITRE Corporation. 2018. Common Vulnerabilities and Exposures. https: //cve.mitre.org/.Google ScholarGoogle Scholar
  8. D. Evans and D. Larochelle. 2002. Improving security using extensible lightweight static analysis. IEEE Software 19, 1 (Jan 2002), 42--51. https://doi.org/10.1109/52. 976940Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Apache Software Foundation. 2018. Apache Subversion. https://subversion. apache.org/Google ScholarGoogle Scholar
  10. Apache Software Foundation. 2018. Apache Tomcat. http://tomcat.apache.org/Google ScholarGoogle Scholar
  11. G. J. Holzmann and M. H. Smith. 2000. Automating software feature verification. Bell Labs Technical Journal 5, 2 (April 2000), 72--87. https://doi.org/10.1002/bltj. 2223Google ScholarGoogle Scholar
  12. David Hovemeyer and William Pugh. 2004. Finding Bugs is Easy. SIGPLAN Not. 39, 12 (Dec. 2004), 92--106. https://doi.org/10.1145/1052883.1052895Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Linux Kernel Organization Inc. 2018. The Linux Kernel. https://www.kernel.org/Google ScholarGoogle Scholar
  14. SANS Institute and MITRE. 2011. 2011 CWE/SANS Top 25 Most Dangerous Software Errors. http://cwe.mitre.org/top25/.Google ScholarGoogle Scholar
  15. Jenkins. 2018. Jenkins. https://jenkins.io/Google ScholarGoogle Scholar
  16. Gerwin Klein, Kevin Elphinstone, Gernot Heiser, June Andronick, David Cock, Philip Derrin, Dhammika Elkaduwe, Kai Engelhardt, Rafal Kolanski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. 2009. seL4: Formal Verification of an OS Kernel. In Proceedings of the ACM SIGOPS 22Nd Symposium on Operating Systems Principles (SOSP '09). ACM, New York, NY, USA, 207--220. https://doi.org/10.1145/1629575.1629596Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Rick Kuhn, Mohammad Raunak, and Raghu Kacker. 2018. What Proportion of Vulnerabilities Can Be Attributed to Ordinary Coding Errors?: Poster. In Proceedings of the 5th Annual Symposium and Bootcamp on Hot Topics in the Science of Security (HoTSoS '18). ACM, New York, NY, USA, Article 30, 1 pages. https://doi.org/10.1145/3190619.3191686Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. David Larochelle and David Evans. 2001. Statically Detecting Likely Buffer Overflow Vulnerabilities. In Proceedings of the 10th Conference on USENIX Security Symposium - Volume 10 (SSYM'01). USENIX Association, Berkeley, CA, USA, Article 14. http://dl.acm.org/citation.cfm?id=1251327.1251341Google ScholarGoogle Scholar
  19. Hongzhe Li, Jaesang Oh, and Heejo Lee. 2016. Detecting Violations of Security Requirements for Vulnerability Discovery in Source Code. IEICE Transacopenions on Information and Systems E99.D, 9 (2016), 2385--2389. https://doi.org/10.1587/ transinf.2016EDL8035Google ScholarGoogle Scholar
  20. Peter Naur and Brian Randell (Eds.). 1969. Software Engineering: Report of a Conference Sponsored by the NATO Science Committee, Garmisch, Germany, 7--11 Oct. 1968, Brussels, Scientific Affairs Division, NATO.Google ScholarGoogle Scholar
  21. Vadim Okun, Aurelien Delaitre, and Paul E. Black. 2013. Report on the Static Analysis Tool Exposition (SATE) IV, NIST Special Publication 500--297.Google ScholarGoogle Scholar
  22. OWASP. 2018. Dependency Check. https://www.owasp.org/index.php/OWASP_ Dependency_Check.Google ScholarGoogle Scholar
  23. K. Schaffer and J. Voas. 2016. What Happened to Formal Methods for Security? Computer 49, 8 (Aug 2016), 70--79. https://doi.org/10.1109/MC.2016.228Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. CppCheck Team. 2018. CppCheck. http://cppcheck.sourceforge.net/.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

Full Access

  • Published in

    cover image ACM SIGAda Ada Letters
    ACM SIGAda Ada Letters  Volume 38, Issue 2
    December 2018
    106 pages
    ISSN:1094-3641
    DOI:10.1145/3375408
    • Editor:
    • Alok Srivastava
    Issue’s Table of Contents

    Copyright © 2019 Copyright is held by the owner/author(s)

    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 6 December 2019

    Check for updates

    Qualifiers

    • research-article

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader