skip to main content
10.1145/2442576.2442589acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

A reflection on seven years of the VAST challenge

Published:14 October 2012Publication History

ABSTRACT

We describe the evolution of the IEEE Visual Analytics Science and Technology (VAST) Challenge from its origin in 2006 to present (2012). The VAST Challenge has provided an opportunity for visual analytics researchers to test their innovative thoughts on approaching problems in a wide range of subject domains against realistic datasets and problem scenarios. Over time, the Challenge has changed to correspond to the needs of researchers and users. We describe those changes and the impacts they have had on topics selected, data and questions offered, submissions received, and the Challenge format.

References

  1. K. Cook, G. Grinstein, M. Whiting, M. Cooper, P. Havig, K. Liggett, B. Nebesh, and C. Paul. VAST Challenge 2012: Visual Analytics for Big Data, In Proc. VAST 2012, (Seattle, 14--19 Oct 2012).Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. L. Costello, G. Grinstein, C. Plaisant, and J. Scholtz. Advancing user-centered evaluation of visual analytic environments through contests. Information Visualization, 8:230--238, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. The DARPA Grand Challenge 2005. http://en.wikipedia.org/wiki/DARPA_Grand_Challenge. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  4. The DARPA Red Balloon Challenge. http://archive.darpa.mil/networkchallenge/. Accessed Oct. 30, 2012.Google ScholarGoogle Scholar
  5. The DARPA Urban Challenge. http://archive.darpa.mil/grandchallenge/. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  6. G. Grinstein, T. O'Connell, S. Laskowski, C. Plaisant, J. Scholtz, and M. Whiting. VAST 2006 Contest: A Tale of Alderwood, In Proc. VAST 2006, IEEE Computer Society Press, (2006), 215--216.Google ScholarGoogle Scholar
  7. Image Group Fingerprint Overview, http://www.nist.gov/itl/iad/ig/fingerprint.cfm. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  8. Text REtrival Conference (TREC).. http://trec.nist.gov/. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  9. National Science Foundation Science and Engineering Visualization Contest. http://www.nsf.gov/news/special_reports/scivis/challenge.jsp. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  10. Netflix Recommender Systems Contest. http://www.netflixprize.com//index. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  11. NIST Facial Recognition, http://www.nist.gov/itl/iad/ig/face.cfm. Accessed Oct. 30th, 2012.Google ScholarGoogle Scholar
  12. C. Plaisant. The Challenge of Information Visualiation Evaluation, In IEEE Proceedings of AVI, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. Redish, Expanding Usability Evaluation to Test Complex Systems. Journal of Usability Studies. 2, 3 (May 2007), 102--111.Google ScholarGoogle Scholar
  14. J. Scholtz, Developing Qualitative Metrics for Visual Analytic Environments. In Proceedings of BELIV '10, 1--7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Scientific Evaluations Methods for Visual Analytics Science and Technology (SEMVAST) www.cs.umd.edu/hcil/semvastGoogle ScholarGoogle Scholar
  16. Visual Analytics Benchmark Repository. http://hcil.cs.umd.edu/localphp/hcil/vast/archive/Google ScholarGoogle Scholar
  17. M. Whiting, J. Haack, and C. Varley. Creating realistic, scenario-based synthetic data for test and evaluation of information analytics software. In Proc. BELIV '08. ACM, (2008), 1--9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. Whiting, C. North, A. Endert, J. Scholtz, J. Haack, C. Varley, J. Thomas. VAST contest dataset use in education. In Proc. IEEE VAST 2009 Symposium (2009), 115--122.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. A reflection on seven years of the VAST challenge

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      BELIV '12: Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization
      October 2012
      94 pages
      ISBN:9781450317917
      DOI:10.1145/2442576

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 14 October 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate45of64submissions,70%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader