Abstract

A course at the University of Nevada, Reno (UNR) teaches science students an interdisciplinary approach for designing data visualizations, in combination with big data computation and statistical analysis. This 15-week, undergraduate-level course, titled “Computational Skills for Big Data: Analysis, Statistics, and Visualization,” is co-taught by visual communication design, math, and physics faculty, and is an example of a pedagogy rooted in a STEAM[1]-based approach that combines instruction in art or design with instruction in science, technology, engineering or mathematics. This paper introduces the interdisciplinary data visualization typology that formed the basis of this course’s visualization component: data visualization for facilitating analysis and data visualization for sharing knowledge. Data visualization for facilitating analysis is the use of visualization among an expert audience who are intimately familiar with the represented data, and data visualization for sharing knowledge is the use of visualization to represent data in a way that is intelligible to a non-expert audience. The authors argue that most of the extant data visualization education and science education literature can be classified as referring to one or the other of these types, and that this classification can help aid planning and facilitating instruction in data visualization (the design and use of graphical representations of data). This paper outlines the general structure of the course, and its data visualization components, in detail, and then provides an overview of how and why the course was assessed as it was, and finally reflects on its potential reproducibility. An analysis of work produced by students as the course progressed suggests that challenging STEM[2] undergraduate students to distinguish between data visualization for facilitating analysis and data visualization for sharing knowledge is a helpful typology to guide teaching data visualization. The paper also argues for greater collaboration between visual communication design faculty and STEM faculty in undergraduate courses, and posits that their collaborative efforts to teach data visualization is an effective way to do this.

Introduction

Data visualization skills are crucial for science undergraduate students to effectively learn conceptual principles, analyze experimental data, and communicate experimental findings.[3][4][5][6] Despite the importance of the effective acquisition of data visualization skills being widely acknowledged in the science education literature, few STEM education scholars advocate for their students to receive explicit instruction on designing data visualizations using principles that are commonly recommended in visual communication design and data visualization literature.[7][8] The science education literature provides science educators with some awareness of why it is important for their students to acquire data visualization skills, but fails to provide them with a practical roadmap for how to teach these skills effectively, much less the bases of knowledge that inform them. At the University of Nevada, Reno (UNR), the expectation that science students will be able to produce data visualizations without any training in the design principles for doing so has frequently resulted in poor communication of scientific information in undergraduate student work, disadvantaging both students’ comprehension of important scientific knowledge and their ability to develop the skills necessary to communicate this scientific knowledge effectively to those inside and outside their particular discipline.

In 2016-2017, an interdisciplinary course titled “Computational Skills for Big Data: Analysis, Statistics, and Visualization” was conceived by three faculty members. One had a background in visual communication design, another in mathematical statistics, and the third in atmospheric physics. Their primary goal was to address the need to improve students’ capacities to design effectively communicative data visualizations of complex information. Secondarily, they sought to address a need for students to improve their understanding of how to work with open source, big data sets, as well as a need to improve their students’ abilities to perform basic statistical analysis of the data that comprised these sets. The visual communication design faculty member assumed the responsibility for developing the details of the course pertinent to data visualization, while the faculty member with the background in mathematics assumed the responsibility for developing the details of the course pertinent to statistics, and the faculty member with the background in physics assumed the responsibility for developing the details of the course pertinent to numerical weather prediction and big data computation.

Big data computation is a methodological approach to engaging in analysis that relies on the automated cross-referencing of very large data sets. It is currently having a revolutionary effect on government, industry, and research in the humanities and sciences, and is consequently a methodological skill that is in very high demand. The qualities that lead data to be considered ‘big’ include the speed of its collection, the availability of real-time computational processing, low data accuracy, and its sheer volume compared with the data that was previously available in any given domain.[9][10]

This interdisciplinary teaching project intentionally introduced students to a “perceptual cognitive-based” approach to data visualization instruction, whereby principles of design are introduced for the purpose of improving clarity of communication within designed data visualizations.[11] This approach was chosen because the authors felt that it fit well pedagogically and conceptually with the statistical and computational instruction students received in the course. The authors note that many critical cultural, ethical, and rhetorical perspectives on data visualization instruction exist, and that these may be more appropriate when operationalized on behalf of a different cohort of students, or within the structure of a different course. Communication design researcher Meredith Davis observes that emphasis on utility in design does not negate acknowledgement or awareness of broader cultural contexts and meanings.[12]

Although the course was planned for all undergraduate first-year STEM pre-major students and will be offered to these students in the future, a small-scale pilot course was offered to physics seniors in the spring semester of 2017 to test the effectiveness of the course planning. The pilot course used numerical weather prediction as the source of big data due to great industry demand for this skill; however, as this first-year course is taught in the future, it will make use of a wider range of big data sources. In order to highlight design educators’ capacity to enhance science education through sharing a real-world example of effective interdisciplinary collaboration, this paper focuses on the visual communication design faculty member’s work on developing and implementing the data visualization component of this pilot course. This focus demonstrates the possibilities for, and benefits of, visual communication design faculty collaborating with faculty members and other university personnel from disparate disciplines to improve pedagogical outcomes and cross-disciplinary practices, and it is hoped that this example will provide a useful roadmap for others.

This interdisciplinary course is an example of STEaM pedagogy, which integrates science, technology, engineering, and math (abbreviated to STEM) education with art or design (abbreviated into the ‘A’ in STEAM) education. Conceived by educator Georgette Yakman in 2006, STEAM pedagogy generally emphasizes thematic (rather than discipline-based) learning, so that students gain cross-disciplinary literacy in skills they are likely to apply in real-world scenarios.[13] In 2010, the Rhode Island School of Design advocated incorporating design into the STEAM education model, and widely promoted STEAM as a means for integrating design into many educational fields.[14][15] Other organizations and institutions have since also advocated for widespread adoption of STEAM-based education. These include the United Kingdom-based Design Council and the United States-based National Art Education Association, the STEAM Program at North Carolina State University, and the dSchool at Stanford University. The STEAM model has rapidly been incorporated into education, industry, research, and policy in many countries worldwide. However, visual communication design can be incorporated into STEAM-based pedagogy much more commonly than it is currently.

Design for Analysis and Design for Sharing Knowledge

In physics education, as in broader science education and research, there are two main uses of data visualization: data visualization for facilitating analysis and data visualization for sharing knowledge.[16][17] Data visualization for facilitating analysis encompasses the production of visualization artifacts — charts, diagrams, maps, plots, and sketches — that provide overviews of complex data that can be interpreted qualitatively through visual perception and pattern recognition skills for the purposes of exploring data, developing and testing hypotheses, and forming conclusions (see Figure 1a).[18][19][20] In terms of meeting the needs of a particular intended audience, data visualization for facilitating analysis is produced and used either individually or in groups that are limited to the people involved in any given experiment, research activity, or data collection event. Data visualization for facilitating analysis is of widespread interest in the science education literature, which focuses on how creating and learning to interpret data visualizations can enhance students’ understanding of basic scientific concepts.[21][22][23][24] For example, biologists Jennifer Klug, Cayelan Carey and their colleagues emphasized data visualization for facilitating analysis in an ecology course module on lake ice. Students explored and graphed real-world, long-term monitoring data in spreadsheet software, visually assessed the results, and then applied linear regression to understand change over time.[25] Chemist Lisa Gentile and her colleagues took a similar approach in designing a unit on diffusion, asking students to compare data visualizations representing transmission via different mechanisms and to qualitatively examine those visualizations for evidence of how the mechanism affects transmission.[26] Learning scientists Douglas Gordin and Roy Pea describe another instance of data visualization for facilitating analysis using spatial data, that involved students in an undergraduate class on atmosphere and climate performing arithmetic operations on data layers to produce a visual representation of different rates of change.[27]

The other main use of data visualization — data visualization for sharing knowledge — encompasses those visualizations that are designed and distributed for the purpose of communicating research findings beyond the individual or group directly involved in a research activity. Although this can make use of the same forms (charts, maps etc.), it is distinguished from data visualization for facilitating analysis by purpose, intended audience, and intentional persuasiveness. There are several sub-groups within data visualization for sharing knowledge: visualization for sharing detailed knowledge, visualization for sharing summary knowledge, and visualization for intentionally evoking emotive responses (persuasion and / or storytelling). Visualization for sharing detailed knowledge is the production of visualizations with the intention to communicate detailed scientific findings to any audience that has not participated in the particular experiment or event (see Figure 1b). This includes detailed diagrams, charts, and maps such as those used as figures in academic journals, as well as physical and virtual three-dimensional renderings of scientific phenomena, as used for teaching purposes. Visualization for sharing summary knowledge is the production of highly simplified diagrams, charts, and maps for the purpose of communicating general conceptual awareness of scientific findings.[28] Summary knowledge visualization formats include overview figures, visual abstracts, and scientific posters produced for educational purposes. Visualization for intentionally emotive purposes is the production of visualizations with the intention of altering the emotional state of their audiences, often in order to alter their beliefs or behaviors (see Figure 1d).[29] This is a far larger category than the previous two, and includes infographics, graphic re-enactments and reconstructions, and visualizations used in advertising, entertainment, journalism, and propaganda activities.[30][31] Figure 2 shows the proximity of various visualization purposes to the original experiment or data collection event. This figure also indicates an increase in the intentional persuasiveness of each of the represented levels of data visualization. With that stated, it should be noted that various scholars acknowledge that all data visualizations are inherently persuasive.[32][33][34]

Figure 1: Examples of two types of data visualization identified in the literature, designed using principles taught in the visualization module: color use, legibility, and visual hierarchy. Figure 1a (left) “a visualization for facilitating analysis” — depicts a scatter plot as typically designed by scientists for data analysis. Figure 1b (right) “a visualization for sharing knowledge” — depicts an annotated scatter plot as it would be depicted on a scientific poster, representing the same data. Data source: Nevada State Climate Office, 2017.


These categorizations in the data visualization literature, data visualization for facilitating analysis and data visualization for sharing knowledge, form the basis of the visualization component of the first-year course described in this discourse. STEM students are required to create both types of data visualizations in their undergraduate work, and, as such, need training on how to differentiate between and apply the two types. For example, if a student is trying to discover a pattern in data they have collected, they rely on data visualization to guide their analysis. On the other hand, if they have already identified an important pattern or structure, and want to communicate their findings, they turn to techniques appropriate to data visualization for sharing knowledge. Understanding the difference between the two types of data visualizations is treated as the first step to developing effective toolsets for creating them. The required fields of knowledge for teaching students to design effective data visualizations are outlined below.

Figure 2: Proximity of the various types of visualization identified in the data visualization literature to the original experiment or data collection event. The further a visualization is located from the original experiment or data collection event, the more intentionally persuasive it tends to be.
Figure 2: Proximity of the various types of visualization identified in the data visualization literature to the original experiment or data collection event. The further a visualization is located from the original experiment or data collection event, the more intentionally persuasive it tends to be.

Teaching Students to Design Data Visualizations for Analysis and Data Visualizations for Sharing Knowledge

Design of these two modalities of data visualization requires some overlapping skills. Data visualization for both analysis and for sharing knowledge requires designing a given example of each of these variants in a way that renders it understandable within the limits of human experience. In turn, this requires the accommodation of human sensing and behavior, information processing, and visual conventions. Human sensing considerations for data visualization include cultivating understanding about how people receive and process input through their bodily sensations and movement. There is an increasing body of work on the important roles that sensory experience play in the facilitation of learning.[35][36][37] These studies suggest that while input from all senses can theoretically influence how effectively a given data visualization can be interpreted, the predominant sensory factors that affect interpretation are aesthetic (relating to their visual nature) and, increasingly, haptic considerations (in the case of interactive visualizations). For example, if a particular data visualization uses multiple colors to communicate some of its essential meaning, and those colors are difficult to distinguish from each other visually, the visualization will be difficult to interpret.[38]

Information processing considerations that affect visualization include the cultivation of understandings regarding how people synthesize sensory experience into conceptual knowledge. These factors relate to cognitive information processing and storage through processes such as chunking, dual coding, and serial position effects.[39][40] For example, it is important to present only five or fewer groups of information (as delineated either by color, shape, shading or saturation, or position) in any one visualization, because this is the maximum number of ‘chunks’ of information that humans can comfortably process at one time.[41] Visual convention considerations for visualization include the repeated uses of visual elements — composition, color, line, shape and type — in a way that ensures the perception of a consistent, readily understood visual hierarchy, that allows them to be easily understood as specific types. For example, a student’s drawing or plot of a bar chart can only be easily understood as a bar chart if it contains the visual conventions that are commonly associated with a bar chart. These include a series of solid vertical or horizontal color bars aligned along a solid black axis line, a title or labels indicating what the bars represent, and numerical indicators of what quantities the bars represent.[a] Visual conventions are effective due to their adherence to expected forms: if a viewer is able to identify elements of a data visualization and their purpose based on his or her familiarity with other similar images, the data visualization is more likely to communicate its message successfully.[42][43]

Data visualization for sharing knowledge requires greater skill and the ability to draw from a wider base of knowledge, because it must account for and address additional contextual considerations. To be effective, data visualization for sharing knowledge requires consideration of the aforementioned factors relevant to the facilitation of visualization for facilitating analysis, as well as awareness of and designing to accommodate a specific audience’s biases and belief systems. Additionally, the socio-cultural context within which the visualization will be interpreted must be considered, as well as its intended purpose, or call to action, or evocation of emotion (or both) that the visualization has been designed to fulfill. To be effective, almost any given visualization requires the consideration of a broad array of factors that affect audience perception, logical interpretation, and emotional response. These include but are not limited to analysis and assessment of: demographics, psychographics, literacy, culturally rooted mindsets, socioeconomic class, and visual trends. While it would be ideal to consider all of these areas when attempting to design almost every data visualization for sharing knowledge, it is important for practical reasons — mostly related to being able to create these in limited amounts of time — to focus on only a few of them.

The key considerations regarding audiences when designing data visualization for sharing knowledge are interest, expertise, and cognitive level. When considered together, these factors provide a clear indication of the level of attention and cognitive effort audience members are likely, or able, to expend to effectively interpret the information they have been presented with. Designing to account for the relative level or interest, or attention span, a specific audience might have in a given subject or presentation, and accounting for the relative amount of cognitive effort they will have to expend to glean meaning from it, is a key tenet of user experience design. This discipline is rooted in attempting to ensure that the tasks required of a given individual in a particular context of use are not more than he or she can reasonably achieve given his or her attention span, level of education, cognitive abilities, socio-cultural beliefs, and sociopolitical and socioeconomic factors.[44][45] The qualities that make a particular line chart (see figure 3) effective for interested adults with average cognitive function for their age and expert knowledge of the subject matter on offer are quite different from the qualities that would make a line chart effective for middle school-aged children who have no prior interest in or exposure to that subject matter. Charts that endeavor to present essentially the same information to both of these audiences may depict some of the same data, but essential meaning can be visually communicated to the former group using much more complex means than the latter, and still remain intelligible to its audience. If a line chart designed for experts in a given type of scientific inquiry were presented unaltered in a scientific textbook aimed at first-year university students, it would likely be ineffective. To increase this line chart’s intelligibility to these students, it would ideally be simplified visually, and perhaps include a greater amount of visual or textual annotation to help them connect the represented data with concepts that are better explained or further emphasized in the additional annotation.

When designing data visualizations for sharing knowledge, it is also critically important to consider and account for the effects that context will have on a given audience’s ability to effectively interpret (and act on) meaning. The audience considerations discussed above focus on analyzing and assessing the individual characteristics of specific audiences or audience members. In this discourse, context refers to the way that location, situation, medium, and sociocultural, sociopolitical, and socioeconomic considerations affect presentation and interpretation of a data visualization, and it determines the likelihood that an audience can interpret a data visualization. Legibility, a term that refers to how clearly a given data visualization can be read by someone who is not visually impaired, is context dependent. For example, research posters are typically read from a distance of 4 feet away or greater, so charts that appear on posters need to be designed so that all of the type they make use of is clearly visible at this distance. In contrast, academic papers are typically read at a distance of 2 feet away or closer, on a printed page, or often on the screen of a digital device that affords the audience the possibility of zooming in to the figure, so that it may be viewed in closer detail. Therefore, in order to be legible in context, a chart for an academic paper can contain type at a much smaller type size than that which appears on a poster, without any loss of contextual legibility.[46]

Figure 3: A line chart, also sometimes referred to a “fever chart.”
Figure 3: A line chart, also sometimes referred to a “fever chart.”

Science classes do not allow time to study each of the knowledge areas that contribute to effective data visualization in detail, if at all. The most practicable way to instill these considerations is to operate a two-part approach. The first of these involves providing students with a wide variety of examples of well-designed data visualizations (in the specific area they are learning about, whether that is data visualization for facilitating analysis, or one of the kinds of data visualization for sharing knowledge). The second involves presenting these examples in tandem with hands-on data visualization activities, and to have an instructor who possesses at least basic knowledge of the formal principles that guide effective layout design guide these activities. He or she can then point out the various formal factors (i.e., effective color use, the importance of legibility, and clear visual hierarchy) that make each data visualization effective as teachable moments arise.[47] Unfortunately, very few science educators are trained in visualization, and there is little awareness among science educators or in the science education scholarship that such specialist expertise exists in the field of visual communication design.[48] Visual communication design educators’ expertise in data visualization holds promise for enhancing science instruction, and helping students to produce more effective data visualizations. However, this potential can only be realized if more visual communication design educators participate more fully in interdisciplinary teaching practices, especially in those that involve reaching across disciplinary boundaries into the sciences.

Computational Skills for Big Data: Analysis, Statistics, and Visualization

The educators who developed and implemented this course recognized that visualization instruction, along with big data and statistical analysis skills within STEM education at UNR, could be improved by introducing some basic, formal principles of layout and typography. The visual communication design faculty member and the physics faculty member informally discussed ways to remedy these pedagogical shortcomings, and settled on a grant-funded, interdisciplinary, co-taught course as the most viable solution. This approach was decided upon after careful analysis of the multiple knowledge disciplines such a course would need to draw from, and recognition that significantly more resources were necessary for such a course than is usually available for development and teaching of courses at UNR. An interdisciplinary course addressing the three pedagogical needs described in the previous sections of this article was planned, and a curriculum development grant was awarded from the NASA Nevada Space Grant Consortium[b] to fund the pilot course. Typically, co-teaching is dissuaded at UNR due to the pressure to have faculty teach as many students as possible , but the teaching team felt that co-teaching was required to address this complex mix of pedagogical needs. Also, the scholarly literature indicated that this type of interdisciplinary approach is usually necessary to effectively address the needs of students enrolled in a STEAM course.[49][50]

Conceiving, planning, and teaching the pilot course involved more complex teaching arrangements than is typical of a course at UNR. The learning goals for this course, as well as its funding arrangements, and co-teaching logistics described in this article are outlined below. Additionally, this section of this piece provides a description of the course materials used, with heavy emphasis on the manner in which visual communication design instruction was facilitated; a detailed articulation of these appears in the visualization modules that occur on pages 56-58. While this course was intended for first-year STEM students and will be offered to them in the future, the pilot course described here was offered only to fourth-year students majoring in physics (due to logistical constraints). First-year STEM students were targeted in the course design because similar statistical and data visualization skills are required in each of the individual disciplines that constitute STEM at UNR, and teaching them to first-year students in an interdisciplinary course reduces the perception that certain techniques pertinent to data visualization might only be applicable with the confines of certain disciplines.[51]

Pedagogical Goals

As described in the previous section, this course was created and implemented to address several gaps in undergraduate students’ knowledge and skills, particularly those involving big data computation, statistical analysis, and data visualization skills. These gaps in student knowledge have appeared over roughly the past decade as university coursework has not kept pace with the increasingly heavy reliance on big data analysis in the arenas of government, industry, and research. 52, 53 As such, the overarching aim of this course was to improve students’ preparedness for upper-level (i.e., third- and fourth-year undergraduate) big data courses and, ultimately, readiness for careers in these three arenas that now require big data processing and analysis. This course was planned to address this overarching aim by articulating and achieving the pedagogical goal of strengthening students’ big data computation, statistical analysis, and data visualization skills. As well as helping students improve their skills in each of the three targeted areas, this course was designed to help students attain an additional pedagogical goal related specifically to addressing the data visualization skills gap: that students would learn the difference between data visualization for facilitating analysis and data visualization for sharing knowledge.

Pedagogical context and demographics of the course

The total enrollment for the pilot course — a fourth-year undergraduate STEM course in physics that was taught in the spring of 2017 — during which data visualization content was introduced was four. Each student brought a different set of skills to the course based on his or her previous academic preparation, but none of them had prior exposure to visual communication design principles. Each of these students were fourth-year undergraduates who were majoring in atmospheric sciences, but who were minoring in different areas, including mathematics and creative writing. One student was a double major in atmospheric sciences and secondary education. The course met once per week for 2.5 hours, and the students were expected to complete the majority of their assignments during this span of time; 40 % of their workload, including the data visualization component, was assigned to be completed outside of class time. The big data component of the course was completed in the first eight weeks of the semester, and included parameterizing and running a numerical weather prediction model, navigating directories and files in a Linux computing environment, and executing tasks in a high-performance computing environment. The statistical analysis component of the course was taught in remainder of the classes, during which students learned to apply hypothesis testing, distribution fitting, and goodness-of-fit techniques to the data generated in the first half of the course.[c] The data visualization course materials were delivered online in the last half (seven weeks) of the semester. This provided the students with timely support for the chart creation that was a required aspect of their statistical analysis.

An overview of data visualization instruction

The online data visualization materials that were taught during the final portion of the semester were taught to the students in three modules of one to four-week duration that consisted of readings, examples of visualizations, weekly activities, and a discussion board on which students were expected to submit completed activities weekly. These online materials were supported by an in-class review of activities, frequent email communication between students and the instructors, and one-on-one meetings between students and the visual communication design faculty member outside of class time. In the first module, Introduction to Design for Visualization, students spent one week learning about the course’s interdisciplinary approach to data visualization, including being introduced to the discipline of visual communication design. They were also given an overview of the requirements, structure, and activities of the visualization segment of the course. In the four-week-long second module, Visualization for Analysis, the students were introduced to three visual communication design principles that are essential for effective data visualizations —  effective color usage, legibility, and visual hierarchy — and were introduced to standard visual conventions of three fundamental chart types: bar charts, line charts, and scatter plots. In each of the first three weeks of this module, students were introduced to one visual communication design principle, and given a data visualization activity in which they had to demonstrate effective use of that particular principle. In the final week of this module, students applied the principles and visual conventions they learned during the previous three weeks to a data visualization for facilitating analysis activity using data they collected themselves d, and choosing from one of the three chart types covered in the data visualization component of the course. Bar charts, line charts, and scatter plots were chosen for the visualization for facilitating analysis activities because they are so commonly used in basic statistical analysis.

In the third and final module, Visualization for Sharing Knowledge, students learned about the intended audience, academic context, and purpose of visual abstracts, after which they completed an activity that involved working with a visual abstract. Visual abstracts are a relatively new kind of data visualization for sharing knowledge that presents an extremely simplified, visual overview of a research paper’s main argument, typically without headings, and with as few labels as possible. When they are used to support scholarly pieces published in academic journals, visual abstracts appear between the paper’s title and its written abstract. In the last few years, many scientific publishers have started requesting visual abstracts — also referred to as graphical abstracts or overview figures — in their submission guidelines for academic papers; in some instances it is listed as preferred, but inclusion of visual abstracts is becoming an increasingly compulsory requirement. Visual abstracts were chosen as the specific type of data visualization for sharing knowledge activity due to their widespread usage across a broad spectrum of disciplines, combined with a general lack of easily accessible, clear instruction regarding how best to design them.

All of the data visualization instruction that occurred during the final portion of the semester was scaffolded, so that each week’s coursework progressively built on and incorporated what the students had learned the previous week. Scaffolding is a term frequently used in educational literature to refer to the practice of providing educational resources in a step-by-step manner in which later materials and activities build on materials and activities presented previously. This technique is crucial to the process of teaching subject matter related to data visualization because it arranges learning materials and activities so that learned concepts build progressively upon one another, which affords opportunities for students to gain integrated understandings of given subject materials by the end of a course.[54][55] All of the materials used to help facilitate these instructional processes were also designed to provide maximum support for students as they engaged in work on their final project, a 3,000 word academic paper reporting the findings of their big data computational modeling and statistical analysis. As such, the scaffolded approach extended beyond the data visualization component of the course, so that students continued to implement newly acquired statistical and big data skills as they learned about data visualization techniques. This scaffolding technique has been identified by other authors teaching similar courses as a key way to balance and synthesize interdisciplinary content while encouraging active learning for the students.[56][57]

Each week as the course progressed, students watched a short video made by the visual communication design faculty member that contained an overview of the essential concepts and activities that would be emphasized that week. Additionally, they read one relatively brief reading on a specific visual communication design principle, such as legibility and color when relevant, viewed examples of data visualizations, and completed one data visualization activity. The data visualization activities taught skills cumulatively: for example, the first data visualization activity required the students to design a line chart, bar chart, or scatter plot that had a high level of legibility, while the second week required them to produce a chart with a high level of legibility and an appropriate use of color. Students produced their data visualizations for analysis (as part of the second module) in R, a widely used open-source statistical analysis tool, thereby incorporating visual communication design practices into a common scientific workflow. Integrating design of data visualizations into familiar processes in this way is recommended in much of the scholarly literature (and consequently this is why using programs more commonly used in visual communication design schools, such as the Adobe suite, were avoided).[58] In the data visualization activities, students were provided with R code that generated a particular chart, with certain variables identified that they needed to change to practice that week’s design activity. This approach was deemed more appropriate than having students write all the R code themselves, due to the complexity of writing it, and the relatively short time students had to complete their visualization modules. Figure 4 shows a typical data visualization activity provided to students on the left (Figure 4a) with the resultant visualization output on the right (Figure 4b).

To produce their visual abstracts (data visualizations for sharing knowledge, designed in the third module), students used online visualization software called Plotly (https://plot.ly/). This software provides its users with a simple WYSYWIG user interface for adjusting data visualizations made with R code, and was deemed to be more appropriate than R for creating the simplified design work typically depicted in visual abstracts. The software had an additional benefit of enabling the production of both static and interactive versions of charts. Interactive effects of Plotly charts include hover effects indicating numerical markers at any data point in the chart and responsive chart re-sizing. Students used images created from Plotly in their major assignment, which consisted of a final paper and final presentation. The final paper was formatted and structured in a style appropriate for submission to a peer-reviewed physics journal, and the presentation was prepared to standard appropriate in a physics conference. In the final paper, static images from plot.ly were used as visual abstracts, while final presentations demonstrated the interactive elements in final presentations.

Figure 4: A visualization activity provided to students. Figure 4a (left) shows the code students were provided with, containing instructions and identifying variables to change. Figure 4b (right) shows the chart that the code produces when run in R.


Measurement of Course Effectiveness

Effectiveness of the pilot course was measured in two ways: assessment of students’ data visualization work as it was implemented in their final projects, and pre- and post-knowledge surveys administered to evaluate the effectiveness of the course’s three subject areas, working with big data sets, statistical analysis, and data visualization. Visual assessment of students’ data visualizations was chosen as a measure of effectiveness because skills acquisition (or lack thereof) was clearly demonstrated in the students’ abilities to realize the effective visualization of complex data in the physical work they produce in response to the assigned coursework they were given.[59] Pre- and post-knowledge surveys based on course content were chosen because they are a common measure of skills acquisition in science education research.[60][61][62] Students’ data visualization work throughout the various learning modules that constituted the bulk of the coursework and that was submitted as part of their final projects was assessed for evidence of high levels of legibility and a clearly discernable visual hierarchy. Legibility was determined be of a high level if paper figures could be comfortably read by an expert audience with a normal range of vision at the extreme end of the contexts in which they were presented. For the final paper, this was determined by on screen viewing at 100 % magnification and at a distance of three feet away. For the final presentation, this was determined from an audience member’s view of a presentation screen 30 feet away. Students accomplishment in understanding and executing data visualization for facilitating analysis was assessed through their module 2 visualization activity submissions. The visual abstracts produced in module 3 were assessed for the improvement of skills and increasing understanding of data visualization for sharing knowledge. During these assessments, two questions were asked in relation to all of the students’ data visualizations. First: is all text in the data visualizations clearly legible? Second: do the data visualizations embody a clear usage of the visual principles that yield an effective visual hierarchy? An additional question was asked of the visual abstracts: are the visualizations for sharing knowledge easy for their particular audiences to understand? The pre- and post-knowledge surveys were based on the evidence-based, customizable UC-Berkeley Course Evaluation Questions (http://teaching.berkeley.edu/course-evaluations-question-bank). The pre- and post-knowledge surveys instructed students to rank their own experience with big data skills, statistical methods, and visualization skills on the first day and the last day of the classes respectively. The post-course knowledge survey also assessed students’ views of their newly acquired skills for creating effective visualizations. Answers to the questions were presented in qualitative form on a Likert scale, ranging from “not well at all” to “extremely well.”

Overall, student work demonstrated progressive improvements in their application of color, configuration of typography, and establishment of visual hierarchy, resulting in the majority of the figures and visual abstracts that appeared in their final papers being significantly clearer in their communication of research findings than they had been in their early visualization exercises. Figure 5 shows a typical example of progression of visualization skills in terms of the implementation of visual communication design principles. In the first chart (Figure 5a), the student practiced implementing legibility, while in the second chart (Figure 5b), the student practiced implementing legibility in combination with effective color use and visual hierarchy. Three out of four of the students demonstrated his or her ability to effectively make use of all of the visual communication design principles they learned throughout the data visualization module in the visualizations that appeared in their final papers, while one student did not. One student effectively used legibility throughout his/her final paper, but demonstrated an inconsistent use of appropriate and effective color and the establishment of visual hierarchy.

Figure 5: Student 1’s visualization sample work. Figure 5a (left) is the first visualization exercise submission, where focus on legibility was required. Figure 5b (right) is one of the last visualization exercise submissions, where legibility, appropriate color use, and visual hierarchy were required.


In their final papers, two of the four students demonstrated clear understanding of the difference between data visualization for sharing knowledge and data visualization for facilitating analysis in their differing design decisions evident in their visual abstracts and figures in their final papers. Figure 6 demonstrates effective student understanding of the different design decisions required of a visual abstract (Figure 6a) and a figure in an academic paper (Figure 6b). Two of the students in the class demonstrated partial understanding of the distinction between data visualization for sharing knowledge and data visualization for facilitating analysis.

In the pre- and post- course evaluation questions relating to data visualization skills, students reflected that their skills had improved on several measures (see Table 1). These reflections were collected as self-assessed measures–students were given five reply options, and for Table 1 these were converted to numerical measures 1–5 on a Likert scale. Collectively, their scores demonstrated an increase in their confidence with creating data visualizations for analysis and for sharing knowledge. The questions about data visualization for facilitating analysis and for sharing knowledge listed in Table 1 are phrased slightly differently than they are referred to in this paper in order to communicate to students in the most clear and simple language. Students also felt greater confidence after the course than before it in their understanding of legibility and visual hierarchy.

Figure 6: Student 2’s data visualization for sharing knowledge and data visualization for facilitating analysis work. Figure 6a (top) shows the visual abstract ‘in situ’ as it appears in the student’s paper. Figure 6b (bottom) shows a figure from the main text of the student’s paper. Note that Figure 6a contains less numerical and textual details and thicker line weights in the lines representing data, resulting in an overall simplified appearance preferable for a visualization for sharing knowledge.
Figure 6: Student 2’s data visualization for sharing knowledge and data visualization for facilitating analysis work. Figure 6a (top) shows the visual abstract ‘in situ’ as it appears in the student’s paper. Figure 6b (bottom) shows a figure from the main text of the student’s paper. Note that Figure 6a contains less numerical and textual details and thicker line weights in the lines representing data, resulting in an overall simplified appearance preferable for a visualization for sharing knowledge.

Pedagogical Reflections

According to student self-assessment that was later shared with the instructors, the pilot course was successful in its aim of improving the industry and research relevance of their skill sets specifically related to visualization of big data. Upon completion of the course, students identified that the material covered was relevant to their future intended careers. This interdisciplinary STEAM collaboration is a replicable model that has great promise for providing a much needed skill set to students at UNR and beyond. The success of the pilot course depended upon securing external funding, as the funding allowed important resources to be provided, and these resources made the course effective for students, and viable for all participating educators. These resources included font licenses, postgraduate researcher assistant salary for teaching purposes, graduate student assistant salary to assist in conducting pedagogical research about the course, and supercomputer processing time. The grant also allowed some instructors to receive teaching stipends that made teaching in this course as an overload viable. The grant also brought managerial support for the course, since externally funded projects are welcomed at the university.

On reflection, future iterations of this course could be strengthened by incorporating reflective discussion of data visualizations by instructors and students after each assignment to encourage deeper understanding of learned design principles. Future iterations of this course could also be strengthened by allowing for reflective time among instructors, in which new negotiated understandings of the benefits of cross-disciplinary collaboration could emerge. Before embarking on this project, the teaching team did not fully appreciate that co-teaching is a specific teaching skill, let alone the fact that each instructor had yet to master that skill. Co-teaching — the practice of multiple teachers providing instruction in the same class — can enrich a learning environment by imbuing it with multiple teacher perspectives, but it does require prior preparation on the part of instructors specific to co-teaching. Several miscommunications between instructors arose throughout the semester, sometimes resulting in contradictory information being shared with students, and other times making re-direction of student assignments necessary. The experience of the course could have been made richer for students if all the instructors who worked together to facilitate it had familiarized themselves with the foundational skills of co-teaching and committed to closer collaboration throughout semester.[63][64] This interdisciplinary STEAM collaboration is a replicable model that has great promise for providing a much-needed base of knowledge and skill set to students at UNR and other university and academy settings. If this model is used more broadly, it will be important to test its effectiveness in different university-level contexts and adjust the model according to the needs and learning styles of local students, and the curricula within which they are taught.

Table 1: Class collective pre- and post-test results for visualization component of the pilot course, demonstrating improvements in students’ confidence on all measures.
Table 1: Class collective pre- and post-test results for visualization component of the pilot course, demonstrating improvements in students’ confidence on all measures.

Conclusion

In the pilot course “Computational Skills for Big Data: Analysis, Statistics, and Visualization,” the use of data visualization for facilitating analysis and data visualization for sharing knowledge proved to be effective for teaching non-design students foundational principles of data visualization from a visual communication design perspective. Over the course of a 15-week semester, students’ data visualization skills improved, as demonstrated by improvement in their design of charts and a marked increase in their levels of confidence regarding their visualization skills. The authors are aware that it is possible that data visualization activities strengthened students’ data literacy and statistical analysis skills, one limitation of the course evaluation is that we did not test for this outcome. This is an opportunity for further evaluation in a future study. The broader benefits of introducing undergraduates to big data skills in combination with visual communication design principles for visualizations include clearer communication of scientific findings across disciplines and improved career prospects for students. Due to the success of this pilot course, the ongoing course of the same name has been submitted for inclusion in UNR curriculum and is currently under review. This fruitful course was made possible due to the combination of external funding and collaboration between visual communication design and STEM faculty. Visual communication design faculty have the capacity to effect transformative pedagogical change to the degree they are willing to engage in external grant seeking and interdisciplinary pedagogical collaboration.

Funding Statement

This project was funded in part by a Higher Education Curriculum Development sub-award from the Nevada NASA Space Grant Consortium: Award #NNX15AI02H.

References

  • Ali, S.M., Gupta, N., Nayak, G.K., Lenka, R.K. “Big data visualization: Tools and challenges.” In: 2016 2nd International Conference on Contemporary Computing and Informatics. pgs. 656–660, 2016. Online. Available at: https://doi.org/10.1109/IC3I.2016.7918044 (Accessed July 1, 2017).
  • Benson, N., Collin, C., Grand, V., Lazyan, M., Ginsburg, J., and Weeks, M. “George Armitage Miller: Cognitive Psychology.” In The Psychology Book: Big Ideas Simply Explained, pgs. 170–73. New York, NY, USA: DK Publishing, 2012.
  • Blair, J.A. “The Rhetoric of Visual Arguments.” In: Defining Visual Rhetorics, edited by C.A. Hill & M.H. Helmers, Routledge, London, pgs. 41–62. London, UK: Routledge, 2004.
  • Kinross, R. “The Rhetoric of Neutrality.” In: Design Discourse: History, Theory, Criticism, edited by V. Margolin, pgs. 197–218. Chicago, US: University of Chicago Press, 1989.
  • Crider, A. 2015. “Teaching Visual Literacy in the Astronomy Classroom.” New Directions for Teaching and Learning, 141 (2015): pgs. 7–18. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1002/tl.20118/abstract (Accessed June 2, 2017).
  • Davis, M. Graphic Design Theory. London, UK: Thames & Hudson, 2012.
  • Dur, B.I.U. “Data Visualization and Infographics In Visual Communication Design Education at The Age of Information.” Journal of Arts and Humanities, 3.5 (2014): pgs. 39–50. Online. Available at: http://theartsjournal.org/index.php/site/article/view/460 (Accessed June 2, 2017).
  • Ekbia, H., Mattioli, M., Kouper, I., Arave, G., Ghazinejad, A., Bowman, T., Suri, V.R., Tsou, A., Weingart, S., Sugimoto, C.R., 2015. “Big data, bigger dilemmas: A critical review.” Journal of the Association for Information Science and Technology, 66.8 (2015): pgs. 1523–1545. Online. Available at: https://doi.org/10.1002/asi.23294 (Accessed July 2, 2017).
  • Ellwein, A.L., Hartley, L.M., Donovan, S., and Billick, I. “Using Rich Context and Data Exploration to Improve Engagement with Climate Data and Data Literacy: Bringing a Field Station into the College Classroom.” Journal of Geoscience Education, 62.4 (2014): pgs. 578–86. Online. Available at: http://nagt-jge.org/doi/abs/10.5408/13-034 (Accessed July 17, 2017).
  • Gentile, L., Caudill, L., Fetea, M., Hill, A., Hoke, K., Lawson, B., Ovidiu Lipan, et al. “Challenging Disciplinary Boundaries in the First Year: A New Introductory Integrated Science Course for STEM Majors.” Journal of College Science Teaching, 41.5 (2012): pgs. 44–50. Online. Available at: https://facultystaff.richmond.edu/~dszajda/research/papers/challenging_disciplinary_boundaries.pdf (Accessed June 2, 2017).
  • Gilbert, J.K. “Visualization: An Emergent Field of Practice and Enquiry in Science Education.” In Visualization: Theory and Practice in Science Education, edited by J.K. Gilbert, M. Reiner, and M. Nakhleh, pgs. 3–24. Dordrecht, Netherlands: Springer Netherlands, 2008.
  • Gordin, D.N. and Pea, R.D. “Prospects for Scientific Visualization as an Educational Technology.” The Journal of the Learning Sciences, 4.3 (1995): pgs. 249–79. Online. Available at: https://telearn.archives-ouvertes.fr/hal-00190593/document (Accessed June 1, 2017).
  • Hill, M., Sharma, M.D., Johnston, H. “How online learning modules can improve the representational fluency and conceptual understanding of university physics students.” European Journal of Physics, 36.4 (2015): pgs. 1–20. Online. Available at: https://doi.org/10.1088/0143-0807/36/4/045019 (Accessed July 1, 2017).
  • Johnson, C., Moorhead, M., Munzner, T., Pfister, H., Rheingans, P., and Yoo, T.S. NIH-NSF Visualization Research Challenges Report. IEEE Computer Society, 2006. Available at: http://nrs.harvard.edu/urn-3:HUL.InstRepos:4138744 (Accessed July 17, 2017).
  • Klug, J.L., Carey, C.C., Richardson, D.C., and Gougis, R.D.. “Analysis of High-Frequency and Long-Term Data in Undergraduate Ecology Classes Improves Quantitative Literacy.” Ecosphere, 8.3 (2017): pgs. 1–13. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1002/ecs2.1733/full (Accessed July 17, 2017).
  • Kohnle, A., Douglass, M., Edwards, T.J., Gillies, A.D., Hooley, C.A., Sinclair, B.D. “Developing and evaluating animations for teaching quantum mechanics concepts.” European Journal of Physics, 31.6 (2010): pgs. 1441–1455. Online. Available at: https://doi.org/10.1088/0143-0807/31/6/010 (Accessed July 17, 2017).
  • Kostelnick, C. “The Visual Rhetoric of Data Displays: The Conundrum of Clarity.” IEEE Transactions on Professional Communication, 50.4 (2007): pgs. 280–294. Online. Available at: https://doi.org/10.1109/TPC.2007.908725 (Accessed July 10, 2017).
  • Krause, K. “A Framework for Visual Communication at Nature.” Public Understanding of Science, 26.1 (2017): pgs. 15–24. Online. Available at: http://journals.sagepub.com/doi/abs/10.1177/0963662516640966 (Accessed June 2, 2017).
  • Krug, S. Don’t Make Me Think: A Common Sense Approach to Web Usability. Berkeley, CA, USA: New Riders, 2006.
  • Land, M.H. “Full STEAM Ahead: The Benefits of Integrating the Arts Into STEM.” Procedia Computer Science, 20 (2013): pgs. 547–52. Online. Available at: http://www.sciencedirect.com/science/article/pii/S1877050913011174 (Accessed July 12, 2017).
  • Langen, T.A., Mourad, T., Grant, B.W., Gram, W.K., Abraham, B.J., Fernandez, D.S., Carroll, M., Nuding, A., Balch, J.K., Rodriguez, J., and Hampton, S.E. “Using Large Public Datasets in the Undergraduate Ecology Classroom.” Frontiers in Ecology and the Environment, 12.6 (2014): pgs. 362–63. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1890/1540-9295-12.6.362/abstract (Accessed July 17, 2017).
  • Lester, J.N. and Evans, K.R. “Instructors’ Experiences of Collaboratively Teaching: Building Something Bigger.” International Journal of Teaching and Learning in Higher Education, 20.3 (2009): pgs. 373–82. Online. Available at: http://files.eric.ed.gov/fulltext/EJ869322.pdf (Accessed July 12, 2017).
  • Lidwell, W., Holden, K., and Butler, J. Universal Principles of Design. Beverly, Massachusetts, USA: Rockport Publishers, Inc., 2003.
  • Maeda, J. “STEM + Art = STEAM.” The STEAM Journal, 1.1 (2013): Article 34. Online. Available at: http://scholarship.claremont.edu/steam/vol1/iss1/34/ (Accessed July 12, 2017).
  • Marshall, J.A., Castillo, A.J., Cardenas, M.B. “The Effect of Modeling and Visualization Resources on Student Understanding of Physical Hydrology.” Journal of Geoscience Education, 63.2, (2015): pgs. 127-139. Online. Available at: https://doi.org/10.5408/14-057.1 (Accessed June 2, 2017).
  • Mathewson, J.H. “Visual-Spatial Thinking: An Aspect of Science Overlooked by Educators.” Science Education, 83.1 (1999): pgs. 33–54. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1002/(SICI)1098-237X(199901)83:1%3C33::AID-SCE2%3E3.0.CO;2-Z/abstract (Accessed June 1, 2017).
  • McDonald, A. “In Between: Challenging the Role of Graphic Design by Situating It in a Collaborative, Interdisciplinary Class.” In Design Studies: Theory and Research in Graphic Design, edited by A. Bennett, pgs. 354–69. New York, NY, USA: Princeton Architectural Press, 2006.
  • Meirelles, I. Design for Information. Beverly, MA, USA: Rockport, 2013.
  • Mercer-Mapstone, L.D., Kuchel, L.J. “Integrating Communication Skills into Undergraduate Science Degrees: A Practical and Evidence-Based Approach.” Teaching & Learning Inquiry, 4.2 (2016): pgs. 1-14. Online. Available at: https://doi.org/10.20343/teachlearninqu.4.2.11 (Accessed June 3, 2017).
  • Milner-Bolotin, M., and Nashon, S.M. “The Essence of Student Visual–spatial Literacy and Higher Order Thinking Skills in Undergraduate Biology.” Protoplasma, 249.S1 (2012): pgs. 25–30. Online. Available at: https://link.springer.com/article/10.1007%2Fs00709-011-0346-6 (Accessed June 2, 2017).
  • Peterlin, P. “Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite.” European Journal of Physics, 31.4 (2010): pgs. 919–931. Online. Available at: http://iopscience.iop.org/article/10.1088/0143-0807/31/4/021/meta (Accessed June 1, 2017).
  • Psotka, J. “Educational Games and Virtual Reality as Disruptive Technologies.” In Educational Technology & Society, 16.2 (2013): pgs. 69–80. Online. Available at: http://www.ifets.info/journals/16_2/7.pdf (Accessed May 18, 2017).
  • Reiner, M. “Seeing Through Touch: The Role of Haptic Information in Visualization.” In Visualization: Theory and Practice in Science Education, edited by J.K. Gilbert, M. Reiner, and M. Nakhleh, pgs. 73–84. Dordrecht, Netherlands: Springer Netherlands, 2008.
  • Reiner, M. “Sensory Cues, Visualization and Physics Learning.” International Journal of Science Education, 31.3 (2009): pgs. 343–64. Online. Avaiable at: http://www.tandfonline.com/doi/abs/10.1080/09500690802595789 (Accessed June 1, 2017).
  • Rodríguez Estrada, F.C. and Davis, L.S. “Improving Visual Communication of Science Through the Incorporation of Graphic Design Theories and Practices Into Science Communication.” Science Communication, 37.1 (2015): pgs. 140–48. Online. Available at: http://journals.sagepub.com/doi/abs/10.1177/1075547014562914 (Accessed June 2, 2017).
  • Segel, E., and Heer, J. “Narrative Visualization: Telling Stories with Data.” IEEE Transactions on Visualization and Computer Graphics, 16.6 (2010): pgs. 1139–48. Online. Available at: http://vis.stanford.edu/papers/narrative (Accessed June 16, 2017).
  • Sheppard, S.R.J. “Landscape Visualisation and Climate Change: The Potential for Influencing Perceptions and Behaviour.” Environmental Science & Policy, 8.6 (2005): pgs. 637–54. Online. Available at: http://www.sciencedirect.com/science/article/pii/S1462901105001188 (Accessed June 1, 2017).
  • Shoresh, N and Wong, B. “Points of View: Data Exploration.” Nature Methods, 9.1 (2012): pg. 5. Online. Available at: http://www.nature.com/nmeth/journal/v9/n1/full/nmeth.1829.html (Accessed June 17, 2017).
  • Trumbo, J. “Visual Literacy and Science Communication.” Science Communication, 20.4 (1999): pgs. 409–25. Online. Available at: http://journals.sagepub.com/doi/abs/10.1177/1075547099020004004 (Accessed June 2, 2017).
  • Trumbo, J. “Essay: Seeing Science.” Science Communication, 21.4 (2000): pgs. 379–91. Online. Available at: http://journals.sagepub.com/doi/abs/10.1177/1075547000021004004 (Accessed June 1, 2017).
  • Valle, M. “Visualization: A Cognition Amplifier.” International Journal of Quantum Chemistry, 113.17 (2013): pgs. 2040–52. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1002/qua.24480/abstract (Accessed June 2, 2017).
  • Vande Moere, A. and Purchase, H. “On the Role of Design in Information Visualization.” Information Visualization, 10.4 (2011): pgs. 356–71. Online. Available at: http://journals.sagepub.com/doi/abs/10.1177/1473871611415996 (Accessed June 2, 2017).
  • Wong, B. “Points of View: Color Coding.” Nature Methods, 7.8 (2010): pg. 573. Online. Available at: http://www.nature.com/nmeth/journal/v7/n8/full/nmeth0810-573.html (Accessed July 16, 2017).
  • Wong, B. “Points of View: The Design Process.” Nature Methods, 8.12 (2011): pg. 987. Online. Available at: https://www.nature.com/nmeth/journal/v8/n12/full/nmeth.1783.html (Accessed June 17, 2017).
  • Yakman, G. “Recognizing the A in STEM Education.” Middle Ground, August 2012. Online. Available at: https://steamedu.com/recognizing-the-a-in-stem-education/ (Accessed July 13, 2017).
  • Zhang, Z.H. and Linn, M.C. “Can Generating Representations Enhance Learning with Dynamic Visualizations?” Journal of Research in Science Teaching, 48.10 (2011): pgs. 1177–98. Online. Available at: http://onlinelibrary.wiley.com/doi/10.1002/tea.20443/abstract (Accessed June 2, 2017).

Biographies

Katherine Hepworth is a communication design practitioner-researcher currently working as an Assistant Professor of Visual Journalism at the University of Nevada, Reno. Her research interests center around how communication design artifacts mediate power relationships, with a particular focus on the power implications of data visualization across disciplines. This research focus has led her to formulate and operate current projects on ethical data visualization in the digital humanities, efficacy and ethics in big data visualization, and pedagogical research on improving educational strategies related to visualization in the sciences. Dr Hepworth also has a broad research interest in improving communication effectiveness in higher education. In her research, teaching, and professional practice, Dr. Hepworth takes a human-centered approach to communication design, prioritizing the cultivation of broader understandings of and about people’s lived experience of communication design artifacts.

Chelsea Canon is a Ph.D. student in Geography at the University of Nevada, Reno. Her research identifies key factors influencing the formation of collaboration and communication networks linking climate scientists to other researchers in physical science, social science, and the humanities, as well as to resource managers and stakeholders. Her goal is to provide actionable information about how to initiate, grow and sustain these networks. Prior to studying science communication, Ms. Canon received a B.A. in Spanish and Spanish American Studies from Mills College in Oakland, California. Her M.S. in Geography, also from the University of Nevada, Reno, used material culture artifacts to consider and contest a commonly-accepted narrative about Nevada’s mining history. This interdisciplinary background informs her current research interests.

Notes

  • a

    These visual conventions, while widely accepted, have sometimes been challenged by prominent data visualization scholars. Notably, Edward Tufte suggests axis lines are not always necessary. However, we consider such removal of key design elements “advanced moves” that are best utilized by expert practitioners only, and do not recommend them for the majority of practitioners or data visualization usereturn to text

  • b

    The NASA Nevada Space Grant Consortium has a mission to develop Nevada's college and pre-college curricular and informal STEM education projects. This proposal was funded because the proposed course clearly fulfilled this mission. There are NASA Space Grant Consortium’s based in most states that offer similar programs.return to text

  • c

    For people unfamiliar with these terms, a brief definition follows. Hypothesis testing: a method of statistical inference. Distribution fitting: fitting a probability distribution to data containing repeated measurement of a variable phenomenon. Goodness-of-fit techniques: determining whether sample data can be considered within a given specified distributionreturn to text

  1. Science, Technology, Engineering, Art (and design), and Math.return to text

  2. Science, Technology, Engineering, and Math. The origin of the acronym “STEM” has been attributed to Dr. Judith Ramaley, the former director of the education and human resources division of the U.S. National Science Foundation. She had originally coined the acronym “SMET” as she was working to help develop curricula that had the potential to enhance learning among science, mathematics, engineering and technology students around the country. She altered “SMET” to “STEM” to allow science and math to “serve as bookends for technology and engineering,” and because she reportedly “didn’t like the sound of [the] word [SMET].” Excerpted from Christenson, J. “Used Nationwide,” Winona Daily News, 13 November, 2011. Online. Available at: http://www.winonadailynews.com/news/local/ramaley-coined-stem-term-now-used-nationwide/article_457afe3e-0db3-11e1-abe0-001cc4c03286.html (Accessed January 30, 2018).return to text

  3. Crider, A. 2015. “Teaching Visual Literacy in the Astronomy Classroom.” New Directions for Teaching and Learning, 141 (2015): pgs. 7–18.return to text

  4. Mathewson, J.H. “Visual-Spatial Thinking: An Aspect of Science Overlooked by Educators.” Science Education, 83.1 (1999): pgs. 33–54.return to text

  5. Milner-Bolotin, M., and Nashon, S.M. “The Essence of Student Visual–spatial Literacy and Higher Order Thinking Skills in Undergraduate Biology.” Protoplasma, 249.S1 (2012): pgs. 25–30.return to text

  6. Valle, M. “Visualization: A Cognition Amplifier.” International Journal of Quantum Chemistry, 113.17 (2013): pgs. 2040–2052.return to text

  7. Trumbo, J. “Visual Literacy and Science Communication.” Science Communication, 20.4 (1999): pgs. 409-425.return to text

  8. Vande Moere, A. and Purchase, H. “On the Role of Design in Information Visualization.” Information Visualization, 10.4 (2011): pgs. 356–371.return to text

  9. Ali, S.M., Gupta, N., Nayak, G.K., Lenka, R.K. “Big data visualization: Tools and challenges.” In: 2016 2nd International Conference on Contemporary Computing and Informatics. pgs. 656–660, 2016.return to text

  10. Mathewson, J.H. “Visual-Spatial Thinking: An Aspect of Science Overlooked by Educators.” Science Education, 83.1 (1999): pgs. 33–54.return to text

  11. Milner-Bolotin, M., and Nashon, S.M. “The Essence of Student Visual–spatial Literacy and Higher Order Thinking Skills in Undergraduate Biology.” Protoplasma, 249.S1 (2012): pgs. 25–30.return to text

  12. Valle, M. “Visualization: A Cognition Amplifier.” International Journal of Quantum Chemistry, 113.17 (2013): pgs. 2040–2052.return to text

  13. Yakman, G. “Recognizing the A in STEM Education.” Middle Ground, August 2012.return to text

  14. Land, M.H. “Full STEAM Ahead: The Benefits of Integrating the Arts Into STEM.” Procedia Computer Science, 20 (2013): pgs. 547–552.return to text

  15. Maeda, J. “STEM + Art = STEAM.” The STEAM Journal, 1.1 (2013): pgs. 1–3.return to text

  16. Shoresh, N and Wong, B. “Points of View: Data Exploration.” Nature Methods, 9.1 (2012): p. 5.return to text

  17. Trumbo, J. “Essay: Seeing Science.” Science Communication, 21.4 (2000): pgs. 379–391.return to text

  18. Gordin, D.N. and Pea, R.D. “Prospects for Scientific Visualization as an Educational Technology.” The Journal of the Learning Sciences, 4.3 (1995): pgs. 249–279.return to text

  19. Johnson, C., Moorhead, M., Munzner, T., Pfister, H., Rheingans, P., and Yoo, T.S. NIH-NSF Visualization Research Challenges Report. IEEE Computer Society, 2006. return to text

  20. Valle, M. “Visualization: A Cognition Amplifier.” International Journal of Quantum Chemistry, 113.17 (2013): pgs. 2040–2052. return to text

  21. Ellwein, A.L., Hartley, L.M., Donovan, S., Billick, I. “Using Rich Context and Data Exploration to Improve Engagement with Climate Data and Data Literacy: Bringing a Field Station into the College Classroom.” Journal of Geoscience Education 62, (2014): pgs. 578–586. return to text

  22. Gilbert, J.K. “Visualization: An Emergent Field of Practice and Enquiry in Science Education.” In Visualization: Theory and Practice in Science Education, edited by J.K. Gilbert, M. Reiner, and M. Nakhleh, pgs. 3–24.return to text

  23. Reiner, M. “Sensory Cues, Visualization and Physics Learning.” International Journal of Science Education, 31.3 (2009): pgs. 343–364.return to text

  24. Zhang, Z.H. and Linn, M.C. “Can Generating Representations Enhance Learning with Dynamic Visualizations?” Journal of Research in Science Teaching, 48.10 (2011): pgs. 1177–1198.return to text

  25. Klug, J.L., Carey, C.C., Richardson, D.C., and Gougis, R.D. “Analysis of High-Frequency and Long-Term Data in Undergraduate Ecology Classes Improves Quantitative Literacy.” Ecosphere, 8.3 (2017): pgs. 1–3.return to text

  26. Gentile, L., Caudill, L., Fetea, M., Hill, A., Hoke, K., Lawson, B., Ovidiu Lipan, et al. “Challenging Disciplinary Boundaries in the First Year: A New Introductory Integrated Science Course for STEM Majors.” Journal of College Science Teaching, 41.5 (2012): pgs. 44–50. return to text

  27. Gordin, D.N. and Pea, R.D. “Prospects for Scientific Visualization as an Educational Technology.” The Journal of the Learning Sciences, 4.3 (1995): pgs. 249–279.return to text

  28. Krause, K. “A Framework for Visual Communication at Nature.” Public Understanding of Science, 26.1 (2017): pgs. 15–24.return to text

  29. Sheppard, S.R.J. “Landscape Visualisation and Climate Change: The Potential for Influencing Perceptions and Behaviour.” Environmental Science & Policy, 8.6 (2005): pgs. 637–654.return to text

  30. Dur, B.I.U. “Data Visualization and Infographics In Visual Communication Design Education at The Age of Information.” Journal of Arts and Humanities, 3.5 (2014): pgs. 39–50.return to text

  31. Segel, E., and Heer, J. “Narrative Visualization: Telling Stories with Data.” IEEE Transactions on Visualization and Computer Graphics, 16.6 (2010): pgs. 1139–1148.return to text

  32. Blair, J.A. “The Rhetoric of Visual Arguments.” In: Defining Visual Rhetorics, edited by C.A. Hill & M.H. Helmers, pgs. 41–62.return to text

  33. Kinross, R., “The Rhetoric of Neutrality.” In: Design Discourse: History, Theory, Criticism, edited by V. Margolin, pgs. 197–218.return to text

  34. Kostelnick, C. “The Visual Rhetoric of Data Displays: The Conundrum of Clarity.” IEEE Transactions on Professional Communication. 50.4, (2007): pgs. 280–294. return to text

  35. Psotka, J. “Educational Games and Virtual Reality as Disruptive Technologies.” Educational Technology & Society, 16.2 (2013): pgs. 69–80. return to text

  36. Reiner, M. “Seeing Through Touch: The Role of Haptic Information in Visualization.” In Visualization: Theory and Practice in Science Education, edited by J.K. Gilbert, M. Reiner, and M. Nakhleh, pgs. 73–84.return to text

  37. Reiner, M. “Sensory Cues, Visualization and Physics Learning.” International Journal of Science Education, 31.3 (2009): pgs. 343–364.return to text

  38. Wong, B. “Points of View: Color Coding.” Nature Methods, 7.8 (2010): p. 573.return to text

  39. Lidwell, W., Holden, K., and Butler, J. Universal Principles of Design. Beverly, MA, USA: Rockport Publishers, Inc., 2003.return to text

  40. Meirelles, I. Design for Information. Beverly, MA, USA: Rockport, 2013.return to text

  41. Benson, N., Collin, C., Grand, V., Lazyan, M., Ginsburg, J., and Weeks, M. “George Armitage Miller: Cognitive Psychology.” In The Psychology Book: Big Ideas Simply Explained, pgs. 170–173.return to text

  42. Krause, K. “A Framework for Visual Communication at Nature.” Public Understanding of Science, 26.1 (2017): pgs. 15–24.return to text

  43. Wong, B. “Points of View: The Design Process.” Nature Methods, 8.12 (2011): p. 987. return to text

  44. Krug, S. Don’t Make Me Think: A Common Sense Approach to Web Usability. Berkeley, CA, USA: New Riders, 2006.return to text

  45. Rodríguez Estrada, F.C. and Davis, L.S. “Improving Visual Communi-cation of Science Through the Incorporation of Graphic Design Theories and Practices Into Science Communication.” Science Communication 37.1 (2015): pgs. 140-148.return to text

  46. Davis, M. Graphic Design Theory. London, UK: Thames & Hudson, 2012.return to text

  47. Mathewson, J.H. “Visual-Spatial Thinking: An Aspect of Science Overlooked by Educators.” Science Education, 83.1 (1999): pgs. 33–54.return to text

  48. Gilbert, J.K. “Visualization: An Emergent Field of Practice and Enquiry in Science Education.” In Visualization: Theory and Practice in Science Education, edited by J.K. Gilbert, M. Reiner, and M. Nakhleh, pgs. 3–24.return to text

  49. Land, M.H. “Full STEAM Ahead: The Benefits of Integrating the Arts Into STEM.” Procedia Computer Science, 20 (2013): pgs. 547–552.return to text

  50. Yakman, G. “Recognizing the A in STEM Education.” Middle Ground, August 2012. return to text

  51. Gentile, L., Caudill, L., Fetea, M., Hill, A., Hoke, K., Lawson, B., Ovidiu Lipan, et al. “Challenging Disciplinary Boundaries in the First Year: A New Introductory Integrated Science Course for STEM Majors.” Journal of College Science Teaching, 41.5 (2012): pgs. 44–50. return to text

  52. Crider, A. 2015. “Teaching Visual Literacy in the Astronomy Classroom.” New Directions for Teaching and Learning, 141 (2015): pgs. 7–18.

  53. Vande Moere, A. and Purchase, H. “On the Role of Design in Information Visualization.” Information Visualization, 10.4 (2011): pgs. 356–371.

  54. Marshall, J.A., Castillo, A.J., Cardenas, M.B. “The Effect of Modeling and Visualization Resources on Student Understanding of Physical Hydrology.” Journal of Geoscience Education, 63.2 (2015): pgs. 127–139. return to text

  55. Mercer-Mapstone, L.D., Kuchel, L.J. “Integrating Communication Skills into Undergraduate Science Degrees: A Practical and Evidence-Based Approach.” Teaching & Learning Inquiry, 4.2 (2016): pgs. 1–14.return to text

  56. Ellwein, A.L., Hartley, L.M., Donovan, S., and Billick, I. “Using Rich Context and Data Exploration to Improve Engagement with Climate Data and Data Literacy: Bringing a Field Station into the College Classroom.” Journal of Geoscience Education, 62.4 (2014): pgs. 578–586. return to text

  57. Langen, T.A., Mourad, T., Grant, B.W., Gram, W.K., Abraham, B.J., Fernandez, D.S., Carroll, M., Nuding, A., Balch, J.K., Rodriguez, J., and Hampton, S.E. “Using Large Public Datasets in the Undergraduate Ecology Classroom.” Frontiers in Ecology and the Environment, 12.6 (2014): pgs. 362–363.return to text

  58. Peterlin, P. “Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite.” European Journal of Physics, 31.4 (2010): pgs. 919–931.return to text

  59. Reiner, M. “Sensory Cues, Visualization and Physics Learning.” International Journal of Science Education, 31.3 (2009): pgs. 343–364. return to text

  60. Kohnle, A., Douglass, M., Edwards, T.J., Gillies, A.D., Hooley, C.A., Sinclair, B.D. “Developing and evaluating animations for teaching quantum mechanics concepts.” European Journal of Physics, 31.6 (2010): pgs. 1441–1455. return to text

  61. Hill, M., Sharma, M.D., Johnston, H. “How online learning modules can improve the representational fluency and conceptual understanding of university physics students.” European Journal of Physics, 36.4 (2015): pgs. 1–20. return to text

  62. Marshall, J.A., Castillo, A.J., Cardenas, M.B. “The Effect of Modeling and Visualization Resources on Student Understanding of Physical Hydrology.” Journal of Geoscience Education, 63.2 (2015): pgs. 127–139.return to text

  63. Lester, J.N. and Evans, K.R. “Instructors’ Experiences of Collaboratively Teaching: Building Something Bigger.” International Journal of Teaching and Learning in Higher Education, 20.3 (2009): pgs. 373-382.return to text

  64. McDonald, A. “In Between: Challenging the Role of Graphic Design by Situating It in a Collaborative, Interdisciplinary Class.” In Design Studies: Theory and Research in Graphic Design, edited by A. Bennett, New York, NY, USA: Princeton Architectural Press, pgs. 354–69.return to text