International Journal of Human-Computer Studies

A


Introduction
City administrations have long generated and analyzed a plethora of data about their jurisdictions to understand patterns and trends and to plan accordingly.Much of these data have, however, been relatively dispersed and closed in nature, held within the organization that generated them.The move to open data as part of a transition towards open government has led to urban data being corralled into open data repositories and becoming accessible to all (Kitchin, 2014).While urban data are now increasingly available, the skills and literacy to handle, process, analyze and visualize such data are lacking.One solution to these issues has been to create city dashboards that translate these data into visualizations to aid understanding.City dashboards are, therefore, created to instill a sense of accountability for public institutions to the larger civilian population (Lněnička and Máchová, 2015).Indeed, city dashboards have become a popular means for organizing and visualizing urban data for a broad constituency of users; analysts, policymakers, politicians, and the public alike.
In this context, a vehicular dashboard is often used as a metaphor to describe what city dashboards are, how they are operated by citizens, and how they are used as data processing tools by different types of professional agencies (Batty, 2015;Few, 2006).Multiple user-types make use of this data in different ways, for example, the driver or mechanic can use this information to make informed decisions about driving or servicing the vehicle.This includes historical data (service mileage), current data (vehicle speed), and information pertaining to the vehicles potential future (fuel levels).This information helps the owner, driver, or mechanic to determine whether they should continue to drive the vehicle or act otherwise accordingly.Notably, a vehicle dashboard does not tell these stakeholders how to solve any of the various technical issues that may arise from traveling in the vehicle.The same is true for city dashboards, they display quantifiable data about a citys status in space and time, but they do not principally state how citizens, city management, or private enterprises should act; displaying only the necessary information that is needed to react to potentially influential issues highlighted in the data.City dashboards are gaining in popularity and are currently constructed to provide citizens and city management with the information required to build knowledge, but not necessarily provide them with any direct services.
City dashboards use a suite of visual analytics -dynamic and/or interactive graphics (e.g.gauges, traffic lights, meters, arrows, bar charts, graphs, maps) -to display and communicate information about the performance, structure, and patterns and trends of cities. Often these visual displays are interactive with users able to select, filter and query data, zoom in/out and pan, and overlay data.Because the data used are recurrent, quantitative measures many of the visualizations show change over time and are updated as new data are released.In some cases, dashboards are displaying real-time data that update every few seconds or minutes.By utilizing the power of the visual to summarize and convey a large amount of information, city dashboards enable a user to quickly and effectively explore the characteristics and structure of datasets to https://doi.org/10.1016/j.ijhcs.2020.102429Received 10 March 2019; Received in revised form 24 November 2019; Accepted 11 March 2020 identify patterns and interpret trends.As such, they act as cognitive tools that improve a users span of control over voluminous, varied, and quickly transitioning data (Brath and Peters, 2004).
In practice city dashboards act as a middleware for data collection and sharing, as well as providing location-based services, mobile, and environment focused information and can be considered a form of urban informatics (Foth, 2009).In this context, Foth has identified urban informatics as a combination of research from a varied assortment of academic studies, ranging from the urban (urban studies, urban planning, etc.), social (media studies, communications studies, cultural studies, etc.), and the technical (computer science, software design, human-computer interaction, etc.) (Foth, 2009).Fundamentally, this requires the adaptation, development, and the piloting of innovative information communication technology (ICT) and information visualization projects for application in realworld settings (Bilandzic and Venable, 2011).The success of these applications in an urban informatics setting depends on the extent to which they are accepted and adopted by citizens and effectively used in community or policy processes.It is therefore essential that new platforms within this domain a thoroughly explored from a users perspective.Cities are an important area of application for both ubiquitous computing (ubicomp) and ICTs.However, urban visualizations presented on city dashboards, their appropriate diffusion into urban routine, and the provision of and management of services remains problematic.To design and develop new technologies that engage citizens in cities, new forms of online participation are required to make the best use of the latest ICT (Batty et al., 2012).
As city dashboards can potentially engage with areas of social, cultural, and urban studies to bring further understanding to the complexities of modern city landscapes, the success of such endeavors requires a close open-data partnership with city councils, local communities, and organizations; as well as public state and government institutions.For the communication and dissemination of open-data via city dashboards, new sources of urban data, such as city-specific issues, plans, policies, and the creation of new platforms, requires the use of new smart city technologies.In most cases, ICTs and ubicomp are applied.Where ICT is an extensional term used for certain types of information technology (IT) that work towards the unification of communications technology and computers (Christensson, 2010), and ubicomp, where computing is created to appear anytime and everywhere (Weiser, 1991).ICT and ubicomp, therefore, include systems that enable access, storage, transmission, and manipulation of digital information in a smart city or modern urban context.

Problem space, related work, and positioning in contemporary HCI research
In general, research concerning city dashboards focuses on open data policy guidelines from the perspective of the data publisher (Open Data Barometer, 2017).One critique of current dashboard systems is that they are not created with effectiveness, efficiency, or user satisfaction principles concerning usability in mind (Kitchin and McArdle, 2017).From observing city dashboards in practice, it seems that the creators of city dashboards are accustomed to conceptualizing the people who use the systems they develop (De Cindio et al., 2007).Unfortunately, this often means that a passive role is assigned to users and user-focused design protocol is often secondary or neglected altogether.This has led to the observation that city dashboards are not always intuitive to use and at times they leave the user frustrated and unable to complete simple tasks (Kitchin and McArdle, 2017).
Additionally, Kitchin & McArdle reported that city dashboards are engineered as data portals that perform specific, pre-set functions with seemingly little thought given or applied to the holistic effects of functionality, usability, or user experience.It is also apparent that many dashboards do not place much value on visual aesthetics or interface design paradigms (Kitchin and McArdle, 2017).In a broader set of papers based on their experience of researching city dashboards and building the original Dublin Dashboard, they provided an extensive range of critiques concerning the production and use of city dashboards (Kitchin et al., 2016;Kitchin and McArdle, 2016;McArdle and Kitchin, 2016b).They summarize their concerns into six main critiques, which they frame in relation to a set of questions: 1. Epistemology: how are insight and value derived from city dashboards? 2. Scope and access: how comprehensive and open are city dashboards? 3. Veracity and validity: to what extent can we trust city dashboards? 4. Usability and literacy: how comprehensible and usable are city dashboards? 5. Use and utility: what are the applications and value of city dashboards? 6. Ethics: how can we ensure that dashboards do not cause harm and are used in a socially responsible manner?
This analysis raises several fundamental and instrumental issues about how city dashboards work in producing knowledge about cities and how they are used in urban planning and management.Rather than reject the use of city dashboards, Kitchin and McArdle instead recognize their utility and value as a mode of communication and means of making sense of the city but suggest that for dashboards to work, the questions above need adequate redress.In this paper, we are concerned with questions of epistemology, usability and literacy, and the extent to which city dashboards are currently designed to facilitate effective use by their users.
In response to these concerns, we propose to include both a usability centric review of relevant human-computer interaction (HCI) work and contemporary digital civic-oriented research with additional social computing perspectives.Specifically, this comprises of user-centered design (UCD) principles applied to website design and the evaluation of data visualization techniques; as it has been suggested that the aesthetic dimensions of visual design should also be applied to graphical, multimodal, and virtual interfaces in the digital domain to increase the impact of user experience (Bollini, 2017).HCI research has validated multiple evaluation techniques from a users perspective that place much more relevance on the users of a system in the design process (Abras et al., 2004).While HCI evaluation focusses on the design of ICT-based products and services, we further suggest that urban informatics also enriches our research with examples of other types of human-computer interface artifacts that can be used within smart cities. Research from HCI provides evidence of the acceptance of new technology as having two primary determinants, perceived usefulness and perceived ease-ofuse (Davis, 1989).Furthermore, to extend our user-focused research, community informatics applications, that are at the forefront of emergent theoretical framings for public focused technologies, are also required (Erete, 2013).In addition, while advancements in digital civics enable governments and policymakers to engage with and gather input from a broader spectrum of the public, it is necessary to understand how communities interact with emergent smart-city technologies and how to make sense of the community produced data (Mahyar et al., 2019).Targeted user-centered research holds the prospect of providing insight into how publics engage with technologies to participate in local democratic processes and predicts the potential impact that new technologies can have on communities in the future (Gurstein, 2000).Community informatics, therefore, draws our attention to the importance of the opinions of the various stakeholders in these communities, particularly their interests and the roles they can play, as emphasized through the concept of participation in the design, development, and research of community-focused technologies (Halabi et al., 2015).

City dashboard evaluations from a users perspective
The nature of community informatics and city dashboards in an urban informatics context should, therefore, focus on the evaluation of perceived usefulness and ease-of-use of new technology from multiple stakeholder viewpoints; however, this approach alone can potentially lack rigor from an HCI perspective (de Moor, 2007).By exploring new applications of ICT in an urban informatics context we can continue to study and learn more about how people and technology form relationships in everyday life (Gordon and Mihailidis, 2016;McCarthy and Wright, 2004).HCI and its focus on interaction design and usability studies, combined with more contemporary, civic-oriented research, provides us with an inclusive and cross-disciplinary approach for the innovation of technologies that can add value to citizen engagements with open data.Equally, urban informatics studies create real-world evaluation contexts that can inform HCI research into user requirements for future city dashboard developments.
While digital civics have been used as a starting point for including the perspectives and experiences of the public more broadly, further balance can be found by including end-user perspectives in system design specifications.This will help support the creation of meaningful digital interventions that facilitate civic engagement as performed by both communities and public officials (Corbett and Le Dantec, 2018).While dashboard developers can aid the communication and interpretation of data through open data and visual analytics, and support collaborative or individual approaches to understanding how a city is performing, data literacy and making sense of urban data still remains a challenge (Mahyar et al., 2019).
Our research positioning was therefore focused on evaluating the quality of design effectiveness and usability from a city dashboards users perspective.The aim was to examine city dashboard users by surveying city dashboard interface practices and gathering insight into the creation of effective website designs, data visualization techniques, and identify the specific data content users choose to engage with.Within this analysis, the accepted ISO definition of usability (ISO, 2018) was adopted as a core element to inform the research practices implemented, where multiple HCI methodologies exist for the evaluation of such topics.Our study thus applied a protocol analysis from HCI to explore four existing city dashboard systems.By applying a structured model of analysis, it was possible to highlight specific areas of concern that could then be translated into guidelines and recommendations that inform future city dashboard system design and support city dashboard users in performing a diverse set of tasks.

Analysis, guidelines, and recommendations for future city dashboard systems
It should be the aim of any public-facing city dashboard project to construct a proficient system for presenting many different users with temporal and spatial data that are seamlessly informative and meaningful.For this to be effective, a dashboard needs to be, on the one hand, designed using established design principles, and on the other, designed around the specific needs of its prospective community of users.There has been much research aimed at formulating general principles of usability for human-computer interaction (Shneiderman et al., 2016).Usability can be generally regarded as ensuring that website interaction is easy to learn, effective, and enjoyable from the perspective of the user (Nielsen, 1994).Therefore, to incorporate usability into the creation of a city dashboard, it is important to have purposefully constructed, well-designed, and robustly validated interface guidelines.Furthermore, with respect to data visualization, a fundamental aspect of city dashboard design, graphics need to present complex ideas with clarity, precision, and efficiency (Tufte, 2001).With respect to presenting maps, they also need to adopt established map design principles (Robinson, 1958;Tyner, 2014).These guidelines are intended to address the common pitfalls in the presentation of scientific data to the public and provide a means to guide and assess the design of quality city dashboards.Guidelines seem to have been little implemented with respect to many city dashboards, which suffer from several website design, data visualization, and fundamental map design pitfalls that limit effective communication of the status of a city.Moreover, no guidelines that are specifically tailored to city dashboard design exist.
By discovering and understanding the fundamental elements of ICT that users engage with when interacting with quality city dashboards, the application of a more focused design framework and evaluation practice can be explored.For example, a new city dashboard would be greatly facilitated by targeting design system elements and user requirements that are of quantifiable concern, as informed through userinteraction observations.This is particularly useful given the lack of specific guidelines for quality city dashboards.Our approach to considering dashboard design strategies has therefore been to consult with users about their knowledge and experiences of city dashboards, with our questions informed by existing design guidelines found in similar HCI literature.To do this, we have applied a qualitative methodology of data generation and explored a structured model of user-data analysis.
The strength of open-ended user-focused examination in this context is the ability to provide complex descriptions of how the user cohort interacted with and understood the city dashboards they engaged with.This methodology provided us with in-depth information about the human element of dashboard usability issues; that is, the often-contradictory behaviors, beliefs, opinions, emotions, and relationships that are developed between people and the technology they use (Mack et al., 2005).Moreover, qualitative methods were effective at identifying the less-tangible factors of human-computer communications; a role in city dashboard evaluation that may not currently be apparent, such as social norms, socioeconomic status, gender, ethnicity, and religion.In our case, we used a combination of interviews, protocol analysis that elicited verbal reports through concurrent think-aloud sessions, and critical incident technique (CIT) procedures to collect interaction data of significance to the participants to explore user experiences of city dashboards on four specific city dashboard systems: Dublin, London, Hawaii, and New York.

The four case study city dashboards
Many cities now possess a city dashboard, though many of them take similar forms, especially if they are produced using commercial software such as Socrata or Tableau.The four dashboards chosen for the study were selected based upon several high-level criteria for the comparisons of open-data platforms.We sought four dashboards that had taken different approaches to dashboard design and had varying look, feel, scope, and tools.Specific considerations were data sources and veracity; variation in the visualization techniques applied; the dashboard creators motivations; funding sources; and the self-classification of the data presented.Consideration was also given for the intended target audience, the use of software licenses, interface features, data transformations, data aggregation, and the use of application programming interfaces (APIs).The four dashboards were selected by the full research team with the aim of getting user feedback on the varying approaches and tools to guide the re-designing of the projects city dashboard.As far as we are aware none of the dashboards involved user feedback in their planning and design beyond user requirements from the city office commissioning the dashboard (and this did not happen either in the case of London).

Dublin dashboard (dublindashboard.ie)
The Dublin Dashboard (NIRSA, 2014) was produced by the Programmable City project and the All-Island Research Observatory (AIRO) at Maynooth University, in collaboration with Dublin City Council.The project was created to provide Irish citizens, public service employees, and private businesses with access to thematically grouped, real-time, and time-series indicator data, as well as interactive maps.The dashboard was funded through the European Research Council (ERC) and Science Foundation Ireland (SFI).
The Dublin Dashboard is optimized to run on a web browser and consists of 11 top-level modules and numerous sub-modules, many of which are hosted by other websites, see Fig. 1 for examples.The landing page presents the user with a mix of bespoke applications developed specifically for the project and curated collections of tools and applications that were developed by other ventures.The design of the website is based on classic information seeking and browsing, where overview data is first presented, followed by further details on demand (Shneiderman, 1996).There are eight main points of interest for the user to explore on the dashboard: Data visualizations on the Dublin dashboard were created using Highcharts (an SVG-based, multi-platform charting library), Leaflet (an open-source mapping JavaScript library), and propriety software such as ArcGIS, InstantAtlas and Tableau.For a more in-depth account of the Dublin Dashboard design and functionality, see (McArdle and Kitchin, 2016a).

Hawaii dashboard (dashboard.hawaii.gov)
The state of Hawaii launched its Open Performance Hawaii (State of Hawaii, 2014) website as part of the states IT / IRM Transformation Strategic Plan, 2014.In the pursuit of contemporary open-government philosophies, the Hawaii Dashboard was created to be accessible by different types of users for viewing recipient-specific government spending through hypertextual representations of data arranged in a catalog format, see Fig. 2 for examples.The site is operated by Socrata, a government service provider that consults with governing bodies on how to build, manage, and develop digital initiatives and programs.The site allows the user to search the website, access the data catalog directly, take a tutorial on how to use the data, and provides a link to a developer website to facilitate API access for new projects.There are seven main navigation points of the site for users to engage with:  As it is built upon the Socrata system, the Hawaii Dashboard is an archetypical example of a commercial online city dashboard app hosted in a web browser.The state-run website presents the public with a broad set of information via data visualizations of, for example, budget and economy, education, healthcare and seniors, energy, agriculture and environment, public safety, and open government.Users can monitor the states performance through the comparison of historic and more current data as key performance indicators (KPIs).The performance with respect to targets is visualized with a green tick or red cross.Linked beneath these indicators are more in-depth data, presenting a graph of annual trends and a link to data sources.There is little detail about how the data are derived or how the public is supposed to use this information.

London dashboard (citydashboard.org/london)
The London dashboard (CASA Research Lab, 2018) is an alpha prototype city dashboard that was created to link London data to an iPad data wall in City Hall (Smart London Board, 2013).It is an example of an at-a-glance dashboard that summarizes and aggregates the quantitative real-time data for the city of London and displays this information using a modularized interface and interactive map, see Fig. 3 for examples of these data modules.The project was created in 2012 by members of the Centre for Advanced Spatial Analysis (CASA) at University College London, as part of the National e-Infrastructure for Social Simulation (NeISS) project, funded by Jisc.The data provided in the display are sourced from a diverse set of data suppliers using APIs from JQuery, OpenLayers, and Google.Citizens can view real-time information about the weather, air pollution, public transport, public bike availability, river levels, electricity demand, the stock market, twitter trends relating to London, view live traffic camera feeds, and the happiness level of the city.These data are also geospatially mapped using OpenStreetMap.

New York dashboard (datausa.io/profile/geo/new-york-ny)
The New York dashboard (Data USA, 2014) is part of the larger Data USA project that was developed by the MIT Media Lab.The project aims to make all open-government data available and accessible to citizens across the United States.The project was started in 2014 and is directed by Deloitte, Datawheel, and Professor Cesar Hidalgo of the MIT Media Lab.The Data USA project has a large, multidisciplinary team comprising of economists, data scientists, designers, researchers, and business executives who have spent many years working with policymakers, government officials, and citizens.
The New York section of the Data USA website presents users with data on the state, the metropolitan area, the city, and other smaller local areas within the city.For the study presented, city-level data was chosen.The landing page displays an aerial shot of Manhattan with six static statistics: population, median age, median household income, poverty rate, number of employees, and median property values.Below this are six sections, each representing more specific thematic data categories.Each thematic subcategory has a short descriptive sentence supported with a data visualization, see Fig. 4. The city data are presented on a single page application that is divided into the following six themes: 1.About New York -a high-level breakdown; including population, median age, household income, number of universities, etc. 2. Economy -data visualizations of economy-related data; including, wages, occupations, and industries.3. Health & safety -health and crime-related data; including, healthcare cover, hospital care for medical patients, and health risks.4. Diversity -demographic data; including, age, heritage, and military service.5. Education -higher education data relating to the student population, the area of specialty, and university costs.6.Housing & living -property-related data, such as household income, housing, and transportation.
The individual data sources are accessible by the user and are from multiple sources; for example, the American Community Survey, Bureau of Economic Analysis, Bureau of Labor Statistics, and others.The data on the site can also be accessed via the Data USA API and each visualization can be saved, shared, or compared to other locations in the USA.

Analysis of city dashboards
For the analysis of the four city dashboards, a concurrent think-aloud (CTA) protocol was implemented (Lewis, 1982).This process sought to facilitate insight into the participants cognitive processes during their interactions with each of the dashboards.CTA is commonly used in usability studies to understand the participants thoughts as they interact with a system by having them think-aloud while they work.Empirical evidence suggests that when following CTA protocols, more problems can be detected by means of observation (Van Den Haak et al., 2003).By applying this technique, we gained insights into the participants thoughts as and when they occurred and as they attempted to work through any issues they encountered.Furthermore, CTA allowed us to elicit real-time feedback and emotional responses for each of the individual dashboards.

Recruitment
Recruitment took place in the Republic of Ireland over a period of nine months from June 2017 to March 2018.Members of the public were sought through social media using the Twitter account of both the project and the dashboard (over 1000 followers).The recruitment strategy sought to target members of the four local authorities responsible for managing the city for which we are re-building a dashboard, along with other stakeholders outside of this region and members of the public across Ireland via an email invitation to participate.Within these stakeholder groups, participants were sought that had some familiarity with data handling and visualization, and those that might be considered expert users.All interview sessions were conducted with counterbalanced measures to decrease the chances that the order in which the four dashboards were presented might adversely influence the results.In the case of the experiment presented, the four city dashboard conditions required 24 orders of treatment (4 × 3 × 2 × 1), and the number of required participants was therefore calculated as a multiple of 24.We, therefore, targeted a sample of 24 users given: (a) it would be difficult to recruit double this number within the small group of officials available to the study through the stakeholders, and the difficulties we encountered in recruiting people interested in city dashboards from members of the public; (b) the in-depth nature of the study, involving one-hour CTA sessions, we felt that sufficient data and depth of knowledge would be produced to quickly reach saturation, wherein few additional insights would be apparent in the data (Fusch and Ness, 2015;Glaser and Strauss, 2017).If the latter proved not to be the case in practice, then we would have sought to extend the sample through intensive new rounds of recruitment, but this did not arise (which was evident in our analysis).Twenty-four participants were, therefore, initially recruited for the study; however, three participants later withdrew from the experiment due to scheduling conflicts and a second date could not be rearranged.The final participant group consisted of 11 males and 10 females (n = 21).The median age for the group was 35 to 44.The education level (NFQ scale) of the participant group was: Advanced Certificate (level 6) n = 2; Honors Bachelors Degree (level 8) n = 7; Masters Degree (level 9) n = 11; and Doctoral Degree (level 10) n = 1.All participants were currently working within ISCO-08 employment categories of: Technical / Engineer n = 9; Management / Executive n = 6; Science / Medicine n = 4; and Clerical / Office n = 2.

Interview methodology
The dashboard counterbalancing measures were randomly assigned to each participant in advance of their scheduled meeting.Participants were, therefore, exploring all four city dashboards in a randomized order.All sessions were conducted face-to-face, at locations and times around Ireland that suited the individuals requirements; this included both workplace visits at local authority offices and home visits.All sessions were recorded, and each session generally lasted about an hour.Each user interaction session began by explaining the research project and the interview session format that was to follow.Each participant was asked at this stage to quantify on a continuous scale of 0 to 100 and verbally explain their current knowledge and understanding of the city dashboard domain and identify their previous experiences and motivations to use such systems.The participants, therefore, self-identified as technically competent users who belonged to the dashboard user-types of advanced users, end-users, and novice users, see Fig. 5.
Next, participants were asked to explore the four city dashboards using the CTA protocol; in which they were encouraged to verbalize their thoughts and actions (Lewis, 1982).Participants were asked to say whatever came to mind as they explored different areas of the dashboards; this included what they were looking at, thinking, doing, and feeling at that time.Where participants naturally finished talking, their statements were probed further via interview-laddering to reveal subconscious motives (Hawley, 2009).During this time, and to further facilitate the analysis of the collected interview data, critical incident technique (CIT) procedures were followed to collect contextual information relating to critically significant exchanges and observed behaviors that occurred during the session (Flanagan, 1954).For each dashboard element that gained attention from the participant, the interviewer made note of and elicited where appropriate further details: 1.The cause of any critical incidents.2. The participants feelings towards the incident.3. The actions that were taken because of the incident.4. Changes that could be made to repeat/rectify the situation.
Observational notes were also used to highlight specific instants in the interview that contrasted what the participants said versus what they did; specifically noting areas of the dashboard interaction where participants encounter some difficulty.

Results
Prior to analysis, all data were transcribed, and user codes were assigned for anonymity.The data were then examined using a content analysis (CA) over a period of three months.A CA is a research method for studying communication artifacts and making replicable and valid inferences through the interpretation and coding of transcripts (Denzin and Lincoln, 2008).The    CA explored the communication of city dashboard quality artifacts and examined patterns in user communications in a systematic manner.This involved the methodical reading of transcripts and the creation and assignment of codes that indicated the presence of interesting or meaningful content that could be used to describe or make inferences about the characteristics of a quality city dashboard.This systematic approach made it possible to quantitatively analyze each individual city dashboard and gain insight into the users understanding of the discipline.Coherent thought-units were extracted from the transcripts, where a single thought-unit represented a contiguous or holistic statement (Hatfield and Weider-Hatfield, 1978).Each thought-unit was then reviewed for further division into coherent single statements (or thought-subunits), as the participant pool all exhibited different experiential quality criteria within individual thought units.The collated single statements were then systematically categorized and subjected to analysis to develop a categorical system of related statements and to highlight interesting or meaningful for city dashboards.These individual single-statements were then matched for semantic similarities, removing redundancies.

Novices
To further reduce the pool of statements and to add supplementary validation to the content analysis process, an affinity diagramming workshop was conducted by three project researchers to group semantically similar words or phrases under a collective category or to split categories into different elements using human and subject matter knowledge (Rosenfeld and Morville, 2002).This process hierarchical content categories in a bottom-up procedure (Beyer and Holtzblatt, 1999).The participants of this workshop were fully aware of how the content data were generated, were familiar with the city dashboard quality criterion, and were able to identify the specific dimensions of subjective quality in city dashboard design that was expressed in the data.In total, 164 unique content categories were identified in this analysis.In two four-hour workshops, this process iteratively characterized these into a three-tier content category hierarchy, resulting in a hierarchical representation of the criterion dimensions of experiential quality expressed across all sessions.Specifically, solutions for three-level categorization were developed for effective web design, effective data visualization, and dashboard data types.See Table 1 for a data inventory of participant responses and Tables 2-4 for a 3-tier hierarchical representation of all CA categories, single statement counts, and a brief synopsis of each category.
The CIT analysis focused upon the intentionality and implication of dashboard design strategies, identifying possible complications associated with major user-system interactions and providing a qualitative breakdown of user sentiments towards each of the four systems.The CIT was carried out by three project researchers, compensating for any potential biasing, where majority consensus was required for positive, neutral, and negative sentiment identification.These CIT methods generated a list of positive and negative behaviors that were used for individual dashboard performance appraisal.From the combined analysis of researcher notes and the collected transcripts, CIT data were analyzed and organized within the same 3-tier hierarchy to represent the participants thoughts and attitudes towards incidents for each of the dashboard systems, see Fig. 6.This breakdown highlighted how the individual website design, visualization, and content of all four systems were discretely influential to the overall user evaluations.

Discussion of results
From the analysis of CA data, specific areas of interest were identified contributing to our knowledge of existing dashboard design interaction.The CIT data were used to expand CA areas and identify the unique elements of the four dashboards viewed that were more successful or unsuccessful regarding incident outcomes.These data are further supported with examples taken from the verbal data.

Navigation
During the interviews, 1130 single statements were recorded under the tier 2 CA category relating to dashboard navigation.Participant interest in this area related specifically to the website navigation methods implemented on each of the four city dashboards analyzed: Dublin = 399; Hawaii = 234; New York = 276; London = 221.The CIT revealed that navigational incidents experienced on the New York dashboard were resolved with the most positive outcomes and that the London dashboard measured the least favorably, see Fig. 7.The main criticisms of the participants expressed across the four dashboards were that the pages they were viewing were not laid out logically and that the data modules being displayed appeared unstructured and, therefore, inconsistent and irregular from an information architecture perspective.Particularly, the structural design of the information environment for the London dashboard was deemed particularly problematic.For Dublin and Hawaii, the overall navigation of the website was excessively complex, with a disproportionate amount of clicking required for exploring or seeking out data.The single page application methodology and data module structures executed on the New York Dashboard were met with overwhelmingly favorable responses as they maintained consistency and clearly divided data categories thematically.

Style
Each of the four city dashboards presented with unique website design styles.In total, this CA area received 619 single statements, with many incidents relating to the look and feel of the different systems: Dublin = 221; Hawaii = 111; New York = 157; London = 130; see Fig. 9. Overall, the participants expressed partiality for the systematic use of colors, typeface, and the overall style of the New York website; a dashboard project that boasts a large multidisciplinary team that includes professional designers.Whereas in contrast, the styles of the Dublin, Hawaii, and London dashboards were criticized for their lack of overall coherence of design, corporate "cookie cut" stylization, and a general lack of basic or modern design values respectively.Both the Dublin and London dashboards displayed little attention to the application of a coherent style guide; a collection of pre-designed elements, graphics, and rules that ensure that separate website pieces are consistent and create a cohesive experience.
"[The New York dashboard] its beautifully presented.It's a work of art.... my favorite look and feel."-PLN_499 (novice user) "The London one may be awesome in terms of the information... but its not awesome to look at."-VMF_529 (novice user) "[The Hawaii dashboard]... its just so textual and so boring."-GOT_164 (end-user) "Stylistically, on the Dublin dashboard... information is not easy to read or easy to understand." -APU_881 (end-user)

Visualizations
One of the fundamental elements of a city dashboard is the data visualizations they display.In our study, the visual elements of the four city dashboards received a total of 577 single statements: Dublin = 221; Hawaii = 94; New York = 179; London = 83.These statements covered many issues relating to the types of visualizations used and the use of maps and images in support of the more traditional visual communication techniques applied.From the analysis of CIT data, the visualization methodologies presented on the New York dashboard were the most well-received, communicating information both clearly and efficiently, see Fig. 10.This dashboard was highlighted as being proficient in communicating information clearly, efficiently, and correctly using maps for displaying data; making complex data more accessible, understandable, and usable   G.W. Young and R. Kitchin International Journal of Human-Computer Studies 140 (2020) 102429

Veracity
The veracity of the data presented on city dashboards received a total of 424 single statements in the CA category of effective data visualization; Dublin = 109; Hawaii = 88; New York = 133; London = 94.This category included issues of displaying where and when the data were collected, when were they updated last, the clarity in which the data sources were presented, and the degree to which the data were perceived to be accurate, precise, or to be trusted.In this case, it was critical incidents on the London dashboard that were considered to have the most positive outcomes, see Fig. 11.On the London dashboard, the source of the data and the update timeframe were dynamically displayed, communicating to the user the real-time nature of the data and the API source.Although this information was also presented on the other dashboards, it was not always clear or prominent.Therefore, the other three dashboards received mixed responses.Many of the data displayed on these dashboards were out of date or the source was perceived as being untrustworthy.In the context presented on each of these city dashboards, the impact and meaning of data veracity were received quite differently.The participants were more aware of bias, abnormalities or inconsistencies, and duplication, potentially impacting upon the accuracy of the data.
"[On the Dublin dashboard] Im confident in knowing that a person is responsible for making it and if I want to talk to them, I know I can do that."-AOR_375 (end-user) "[On the Dublin dashboard] it makes it more trustworthy in a sense because it's not just ad hoc you put up there then, you know?It seems to be verified; I think anyway."-KHI_515 (end-user) "[For New York] Theres like a disclaimer at the start... Which is good.It lets people know that, while it might work on a big scale, it might not always be as accurate on a smaller scale."-IDO_272 (advanced user)

Users
References to different types of dashboard user received a total of 365 single statements: Dublin = 98; Hawaii = 70; New York = 96; London = 101.The critical incidents around this issue pointed to an awareness of the different types of users and their requirements when interacting with city dashboards.Particularly, the user cohort was aware that certain areas of the dashboards that were viewed were not appropriate for all types of users, such as novice users, end-users, or advanced users for several different reasons.Firstly, some of the complex data analysis and visualizations used were not deemed effective for the delivery of a coherent message for novices.Secondly, the variations in data veracity and flexibility of the visualizations used were often not stringent enough for official use by endusers.Finally, access to raw data sources and data manipulation were frequently constrained for use by advanced users.Although there were issues for all dashboards in serving all users effectively, the Hawaii website and its use of KPIs and the catalog system were praised for potentially serving many data user types and the New York site for its ability to access APIs and share data visualizations.By dividing the potentially broad target market for city dashboards into subsets of consumers with common needs, wants, demand, or characteristics, the dashboards equipped themselves with the appropriate tools to handle specific queries, see Fig. 12.While there are efforts being made to make these four dashboards more user-friendly, they should also serve to enable users to form a better data-driven understanding of a city.
"Its not for... like if my mother looked at this [London dashboard], she wouldnt know what shes looking at." -SOT_205 (end-user) "[For the Dublin dashboard] if there was an American company that was expanding and was looking for different cities to move to, then having this information easily available is the kind of thing that they'd be interested in." -RAJ_136 (advanced user) "I would never in a million years want to use someone elses graph or bar chart, like ever.But as you know, theres other users that were like, Yes, thats all I have to do, lets just take this instead."-AOR_375 (end-user)

Data types
Of the effective data CA category, data types were the most commented upon area of interest (SS = 293), serving in highlighting the differences between the types of temporal data that the dashboards chose to display; for Dublin = 82, Hawaii = 56, New York = 96, and London = 59, see Fig. 13.Fundamentally, by providing both real-time and historic data content, participant interest was focused on the use of open, free, and reusable content, and information that was delivered and displayed immediately, such as real-time sensor data and alerts.These types of data reassured the users that all information was up to date and was being monitored in real-time.The use of historic data supplied the users with an explorable catalog of data for a contextual understanding of the often-interrelated data themes.Each of the city dashboard websites was motivated to make information easy to access and use.Interestingly, the New York dashboard also included Stories on the sites main menu, which features highlights and interesting outputs from the different data sets.Cesar Hidalgo, one of the sites creators, and director of the MIT Media Lab's Macro Connections group told the website CityLab -'People do not understand the world by looking at numbers, they understand it by looking at stories.' (Misra, 2016).This approach to city data made the Data USA system stand out from the other city dashboards viewed.
"I can see that it is current because the graphics keeping changing."-AOR_375 (end-user) "I can see the camera, which at least makes you think its live."-CPK_931 (end-user) "I think people are interested in the story behind the actual graphics.And its always good to have a story to tell, so that people can show their friends and say, Listen, isnt this interesting here what has been happening'."-GOT_164 (end-user)

Usability
The CA category for usability received a total of 205 single statements: Dublin = 27; Hawaii = 30; New York = 52; London = 19.In this subcategory of effective web design, interactions with data that encouraged further action and the degrees of which the dashboards fulfilled issues relating to effectiveness, efficiency, and satisfaction in use were focused upon.Critical incidents relating to the usability of the New York dashboard were resolved with the highest positivity, see Fig. 14.In combination with the other positively reviewed attributes of this site, the relationships between user perceptions of a systems look and feel and usability was apparent (Tractinsky et al., 2000).Furthermore, the lack of coherent style applied on the London dashboard caused several interaction issues and the lack of consistency to the information presented on the website meant that the system was not as explorable as the other systems.The four dashboards each implemented different effective website designs and strategies for communicating the relevant information quickly to the users, the differences in CIT outcomes can be seen in Fig. 15.This analysis included the provision of at-a-glance data modules, the use of social media for communicating new data and giving the users a brief summary of the information being viewed.In this category, the London dashboard provided our users with multiple at-a-glance modules that displayed real-time data on one page; the New York dashboard displayed six atoms of historic data on the landing page also.However, Dublin and Hawaii did not display any data on their landing pages and therefore, users had to be enticed to dig deeper to find data.
"…out of all of these, I would only consider one to be an actual dashboard and that is the London one because it's the only one that actually presents information that I don't need to click."-PNL_499 (novice user) "I think the New York one with the stats at a glance was easier to read and get information out of." -KHI_515 (end-user)

Toward design guidelines for building city dashboards
By embedding design and usability focused evaluation methods from HCI early in the creation of a user-centered city dashboard it is possible to gain valuable insight into the community informatics issues of user interactions with city dashboards and provide some forethought into their potential use by community users (Corbett and Le Dantec, 2018;Mahyar et al., 2019).In our study, we were able to observe first-hand how the quality of existing systems are perceived, which tools are potentially useful in different activities, and how the different approaches to city dashboard design can be implemented and changed across different systems.This is particularly advantageous when applying new and emergent ubicomp, ICT, and other new mixed reality interaction methods that can potentially be used for urban data visualization projects (Young et al., 2017).If the impact of a city dashboard is found to be wanting in an early evaluation, where the original design goals are not met or new problems arise from the introduction of new and unfamiliar technology, a decision can be made to reevaluate the research direction and to then redesign and re-implement the system.In this way, a new city dashboard can better meet the needs of its stakeholders, founded on user-identified shortcomings.
Although, in this case, think-aloud methodologies were proven to be effective in dashboard evaluation and comparison, they still do not closely evaluate the effects and impact of specific elements of dashboard design on users.However, the application of quality design guidelines and other documentation, when applied throughout the developmental stages of a project, with both expert and novice users alike, would move towards addressing some of these shortfalls (Nielsen, 2005).The first stage of producing city dashboard design guidelines was to deliver an empathic understanding of the problems faced by users when using existing dashboard systems.By observing, engaging, and empathizing with users directly, we generated data that built up our own knowledge and understanding of their experiences and motivations to visit or use city dashboards.Furthermore, to expand this study further, we sought to immerse ourselves in the larger city dashboard domain from an urban informatics perspective (Kitchin and McArdle, 2016;Lněnička and Máchová, 2015), to gain a deeper understanding of the issues users face by exploring diverse city dashboard projects and then comparing these results across the four chosen systems.Empathy building in creating design guidelines was a crucial element for us as the practice of understanding the community and user cohorts thoughts and feelings were to allow the project to put to one side existing assumptions about the domain and to gain insight into city dashboard users and their fundamental needs and requirements.By forming a better understanding of the users of city dashboards, it was possible to then explore the reported perceptual quality and user requirements that were conveyed as being fundamental for a quality dashboard system and create a list of requirements in the form of system design guidelines and suggestions, see Table 5.
System design guidelines are intended to shape how a system is conceived, planned, and implemented.Nielson has stated that 'It may be one of the defining characteristics of next-generation user interfaces that they abandon the principle of conforming to a canonical interface style and instead become more radically tailored to the requirements of individual tasks' (Nielsen, 1992).To achieve this level of specialization the creation of organization-specific guidelines that reflect the needs and tasks of the users and not the developer or managers are required    (Henninger et al., 1995).Usability guidelines have proven highly durable and have been shown to hold true over time (Nielsen, 2005).Current guideline trends have moved towards brevity over detail and most contemporary guideline documents are concise, provide a basic overview, and are designed as reminders, not rules (Nielsen and Molich, 1990).To help represent these concepts, there are several existing website design and data visualization principles that can be explored to move dashboard designs beyond the familiar, more functional operations that can be seen today (Kelleher and Wagener, 2011;Shneiderman et al., 2016;Tufte, 2001).
It must also be noted that no single philosophy is a perfect fit for all scenarios.Therefore, the concepts presented here may be followed verbatim or be applied in varying degrees depending on the city dashboard design brief and the overall project objectives.By employing primary concepts of web-design aesthetics, usability, and functionality, any dashboard venture should have a clear understanding of which of these values can address the unique requirements of any given project.As there are countless factors that affect each of these elements, it is ultimately the final user who decides if a dashboard is visually appealing and is easy to use.From conception to completion, if a city dashboard is not created effectively, it will be evaluated poorly by users and display only modest website analytical scores.

7.
The purpose of this study was to create specific user requirements and design guidelines for new city dashboards based on user experiences in a community informatics context.A CTA protocol, along with CIT was followed to collect interaction data of significance to the user.Three main categories of interest were identified from previous research: effective website design (Shneiderman et al., 2016); effective data visualization (Kelleher and Wagener, 2011); and the specific data categories that are available to the user (Kitchin et al., 2015).The participants verbally expressed what they were looking at, thinking about, the tasks they were undertaking, and how they were feeling throughout their session.It was, therefore, possible to objectively observe and comprehend the cognitive processes associated with dashboard system interaction and quality evaluation from a users perspective.By using a mix of different user types as participants in this procedure, the research provided insight into how dashboard systems can be applied in practice and revealed real-world usability and user experience issues.
The creation of a usable city dashboard interface that is instructional and helpful, while also delivering data visualizations that are usable and meaningful to multiple user-types, is critical.In pursuit of this goal, several shortfalls were observed between the city dashboard systems that were evaluated.Fundamentally, the vocabulary used to describe city data was too variable in terms of technical accuracy; and simple explanations of meaning could have profitably been used to ensure that the user felt confident in drawing meaning from the data and could, therefore, act upon it accordingly.Contemporary research on data analysis and visualization in digital civics has also highlighted this 'digital divide' between user-types and that this has become a common problem worldwide (Zhu et al., 2015).By improving upon this, the user will be more likely to feel confident to progress towards exploring other information, gaining experience and transition towards improving their domain knowledge and experience.Another commonality between the observed dashboards was the lack of carefully designed supporting materials, such as help pages and tutorials, to facilitate this advancement of knowledge.Moreover, the study highlighted how the number of steps for accessing data or data sources should be limited to just a few clicks and the relationships between data sets should also be logical and innately explorable.The user should be able to quickly navigate back-and-forth through familiar territory without becoming lost, overwhelmed, or overloaded with external links.Reducing these types of action will serve to reduce the anxiety felt by users and build confidence through the positive reinforcement of their actions.
The findings from the empirical study were used to create general dashboard system guidelines and recommendations for creating and assessing city dashboards.These guidelines focused on nine general principles (relating to navigation, data utility, style, visualizations, veracity, users, data types, usability, communication), rather than rules,

Table 5
Guidelines Design Focus Suggestions

Navigation
Implement logical navigation patterns and menus so users can explore data with confidence and quickly trace their progress throughout the dashboard hierarchy.
If the user takes a wrong turn, facilitate menu functions that help correct unintended actions.Also, provide users with 'accelerators' to speed-up navigation and facilitate frequent actions.

Data Utility
The intended meaning of the data being presented must be explicit and have actionable applications for diverse user types with different data literacies.
The utility of data depends upon the anticipated usage of the dashboard.To communicate this, the dashboard should use clear consistent terminology, familiar words, phrases, and concepts.

Style
The overall look and feel should be representative of the city and should be applied consistently to help build familiarity and confidence as well as improving the overall user experience.
There should be no ambiguity in the look and feel of the user interface; all pages and themes should remain the same throughout the different areas of the dashboard.

Visualizations
Data visualizations must be of a suitable type and have further contextual information or metadata attached for clarity of meaning.
Think about consistency and relevance in the use of all visualizations, dialogues, and actively support users in building knowledge.

Veracity
The accuracy, precision, lineage, source, and age of data must accompany all data.
Ambiguous or untrustworthy data should not be used.Provide links to data sources so that users can also access and asses the veracity of data.

Users
Potential user-types for city dashboards are broad; therefore, implement user-centered design methodologies for all system development workflows to build empathy with the different user types of dashboard systems.
Engage with users and build empathy with them via workshops and questionnaires.If repeated user testing is not feasible, consider applying targeted scope user personas in support of less frequent user testing and for informing minor dashboard design choices.

Data Types
Use both real-time and historic data; arrange them logically and group them thematically.
Include real-time data to assure users that data are current, as well as displaying time-series data to provide further context and encourage data exploration.

Usability
Usability heuristics should be applied at all stages by all project team members.
Use heuristics to provide users with explorable information, usable interfaces, and learnable interaction methodologies that are informed via validated HCI research.

Communication
Use effective language and appropriate visualizations to communicate meaning across multiple platforms, media, and via multiple modalities if possible.
Different dashboard pages can serve different users, therefore, understand your audience and focus on communicating data across multiple pages, platforms, and modalities accordingly.
that are informed by the wider HCI literature (Henninger et al., 1995;Nielsen, 2005;Nielsen and Molich, 1990) but are tailored for city dashboards.These guidelines will be deployed and evaluated in future dashboards task-based performance evaluations, validating their use for creating effective city dashboards.

Declaration of Competing Interest
None.
1. Dublin Overview -an at-a-glance dashboard page that presents the user with current values of key indicators in Dublin.2. Hows Dublin Doing?-a set of time-series indicators related to different themes: transport, housing, economy, etc. 3. Dublin Real-Time -real-time environment and travel data presented via interactive maps.4. Dublin Mapped -a set of mapping modules that presents a variety of data, such as census variables, crime, social welfare, and historic environmental and archaeological data. 5. Dublin Planning and Dublin Housing -a set of mapping modules presenting housing, planning, and land-use data.6. Dublin near to me and Dublin reporting -information on the location of key services and allow citizens to report issues via a mapping interface.7. Dublin Data Stores and Dublin Apps -a module that links the user to other websites and portals, providing access to data that is specific to Dublin.8. Dublin Bay Dashboard -a separate dashboard that provides data tools and visualizations about the coastline and sea around Dublin.

Fig. 1 .
Fig. 1.Data and interaction elements of the Dublin dashboard.

Fig. 3 .
Fig. 3. Real-time data modules on the London dashboard.

Fig. 4 .
Fig. 4. Real-time data modules on the London dashboard.

Fig. 5 .
Fig. 5. Representative visualization of participants experience and domain knowledge; dotted line representing the linear average.

Fig. 6 .
Fig. 6.Diverging stacked bar chart showing Tier 1 CA categories for all dashboard CIT inspections.
Fig. 7. Diverging stacked bar chart showing CIT results for website navigation.

Fig. 9 .
Fig. 9. Diverging stacked bar chart showing CIT results for website style.

Fig. 12 .
Fig. 12. Diverging stacked bar chart showing CIT results for types of users.

"
Yes, [the New York dashboard] gives the city level, but it went further down into the different boroughs.And then you come down to the end and then you can explore different parts of New York."-NIL_855 (advanced user) "[London dashboard] its such a mess, I dont know where to start."-PML_401 (novice user) "Now, [on the London dashboard] you dont even know where you can click, honest to God.What?" -SOT_205 (end-user) 5.9.Communication The CA category for communication received a total of 175 single statements: Dublin = 73; Hawaii = 23; New York = 36; London = 43.

Fig. 13 .
Fig. 13.Diverging stacked bar chart showing CIT results for types of data.

Table 1
Data inventory for protocol analysis (n = 21) for all city dashboards.
⁎ Dashboard website failed to load.

Table 2 3
-tier hierarchical representation of the dimensions of experiential quality criterion for effective web design expressed as single statements (SS) for all systems.

Table 3 3
-tier hierarchical representation of the dimensions of experiential quality criterion for effective data visuzlizations expressed as single statements (SS) for all systems

Table 4 3
-tier hierarchical representation of the dimensions of experiential quality criterion for effective dashboard content expressed as single statements (SS) for all systems.