Open Badges for Promoting Open Practices in the Institutional Repository: A Pilot Project

INTRODUCTION This paper describes a pilot project conducted at a mid-sized research university to integrate an Open Badge into the institutional repository (IR) alongside research articles. The Open Badge was intended to indicate that the research article in question complies with a national funders’ open access (OA) policy. METHODS This study employed a two-step process to investigate the value of badges: first, researchers were surveyed to ask their opinions about using badges in the IR; second, user testing was done with a small group of researchers to assess whether badges are easy to apply during the process of depositing an article to the IR. RESULTS A minority of respondents to the survey indicated that they saw value in an open badge. Participants in the testing component revealed several areas where the overall interface to the IR submission process could be improved. DISCUSSION It was clear that there are opportunities to promote open practices relating to national funders’ open access policy in our sample. However, any incentive represented by an open badge may be overshadowed if the infrastructure in which it is presented is not sufficiently streamlined. CONCLUSION Scholars are not willing to spend much, if any, additional time to indicate compliance with an open access policy. Adding an open badge was neither an incentive nor a disincentive for promoting open practices.


INTRODUCTION
Open badges are a form of micro-credential that are increasingly being used within the academic sector to recognize skills and achievements. Some research has suggested that badges can be effective tools to promote open practices, while other studies find that using badges with faculty may not be meaningful. However, because low faculty participation in institutional repositories (IRs) is a well-documented issue, and because faculty cite many burdens with respect to administrative duties, exploring the potential value of badges to demonstrate compliance with a major research policy and incentivise participation is worthwhile. This paper explores the value of an open badge to recognize researchers' compliance with the open access (OA) policy of three major Canadian funding agencies. We surveyed funded researchers at one Canadian university to learn whether they see value in such an open badge, and whether it would promote the practice of depositing materials in the IR. We then conducted user testing to learn whether applying an open badge during the deposit process was feasible and time-effective for researchers. This paper argues that most researchers do not see micro-credentials as an incentive, and that the time commitment involved in depositing to IR is a barrier to participation. This paper contributes to the literature on open badges for recognizing open practices in scholarly communication, showing that the value of microcredentials depends on where they appear.

LITERATURE REVIEW
Although IRs have been in wide use in Canadian universities since the early 2000s (Bailey et al., 2006), there has been renewed interest in their use due to the recent Tri-Agency Open Access Policy on Publications (Government of Canada, 2015). The Policy, which was introduced in 2015, provides a harmonized approach to OA requirements across Canada's three main funding agencies (The Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council of Canada (NSERC), and the Social Sciences and Humanities Research Council of Canada (SSHRC)). The crux of the Policy, which builds upon the policy in place at CIHR since 2008, is that Tri-Agency-funded researchers must ensure that any peer-reviewed journal publications are freely accessibleeither in an online repository or an open access journal-within 12 months of publication. In this way, the Policy allows authors to choose either the gold (publish in an OA journal) or the green (self-archive a version of a paper in an open repository) route to OA, although the two routes are not mutually exclusive because gold OA articles can still be archived in a repository as well. Our study was interested in investigating the green route to complying with the Policy because although fees associated with OA publishing (e.g. article processing charges or APCs) are eligible costs under Tri-Agency guidelines, grant budgets are often tight and self-archiving is often a more practical method of complying with the Policy, if publishers permit it.
Recent research suggests that "badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories" (Kidwell et al., 2016). This study found badges to be a very effective incentive among authors submitting open data to a peer-reviewed journal: during the study period, the percentage of authors reporting open data increased from 3% to 23%, with no corresponding increases seen in comparison journals. In this case, badges were issued by the journal. Kidwell et al.'s study has been included in a "manifesto for reproducible science" that includes badges as one tool in a suite of "effective interventions for nudging incentives" (Munafo et al., 2017).
A systematic review of the literature in health and medical research (587 articles) conducted by Rowhani-Farid et al. in 2016 found only the Kidwell article noted above discussing how badges can incentivize researchers to comply with open data principles. The authors of this review conclude that with regards to medical researchers making their data available, "what is lacking, it appears, are rewards that incentivize researchers to share their data." While we have found plenty of evidence of other institutions experimenting with badges for students, and even our own institution for faculty completing certain teaching tasks, we have not found any evidence of other institutions issuing badges to faculty/researchers in order to promote the principles of OA or grant compliance (Association of College & Research Libraries, 2015;Pittinsky, 2015;University of Calgary, 2015).
Many academic journals have begun issuing digital badges next to papers that comply with OA principles, or that include open datasets (Grahe, 2014;Johnson, 2014;Setchell, 2016;Pexman, 2017), though usually, as is the case with our implementation, "papers will not be required to pursue or earn these badges to be published" (Johnson, 2014). Organizations, too, now regularly issue badges to show compliance with the principles of OA, (Open Science Framework, 2017). Allen et. al. (2014) describe 14 badge types that can be assigned to various authorship roles in an academic paper, yet none of these proposed badges indicate compliance with a funder's requirement that data or papers be made available as OA.

METHODS
This study employed a two-step process to investigate the value of badges: first, researchers were surveyed to ask their opinions about using badges in the IR; second, user testing was done with a small group of researchers to assess whether badges are easy to apply during the process of depositing an article to the IR.

Survey
The survey data were collected at the University of Calgary, a mid-sized research university with almost 1,900 full-time academic staff (University of Calgary, 2018). This study received ethics approval from the University of Calgary's Conjoint Faculties Research Ethics Board (ID REB170110), and all participants gave informed consent before taking part. Because we were specifically targeting faculty who had received funding that is subject to the Tri-Agency Open Access Policy on Publications, the university's Research Services Office provided a list of principal investigators who had received an operating grant from one of the Tri-Agencies since 2015, when the Policy went into effect.
A short questionnaire was developed based on the research questions. The questionnaire provided brief explanations on the format and purpose of open badges, and how they could be deployed to indicate compliance with the Policy in the IR, along with a visual example of what this would look like. This introduction was followed by six questions asking respondents to rate their level of knowledge of the Policy (on a four-point scale from "none" to "high"), to indicate whether they had ever used the IR, to rate their perception of the value of an open badge, and to indicate how much time they would be willing to spend in applying a digital badge to a research article. A copy of the questionnaire is available in Appendix A. The questionnaire was created in LibWizard, an online survey tool that is part of the SpringShare platform.
An email invitation to participate was distributed in May 2017 to a total of 220 faculty members who fit the inclusion criteria mentioned above. The survey was made available for one month, with a reminder sent at the three-week mark.

User Testing
For the user testing, a mockup of the IR was created to mimic the DSpace 6.2 XMLUI upon which the repository is running. The mockup closely mimicked the existing repository interface, with the addition of one additional dropdown field where participants could choose an open badge to indicate compliance with Tri-Agency Policy on Open Access.
The user testing employed a script that put testers into a consistent scenario where they took the role of a faculty member with funding from a Tri-Council Agency, and they were fulfilling the Policy by depositing a post-print version of a journal article into the repository. To remove the burdens associated with checking publisher policies for the purposes of this user testing, the script informed participants that the PDF they were directed to upload met the publisher's policies; however, no other information about the publisher's policies were provided. Both the scenario and the article used in the user tests were mockups; as such, no bibliographic data for the post print article could be found through scholarly databases.
An email invitation was distributed in October 2018 to the respondents who had provided contact information to participate in the user testing phase. The user testing was arranged in 30-minute blocks. All three members of the study team participated: one to facilitate the user testing process with participants, one to keep time during each stage of the deposit process, and one to take notes on issues that participants were experiencing during the deposit process. Participants used the Skype for Business application to broadcast their screen as well as their face and voice via wi-fi to testers in another room. The user experience tests were not recorded, and the facilitator left the room during the user testing itself.

Survey
A total of 48 responses to the survey were received, which yielded a 22% response rate. Respondents could complete the questionnaire anonymously, or they could provide an email address if they wanted to participate in step two of the research. Just over half of respondents (n=26, 54%) reported a "moderate" level of knowledge of the Tri-Agency Policy on Open Access (see Fig. 1). Additionally, most respondents reported that they had never used the IR in the past (69%, n=33). Those respondents who reported moderate to high knowledge of the Policy were more likely to have used the IR than those reporting lower levels of knowledge (see Fig. 2).  In response to the question about perceived value of a digital badge to indicate compliance with the Policy, there was a relatively even split in terms of responses, with about a third of respondents indicating value (37%), indicating no value (27%), or indicating that they were not sure (35%). Additionally, 40% of respondents said that the presence of a digital badge would not increase their likelihood of using the IR to deposit Tri-Agency-funded research.
Most respondents indicated they would be willing to spend 1-5 minutes to apply a digital badge (31%, n=15); the next most common response was "no time" (27%, n=13). Respondents who saw value in badges were more willing to spend more time (e.g. 10-15 mins) on uploading a badge (see Fig. 3).

User Testing
Of the 18 participants who had provided their email address when responding to the initial survey, six faculty members from a range of disciplines participated in the user experience testing, which took place in the library as well as in faculty offices.
The user-experience tests required users to deposit a post-print into the IR, a process that includes six steps: logging in, selecting the appropriate collection, describing the submission (adding metadata), uploading the item, selecting a distribution license, and completing the submission. The time required for this process amongst participants ranged from 4:54 to 17:00 minutes, with an average of 11:08 minutes. As might be expected, the step where users described the submission was by far the most time-consuming, with the average user requiring more than 8 minutes to complete this section (see Fig. 3). The large variation in time taken by participants stemmed primarily from the level of effort participants went to in completing the tasks: while some simply left unclear fields blank, others searched the web or scrutinized the instructions carefully to try to complete all fields. The distribution license task asks participants to agree to the Library's non-exclusive distribution license; two of six participants read the license while the majority simply clicked through the screen.
The description task was also where participants experienced the most problems or issues (see Fig. 4). A wide variety of issues with the form were either verbalized by participants or observed by the study team, including a lack of clarity between label, description and information request; unclear formatting of field descriptions; long and unwieldy drop down lists; and buttons or descriptions without a clear call to action. Although the field to select an open badge was part of this description task, none of the participants remarked on it, and when specifically probed after the user testing was complete, not a single participant recalled whether or not they had selected this field. In the item-submission task, two participants were observed to be unsure as to whether the postprint PDF had actually been uploaded, due to a lack of confirmation from the interface. One user was unsure about whether to apply an embargo to the submission.

DISCUSSION
Despite widespread adoption of open access mandates by major funding agencies all over the world, there are still very few incentives in the current research culture to share work openly, and dedicated infrastructure and policies to facilitate compliance are also lacking. The first large-scale analysis of OA mandate compliance shows that research funded by agencies with stronger environments of enforcement and convenient infrastructures for depositing articles in a timely fashion is much more likely to be made openly available (Larivière and Sugimoto, 2018).
Of all the funding agencies studied by Larivière and Sugimoto, the Tri-Agencies had among the lowest rates of compliance. The authors suggest that this is because there is little to no enforcement of the policy and because of the Tri-Agencies' flexibility on when articles must be deposited: since authors are permitted up to a year after publication to make their articles available, it is suggested that many lose track of this obligation. The Tri-Agencies, unlike funders with higher compliance rates like the U.S. National Institutes of Health, also do not provide designated repositories for deposit; instead, authors are directed to deposit their work in an institutional or disciplinary repository of their choice. There is also no suggestion that grant payments have been withheld due to a lack of compliance, unlike funding agencies with higher rates of compliance (Van Noorden, 2014).
Larivière and Sugimoto's research shows that this flexible and voluntary mandate has resulted in low rates of compliance, and our own survey suggests that the mandate, after 3 years, is still not universally understood by all funded researchers, at least those at our institution. In this context, it is perhaps not surprising that the "nudge towards openness" (Maynard and Munafò, 2018) represented by an open badge was not an effective incentive for authors, most of whom reported never having used the IR prior to this study, and who reported not being willing to spend much time in activities related to Policy compliance.
It may be this differing context that accounts for the difference between our findings and those of Kidwell et al (2016). That study examined the effectiveness of a small incentive in the context of a process that academic researchers are already highly incentivised to do: have papers accepted to academic journals. Within this very motivating context, the small nudge of a badge was effective. In contrast, our study attempted to incent researchers to make their research openly accessible in a context where they were not already motivated to comply. Additionally, the audiences, purposes, and outcomes of sharing data openly may be very different from those involved with sharing journal articles, and so it is possible that the type of academic output to be shared also influences the perceived value and subsequent use of badges.
Another factor in our results was the inherent complexity of the IR interface, a factor that has been documented in previous research (Hee Kim & Ho Kim, 2008, Betz & Hall, 2015. The deposit process presents users with over 20 metadata fields to increase the discoverability of the item and assist with ensuring copyright compliance. Although most of these fields are not mandatory, it is not surprising that the deposit process took time and caused researchers difficulty. Many of the fields were not easily understood by researchers in our test group: for example, users are asked to include a link to the publisher's copyright policy that allows selfarchiving in a repository, and many of them had difficulty finding this information.
We used the results of the user experience testing portion of this study to make tangible changes to the deposit description page of the IR that we believe will address the issues most users encountered. Metadata fields were reordered so that higher priority and commonly used research output description fields were collocated together. Lower priority metadata fields were moved to the bottom of the description page. Field description text was changed to more clearly indicate optional fields. Fields that caused consistent confusion or were consistently skipped were changed to workflow only settings, a setting that only displays a field during a workflow step so that repository staff would be responsible for entering this data. We will continue to monitor data entry to ensure that these changes improve user experience of the repository deposit process. Our user experience testing showed that adding an open badge was neither an incentive nor a disincentive for researchers depositing papers in an IR. As such, we decided not to deploy the badge beyond a testing environment.
Badges have been effective in other contexts. The negative results found in our research may be a result of the repository environment, a lack of awareness, a lack of incentive, or a combination of all of the above. The user experience is already complex, and a deposit interface is not an optimal location to educate researchers on OA policy. Based on the results of both our survey and the user experience testing, we would not be confident that deploying a badge in the repository would result in increased understanding of the Policy or even accurate applications of the badge itself.
This study took place at one institution, with researchers subject to the same open access policy. These contextual factors may affect our results. For example, it would be interesting to replicate this study in a context where researchers are more highly-incentivized to adopt open practices, for example those funded by those with stricter enforcement of open access requirements.
We would consider revisiting the utility of badges to signal compliance with OA policies if awareness of the Policy increased among funded researchers, if compliance measures around the policy got stricter, or if the badge was used in another system more closely tied to research compliance and reporting, for example a current research information system.

CONCLUSION
Despite some research suggesting that open badges can serve as an incentive or reward to indicate compliance with open science principles, we were unable to replicate this finding with our small group of researchers. At least in our instance, the process for depositing articles in the IR is already perceived as too time consuming and/or difficult, and even though the addition of a badge indicating compliance with the Tri-Agency Open Access Policy on Publications consisted of a single addition to the process, our users appeared to be too preoccupied with the overall deposit process to notice the badge. We conclude that badges may seem useful in theory, but that the larger context of these small incentives is likely significant: if they take little to no time to implement on the part of the researchers submitting documentation to the IR, or if they are located within the context of something that they are already incentivised to do, like have an article accepted to a peer reviewed journal.