Publications and Evaluations: Conducting a Baseline Assessment of Open Access Adoption and Support at an R2 University

INTRODUCTION This study reflects a mid-size university library’s first attempt to assess faculty research output to shape future scholarly communications efforts. METHODOLOGY The assessment combined a qualitative analysis of the university’s reappointment, promotion, and tenure (RPT) documents with a quantitative analysis of faculty publications recorded in Digital Measures from 2015-2019. The RPT documents were coded to determine which indicators of scholarly value were emphasized, then compared with data on where and how faculty were publishing. RESULTS Within RPT documents, peer review was frequently emphasized, but open access and predatory publishing were not mentioned. The majority of publications occurred in hybrid journals, and publishing was concentrated among only a handful of publishers, with 11 publishers responsible for 62% of faculty’s research output. OA journal publications have risen slightly in recent years but still accounted for only 20.7% of UCCS publications. However, predatory publishing was very low, accounting for less than 5% of UCCS publications. DISCUSSION More education is needed on the importance of open access and how to assess the quality of a journal. RPT criteria consistently mentioned certain indicators of scholarly quality, but these indicators were often vague and preferential to traditional publishing models. Both open access and predatory publishing remain low, and additional education may help faculty feel more confident in exploring alternative publishing models. CONCLUSION Assessing the research output of faculty and how scholarship is being evaluated within each college can help libraries to tailor their efforts to promote open access publishing. However, the lack of OA support in the RPT criteria suggests a larger cultural shift is needed to make faculty not only aware of OA, but also encouraged and supported in publishing OA.


INTRODUCTION
Since the term "open access" was coined in 2001 by the Open Society Institute in Budapest (Solomon, 2013), open access publishing has grown and evolved and has been predicted to surpass and replace traditional publishing as the primary means of scholarly communication (Lewis, 2012). The open access (OA) publishing model is increasingly recognized as an important means of improving equity in access to research, facilitating a faster spread of information and innovation, and returning publicly funded research to the public. The National Institutes of Health became the first public funding agency to have a nationally legislated OA mandate in 2008 (Suber, 2008a), and in the 12 years since, Green OA or Gold OA mandates have become more common among public funders, private funders, and universities (ROARMAP, 2020). As a result, ensuring funding compliance and tracking and promoting OA publishing has become a major focus of academic libraries' scholarly communications work.
Despite the growing push for broader research dissemination from funders and institutions, faculty and university administrators continue have reservations about open access journals' peer review and publication quality (Warlick & Vaughan, 2007;Laughtin-Dunker, 2014;Swoger, Brainard, & Hofmann, 2015;Zhu, 2017). Assessing quality is difficult, and often popularity or prestige is treated as a substitute measurement. Peter Suber observed in 2008, "If you've ever had to consider a candidate for hiring, promotion, or tenure, you know that it's much easier to tell whether she has published in high-impact or high-prestige journals than to tell whether her articles are actually good" (Suber, 2008b, n. p.). However, these value swaps often ignore how metrics give an unfair advantage to traditional publishing and older journals, as "[m]ost OA journals are new and it takes time for any journal, even one born excellent, to earn prestige in proportion to its quality" (Suber, 2008b, n. p.). Additionally reliance on metrics that evaluate prestige and popularity ignores the biases the underly the search algorithms that determine which research is found and seen (Noble, 2018).
Many academics are aware of and concerned by the association between OA publishing and predatory publishing. Predatory publishing, or as Anderson calls it "deceptive publishing," is "a practice whereby a company creates a journal on false pretenses for the purposes of defrauding authors, helping authors deceive their colleagues, or both" (Anderson, 2019, n. p.). Although its relation to the open access movement has been hotly debated, predatory publishing continues to taint perceptions of online and open access publishing. Suspicions about predatory journals can discourage faculty from publishing in OA journals and lead administrators and tenure review committees to dismiss or undervalue OA publications.
Previously the University of Colorado Colorado Springs (UCCS) had been classified as an M1 school (defined as "Master's Colleges and Universities-Larger programs"), but in 2018, UCCS became an R2 (defined as "Doctoral Universities-High research activity") (The Carnegie Classification of Institutions of Higher Education, n.d.). With this new classification comes growing attention to research outputs at UCCS. This change presented an opportunity for the Kraemer Family Library (KFL) to assess UCCS's research outputs and to strategically expand its existing support of scholarly communications. KFL had offered limited scholarly communications education, with a once-yearly workshop that introduced the concept of Open Access along with other publishing topics. Although some departments have established research funds for covering Article Processing Charges (APCs), neither UCCS nor the KFL offer faculty any APC funding. Additionally, UCCS is a member of Mountain Scholar (the collective digital repository for several universities in Wyoming and Colorado), but the repository is not well known to UCCS faculty. Only four faculty members have contributed to the collection, and none of their contributions are articles.
As UCCS is both a public institution and a frequent recipient of public funding, it seemed especially important to assess UCCS's existing adoption and support of OA publishing and to see if concerns about predatory publishing were warranted. To understand UCCS's scholarly communications needs, it was important to not only assess current research outputs, but also the university's expectations for faculty research. The researcher wanted to know how research was evaluated by tenure and promotion committees in each college, and whether OA or predatory publishing was explicitly discussed in reappointment, promotion, and tenure (RPT) criteria. The researcher also wanted to see if data already being collected in faculty activity reports could be used for a baseline assessment of UCCS's publishing habits, particularly with regards to Gold OA and predatory publishing.

Assessments of OA Publishing
OA publishing and article availability has been measured using a variety of bibliometric data sources, such as Google Scholar (Martín-Martín, Costas, vanLeeuwen, Delgado López-Cózar, 2018), proprietary datasets such as Scopus and Web of Science (van Leeuwen, Tatum, & Wouters, 2018;Archambault et al., 2013;Archambault et al., 2014;Piwowar et al., 2018), and, more recently, Unpaywall (Piwowar et al., 2018;Robinson-Garcia, Costas, & van Leeuwen, 2020). As van Leeuwen et al. (2018) write, the process of quantifying OA is complicated by several factors, including "the lack of clear and consistent identification of open access publications in bibliographic data" (p. 1161). Archambault et al. (2013) performed a large-scale assessment of OA availability, using a sample of 320,000 papers randomly selected from Scopus. Papers in Gold OA journals were identified by comparing ISSN and title information from Scopus with OA journal lists from the DOAJ and PubMed Central. Other OA availability was determined by searching in Google and Google Scholar, and if a free version was found, it was considered OA. However, they observed that "some of the documents freely available at that time ceased to be free later" (Archambault et al., 2013, p. 4). Piwowar et al. (2018) compared OA across three different samples from Crossref-DOI, Web of Science, and articles accessed through Unpaywall using oaDOI to identify legal OA copies of the sample articles. However, Piwowar et al. (2018) also performed an accuracy check that suggests that oaDOI "are probably overlooking around 23% of OA otherwise discoverable" (p. 18/23). The findings of these studies generally indicate that OA, or at least freely available copies, is rising. Archambault et al. (2014) updated the results of their 2013 study, and found that "more than 50% of the scientific papers published" between 2007 and 2012 could be freely downloaded (p. ii). Piwowar et al. (2018) found that 27.9% of all DOI bearing articles were OA and that the likelihood of OA rose with the article's recency.
Another frequent form of OA publishing assessment is a survey or interview of faculty attitudes and understanding surrounding OA (some examples include: Warlick & Vaughn, 2007;Hurrell & Meijer-Kline, 2011;Dawson, 2014;Laughtin-Dunker, 2014;Rodriguez, 2014;Gaines, 2015;Swoger et al., 2015;Yang & Li, 2015;and Zhu, 2017). In 2010, Jenfring Xia took the results of 26 such surveys and was able to map the attitudes captured over time, finding that, from 1991-2008, awareness of OA publishing, and reports of publishing in OA journals had gone up, but that willingness to publish OA continued to vary. More recent studies continue to show high faculty awareness of OA. Dawson (2014) found that 91% of faculty surveyed claimed to either understand or have some knowledge of OA, Yang and Li (2015) found that 88% of their faculty respondents were aware of OA journals in their fields, and 93% of faculty surveyed in Zhu (2017) recognized OA as important.
However, there is a discrepancy between awareness or a positive perception of OA and actual open publications. Morris and Thorn (2009) found that relying on faculty recall of OA activity does not provide accurate results. In their study, they asked questions regarding faculty attitude towards OA and whether faculty had ever published OA but followed this question by asking faculty to name the journals. Despite faculty supporting OA in principal and believing that they had published in an open journal, 31.04% of journals faculty referred to were not OA journals (Morris & Thorn, 2009, p. 225), leading them to conclude " [t]here is much more support for OA publication in theory…than in practice" (p. 236). This disparity between attitude and action is echoed in others' studies. Laughtin-Dunker (2014) and Dawson (2014) both found that while faculty expressed support for OA, they seemed confused about what OA is (Laughtin-Dunker, 2014) or lacked practical knowledge on related concepts like author's rights (Dawson, 2014). In Zhu (2017), 93% of faculty responded that OA was important, but only 41% of them claimed to have published in an OA journal.

Tenure and Promotion and OA Publishing
Research on the attitudes and perceptions surrounding OA and other scholarly communications issues provides some insight into the reasoning behind faculty's publication habits. Although some faculty have appeared pleased with the comparative speed of OA publishing (Nariani & Fernandez, 2012), others have expressed concerns about "author pays" models and are reluctant or unable to pay APCs (Dawson, 2014;Zhu, 2017). A frequently repeated concern is that OA journals are not high quality or lack peer review (Laughtin-Dunker, 2014; Gaines, 2015;Swoger et al., 2015;Yang & Li, 2015). These concerns about OA journals' peer review or research quality are tied to concerns about predatory publishing, but also to RPT criteria. Gaines (2015) writes, "Another common concern with respondents was the perceived lack of support from their department, or confusion over whether an open access article would 'count' for promotion and tenure" (p. 17). Swoger et al. (2015) found the same, writing, "Indeed, the primacy of tenure concerns and the need to publish in peer-reviewed outlets influences nearly every aspect of faculty's research and publishing behavior and decisions" (p. 11). Yang and Li (2015) conclude their study of Texas A&M faculty by observing, "Whether OA journal publications are acceptable for consideration of tenure and promotion in their department varies across TAMU schools and colleges. Further research on comparing tenure and promotion policies among schools and colleges, or even departments and disciplines might be needed" (p. 18).
Some studies analyzed RPT criteria surrounding research, either through review of criteria themselves or surveys of administrators. These studies seem to support faculty concerns, finding that traditional measures of scholarly quality are most frequently emphasized (Alperin et al., 2019) and often these quality measures are unclear, unreliable, or based solely on venue of publication or claims of peer review (Schimanksi & Alperin, 2018). Hurrell and Meijer-Kline's 2011 review of the literature on OA and tenure and promotion finds that "OA publications are seen to have a slightly negative or neutral effect on one's academic career" (p. 14) and they argue, "If tenure and promotion committees do not recognize newer forms of scholarly outputs, including OA materials as legitimate, then authors may be reluctant to explore these new options" (p. 1). The need to adhere to RPT criteria is especially strong at small and midsize institutions, like UCCS, with limited funds, limited staff, and greater emphasis on teaching, Del Toro et al. (2011) argue. They write that "[r]esearchers at such institutions, therefore, may feel they are in a more tenuous position and may be less willing to take steps they view as risky" (p. 153).
Although faculty perceptions and understanding are helpful in focusing scholarly communication education efforts, the literature repeatedly shows that positive perceptions of OA did not determine where faculty published. However, given the influence of RPT criteria in publishing decisions, another relevant consideration for determining UCCS's OA understanding and acceptance are the RPT criteria adopted by each college and department. Determining what language was used to describe and evaluate scholarship within those criteria could provide an insight into campus and administrative perceptions of open access as well as the motivations of faculty trying to adhere to them. Analysis of the RPT criteria coupled with the data of faculty's reported publications could provide an insight into where UCCS stands currently with its adoption and promotion of open scholarship.

METHODOLOGY
To examine the current RPT criteria for each college, deductive coding was used to identify criteria language surrounding seven predetermined evaluators of scholarly contribution: Reputation (any references to journal quality or a journal's being well-known or esteemed within a field); Discoverability (indexing); Impact (article or journal metrics); Open Access (open access, institutional repository deposits, or making work publicly available); Predatory (predatory journals, misrepresentation of journal quality); Tiered Lists (scale of value for different types of scholarly contribution); Journal Lists (explicit lists of journals to/not to publish in). After codes had been applied, they were used to visualize which evaluators were being emphasized in tenure and promotion processes.
Faculty publication data was already being collected by UCCS through Digital Measures by Watermark. Digital Measures is an online faculty activity reporting system that allows faculty to record instruction, research, and service activities for promotion and tenure evaluations. The data is self-recorded, with faculty entering their own publications and activities manually or using the system's RIS, Scopus, CrossRef, or PubMed import options. UCCS began using the system in 2015. Curious about where and how faculty were publishing and what unique insights this dataset might provide, the author requested the existing faculty publication data in Digital Measures from the UCCS Office of Institutional Research in April of 2019 to assess current and past faculty research habits.
Digital Measures was chosen over other sources of bibliometric data because it is not limited by subject area, as Scopus, Web of Science, or PubMed are (Martín-Martín, Orduna-Malea, Thelwall, & Delgado López-Cózar, 2018), and because it captures poorly indexed research. More specific to UCCS, Digital Measures data was also chosen because of how it is already being used on campus. Digital Measures is the primary source of faculty activity reports issued by the Office of Institutional Research, and recently the university began using this data to automatically generate public-facing faculty profiles, rather than relying on faculty to upload CVs or update personal webpages. Digital Measures is also used in the tenure and promotion process, with activity reports required as part of a candidate's dossier in several colleges.
This data was first cleaned in OpenRefine to eliminate duplicates and to begin the process of identifying and standardizing journal name variants. Then the researcher created a list of the unique journal names from across all records and investigated each journal manually, locating a current website or publisher website for each journal and collecting the ISSN, publisher, whether the journal was open access, and peer review status. Other types of publications were labeled "Not a Journal Article" and removed from the results. For records where an incorrect journal title or a publisher name was provided, the article title and author name were used to identify the correct name of journal where the work appeared. When there was insufficient information available to identify or distinguish the journal, no journal information was provided, or the researcher felt unable to make further determinations about the journal from the information available (e.g. a journal website in another language, the journal has no website and is only available in print in other countries), the record was marked as unidentifiable and removed from the results.
Journals were listed as OA if they provided immediate open access on publication (Gold OA) to all articles. Hybrid journals were those that "give authors the choice of making their individual articles OA in journals that are otherwise subscription" (Solomon, 2013, p. 26). The Digital Measures data does not currently show how many faculty have chosen to publish OA when given the choice in a hybrid journal, nor does it capture repository activity, and further investigations into our research outputs need to establish reliable methods for capturing this information as well. Closed journals were those that offered delayed or embargoed preprint repository deposits (Green OA) or subscription only access. Because Green OA refers to articles that may or may not be Open Access currently, it did not make sense to include them in a count of what UCCS research was open available. Additionally, Green OA's open component depends on authors depositing their research in a repository and would require the researcher to confirm each article's presence in some repository. As will be discussed later, this strategy may underestimate how much of UCCS research is openly available by some means.
The researcher also made a determination about whether or not each journal was predatory. For the purposes of this project, predatory was defined as a journal with an article processing charge (APC) and which provided false or misleading claims regarding its indexing, editorial board, or publishing address. Journals were automatically deemed legitimate if they were indexed in the DOAJ or if the publisher or journal were participants in regulatory bodies or trade groups such as OASPA, COPE, INASP's Journals Online, or African Journals Online. This definition was based on the checklist at Think.Check. Submit.org (n.d.).
This definition is very narrow, and by no means encompasses the full range of predatory, misleading, or illegitimate publishing practices. This limited definition was favored for two reasons. First, the researcher sought to preemptively defend against excuses like editorial incompetence. Although journals may innocently fail to update their websites regularly, leading to inaccuracies in their presentation of their address, editors, and indexing, this carelessness becomes deceptive when these elements represent a service being paid for by researchers and funding institutions in the form of APCs. Secondly, although other attempts at identifying predatory journals have often focused on testing peer review, these processes are lengthy, intensive, and require the tester to wait for acceptances to make the determination. Instead, this definition assumes that a journal that misrepresents its editorial board, location, or indexing is unlikely to have been truthful in its claims of peer review. However, the author acknowledges that this definition does not capture the full scope of illegitimate publishing.
Names and ID numbers were used to match the records to one of the university's colleges (College of Business and Administration; the Helen and Arthur E. Johnson Beth-El College of Nursing and Health Sciences (NURS/HSCI); College of Education; College of Engineering and Applied Science; Kraemer Family Library, College of Letters, Arts, and Sciences (LAS); School of Public Affairs). The researcher chose not to match the records to individual departments within the colleges because some departments consist of only one or two faculty members. As INORMS Research Evaluation Working Group has noted, bibliometric assessment at the individual level leads to higher risk of misuse of metrics (INORMS, 2020). 1

Tenure and Promotion Criteria
The criteria for tenure and promotion for UCCS colleges and departments are publicly available online 2 , and they reflect the most recent criteria adopted by the faculty. Because UCCS faculty have the choice to either adopt new criteria or continue to use the criteria they were hired under, this analysis does not capture all criteria in use at UCCS. However, it does provide a snapshot of what the different departments and colleges value and incentivize.
Most common was peer review, which was mentioned at least once in all but one of the documents (the exception being the Visual Arts department, where the focus was primarily on creative contributions). Other common traits were Reputation which appeared in 10 of 22 documents and Metrics which was mentioned in 9 of 22 documents. However, when Reputation or Metrics were mentioned, the criteria lacked specificity. For example, the College of Business and Administration writes that "some journals are regarded as more prestigious than others by the College" (2017, p. 2), but the document provides no guidance about which journals these are or what makes a journal prestigious, other than that they are "typically more challenging to the authors" (p. 2). Few documents were highly prescriptive about where and what faculty could publish, with only one document including a recommended journals list and only four documents containing tiered lists that indicated a scale of value assigned to different types of scholarly contributions.
No documents explicitly mentioned OA publishing. A few documents mentioned public facing work in their research criteria: the Women's and Ethnic Studies department acknowledges, "Public scholarship such as editorials, magazine articles, and blogs" (2009, p. 9); the Anthropology department recognizes "major publications that serve as resources for affected communities" (2018, p. 5); and the History department mentions that "scholarship can take many forms….for example, through scholarly work that supports a public exhibit on an historical topic" (2009, p. 4). All three statements could be interpreted as supporting scholarship that is publicly available, but the phrase "open access" is never used, and there is an underlying assumption that public work will not include journal publications. Even the library, which professes a commitment to supporting OA, failed to include any language regarding OA in its own tenure and promotion criteria. Similarly, there was no mention of predatory publishing in any of the documents.

Faculty Publication Habits
After cleaning and the elimination of duplicates, unidentifiable titles, and non-journal publications, the final list included 1,333 records of unique publications or publication attempts in 887 different journals. When compared to other sources of bibliometric data, Digital Measures appears to capture more. Restricting Digital Measures data to articles published in 2015-2018, it appears that UCCS published 972 articles. By comparison, an affiliation search in Scopus discovers only 764 journal articles for the same time period. This discrepancy could be due to a number of factors (Scopus's indexing, articles missing affiliation information, articles published without a UCCS affiliation included in Digital Measures), but the discrepancy is large enough that it raises questions about how the contents in Digital Measures compare to the research outputs reported by other sources.
Although UCCS faculty had published in or submitted to 887 different journals, those journals were produced by 262 publishers. Eleven publishers, 4.9% of the total number reported, were responsible for publishing more than 60% of UCCS's reported research output (see Figure 2). Taylor & Francis alone published 12.4% of the 1,333 reported publications.  The majority of the journals chosen by UCCS faculty were hybrid journals. The amount of hybrid publishing varied by college, with Education the only college where closed publications outnumbered hybrid (see Figure 3). However, the data from Digital Measures does not provide indication of whether or not an author chose to publish OA when offered that choice by a hybrid journal 3 . Publications characterized as OA in this paper refer to those published in purely OA journals. Overall, 20.7% of UCCS faculty publications were in OA journals. Again, this number may not truly reflect how much of faculty research is openly available as it does not account for hybrid publications or for publications deposited in institutional or other repositories. The amount of OA publishing occurring varies by college (Figure 3), with the least OA publishing occurring in Business. Springer published 9% of UCCS's total OA publishing and was the most common OA publisher. However, OA publishing showed greater diversity of publishers, with the top five publishers only responsible for 28% of publishing, compared with hybrid and closed publishing where the top five publishers accounted for 56% of publishing. OA publishing also appears to be rising slowly at UCCS (Figure 4) 4 , however with only four complete years of data, it is difficult to say this is a trend. One of the oft-repeated concerns surrounding OA publication is predatory journals. However, in examining UCCS faculty's reported publications, only 35 of the 1333 records (around 3%) were publications in predatory journals. The number of predatory publications each year does appear to be rising ( Figure 5), with 2018 showing the highest yet. Faculty have published in 26 different predatory journals from 22 different publishers. Seven publishers and six journals appeared more than once in the data, with one publisher responsible for 20% of UCCS's predatory publications.

Tenure and Promotion Criteria
The emphasis on peer review throughout the RPT documents was consistent with other studies. Its near omnipresence across departments demonstrates what Harley et al. (2007) observed: "Conventional peer review is so central to scholars' perception of quality" (p. 3). However, they also noted this centrality becomes problematic for open access as "[t]here is a large tendency for many members of the research community to equate electronic-only publication with the lack of peer review" (Harley, Earl-Novell, Arter, Lawrence, & King, 2006, p. 3). Although it seems old-fashioned in 2020 to be suspicious of online journals, four RPT documents still discussed online journals as a separate publication type from print journals. Not only does this standard fail to reflect the modern scholarly publishing landscape, it can also be detrimental to OA publishing as faculty believe these publications will not count or will count for less than those published in traditional journals.
No documents contained explicit support for OA, which may affect both where faculty choose to publish and how tenure and review committees judge the publications of a candidate. The few documents which did acknowledge publicly available works or alternative publishing methods never mentioned OA journals or open repositories, creating a false dichotomy in which work could be publicly accessible or work could be scholarly, but never both. As a public institution and a frequent recipient of public funding, UCCS could do more to support faculty in returning their research to the public by acknowledging and encouraging OA journals and repository use as legitimate means of distributing research and meeting funding mandates. For contrast, peer review was frequently and explicitly noted as an important quality of scholarly contributions in the RPT criteria for every college, and unsurprisingly peer-reviewed publications made up 92% of UCCS's overall publications.
Of the other scholarly quality indicators that were mentioned in the tenure documents, few provided clear guidance. It is understandable and even desirable that RPT criteria avoid being overly prescriptive; the use of highly prescriptive criteria, such as journal lists or journal blacklists can impinge on academic freedom, limit OA publishing, and impede interdisciplinary work (Bales, Hubbard, vanDuinkerken, Sare, & Olivarez, 2019). However, the vagueness of some quality indicators could actually lead to an increase in predatory publishing. Most RPT documents at UCCS emphasized peer review and several emphasized metrics, but none provided faculty with guidance about how to assess the quality of a journal's peer review or how to understand and validate the metrics a journal might claim. Setting aside concerns about the validity of metrics like journal impact factor or h-index for determining quality, many predatory journals tout inflated and impossible metrics on their homepage and most claim to have a thorough peer review process. These metrics presumably represent the desire for visibility and impact, but very few documents mentioned Discoverability (referring to indexing) or public accessibility as a consideration in choosing a journal. Vague language used to describe the reputation of journals may have been intended as a way to empower faculty to publish more freely, but wording that refers to "top journals," "high quality journals," or "well-known journals" leaves the value of publications to the discretion of the tenure review committees and funnels research into a limited number of journals and publishers.
However, the inclusion of journal lists in RPT criteria is unwise. Not only can lists be seen as an impingement on academic freedom (Bales et al., 2019), they do not remain useful after their creation. The predatory journals identified in this assessment may change their names, new predatory journals will be created, faculty could discover different predatory journals, well-run journals could become illegitimate, and, on a positive note, some of the predatory journals could improve their practices and become legitimate. It is not enough to identify a predatory publisher or a predatory journal once. Choosing a publication outlet requires faculty to identify and verify the claims each publisher makes and to critically evaluate how different indicators of scholarly quality align with their personal, departmental, and institutional research goals. The necessity of these evaluation skills points again to the need for greater scholarly communications education and outreach at UCCS to ensure administrators, faculty, and students are empowered to critically evaluate journals and make the best choices for their research.

Faculty Publishing Habits
The data provided by Digital Measures showed which journals and publishers faculty favored, and it differed from the data provided by Scopus. It is unclear what causes the discrepancy between our research outputs as reported in Digital Measures and those reported by other bibliometric data sources, and future research should try to determine specifically how Digital Measures data differs and what the relative accuracy of each tool is for measuring UCCS research outputs.
Even though faculty published in 887 different journals, the majority of UCCS research is produced by a handful of publishers. Additionally, the majority of publications occurred in hybrid journals. The most likely explanation is the number of publishers offering hybrid journals. With the ability to charge APCs as well as subscription prices, publishers have an incentive to make their journals hybrid. Hybrid publishing may also appeal to researchers as a chance to appease OA funding mandates while continuing to publish in traditional journals from known publishers that tenure and promotion committees recognize and reward. Name recognition could also account for the popularity of Springer (including the Nature titles) among UCCS's open publications.
The amount of hybrid publishing occurring at UCCS also indicates that there is an opportunity and need for education on the importance and benefits of OA as faculty frequently have a choice between closed and open publishing. Because this study did not investigate how many articles in hybrid publications were OA, this research may underestimate how often UCCS faculty choose to make their research OA. However according to the Hybrid OA Dashboard, between 2013 and 2019, only 3.5% of the articles published in hybrid journals were made OA (Jahn, 2020), suggesting that, even with OA as an option, the vast majority of authors still choose to publish traditionally.
Although OA publishing is still not very common at UCCS, it does appear to be rising. Yang and Li (2015) point to OA mandates from funding agencies like the NIH as the "main possible reason for the increase in awareness for OA publishing" at Texas A&M University (p. 12). Of UCCS's 2015-2018 journal articles, 17% received funding that mandated either open archiving or OA publishing. 5 However, a longer study is needed to determine if other factors, like hiring waves, influence the number of publications (both OA and not) in certain years. As already discussed, there is no explicit support for OA in the RPT criteria, and in many cases, the criteria favor traditional publishing outlets which could make faculty reluctant to publish in OA journals they perceive as risky. Faculty also might be unwilling or unable to pay journal APCs themselves. As noted before, some individual departments and colleges have funds for faculty APCs, but there are no university or library funds designated to support UCCS authors publishing OA.
Another reason for the lack of OA publications could be concerns surrounding predatory journals, but this assessment showed that predatory publishing is very infrequent at UCCS, accounting for less than 5% of faculty publications. The repeated use of certain predatory publishers or journals suggests that education on how to recognize a predatory journal and how to evaluate publication outlets may be helpful for faculty and administrators. Further education would not only allow faculty and administrators to recognize predatory journals but could also further OA publishing by increasing their comfort with assessing journals without relying on lists or name recognition.

CONCLUSIONS AND LIMITATIONS
This method of assessment was labor intensive and may not be achievable for universities with large research outputs. The findings are dependent on a data source that, as mentioned before, is incomplete and assumes the accuracy of what faculty have entered themselves. It is also important to remember that while correlations may exist between the RPT criteria and faculty publications, this method of study does not capture faculty's perceptions or intentions in their publication choices. This first assessment of UCCS's RPT criteria and research outputs revealed an emphasis on peer review, but a lack of guidance regarding OA, journal quality, and predatory publishing. The pervasiveness of peer review in the RPT criteria was reflected in the high percentage of peer reviewed publications, and this correlation suggests that RPT criteria could be used to influence other practices as well, such as funding compliance. The publication activity reports from Digital Measures answered some of the research questions posed at the start of this assessment. The data provided a better sense of where and how UCCS faculty are publishing and established a baseline of OA and predatory activity so the KFL can track trends and the impact of future scholarly communications initiatives. Additionally, sharing this data with administrators and faculty has allowed the KFL to start conversations about scholarly communications issues.
The literature and this study's findings suggest that library advocacy alone will not sway faculty towards OA. Instead our efforts will need to be focused on a larger change in the institutional culture. A possible model is described in Odell, Coates, and Palmer (2016), where IUPUI adopted language favoring OA in its tenure and promotion criteria because the library worked to create "an explicit link between IUPUI's institutional values and OA" (p. 322). Odell et al. (2016) argue that a combination of advocating for the incorporation of OA in faculty governance, campus strategic planning, and tenure and promotion criteria as well as educational workshops, OA monitoring, and increased support for item-level evaluation and altmetric use by tenure committees has created an OA-valuing culture at IUPUI. Describing OA advocacy efforts at University of North Texas, Rodriguez (2017) echoes many of the same calls for greater faculty education and OA monitoring, but also suggests that libraries go further to provide financial support for OA.
The study data has also created new questions to investigate. The disparity between UCCS's research output in Scopus and in Digital Measures raises questions about how Digital Measures compares to the scope of traditional bibliometrics data sources as well as the accuracy of the data faculty are self-entering about their research activities. Additionally, the messy and incomplete quality of the data in Digital Measures suggests that its usefulness as a tool to capture and express faculty activities should be further tested. Finally, although it was possible to identify OA, closed, and hybrid journals from the data gathered, it is still unknown how many UCCS faculty members chose to publish OA within hybrid journals and how many have made their research publicly available through repositories.
Understanding where and how much UCCS faculty are publishing helps to establish a baseline for future assessments and may prove helpful to the library in collection development and deal negotiations as well. With a greater focus on creating and sharing research at UCCS in the future, examining what patterns and trends already existed will allow the KFL to better align its scholarly communications efforts with the needs and goals of the faculty it serves. To support OA at UCCS, it is clear the KFL needs to expand its outreach and engage with students, faculty, and administrators to create not only awareness but policy and lasting support for OA.