Using Law School Faculty Author Profiles to Promote Impact: The U.S. News & World Report Saga Continues

to that their were included in HeinOnline’s database and that HeinOnline Author Profiles were accurate. This case study helps librarians tackling either similar law library projects or similar projects throughout academia as metrics gathering for scholarly impact in a variety of manners, including educational rankings, becomes more prevalent. NEXT STEPS Following this project, Hein contacted UNC to inform the law library they had identified an additional 119 articles for faculty members not previously attached to UNC faculty Hein Author Profiles. Some implications from this project are unclear and changing methodology, scalability issues, authority control concerns, lack of capturing interdisciplinary works, and an increased workload for librarians. Librarians, especially Scholarly Communications Librarians, are well equipped to promote our faculty’s scholarship by understanding the methodology of these educational ranking systems and by connecting our faculty and their research to the database tools of our field. This article represents just one field, but the implications apply more broadly.


INTRODUCTION
As the prominence of academic ranking systems have grown, devising ways to boost faculty scholarly impact is becoming increasingly important for most, if not all, disciplines in academia. Technology improvements that quantify academic metrics are now routinely used for tenure and promotion decisions, while news institutions such as the U.S. News & World Report's (U.S. News) ranking of colleges and universities in the United States or the Times Higher Education's World University Rankings annually rank academic institutions for their news reading public. These educational ranking systems can define an institution's selectivity, academic prowess, and prestige to the public, and are promoted as a tool for potential students to select where they apply and attend. Thus, how a program or institution ranks within these academic ranking systems can have a serious effect on student enrollment, operational budget, and institutional prominence.
For law schools in the U.S., U.S. News' Educational Rankings are currently considered the most important ranking system in terms of measuring program quality and prestige. The methodology for the rankings has changed over the years, but in 2019, U.S. News announced a substantial change in that it would now experiment with law faculty scholarship as an important new factor of consideration, either separately or as part of the overall law school ranking. Specifically, U.S. News now collaborates with William S. Hein & Co. Inc., a legal publisher and online research platform, to gather and measure faculty publications and scholarly impact.
Immediately, law schools across North America were forced to respond to this change to make sure that their faculty's publications were counted. While a law librarian at the University of North Carolina at Chapel Hill School of Law (UNC Law), I was charged with the task of ensuring that all the UNC Law faculty articles were accurately included in Hein's database and correspondingly in the US News' law school ranking system.
In this article I will discuss how I managed the author profiles and citation analytics of the UNC Law faculty in order to ensure that their scholarship would be included and tracked by Hein and subsequently U.S. News. Additionally, I will discuss some implications this may have on librarians, faculty members, and law schools. We have entered the era where every citation counts-beyond decisions for promotion and tenure-and librarians have a key role that they can play to ensure that our faculty's scholarship is included and counted within these ranking systems.

PART I. UNIVERSITY RANKING SYSTEMS: AN OVERVIEW
Law is not the only field that cares about institutional rankings. In their article, Michael Sauder and Ryon Lancaster (2006) note, " [O]ver the last 15 years, there has been a great increase in the number of rankings of educational institutions published by widely circulating magazines and newspapers both in the United States and internationally" (p.106). Governments and other funding bodies are concerned about research accountability when they provide grants to universities for research (Pagell, 2009). These government agencies and funding bodies want to know how universities and academic programs compare to others on both a national and international level (Pagell, 2009).
Prior to U.S. News rankings, which are consumer centric, the focus of institutional rankings and prestige was on faculty research output and peer review, not input factors and student and faculty characteristics (Pagell, 2009). U.S. News ranks a variety of academic disciplines, as well as overall undergraduate institutions. Each ranking has its own methodology (Morse, Hines, & Martin, n.d.). 1 Many of them contain a peer survey to determine prestige of an institution, but none of them have a specific scholarly impact factor (Morse, Hines, & Martin, n.d.).
However, U.S. News rankings are not the only rankings in the world. The Times Higher Education (THE) ranks universities globally with their Times Higher Education World University Rankings. According to THE's methodology (2019), they assess almost 1,400 universities across the globe. THE (2019) looks at teaching (the learning environment), research (volume, income, and reputation), citations (research influence), international outlook (staff, students, research), and industry income. There are 13 indicators used among these five areas. Research makes up 30% of an institution's ranking. Research looks at responses to THE's Academic Reputation Survey (18%), research income that is scaled for purchasing power parity (6%), and research productivity (6%). According to the methodology, for research productivity, THE looks at Elsevier's Scopus database: "To measure productivity we count the number of publications published in the academic journals indexed by Elsevier's Scopus database per scholar, scaled for institutional size and normalized for subject" (THE World University Rankings 2020: Methodology, 2019.
THE also uses citations as a way to measure research influence. This citation factor comprises 30% of an institution's ranking. Again, THE works with Elsevier to determine citation counts.
We examine research influence by capturing the average number of times a university's published work is cited by scholars globally. This year, our bibliometric data supplier Elsevier examined 77.4 million citations to 12.8 million journal articles, article reviews, conference proceedings, books and book chapters published over five years. The data include more than 23,400 academic journals indexed by Elsevier's Scopus database and all indexed publications between 2014 and 2018 (THE World University Rankings 2020: Methodology, 2019.
THE explains that they want to see how a university is contributing to the global scholarly community and that citations are a way to determine this. The citations are "normalized" as a way to counter citation discrepancies between subject areas with high citation counts that could benefit certain universities and hurt others (THE World University Rankings 2020: Methodology, 2019. Other rankings systems also use citations as a factor. For example, Baltagi's Worldwide Econometric Rankings ranks institutions based on their contributions to journals regarding econometrics (Baltagi, 2007;Pagell, 2009). The University of Texas Dallas UTD Top 100 Business School Research Rankings looks at publications in the top 24 business journals to rank business schools in North America and worldwide (Naveen Jindal School of Management-The University of Texas at Dallas, n.d.). Academic Ranking of World Universities (ARWU) ranks the top 1000 universities in the world (About Academic Ranking of World Universities | About ARWU, n.d.). ARWU looks at how many highly cited researchers a university has using Clarivate Analytics, how many articles faculty published in particular journals, and how many articles are indexed in Science Citation Index -Expanded and Social Sciences Index (About Academic Ranking of World Universities | About ARWU, n.d.; Pagell, 2009;Pietrucha, 2018).
QS World Rankings also considers scholarly impact, looking at citations per faculty member (QS World University Rankings-Methodology, 2016). Citations per faculty member account for 20% of an institution's ranking (QS World University Rankings-Methodology, 2016). QS explains in their methodology that they use Elsevier's Scopus database (QS World University Rankings-Methodology, 2016). The CWTS Leiden Ranking uses Web of Science Clarivate Analytics to account for faculty publications and scholarly impact when ranking institutions worldwide (Pagell, 2009; CWTS Leiden Ranking).
However, university ranking systems are not free of criticism. William C. McGaghie (2019) notes that U.S. News medical school rankings do not accurately assess medical school quality. Some critics point out that university ranking methodology is often flawed, using variables that can be easily manipulated to achieve higher status and perpetuating elitism of already highly ranked institutions (Marginson, 2014). Jung Cheol Shin and Robert K. Toutkoushian (2011) explain, "other possible negative side effects of rankings might be characterized as institutional homogenization, distorting disciplinary balance, and leading institutions to change their focus and mission in response to rankings" (p. 12).

PART II. LAW SCHOOL RANKING SYSTEMS
The impact that the U.S. News Educational Rankings has had on law schools is hard to overrepresent. Law school rankings are useful because it can be difficult to determine quality; thus, the comparative ranking system employed by US News became a quick way for prospective students to judge the quality of an institution (Sauder & Lancaster, 2006). Sauder and Lancaster (2006) argue that U.S. News exerts such a tremendous impact on the field of law schools because the magazine is inexpensive and the format enables easy comparisons that potential students and other outside audiences use it as their go-to source to determine the quality of a law school program.
In A Value-Added Ranking of Law Schools, Christopher Ryan, Jr. (2019) explained, "The U.S. News rankings are integral to understanding the current environment for legal education, because for better or for worse, the U.S. News rankings have become the 'gold standard of the ranking business,' as well as a proxy for determining a law school's quality and value" (p.287).The U.S. News ranking methodology for law schools has long incorporated four major components: 1. Quality Assessment, including peer assessment score and assessment score by lawyers and judges.
2. Selectivity, examining median LSAT and GRE scores, median undergraduate GPA, and acceptance rate.
3. Placement success that includes bar passage rate and future employment of recent graduates 4. Faculty resources, looking at expenditures per student, student-to-faculty ratio, and library resources (Morse, Hines, & Martin, n.d.;Ryan, Jr., 2019).
In 2000, Brian Leiter (2000), law professor and legal scholar, developed the Leiter Ranking system utilizing law school faculty citation counts as means to comparatively rank U.S. law schools. After the initial law school ranking in 2000 and a number of updates until 2010, a group from the University of St. Thomas in Minnesota continued the Leiter Rankings as an assessment starting in 2012 (Sisk, Aggerbeck, Hackerson, & Wells, 2011, Sisk, Aggerbeck, Farris, McNevin, & Pitner, 2015, Sisk, Catlin, Veenis, &Zeman, 2018. Tenured law faculty are included in the scholarly impact score for each law school, "because the scholarly impact score is derived from citations in legal journals, the proper subject of study is the tenured law school faculty member who is expected to contribute to that genre of legal literature" (Sisk et al., 2015, p.117). They define scholarly impact as the citations in law review articles over the past five years (Sisk et al., 2015). The University of St. Thomas group relied on Westlaw's Law Reviews and Journals database for their Leiter ranking study (Sisk et al., 2011(Sisk et al., , 2015(Sisk et al., , 2018. 2 "Schools are rank-ordered by their weighted score, which is the mean X 2 plus the median (since mean is more probative of overall impact than median, it gets more weight in the final score)" (Sisk et al., 2015 p.118). With its emphasis on transparent and quantitative measurement of faculty impact, Above The Law, a popular law blog, speculated that U.S. News wanted to compete with the Leiter Ranking (Zaretsky, n.d.).
With some similarities to Sisk et al.'s scholarly impact scores, in Fall 2019, Paul Heald and Ted Sichelman (2019) published "Ranking the Academic Impact of 100 American Law Schools" detailing their methodology for using raw data from HeinOnline (citation counts) and SSRN (download counts) to rank law schools by the scholarly impact of tenure and tenure-track faculty. Heald and Sichelman (2019) called on U.S. News to use a quantitativebased scholarly impact ranking similar to what they created rather than the peer assessment that is currently taken into account.
Like other university ranking systems, U.S. News receives its share of criticism. In their article, "Jukin' the Stats: The Gaming of Law School Rankings and How to Stop It," Darren Bush and Jessica Peterson (2012) explain that students, scholars, faculty and deans criticize U.S. News rankings. The criticism revolves around three main topics: "flaws in the methodology employed by USNWR; undue weight given to the rankings by law students and faculty; and, the diversion of resources to inefficient uses" (p. 1241).

PART III. U.S. NEWS DECIDES TO PARTNER WITH HEINONLINE
In February 2019, U.S. News announced that it would begin to track faculty scholarship as a component of its methodology used to rank law schools (Morse, 2019a). "The intent is to analyze each law school's scholarly impact based on a number of accepted indicators that measure its faculty's productivity and impact using citations, publications and other bibliometric measures" (Morse, 2019a). To do this, U.S. News announced it would work with William S. Hein & Co. Inc. (Hein) as their sole source of faculty publications and citation information (Morse, 2019a).
Hein is a legal publisher that has an online platform available for subscription fee called HeinOnline. They have a large collection of legal periodicals and government documents, both current and historical. According to their website, HeinOnline has more than 178 million pages and 270,000 titles of historical and government documents (HeinOnline, n.d.). HeinOnline includes a Law Journal Library database that has over 2,700 law-related periodicals from starting publication dates. Some journals have an embargo period for current issues and volumes, but Hein claims that more than 90% of the journals are available through current issue or volume (HeinOnline, n.d.). Hein also has cases, constitutions, and other primary legal documents in its online catalog. The documents available on HeinOnline are PDFs of original print materials, and the publisher claims these are fully searchable image-based PDFs (HeinOnline, n.d.). With regard to citation tracking, Hein has added a tool called ScholarCheck and has created Author Profile Pages.
In a blog post, Hein explained that they are not receiving compensation for their involvement with US News' scholarly impact ranking project (Mattiuzzo, 2019). Rather, U.S. News will create the methodology, while Hein will provide the citation metrics utilized within the Scholarly Impact indicator (Mattiuzzo, 2019). Using Hein, U.S. News will collect and calculate the mean citations of law school faculty members, the median citations per faculty member, and the total number of publications of the law school from the last five years as the formulaic basis of the new metric (Morse, 2019a).
Knowing that the current database was problematic in terms of authority control, article record, and comprehensibility, U.S. News asked law schools to work with Hein to update their faculty lists (Morse, 2019a). Law schools were asked to send Hein a full list of their tenure track faculty members to ensure that faculty members' Author Profiles were attributed to the correct law school (Caron, 2019). This also allowed users (including U.S. News) to view all the law school faculty members' on a single institutional page within HeinOnline. Law schools were also asked to check faculty members' names as authority control is very important for accurate citation counts.
Many law schools were not thrilled with this change in ranking methodology. Robert Morse, the Chief Data Strategies at U.S. News wrote an "Open Letter from U.S. News & World Report to the Law School Dean Community" to address concerns that the publication was receiving from law school deans, professors, and other members of the legal community (Morse, 2019b). This new component of U.S. News Law School ranking methodology is scheduled to be implemented in the 2020 edition of the US News. How it would be calculated and used is still unclear and evolving (Morse, 2019b). 3

PART IV. THE UNC LAW SCHOOL/HEINONLINE PROJECT
In May 2019, I was tasked with determining whether all of UNC Law faculty members' articles were properly identified in HeinOnline's catalog. This project fell under several stages, the two main components were: 1) establishing faculty profiles and authority control, and 2) ensuring an accurate and comprehensive inclusion of faculty scholarship. The first part of this project was determining whether the 42 UNC tenure-track law faculty all had an Author Profile in HeinOnline. HeinOnline created Author Profiles in their database to track author publications. Authority control quickly emerged as a problematic issue (Decker, 2019;Heald & Sichelman, 2019). 4 We needed to confirm that there was only one Author Profile for each faculty member. Hein's system sometimes created multiple Author Profiles for a single faculty member. It turns out that comma placement, middle initials, and misspellings caused multiple profiles to be created for the same individual, thus splitting up the number of citations across multiple Author Profiles for one individual. After identifying the name variations and the name variation with comma placement, we at the UNC Law Library sent the updated file to Hein, who supposedly combined the different Author Profiles under one name for each faculty member.
Hein added a tool called ScholarCheck that integrates with the Author Profiles to track citation metrics for authors (Furtak, 2019). Thus, for the second stage of this project, I needed to assess whether the Author Profiles on HeinOnline incorporated ALL of our faculty members' publications. This too was problematic. HeinOnline's database only focuses on legal publications-law review articles, cases, and newsletters, and other legal documents. It does not include interdisciplinary work of faculty members. It also does not include many current legal treatises or monographs that are useful for legal practitioners. For example, a pro-fessor may be cited in Nimmer on Copyright, but Hein does not capture this citation because it is a LexisNexis publication that is not available on HeinOnline. Issues regarding Hein's limited catalog will be discussed further in Part V of this article, which highlights some of the implications of this project.
In order to identify whether or not Hein's Author Profiles contained all publications that should be on Hein, I downloaded each tenure-track faculty members' CV from the UNC Law Faculty Directory website. Next, I went to each Author Profile to confirm whether all of the articles on the faculty members' CVs were included on the Author Profile. I compiled the information into an Excel spreadsheet-tracking the various names used by faculty members, the number of articles included on the Author Profile, and the citation counts for articles and cases noted in the Author Profiles.
While checking Author Profiles and articles, I found more misspellings of faculty members' names in the article record. Thus, HeinOnline might have the article in the database, but due to erroneous metadata it sometimes was not connected to the correct Author Profile. I also found that some journals did not individually identify certain types of publications. For example, some journals lumped together all book reviews and all notes, not separating them out by author. Because it is unclear what types of publications will be counted for U.S. News' scholarly impact ranking, such as book reviews and notes/commentaries, it may be worth correcting the metadata for these lumped publications. Many notes or commentaries are downloaded often and if a journal does not split these up as separate items in Hein's database, then an author could be shortchanged for download counts (Kirschenfeld, 2019).

PART V. PROJECT RESULTS AND IMPACT ON SCHOLARLY IMPACT
Hein contacted UNC later in 2019 to inform the law library that they had identified an additional 119 articles for faculty members at UNC Law that were not previously attached to UNC faculty Hein Author Profiles. Hein used the information that we provided them to find these additional articles. I know it was a long process to identify all of the different name variations, sort through CVs, and run numerous searches in the database, but 119 additional works seems like a large amount. I hope to see other schools' information about these projects.
For librarians considering a similar project I have the following practical recommendations: For obtaining CVs, see if you can do an institutional download of CVs or publication lists. Many faculty members include their publications in their annual reviews, so check to see if a review committee already has a publication list or if a department on campus has an up-to-date list. Some schools have faculty activity reporting systems like VIVO and Digital Measures Activity Insight. Institutions have used these systems to create faculty profiles and to populate institutional repositories with publications gathered by faculty reporting (Givens, Macklin, Mangiafico, 2017; Novak & Day, 2018). For this project, I downloaded CVs from our faculty members' directory pages, however, these turned out to be incomplete. The CVs were current as the last time the faculty members updated them and some faculty members did not list all of their publications. For example, one faculty member that has been at UNC Law for a number of years chose to only include publications from 2000 to present. This faculty member did not include the previous 20+ years of publication information from earlier in their career. Even though US News claims to only be looking at the past 5 years of publication data, it would be better for thoroughness to have checked all of the faculty member's publications.
Keep track of the amount of time you spend. This is an incredibly time-consuming project. I had to keep track of how much time I spent on this project for multiple reasons, including demonstrating the value law libraries add to law schools. In a similar vein, be careful and detailed oriented. This project will be tedious, especially for schools with large faculties.
Check multiple names, be flexible, and get creative. Authority control was a major issue during this project. Letters were transposed or added to names. Middle names and middle initials could sometimes result in multiple profiles. Common last names can be an issue. The placement of a comma between three different parts of a name resulted in multiple profiles as well. When an article on a CV was not on a Hein Author Profile, I would search by article citation. Often there would be an issue with the author's name, thus resulting in the article's exclusion from the appropriate Author Profile.
Keep track of articles not associated with Author Profiles. It was interesting to see which articles were not on profiles. Often, they were interdisciplinary works that were not published in law journals.
While not directly related to this project's retrospective collection analysis, consider signing up faculty members for author profiles on other sites, such as Google Scholar, SSRN, and ORCID ids. This will help with their prospective authority control and calculation of impact. Hein's Author Profiles now include ORCID ids, so we might as well take the time to register faculty members for persistent identifiers to help with authority control if they do not already have them.
While looking at Hein's profiles and citation counts, I also looked at Social Science Research Network (SSRN) pages for faculty members, Google Scholar, and Westlaw to see what citation counts looked like on other platforms. It was time consuming, but because we were already looking at each author's scholarly metrics on Hein, it seemed like a good time to explore the metrics on other platforms. Doing this highlighted some of the implications of U.S. News' decision to use HeinOnline's metrics. Metrics on other platforms varied due to the size of their catalogs, depending on what types of publications they included, such as all law journals, books, interdisciplinary journals, and other discipline journals.

PART VI. IMPLICATIONS
There are a number of implications that arise from U.S. News including scholarly impact as a factor for law school rankings. Below is a non-exhaustive list of a few that stood out to me while I was doing my research. This list may change as we have more information from U.S. News about the methodology they will use for the ranking factor. Clarification will be needed, especially with regard to what sources count for faculty members, what sources count for citation counts, and which faculty members count towards an institution's scholarly impact.

Unclear Methodology
U.S. News has been providing updates through blog posts, conferences, letters to deans, for example. Some notable updates about this overall process are that Hein will be working with each individual school, but this is going to be extremely time consuming. U.S. News has stated that the scholarly impact factor will not be taken into account for the overall law school rankings for the 2021 rankings. However, they may release a separate scholarly impact ranking for law schools. With regard to taking into account the scholarly impact factor, they do not have the methodology determined yet, but they will not be assessing individual faculty members, but rather looking at the law school faculty as a whole.

Changing Methodology
How much effort should law schools and librarians put into these projects? I, like many of my law librarian colleagues, were in effect cleaning up the database of HeinOnline. Additionally, U.S. News could very easily decide to change platforms they use to capture metrics and have all of us scrambling again. Many other database companies track faculty impact in a more comprehensive fashion, for example the author profiles provided by Google Scholar or Elsevier (which owns LexisNexis, Social Science Research Network, Scopus, Bepress, and many other databases/platforms in a variety of disciplines). If U.S. News were to use either of these products, metrics from law faculty that publish in interdisciplinary fields could be better represented than using the strictly legal focused, HeinOnline database.

Scalability
For my project, I was working with a relatively small number of faculty members. It would take much more time for someone to do this for a large faculty, for example Georgetown or Harvard.

Authority control
Authority control has been a major issue and will continue to be in any database. Database creators need to ensure that documents are properly scanned and that metadata is accurate. This applies across any platform/ranking. Older systems cannot keep up with new demands. Previously uploaded pdfs of journal articles may not have the necessary metadata or may not be properly OCRed to meet our citation analytics needs. U.S. News claims to be only looking at the most recent 5 years of scholarship, which theoretically should not be affected by outdated technology issues. However, if they decide to look at the full publication lists of some faculty members, they will run into this issue for scholars that have been publishing since the 1970s and 1980s. Hein has acknowledged issues with citation metrics: The traditional HeinOnline citation counts are derived by looking for official citation patterns (i.e. Bluebook 5 ). We are aware that, due to OCR errors, improper use of citations, or lack of an official citation, the current citation metrics are not perfect. Our development team has begun to work on alternate methods to locate citations within the text of the database, which we are confident will improve the metrics' overall accuracy (Mattiuzzo, 2019). Hopefully with this focus on improving citation capture, authority control issues will improve. Also, Hein has started to include ORCID ids on Author Profiles, which may be helpful for future authority control issues. Perhaps it can be included as a metadata field as an additional way to tag the correct author of a work. 6

Interdisciplinary works
Interdisciplinary researchers and institutions that emphasize interdisciplinary research and publishing will be negatively impacted if Hein is the only database used for U.S. News rankings of scholarly impact. As Hein is limited to legal publications, it does not include publications from other fields. Many faculty members write about the impacts of law in other academic fields. For example, a faculty member that researches education law may publish in an education journal rather than a legal journal. These articles are not captured by metrics using only Hein's catalog.
In addition, Hein does not capture citations to law journals from scholarly publications in other fields. That professor writing about education law, would not be recognized for their citations from articles in education journals. Interdisciplinary research was all the rage in academia at one point. It may be worth exploring other platforms while we are all doing this work, such as creating SSRN author profiles, Google Scholar author profiles, registering for ORCID ids, and other best practices for promoting scholarship and discoverability in disciplines outside of law if faculty members do interdisciplinary research.
After this project was completed, Hein created a way for authors to link their SSRN and Google Scholar profiles to their HeinOnline Author profiles. Also, with the new ORCID phase II implementation, all works on ORCID connected to a specific author will now show on HeinOnline Author Profiles, even if the full-text of an article is not available in Hein's database (Kibler, 2020b). Both of these are steps to better reflecting interdisciplinary work in the author profiles. However, it remains unclear if citations to these articles will be included.

Citation counts
Databases count the number of citations to an article from other items in their catalogues. So, the larger the number of relevant items in a catalog, theoretically, the larger the number of potential citations. Total number of citations, h-indexes, and number of citations in the past 5 years may be affected. This is something that others have discussed with regard to bibliometrics so I will not explore it in this article, but it is interesting to think about. Using Hein could limit the number of citations because it has a smaller universe than a platform like Google Scholar or because there is an embargo on recent articles for some journals. Also, we are unsure what sources count for citation counts. We do not know if it will include law review articles, book chapters, treatises, encyclopedia entries, newsletters, commentaries, notes, and book reviews. Similarly, we do not know what sources the citations can come from. Courts may cite law review articles or treatises in opinions, but we do not know if these count towards the citation counts. Also, we do not know if self-citations will be excluded from citation metrics to determine scholarly impact. Some platforms allow for metrics to be limited to non-self-citations. We also do not know if altmetrics will be considered, such as twitter mentions, saves within citation management platforms like Mendeley, or Wikipedia mentions.
Other databases will have their own issues regarding citation counts. As I mentioned above, I also looked at author profiles and author search results on Google Scholar, Westlaw, and Social Science Research Network (SSRN). Google Scholar metrics are usually higher but that is because the algorithm pulls in more results than HeinOnline. Google Scholar is an aggregator that captures much more than just academic journal articles. Also, some results may be duplicates if they are captured from multiple sources, such as from the journal website, institutional repositories, various databases, and online archives (both general and subject specific) (Delgado López-Cózar, Orduña-Malea, & Martín-Martín, 2019;Jacsó, 2012;Martín-Martín, Orduña-Malea, Thelwall, & Delgado López-Cózar, 2018). Google Scholar may solve some issues related to interdisciplinary works, but it also may pull in too many miscellaneous results.

Five Years
Five years as a timeframe for citation metrics is standard for many other rankings systems that take scholarly impact into account (Leiter, 2000;THE World University Rankings 2020: Methodology, 2019. This does promote current scholarly productivity, but it could have a negative impact on very new professors and some more senior tenured professors that may have been incredibly prolific for many years prior to the past five years. Five could be seen as an arbitrary number, but that may be industry standard for metrics. Heald and Sichelman argue that all-time citation counts and 12-month citation counts should be used for measuring faculty impact (Heald & Sichelman, 2019).

Non-Tenure Track Faculty
Law schools have other faculty members that are not tenure track. It was initially unclear whether non-tenure track faculty members would be counted in the U.S. News tracking of scholarly impact. Clinical professors may not have a writing requirement for tenure, but that does not mean they do not publish. UNC has a number of non-tenure track faculty members that publish articles. Also, law librarians publish as well. At some law schools, librarians are tenure-track, but at other institutions they are considered staff or have a secondary appointment to the faculty. If every citation counts, should non-tenure track faculty that publish be factored into a law school's scholarly impact score?

Gaming the rankings system
Self-citations may increase, unless they are excluded from citation counts. By using only Hein's catalog, citations are limited to law publications. This could push authors to cite specific journals or specific law school faculty members or only recent scholarship, that is, the past five years in an attempt to boost a school's scholarly impact ranking.

Subject Area
Certain subjects in law are cited more often than other subjects. According to Shapiro and Pearse's article about the most cited law review articles of all time, constitutional law, civil procedure, contracts, property, torts, and criminal law are more often written about and are more often cited than smaller areas (Shapiro & Pearse, 2011). Could scholarly impact rankings dissuade authors to write in "less cited" areas of law? Perhaps U.S. News should "normalize" citation counts between subject areas similar to Times Higher Education's methodology to balance law school areas of specialization (THE World University Rankings 2020: Methodology, 2019).

Increased workload on librarians
But perhaps this will be seen as a value-added situation by administrators. "For example, the rankings, according to many law school administrators, change how both resources are distributed and work is done within law schools because administrators feel pressure to make decisions based on what is best for the school's rank rather than what is best for the quality of the education provided by the school" (Sauder & Lancaster, 2006, p. 1488. Perhaps because of the inclusion of scholarly impact in the U.S. News, the library will be seen as adding value to the law school. Librarians will be doing much of this work, and it will be a factor in rankings, so hopefully administrators will see the benefit of law libraries. 7 However, would librarians need to check author profiles and publications every year for each faculty member? This could get out of hand easily taking over our workloads, especially if it grows in importance in the overall U.S. News rankings or if it were to spread to other disciplines outside of Law. Luckily after this project was completed, Hein enabled faculty members/authors to make changes to their profiles, similar to Google Scholar author profiles (Kibler, 2020a). Librarians are still able to make changes to the profiles, and are probably better situated for working on these projects with our specialized knowledge of these databases and bibliometrics.
How fine grained will this metric chasing go?
When every publication and citation count, is this something that librarians will this micro analysis and quality control become more and more prevalent? Here HeinOnline extensively had the law librarians do their quality control and improve their database for them because it was in the law schools' best interest for them to do so. How prevalent will this intertwined relationship become? Will librarians do the work for the for-profit database companies? It would be worth investigating how Clarivate Analytics, Scopus, and other citation analytics platforms update their institution lists and authority control issues. These metrics have real impact on academic institutions and faculty members promotion and tenure, so it is in schools' best interest to ensure they are accurate.
With the impact U.S. News rankings have on law schools' prestige and budgets, schools could choose to have more men on faculty to boost their citation counts. I am certainly not advocating for law schools to do this. However, there is bias built into citation analytics, and having a ranking system built on this could perpetuate this bias.

CONCLUSION
U.S. News rankings will continue to be an important ranking system for law schools, and adding a scholarly impact factor for faculty members is not unheard of for academic ranking systems. No matter what, this change of announcement has resulted in numerous hours of librarian time going to checking faculty Author Profiles on HeinOnline to ensure that all of their faculty publications are listed on these profiles (Decker, 2019). As a practical case study, correcting author profiles on Hein is a time-consuming project, but hopefully law school administrations will see the value added by the law library in working to capture every citation. Although we do not know the exact methodology that U.S. News will use for the scholarly impact factor, it would be best to be prepared by ensuring each article that faculty members have published are properly attributed to them in Hein.
This article describes a case study of how one change in the U.S. News ranking methodology will potentially impact law schools, but has already impacted the work of law librarians around the country. But the issues facing law faculty and law librarians are not unique to that discipline; faculty in all fields, particularly in R1 universities, face mounting pressure to increase the quality and quantity of their scholarly publications. Likewise, scholarly communications librarians across the country are trying to increase the findability, accessibility, and impact of their faculty's publications.
Librarians are well equipped to promote our faculty's scholarship by understanding the methodology of these educational ranking systems and by connecting our faculty and their research to the database tools of our field. This article represents just one field, but the implications apply to all.