A rant about Elsevier Pure

I have other things to do but one day I’ll enlarge on the insidious effects of elevating this cursed little histogram of “Research output per year” as the single most important bit of information about academics at thousands of universities that use Elsevier Pure. Consider this mini-rant my notes for that occasion.

Most importantly, we DO NOT write per year. Our careers are too diverse and precarious to measure output that way. What we write is important, where to find it is key, but how much per calendar year is at best irrelevant.

You may object that it is informative. Ah yes, informative. For whom? Primarily for bean counters who care about ‘deliverables’, ‘outputs’ and other countable things. And of course for managers who care about ‘productivity’ and ‘volume’.

These cursed little histograms invite inferences about productivity, gaps, and publication volume that are guaranteed to be reductive and bias-ridden. One could make the case they are actively harmful, feeding into exactly the wrong kind of feedback loops. So why do unis do it?

Marketed to the managerial class

Systems like Elsevier Pure are marketed to Research Managers, and every bit of their design shows that. Only a managerial class bent on counting output and measuring productivity is sensitive to marketing speak about “fact-based decision making” and “unlocking your full research potential”. Universities and institutions who use Pure’s “industry-proven data model” to create automated profiles for their researchers are making a big mistake.

Above I wrote how these public-facing histograms invite inferences that may be harmful. Of course that’s pretty much what Pure has been designed to do. Behind the scenes, there’s a plethora of ways to track metrics, targets, and progress right down to individual researchers. Just take the below screenshot from Elsevier’s marketing materials: Pure will help you “achieve performance goals by defining targets and tracking research progress over time”. This is the definition of corporate surveillance, and Elsevier is of course happy to bring it to capitalist neoliberal universities — and in the process insinuate itself into the veins of the system.

Incidentally, one has to admire the efficiency of this screenshot, showing that three researchers whose output was “insufficient” are “former staff”, i.e. have been let go. Once again, this is straight from the MARKETING MATERIALS of Pure, in case you were wondering. They really leave no doubt about their goals of limitless corporate surveillance and productivity policing.

Cursed little histograms

But I digress; my main beef is with the public profiles, which thoughtlessly include these little plots wherever possible — even in search results! Fortunately, my own institution Radboud University doesn’t use Pure (unless the recent VSNU deal with Elsevier forces it down our throat).

Elevating this useless histogram to such a prominent place on every researcher’s profile is the web design equivalent of “nerdview”: an ill-thought-out choice that makes very little sense to end users and is telling of your own biases. Fellow academics are probably the key audience for institutional homepages. When we look up someone’s page we do it to find a specific paper, read what they’re working on, perhaps check out recent work. We don’t want to see this cursed little histogram.

Updates

This post originated as an Oct 2020 thread on Twitter, and roused quite some attention. Some research data managers woke up from their twitter slumber to reluctantly agree with the notion that the prominent display of this visual on every single researchers’ profile may not send the right message in a time of Recognition and Rewards.

Indeed, two universities (VU Amsterdam and TU Delft) admitted outright that the output graphs violate the principles of SF-DORA and the Recognition and Rewards position paper, with the Free University Amsterdam taking the lead in removing the output-per-year graphs from their Research Portal. (Not a coincidence that the change was spearheaded by Sander Bosch and Lena Karvovskaya, who care about open scholarship rather than senseless stats.) Switching of this view is incredibly simple, by the way: it is literally a matter of flicking a single button in the Pure settings for personal profiles.

2021 screenshot of Pure settings showing the way to remove the cursed little research output graph

Quite a few universities didn’t bother. Wageningen University signed SF-DORA and was, last I checked, still “looking for a better assessment system” but until that time choses to reduce its researchers to little output graphs. Rector Frank Baaijens of TU Eindhoven coauthored the Recognition and Rewards position paper, but his own university still clings to the numbers game. TU Delft claims they want to recognise “a wide spectrum of products” but their public research profiles invite only mindless bean counting. For shame, especially since all of them have signed SF-DORA. Actions speak louder than words.

Leave a Reply

Your email address will not be published. Required fields are marked *