Home » Blog

Tracking research into practice: Are nurses on twitter a good case study?

11 June 2012 6 Comments

The holy grail of research assessment is a means of automatically tracking the way research changes the way practitioners act in the real world. How does new research influence policy? Where has research been applied by start-ups? And have new findings changed the way medical practitioners treat patients? Tracking this kind of research impact is hard for a variety of reasons: practitioners don’t (generally) write new research papers citing the work they’ve used; even if they did their work is often several steps removed from the original research making the links harder to identify; and finally researchers themselves are often too removed from the application of the research to be aware of it. Where studies of downstream impact have been done they are generally carefully selected case studies, generating a narrative description. These case studies can be incredibly expensive, and by their nature are unlikely to uncover unexpected applications of research.

In recent talks I have used a specific example of a research article reaching a practitioner community. This is a paper that I discovered will search through the output of the University of Cape Town on Euan Adie‘s Altmetric.com service. The paper deals with domestic violence, HIV status and rape. These are critical social issues and new insights have a real potential to improve people’s lives, particularly in the area of the study. The paper was tweeted by a number of accounts but in particularly by @Shukumisa and @SonkeTogether two support and adovcacy organisations in South Africa. Shukumisa in particular tweeted in response to another account “@lizieloots a really important study, we have linked to it on our site”. This is a single example but it illustrates how it is possible to at least identify where research is being discussed within practitioner and community spaces.

But can we go further? More recently I’ve shown some other examples of heavily tweeted papers that relate to work funded by cancer charities. In one of those talks I made the throw away comment “You’ve always struggled to see whether practitioners actually use your research…and there are a lot of nurses on Twitter”. I hadn’t really followed that up until yesterday when I asked on twitter about research into the use of social media by nurses and was rapidly put in touch with a range of experts on the subject (remind me, how did we ask speculative research questions before Twitter?) . So the question I’m interested in probing is whether the application of research by nurses is something that can be tracked using links shared on Twitter as a proxy?

The is interesting from a range of perspectives. To what extent do practicing nurses who use social media share links to web content that informs their professional practice. How does this mirror the parallel link sharing activity by academic researchers? Are nurses referring to primary research content, or is this information mediated through other sources? Do such other sources link back to the primary research? Can those links be traced automatically?  And a host of other questions around how professional practice is changing with the greater availability of these primary and secondary resources.

My hypothesis is as follows: Links shared by nurse practitioners and their online community are a viable proxy of (some portion of) the impact that research has in clinical practice. The extent to which links are shared by nurses on Twitter, perhaps combined with sentiment analysis,  could serve as a measure of the impact of research targeted at the professional practice of nurses.

Thoughts? Criticisms?


6 Comments »

  • Dario Taraborelli said:

    I totally agree that we need to move from the abstract mission of understanding “broader impact” to concrete case studies of how practitioners access, share and use research (and measure this usage). I am not sure links shared via Twitter per se will provide enough (and sufficiently representative) data to understand how the nurse community functions (how many nurses is “a lot”? Why not start from QA platforms instead like StackExchange or Quora?), but it’s definitely worth trying. Another field that I expect will  (or should) attract a lot of attention in the altmetrics community is law. I was excited to see studies such as this one [1] (looking at citations to Wikipedia articles in law reviews), the same should be possible for citations of research in non-scholarly legal media.

    [1] http://meta.wikimedia.org/wiki/Research:Newsletter/2012-04-30#Wikipedia_citations_in_American_law_reviews

  • Laura said:

    Another example to examine would be primary research on canine hip dysplasia. I think researchers would be absolutely stunned at how much their articles are anticipated, discussed, and debated in the working dog communities (by folks who have read every single study available on HD plus have experience working and breeding dogs). A good recent example would be Dr. Krontveit’s work on the impacts of housing and exercise on HD.

  • Cameron Neylon said:

    That’s an interesting idea. So I’ve done a quick dig and I don’t see terribly much recent activity on Twitter/Facebook that contains links to primary literature (searching for “canine hip dysplasia”) but might there be links via secondary reports?
    The two reports I’ve run, first on all the papers I get from a pubmed search for “canine hip dysplasia”: http://total-impact.org/collection/DtELeA

    and then for the five papers I get returned for a search on “Krontveit”: http://total-impact.org/collection/kJrCpZ

    don’t show up a whole lot but that isn’t necessarily conclusive. The question is whether people would link to those original works or to some other source for the information. I’m not sure how accessible the papers themselves would be?

  • Laura said:

    The canine HD example is just one that I’ve always wondered about. Do HD researchers know how much their articles get read and discussed by “regular folks”; how/if this impact is measured (or if the researchers know that they are making an impact on daily practice). It’s likely not an obvious example unless you are part of the working dog community.

    Krontveit doesn’t have a lengthy body of work- I used her as an example since she has two recent articles out on the impact of exercise on HD that have been discussed quite a bit by the working dog community since the articles contribute to the on-going debate of “is HD environmental or genetic”.

    Here is where things get a bit pedantic-

    I would be surprised if you found anything using “canine hip dysplasia” as a search strategy. I expect people would likely post under things like “HD and New Article” or “HD and Exercise”, “HD and Ester-C”, etc…so HD and some factor. They likely wouldn’t include canine or dog or even spell out hip dysplasia since their community already understands what they are discussing (but a specific dog breed might be mentioned). It’s analogous to you spelling out OA- your readers know what you mean.

    Thinking a bit more about this now- one issue you might find is that you won’t be able to search Facebook for evidence of discussions since working dog groups can be restricted to members only- especially in the bite sports (ie. French Ring, Mondio, IPO, etc.). You may find this issue in any practitioner community. Beyond Facebook and Twitter, you’ll find that old school discussion forums (web-based and Yahoo email-based – really!) are still used, often restricted to members.

    Your question about what do people link to – it really varies. I’ve seen links to PubMed abstracts, to the journal’s abstract, write-ups in breed magazines, discussion forum threads, quotes from Facebook posts, etc. You’ll also see varied approaches to obtaining the actual article- some people will buy it from the publisher …access it at their local university library, ….or more likely, there is probably a substantial amount of illict/e-piracy sharing going on. Here’s where you can make a case for OA (or even availability through DeepDyve) to HD research. Locking HD research behind a subscription wall makes it difficult for breeders/the public to (legally) access the work- and ultimately, apply the research in breeding and puppy raising practices. I respect the University of Guelph in Ontario (Canada) for making their longitudinal studies on canine heartworm freely available to the general public (http://www.ovc.uoguelph.ca/heartworm/2010/).

    OK- I’ve droned on long enough on this topic. :)

  • Cameron Neylon said:

    Hi Dario

    Yes, this would work as well and in some ways is more directed. AnneMarie Cunningham also pointed me at TILT (http://tilt.tripdatabase.com/) which is an interesting QA/Knowledgebase in the health practitioner space. The reason for going with Twitter was fundamentally that there is some existing infrastructure for digging into streams to find references to formal scholarly literature. That doesn’t existing for Quora/StackExchange although there is no particular reason why it shouldn’t. So I was looking for low hanging fruit and this was what I came up with – I’m sure there are lots of other smart things to pursue.

  • Around the Web: Debating the NYPL renovation, Journal editor ethics and more – Confessions of a Science Librarian said:

    […] Tracking research into practice: Are nurses on twitter a good case study? […]