Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Sep 28, 2021
Open Peer Review Period: Sep 28, 2021 - Nov 23, 2021
Date Accepted: Jul 18, 2022
(closed for review but you can still tweet)

The final, peer-reviewed published version of this preprint can be found here:

Planning and Reporting Effective Web-Based RAND/UCLA Appropriateness Method Panels: Literature Review and Preliminary Recommendations

Sparks JB, Klamerus ML, Caverly TJ, Skurla E S, Hofer P T, Kerr EA, Bernstein SJ, Damschroder LJ

Planning and Reporting Effective Web-Based RAND/UCLA Appropriateness Method Panels: Literature Review and Preliminary Recommendations

J Med Internet Res 2022;24(8):e33898

DOI: 10.2196/33898

PMID: 36018626

PMCID: 9463617

Planning and reporting effective virtual RAND/UCLA Appropriateness Method panels: literature review and preliminary recommendations

  • Jordan B Sparks; 
  • Mandi L Klamerus; 
  • Tanner J Caverly; 
  • Sarah Skurla E; 
  • Timothy Hofer P; 
  • Eve A Kerr; 
  • Steven J Bernstein; 
  • Laura J Damschroder

ABSTRACT

Background:

The RAND/UCLA Appropriateness Method (RAM), a variant of the Delphi Method, was developed to synthesize existing evidence and elicit the clinical judgement of medical experts on the appropriate treatment of specific clinical presentations. Technological advances now allow researchers to conduct expert panels online, offering a cost-effective and convenient alternative to the traditional RAM. For example, the Department of Veterans Affairs recently used a virtual RAM to validate clinical recommendations for de-intensifying routine primary care services. A substantial literature describes and tests various aspects of the traditional RAM in health research, yet we know comparatively less about how researchers implement online expert panels.

Objective:

The objectives of this study were twofold: 1) to understand how the virtual RAM process is currently used and reported in health research, and 2) to draw on our experience with the Assessing when to Stop or Scale back Unnecessary RoutinE Services (ASSURES) Study to provide preliminary reporting guidance for researchers to improve the transparency and reproducibility of reporting practices.

Methods:

The PubMed database was searched to identify studies published between 2009 and 2019 that used a virtual RAM to measure the appropriateness of medical care. Methodological data from each article was abstracted. The following categories were assessed: composition and characteristics of the online expert panels, characteristics of panel procedures, results, and panel satisfaction and engagement.

Results:

Of the 12 studies meeting the eligibility criteria and reviewed, only 42% (5/12) implemented the full RAM process with the remaining studies opting for a partial approach. Among those studies reporting, the median number of participants at first rating was 42. While 92% (11/12) of studies involved clinicians, 50% (6/12) involved multiple stakeholder types. Our review revealed that studies failed to report on critical aspects of the RAM process. For example, no studies reported response rates with the denominator of previous rounds, 42% (5/12) did not provide panelists with feedback between rating periods, 50% (6/12) either did not have or did not report on the panel discussion period, and 25% (3/12) did not report on quality measures to assess aspects of the panel process (e.g., satisfaction with the process). A self-assessment of the ASSURES study revealed implementation features, including panelist engagement and satisfaction, that were not recommended in earlier reviews of the RAM method, but nonetheless proved valuable for making mid-process adjustments.

Conclusions:

In a post-COVID world, we anticipate that the virtual RAM will be an appealing option for researchers seeking a safe, efficient, and democratic process of expert agreement. Our literature review uncovered inconsistent reporting frameworks and insufficient detail to evaluate study outcomes. We provide preliminary recommendations for reporting that are both timely and important for producing replicable high-quality findings.


 Citation

Please cite as:

Sparks JB, Klamerus ML, Caverly TJ, Skurla E S, Hofer P T, Kerr EA, Bernstein SJ, Damschroder LJ

Planning and Reporting Effective Web-Based RAND/UCLA Appropriateness Method Panels: Literature Review and Preliminary Recommendations

J Med Internet Res 2022;24(8):e33898

DOI: 10.2196/33898

PMID: 36018626

PMCID: 9463617

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.

Advertisement