FormalPara Key Points for Decision Makers

Patients, caregivers, and other relevant stakeholders are increasingly engaged in the process of developing clinical practice guidelines.

This paper provides practical guidance on using online modified-Delphi approaches to facilitate engagement of patients, caregivers, and other stakeholders in the guideline development process.

Based on a recent study about engaging individuals with Duchenne muscular dystrophy (DMD) and their caregivers in determining patient-centeredness of DMD care guidelines, we provide 11 practical considerations for using online modified-Delphi approaches for large-scale engagement.

1 Introduction

Patients and caregivers are increasingly recognized as key stakeholders in developing clinical practice guidelines (CPGs) [1,2,3]. Their involvement could potentially make CPGs more trustworthy, ensure their relevance to patient needs and preferences, facilitate the implementation of guidelines, increase compliance with CPG recommendations, and ultimately improve care quality [4,5,6]. The Institute of Medicine [7], Guidelines International Network [8], National Institute for Health and Care Excellence [9], and other organizations encourage patient and stakeholder involvement in CPG development. Nonetheless, there is little guidance on how best to incorporate patient and caregiver input in CPG development [10]. Methods commonly used to involve stakeholders are including patients and their representatives in guideline working groups, participating in focus groups or individual interviews, and convening a workshop, meeting, or seminar [5]. However, these methods typically require face-to-face interaction and do not allow for large-scale engagement. Guideline groups tend to include one or two patient representatives, and focus groups rarely have more than 11 participants. When patients do participate, they may feel intimidated by clinicians and researchers, especially if the patients are not trained [11].

Online engagement approaches resolve many of these issues. They are scalable and do not require travel to a central location. They are often characterized by the greater openness attributed to anonymous participation [12] among diverse groups of patients and their representatives. Participating from home or other patient-chosen locations makes panels more accessible, particularly if a patient has challenges with travel or public speaking [13]. Online engagement in CPG development may include commenting on draft guidelines, participating in a Delphi process, and using voting tools, Wikis, and discussion forums [1, 3, 14,15,16,17]. An online modified-Delphi approach that combines rounds of rating, anonymous feedback on group results, and a moderated online discussion forum is a promising way to engage large and diverse groups of patients and their representatives [11, 18, 19].

Although this guidance is written primarily for guideline groups, and its objective is to illustrate how to use an online modified-Delphi approach to engage patients and their representatives during different stages of CPG development, we believe it will be useful for large-scale stakeholder engagement in other areas, including prioritizing tasks, creating research standards, and developing healthcare quality indicators [20, 21]. We highlight 11 practical considerations based on our experience in a recent study about engaging individuals with Duchenne muscular dystrophy (DMD) and their caregivers in determining the patient-centeredness of the 2018 DMD care considerations that were published a month before we conducted the online panels [18]. This guidance is also informed by the literature on patient engagement in CPG development [5, 22], our experience conducting more than 25 similarly structured online expert and stakeholder engagement panels, and best practices for conducting Delphi panels [23, 24]. Although we used ExpertLens (an online modified-Delphi system) to collect data, other online platforms with rating, feedback, and discussion functionalities could also be used. Finally, while participants in our study provided input on the patient-centeredness of CPG recommendations, the engagement method could help prioritize guideline topics and intervention outcomes or help determine the extent to which recommendations help ensure health equity.

2 A Brief Description of an Online Modified-Delphi Approach to Engagement

Direct interaction among participants distinguishes modified-Delphi methods from traditional Delphi panels. Expert panels conducted using the RAND/UCLA Appropriateness Method (RAM) consist of two rating rounds and a face-to-face or phone discussion conducted between the rating rounds [25]. In health services research, RAM is often referred to as a modified-Delphi method because it adds the discussion round. Clinical experts used RAM to develop the 2018 DMD care considerations [26].

Although there are different ways of using the online modified-Delphi method for engaging patients and their representatives in the CPG development process, one way of doing so is to conduct a three- or four-round engagement process to determine the patient-centeredness of draft guideline recommendations using the RAND/PPMD Patient-Centeredness Method (RPM) (Fig. 1) [19].

Fig. 1
figure 1

Source: Khodyakov et al. [19]

The RAND/PPMD patient-centeredness method (RPM)

In an optional Round 0 of the RPM, participants are asked about their care preferences, needs, and interests, and the barriers to/facilitators of seeking care. Round 0 is indicated if this information is not available from prior research. Round 0 outcomes can encourage participants in subsequent rounds to think beyond their personal experiences. If needed, participants are also asked to prioritize care outcomes, barriers, and facilitators for a given aspect of care. In Round 1, participants review draft care recommendations, rate them on a predefined set of criteria, such as importance and acceptability (see Box in section 3.1.1), and explain their ratings using open-text boxes. In Round 2, they see how their own Round 1 answers compare with those of the group and whether consensus is achieved. Participants contribute to a moderated, asynchronous, and (partially) anonymous discussion board. Finally, in Round 3, participants can revise their original ratings. The RAM approach to determining group consensus was applied to Round 3 ratings to determine the final group decisions [25].

3 Practical Considerations for Conducting Online Modified-Delphi Panels

The following practical guidance for using the online modified-Delphi approach covers three stages of stakeholder engagement—preparation, implementation, and evaluation and dissemination—and includes examples from our recent study (Fig. 2).

Fig. 2
figure 2

11 Practical considerations for online modified-delphi panels

3.1 Preparing for Research

3.1.1 Co-Develop an Engagement Approach with Relevant Patient Representatives

Guideline developers should determine who should be engaged in the CPG process and work with patients, caregivers, and their representatives to design all engagement activities and data collection protocols. At this stage, developers should also consider whether patients may have substantively different perspectives than caregivers and, therefore, whether patients should be engaged independently from, or together with, caregivers in the CPG development process. Forming an advisory board (AB) could also be useful. Research suggests it is important to engage relevant stakeholders early on and ask for their input often [27]. Working with a patient advocacy organization can help locate patients, caregivers, and others with relevant perspectives who can provide input on patient needs, the feasibility of proposed engagement activities, appropriate participation burden, and acceptable remuneration for participation. Patient representatives can be instrumental in helping operationalize the engagement tasks, define key concepts, translate scientific information, and finalize research protocols [28, 29]. All research-related activities should be reviewed and approved by the institutional review board.

Examples We worked with the Duchenne Registry to identify key patient and caregiver partners and assembled a multi-stakeholder AB that included one adult with DMD, two caregivers, two clinicians, two genetic counselors, three researchers, and two guideline developers.Footnote 1 The AB was co-led by a caregiver and a Delphi expert who made sure that all decisions were made jointly and that the patient/caregiver voices were heard, valued, and given more weight (than those of the other AB members) in discussions related to decisions that may have affected what the panelists were asked to do and how the panel results were interpreted. We found patient and caregiver input particularly useful for helping us define, measure, and operationalize patient-centeredness in the guideline context (see Box). Caregivers and patients on the AB also helped us identify the recommendations that may be of interest to patients and caregivers. To ensure participants understood the complex medical information, we developed plain language explanations of each recommendation. Here, patients/caregivers worked with clinicians to finalize these descriptions. Using AB input, we also included the clinical rationale for each care consideration, a description of the process for following the guidance, and other relevant information, such as treatment burden.

figure a

3.1.2 Mirror Methods Used for Expert and Stakeholder Engagement

One way to increase the scientific rigor and legitimacy of patient engagement in CPG development is to adapt the methods that clinical experts use to develop guidelines. Because CPG development is labor intensive and time consuming, it is crucial to ensure that participants do not feel overburdened [30]. Finding a balance between rigor and ease of participation is key.

Examples To mirror the methods clinicians used for the 2018 DMD care considerations [26], we began Round 1 by providing study participants with data we collected in Round 0 on the reasons for, and the barriers and facilitators associated with, seeking care. We then asked participants to rate the patient-centeredness of guideline recommendations (Fig. 3). This corresponded to the step of providing clinical experts with a literature review before asking them to rate the appropriateness of different treatments. We also adopted a three-round modified-Delphi format and used a nine-point rating scale, which mirrored the appropriateness and necessity scales that clinicians used to develop the 2018 DMD care considerations. Finally, we adopted the RAM approach to determine consensus [25].

Fig. 3
figure 3

Round 1

3.1.3 Pilot Test the Engagement Approach

It is best practice to pilot test any data collection with a small sample of qualified participants [31]. A pilot is particularly important for online modified-Delphi approaches [32] because the task is novel for a typical patient and there are nuances to using online platforms. It is also important for ensuring participants can actually use the online tool, especially if they have disabilities. Guideline developers and panel participants are not in the same room and cannot provide assistance in real time. It is important to ensure pilot testers are not counted as study participants.

Examples Based on our experiences [33], we recommend testing the clarity of participation instructions, recommendation wording, and rating criteria. A pilot allowed us to estimate the time that participation in each round was likely to take, which helps determine the amount of renumeration, if any. Asking testers for feedback at the end of the pilot via a survey or brief telephone interview can help identify how the wording of recommendations should be changed, what information to add or delete, or how to improve the engagement process. Based on feedback we received during the pilot, we reduced the number of recommendations that participants had to rate.

3.1.4 Recruit Participants with Diverse Perspectives

Expert panels are often criticized for not including diverse perspectives. A panel about the clinical appropriateness of carotid endarterectomy that includes only surgeons will arrive at different recommendations than a panel of surgeons, neurologists, primary care physicians, and radiologists [34]. The same can be true of patient panels. It is important to ensure that patient representatives have relevant experiences and to help them think about the experiences of a typical patient, especially if patient-only panels use a methodology that clinical panels adhere to.

Examples We found that using an established and curated patient registry was helpful for recruiting a panel with diverse views. While it may be difficult to know what types of patients may have different views on a given issue, we were able to reach the diversity goal by using previous research on patient preferences, recruiting demographically and geographically diverse panelists, and recruiting those in different stages of disease progression. If recruitment via registries is not possible, then screening should be used to confirm a participant’s expertise with a condition.

3.1.5 Assemble a Panel of Adequate Size and Composition

Assembling panels of adequate size and composition helps ensure effective and productive online discussion and account for attrition in online modified-Delphi panels. Research suggests empaneling approximately 40 participants; larger panels may increase participation burden during the discussion round, and smaller panels may become too small due to attrition [35]. Attrition is typical for all Delphi panels because they rely on iterative data collection [36]. It is not uncommon for online Delphi panels with only two rating rounds to have 50% participation rates, calculated by dividing the number of those completing all rounds by the number of those invited to participate [37].

Examples To account for attrition, we included 61 participants in each panel. To reduce attrition, we asked participants during recruitment to confirm their interest and intention to participate. We made sure both panels consisted of patients and caregivers to ensure diversity of perspectives. Because DMD is a rare pediatric disorder, most participants were parents of, or caregivers to, individuals with DMD, but we also included adults with DMD.

3.2 Implementation and Continuous Participant Engagement

3.2.1 Build Participant Research and Engagement Capacity

CPG groups require patients and their representatives to undergo extensive training on the CPG development process, which can make patients unwilling to engage [22]. Although an online platform can help reduce perceived participation burden, it is important to ensure that participation instructions and task descriptions are self-explanatory. Because some participants are more comfortable with online technologies and sharing disease experiences, CPG developers should try to put all participants on a level playing field.

Examples To build their capacity, we provided participants with instructions on how to participate in the online process and use the online platform. The instructions were modified based on the pilot results. We included instructional videos on how to log into ExpertLens and participate in each round. Because Round 2 used charts showing the distribution of participants’ responses, we provided explanations of what each chart showed, included tooltips that explained statistical terms, and color-coded group responses/decisions (i.e., green text identified recommendations that participants agreed were important or acceptable) (Fig. 4). In case participants had questions or technical issues, they received contact information for study staff, including the principal investigator, caregiver representative, clinician, and technical support personnel.

Fig. 4
figure 4

Round 2

3.2.2 Build Two-Way Interaction

Although face-to-face interaction may be more engaging than online discussion boards, threaded discussion boards allow participants to engage in more thoughtful conversations and explore other participants’ ideas [38]. That is why encouraging two-way information exchange and lively discussions is particularly important for online modified-Delphi panels. Make sure discussion boards have a clear structure and allow participants to keep track of comments made by other participants. As with in-person expert panels, an experienced discussion facilitator is crucial. The facilitator’s role is to encourage discussion, solicit comments from all participants, and ensure that no single participant dominates the conversation [25, 39].

Examples In our experiences, providing the distribution of Round 1 responses and a summary of participants’ rationales in Round 2 helps promote discussion because participants see how their responses compare with those of other participants. A threaded discussion board structure makes it easier for participants to find the right place to share their opinions (Fig. 4). Using participant IDs helps ensure that all comments made by a given participant can be attributed to him or her, and the anonymity facilitates an open exchange of information. We found it useful for the user ID to show whether a participant was a caregiver or a patient to help participants contextualize their comments [49].

To ensure active discussion engagement, three trained discussion moderators (a caregiver, a genetic counselor, and a modified-Delphi expert) facilitated the discussions by reviewing and posting comments at least once a day. Moderators followed a guide (see Appendix A) and were instructed to focus on group dynamics, ask non-leading clarifying questions, promote direct engagement among participants, and answer factual questions about the study. They also provided access to additional informational resources as needed.

3.2.3 Ensure Continuous Engagement and Retention of Participants

Because participant attrition is common in Delphi panels [32, 36], it is important to keep panelists engaged throughout all study rounds. The Delphi method is less common than surveys and relies on iterative data collection. Panelists can participate at any time while each round is open but are expected to contribute to each round. Because of the time gap between rounds, reminding them about their participation is critical.

Examples To encourage continuous engagement, we informed participants about expected time commitments and paid them $US50 for completing each round. We sent personalized email invitations when each round opened and emailed up to three reminders to lagging participants during each round. We extended the round deadlines as needed. If requested, we allowed participants to perform Round 1 after Round 2 opened but before they saw other participants’ responses and comments. Such flexibility may be required when the condition of interest causes significant impairment or treatment burden. During Round 2, participants also received daily discussion digests informing them of when others posted new comments or responded to the participant’s own comments.

3.2.4 Conduct Scientifically Rigorous Data Analysis

Research shows that the methods used to measure consensus can have a significant impact on study findings [40] and calls for specifying how Delphi data will be analyzed before they are collected [41]. The RAM manual offers a validated and frequently used measure of consensus for nine-point Likert scales [25]. Moreover, Delphi panels have been criticized for low replicability of its findings [42]. Therefore, it is prudent to conduct more than one panel using the same protocol, balance panel composition on key variables that might affect outcomes, and include data from all panels in the a priori determination of group consensus [43]. Because the Delphi technique is based on a mixed-methods approach to data collection, thematic analysis of qualitative comments can help explain why consensus was or was not reached [44].

Examples To ensure rigor of our panel findings, we published our research protocol at the beginning of the project [18] and used the RAM to measure consensus [25]. We also ran two concurrent panels using the same protocol to ensure replicability of panel findings. We randomly assigned selected participants to one of two panels and balanced panels in terms of caregiver educational attainment, ambulatory status of the individual with DMD, and the distance to the closest PPMD Certified Duchenne Care Center [45], which we considered key variables that might affect determinations of patient-centeredness [46]. Our a priori criteria for patient-centeredness was that both panels had to agree that a recommendation was important and acceptable. Finally, we qualitatively analyzed all comments made by participants throughout the panel to determine points of agreement and disagreement and any differences in perspectives between patients and caregivers.

3.3 Evaluation and Dissemination

3.3.1 Evaluate Engagement Activities

Participant experiences with the Delphi processes are not typically evaluated as part of every panel. Understanding what works and what does not is important for measuring the quality of panel findings and the engagement process as well as for retaining participants during iterative data collection [47].

Examples All panels conducted using the ExpertLens system include questions that measure participant experiences and satisfaction with the platform [48]. For our study, we slightly modified these questions and asked them after Rounds 1 and 3. We also interviewed a diverse sample of individuals with DMD and their caregivers after the modified-Delphi process was completed [49].

3.3.2 Disseminate Results

Sharing results with participants [50] is a key principle of participant-centered research [51], and sharing individual results and overall study findings can help enroll and retain participants in longitudinal projects [52, 53]. Disseminating study findings to wider audiences, including patients, caregivers, clinicians, and guideline developers, is important not only for the conduct of rigorous and transparent research but also for improving care quality and helping develop future guidelines [2, 54].

Examples Feedback on Round 1 results provided to participants can serve as an important incentive to participate and engage in Delphi panels. In Round 2 of our study, we not only provided statistical summaries of Round 1 ratings, but also thematically analyzed the reasons behind participant ratings. We also emailed copies of Round 2 discussion comments to participants who requested them after the panels were completed. We presented preliminary study findings to our panelists using a webinar format that has been posted on the PPMD’s YouTube channel (https://www.youtube.com/watch?v=aps_E08C4fg). To reach a wider audience, we presented our results at the annual PPMD and G-I-N conferences, as well as at the Centers for Disease Control and Prevention, which was responsible for developing the 2018 DMD care considerations. In addition, we gave a G-I-N webinar, which was recorded and posted on the G-I-N North America’s website (https://g-i-n.net/library/webinars/g-i-n-n-a-webinars/a-new-online-approach-to-engaging-patients-andcaregivers-in-guideline-development/?searchterm=khodyakov). Finally, we published the results in peer-reviewed journals [19, 46, 49].

4 Conclusions

The importance of involving patients, caregivers, and/or their representatives in the process of developing CPGs has been recognized by guideline developers. We offer 11 practical considerations for using online modified-Delphi approaches to facilitate large-scale engagement. While we used the examples from a recent study that engaged individuals with DMD and their caregivers in rating the patient-centeredness of already finalized DMD care considerations, online modified-Delphi approaches could be used to engage relevant stakeholders not only throughout but also beyond other stages of guideline development.

However, online engagement requires specialized resources and has its limitations. First, guideline developers need access to an online platform with survey, discussion, and analytic capabilities; patients need access to an internet-connected device. Second, not every patient may find the online experience to be as engaging as in-person meetings. Nonetheless, people quickly become accustomed to using technology in all aspects of their lives, which is likely to increase their comfort level with online engagement moving forward. Third, discussion moderators need skills to facilitate asynchronous discussion among panelists. Finally, although online engagement is intended to be intuitive, training on how to participate in data collection activities should be provided.

Online modified-Delphi approaches may not be appropriate for every engagement activity. In selecting the type of engagement, guideline developers should consider its purpose, engagement tasks, and participants. Given the relative novelty of this online method and the fact that we engaged patients and caregivers after the DMD care considerations were finalized, future research should focus on evaluating the impact of online engagement of patients and caregivers on the quality of and adherence to guideline recommendations. Nonetheless, we believe that online engagement is a promising approach for guideline developers to consider and should be added to the G-I-N PUBLIC Toolkit.