Keywords
critical thinking, critical health literacy, informed decision making, infodemic, health education, secondary school, educational design, human centred design
critical thinking, critical health literacy, informed decision making, infodemic, health education, secondary school, educational design, human centred design
Claims about how to care for our health are everywhere, spread by friends, family, news media, industry, healthcare professionals, policymakers, researchers, and others. Many of these claims unreliable,1 but people often lack the skills needed to assess them.2,3 When we believe unreliable claims, we might take ineffective or harmful actions, or fail to take helpful actions. The Covid-19 pandemic showed how easily unreliable claims4–6 and research7–9 spread, impacting public trust and protective behaviours.10,11
Publicly debunking untrustworthy information has value, but the effectiveness of this retroactive strategy might be limited. Misinformation, once spread, can be resistant to correction; an alternative narrative may not exist (for instance, there may not be a safe or proven alternative treatment); and individuals may reject the scientific process or the source altogether rather than change their established views about some topics.12 Pre-emptively ‘inoculating’ people against misinformation13,14 by teaching them to think critically about claims of what works, and how to make informed decisions, has the potential to provide broader, more long-lasting protection.
In earlier work developing and evaluating Informed Health Choices (IHC) resources for primary school students, we have shown that it is feasible to teach critical thinking skills about health actions to children as young as 10-years old.15–20 In that project, we created a framework of principles that are important for people to understand when assessing the reliability of healthcare claims and making informed choices: the IHC Key Concepts.21,22 This framework is a starting point for designing curricula, learning resources, and evaluation tools. Focusing on a selection of the IHC Key Concepts, we then developed printed learning resources for primary school children (age 10-12 years) and their teachers15 and a podcast for parents.16
To evaluate the effect of these resources, we conducted two randomised trials in Uganda, one including 120 schools and more than 10,000 children, and another that included 675 parents and guardians of primary school children.23 The primary outcome measure was responses to multiple-choice questions that measure respondents’ ability to apply IHC Key Concepts: the Claim Evaluation Tools.17,18 The primary school trial demonstrated that use of the resources led to a large improvement in the ability of children and teachers to apply IHC Key Concepts to hypothetical scenarios.17 The podcast trial showed a smaller, similar effect among parents.18 Follow-up data showed that children retained their learning for at least one year,19 while the performance of the parents declined.20 The primary school resources have been translated into 12 languages and adapted for use in other countries.24
Alongside the trials, we undertook process evaluations to explore barriers and facilitators for scaling up use of the learning resources, potential adverse effects, and potential additional benefits.25,26 We found that children, teachers, parents, and district education officers valued the IHC primary school resources. The human-centred design approach we employed - iteratively addressing user and stakeholder concerns prior to and during development - was likely an important contributing factor for this positive reception. However, the study pointed to two important implementation barriers in Uganda: printing costs and lack of time in the school schedule.
Informed by these findings, we began the current five-year project in 2019 to develop and evaluate resources for lower secondary schools (age 13-16) in Kenya, Rwanda, and Uganda. Since drafting this article, we have evaluated the resources in three parallel randomised trials,27–29 are carrying out process evaluations,30–32 and will conduct one-year follow-up trials in each of the three countries.
This article describes the development of these resources that took place prior to evaluation during the first 2.5 years of the project in two phases. We began development with this set of objectives (see protocol33):
To explore how we might develop resources that are
• Digital (avoiding printing costs)
• Suitable for use with schools’ available digital technology and infrastructure
• Compatible with national curricula
• Based on evidence about effective strategies for teaching critical thinking
• Experienced as accessible, useful, usable, understandable, credible, and desirable by students and teachers, and well-suited to use in their schools
• Easily translatable and adaptable to other contexts
• Sustainable (i.e., not dependent on our team for rolling out at scale)
We organised the work in two phases: 1) a set of preliminary studies (involving data documented and discussed elsewhere) and 2) an iterative development phase with data collection and analysis reported in this article. Although the preliminary studies have already been published elsewhere, findings from those studies resulted in reframing of some of our objectives and provided the basis for content and design decisions made in the second phase. Therefore, we describe methods and results from both phases (with references for more detail about phase one). In the discussion, we describe how this study may inform development of similar educational resources for teaching critical thinking.
We employed perspectives and methods from human-centred design. This is an iterative approach to creating products, systems and services that places users’ and other stakeholders’ needs and experiences at the centre of the design process.34,35 Early and continued engagement with stakeholder groups and multidisciplinary collaboration are central components of a human-centred design approach.36,37 We employed qualitative data collection and analysis methods to explore the user experiences of multiple types of stakeholders for the purpose of informing the development process.
We established a core team with backgrounds in health systems and public health research, design, journalism, education, social science, statistics, and information and communication technology (ICT). Research leadership was shared among senior team members in East Africa and Norway. Three PhD fellows (one female and two male) based in Kisumu (Kenya), Kigali (Rwanda) and Kampala (Uganda), engaged with stakeholders and collected and analysed data, supported by their local teams including researchers with experience developing and evaluating the IHC primary school resources. All teams contributed to design and content development, led by the team based in Norway who also analysed data. The ICT team, based in Chile, developed the technical solutions.
We conducted this work in two phases: 1) Groundwork and 2) Development cycles (Figure 1). The first phase began in August 2019, the second phase began January 2020 and ended in April 2022.
We defined stakeholders as people or organisations that have a vested interest in the process or results of the study. Intended users – secondary school teachers and students – were the key stakeholders. Additional stakeholders included curriculum developers and other education policymakers, school administrators, parents, educational researchers, researchers in public or clinical health, and health professionals.
For the purpose of providing continuous input, we established groups of stakeholders in each country early in the project. We formed teacher networks, recruiting via formal invitation letters to teachers who worked in a mix of government funded and private schools, with varied ICT resources (Uganda), and in rural, semi-rural, and urban environments (Kenya and Rwanda). We formed student networks by asking members of the teacher networks to suggest lower secondary school students who were interested, likely to contribute, and able to participate without adversely affecting their schoolwork, sending them formal invitations and assent forms, and consent forms to parents. Due to Covid-related school closures in parts of Development Cycle 1, we also established “home networks” of students that we had access to when schools were closed. Home networks included both students that were already recruited to the student network and additional students whom team members could easily reach from nearby communities. We employed convenience sampling and used the same assent forms and consent forms to parents as for the student networks.
We engaged with the national curriculum development offices to establish channels of contact and collaboration. We formed national advisory panels of policy makers at the ministerial and regional and district levels, school directors, head teachers, leaders of teacher unions, and representatives of parents’ groups and civil society. We created an international advisory panel of people from 18 countries who had expertise in education, education policy, and relevant areas of research, such as health literacy, evidence-informed decision making, science communication and ICT. Additionally, during the final cycle of development, we recruited seven schools across the three countries to pilot use of the resources in one or more classes over a school semester.
Apart from people in the international advisory panel, many of whom were a part of our existing professional networks, and some of the students in home networks, we did not have established relationships with the stakeholders prior to the onset of this project.
Details of how we selected and recruited each stakeholder group and characteristics of participants and schools are described in Tables 1-3. More information about the methods, degrees, and nature of engagement with different stakeholders, can be found in a protocol for evaluating stakeholder engagement.38
The groundwork phase consisted of a series of preliminary studies. More detail about the methods and results can be found in separate publications (see overview in Table 4).
Study aim | Methods/Study design | Data sources | More detail |
---|---|---|---|
Analysing contexts in Kenya, Rwanda, and Uganda | Individual and group interviews, document analysis, non-participatory observation | Interviews: Teachers, students, curriculum developers, ICT support staff Document analysis: learning materials for health and science subjects, curriculum documents, ICT policy documents Observation: School classes | 39–41 |
Prioritising IHC Key Concepts (learning goals) | Iterative, structured consensus process | Curriculum specialists, teachers, and researchers | 54 |
Identifying teaching strategies for critical thinking | Overview of systematic reviews | Systematic reviews of the effects of teaching strategies for critical thinking | 44 |
Collecting examples | Individual and group interviews | Secondary school students in Kenya, Rwanda, and Uganda | 45 |
Analysing contexts
To inform the development of the resources, as well as their potential implementation in Kenya, Rwanda and Uganda lower secondary schools, we explored if there was demand for learning resources to teach critical thinking; if such resources were in use; how the content maps onto existing curricula; what administrative approval was necessary; what ICT infrastructure is available in schools and how it is used in schools. We analysed the three country contexts. Detailed methods are reported in a separate publication.39–41
To synthesize findings from these studies in a way that was useful for developing educational resources, we drew on the the behaviour change wheel.42 This is a framework for characterising and designing interventions for behaviour change, built around three essential conditions for change: Capability, Opportunity, and Motivation (the COM-B system). We organised findings from the three context analyses according to these three themes. The synthesized findings from these three context analyses supplemented our original set of development aims and provided a much more detailed understanding of contextual challenges before we started the development phase.
Prioritising IHC Key Concepts
Due to the scope of our research funding, as well as the pandemic and school closures, we could not pilot and evaluate resources that took longer to use than one school semester. To prioritise which of the 49 IHC Key Concepts to include in the learning goals for the resources, we organised a structured, iterative consensus process with curriculum specialists, teachers, and researchers. Detailed methods are reported in a separate publication.43
Identifying teaching strategies
To identify effective, relevant teaching strategies to inform the resources, we conducted an overview of systematic reviews of strategies for teaching critical thinking. Detailed methods are reported in a separate protocol.44
Collecting examples
To identify relevant and engaging examples for the resources, we interviewed Kenyan, Rwandan, and Ugandan secondary school students (aged 14 to 16) from the student networks, via WhatsApp. We collected examples of health actions, their claimed and actual effects, and information sources. The students rated their interest in a broad range of health actions and conditions, and we used those with higher scores as examples in the resources. Detailed methods are reported in an online report.45
Starting with what we learned and developed in the preliminary studies, we developed resources in three iterative cycles of content creation and feedback. Each development cycle (Figure 2) included:
- Generating ideas
- Prototyping
- Pilot testing prototypes and collecting data
- Analysing problems
- Checking our analysis of the most important problems and solution ideas with key stakeholders, drawing on the concept of “member checking”46
We also sought ad hoc feedback from the student and teacher networks as needed.
Generating ideas
We generated content and design ideas in two ways: brainstorming33 internally, within the team; and identifying ideas suggested in external feedback, from participating stakeholders.47 We selected ideas based on considerations such as consistency with what teachers and students said they valued, and ease of implementation.
Prototyping
A prototype is a sketch or model. We developed increasingly refined prototypes in each cycle. We used the platforms Google Drive (including Google Docs and Google Slides) and Adobe XD for drafting, editing, and sketching. We drafted and edited text using Google Docs, exported early prototypes to PDF format, and used Google Slides and Adobe XD later to create interactive sketches. Final versions were published as web pages, using Google Drive as a content editing platform.
Testing prototypes and collecting data
The three PhD fellows conducted interviews in their respective settings, supported by their research assistants. They had university degrees (Bachelor of Statistics, Master of Epidemiology & Biostatistics, Master of Public Health, and Master of Community Health and Development) and have attended PhD-level courses in qualitative study methods.
Individual interviews employing user testing methods
Using semi-structured guides48 that were pilot tested, the three PhD fellows conducted user test with individual teachers and students, to explore how they experienced prototypes and how we might make improvements. Data were collected face to face or online when covid measures restricted face to face meetings. When possible, user testing immediately followed a pilot lesson, so the participant had their experiences from the lesson fresh in mind. Interview duration was approximately one hour. Occasionally, teams printed the digital prototypes on paper to collect feedback where there was no access to a device or internet connection.
Each user testing session had three main parts. First, interviewers introduced themselves and the reasons for the interview. Next, they employed a think-aloud approach: a method of qualitative data collection that encourages the participant to articulate their thoughts while performing a task,49–51 in this case reviewing prototypes of the resources. The last part was conducted as a semi-structured interview.
We captured data as observers’ notes and audio recordings. We designed the interview guides to explore different facets of the user experience, including usefulness, understandability, usability, credibility, desirability, and identification, based on a revised version of Morville’s framework of user experience (Table 5).34
Group interviews
The PhD fellows conducted semi-structured interviews with groups of students or teachers to explore their experiences of the prototypes, using pilot-tested interview guides.48 When possible, interviews were scheduled immediately after a pilot lesson when the participants had their experiences fresh in mind. Sessions lasted approximately one hour.
We chose group interviews, rather than user testing or individual interviews, when we anticipated that the former would improve the quality of the data, for instance by increasing students’ confidence in speaking with researchers. Group interviews took place at schools or other locations determined to be practical for the participants. After introducing themselves and explaining reasons for the interview, one researcher moderated and one or more researchers observed and took notes. Sometimes, teachers were present during student interviews. With written informed consent (from parents and teachers) and assent (from students), we audio-recorded sessions.
Piloting and observation
We facilitated pilots of the prototypes to explore how teachers and students used and experienced them in as natural settings as possible. Early pilots involved single lessons, with one of the PhD fellows sometimes assuming the role of the teacher. These took place both at schools and in other settings, such as students’ neighbourhoods, when schools were closed due to the pandemic. The final set of pilots included a full set of lessons taught by teachers at schools, over a school term. (In Uganda, although schools were still closed, we got permission for students from the student network and teachers to meet at the schools for the purpose of conducting pilot lessons.) PhD fellows or research assistants observed and took notes using a structured guide,48 without intervening. We followed up pilots with either individual or group interviews.
“Critical Thinking about Health” Test
After piloting a full set of lessons, students took the “Critical Thinking about Health” Test.52 Administering the test had two purposes: validating the items included in the test and giving us a sense of whether the prototypes had the intended effects. We report the validation in detail in a separate article.52
Consulting the advisory groups
Twice annually, we emailed the international advisory group, to update them on the project, and ask for feedback on specific parts or prototypes. We entered their feedback in a spreadsheet,47 familiarised ourselves with those data, and discussed and agreed on how to deal with any specific suggestions. We held face-to-face meetings with the national advisory groups to keep them updated and seek feedback related to ensuring the sustainability and future scaling up of resources, if shown to be effective.
Data analysis
We transcribed and translated several of the audio recordings from individual and group interviews. The three PhD fellows and their three research assistants reviewed transcriptions, their own interview and observation notes, and recordings, and extracted data about negative and positive experiences. They entered the data into Excel spreadsheets,47 supplemented with quotations where relevant, and tagged the data using pre-determined codes related to the nature of an experience (e.g., “negative”); the “location” of the experience, i.e., the relevant part of the resources (e.g., “illustration/graphics”), and suggested implication (e.g., “consider changes”). We did not anticipate differences in results between male and female students, so coding did not include gender identification. The prototyping team reviewed the coded data, flagged data entries that they did not understand, suggested changes to codes, and discussed the suggestions with the PhD fellows and their research assistants.
After agreeing with the original coders on the final codes, the three-person prototyping team sorted data entries according to the nature, implication and location, and re-reviewed them, focusing on data tagged with the implication “showstopper” or “consider changes”. New topic codes emerged during this process and were added as thematic labels of issues that needed addressing (e.g., “Time”, “Conceptual misunderstanding”, “student-computer mode”). This coding scheme evolved throughout. For an example from the dataset,47 see the “Topics” column in the Development Cycle 2 – codes and response options. When data analysis raised important unanswered questions or resulted disagreement about interpretation, the team either contacted teacher or student networks for additional input, or adjusted interview guides to include these issues in the next development cycle.
The prototyping team drafted a description and assessment of the most important problems with the prototypes, with an initial set of ideas for solutions. The whole project team discussed these drafts, reached a consensus about which were the most important problems and brainstormed additional ideas for solutions. Based on this input, the prototyping team made the final decision about which solutions to implement.
Checking with stakeholders
Drawing on the method “member checking”,46 we carried out “stakeholder checking”, checking our analysis and ideas with the teacher networks and curriculum developers. At the end of each development cycle, we presented descriptions of what we considered the most important problems and our proposed solutions. We asked about the accuracy of the analysis, their reactions to the proposed solutions, and any suggestions of their own. We modified solutions based on this input.
Ad hoc feedback
At different points within each development cycle, we also contacted people in the teacher and student networks for quick feedback on specific issues, for instance what to call the resources.
Figure 3 and Table 6 provide an overview of what we did in the three development cycles.
We obtained ethics approval for the entire project from local institutional review boards and national research governing councils in Kenya, Rwanda, and Uganda: the Rwanda National Ethics Committee (approval number 916/RNEC/2019); Masinde Muliro University of Science and Technology Institutional Ethics Review Committee and the Kenyan National Commission for Science, Technology and Innovation (approval number NACOSTI/P119/1986); and Makerere University School of Medicine Research Ethics Committee (REC REF 2020-139) and Uganda National Council of Science and Technology (HS916ES).
We present two different types of results that emerged from this work: the designed output (final version of the resources), and the knowledge output that informed our development (findings from Phase 1 and 2). We report the designed output first, to give the reader an overview of the resource features and components we refer to in the findings.
The final set of resources, Be smart about your health, is an open access website for teachers comprised of ten lesson plans, a teachers’ guide, and extra resources including materials for teacher training.53 Lesson plans are designed in two modes: for use in classrooms with a blackboard/flipchart (optimized for smartphone) and for classrooms with a projector with lessons formatted as downloadable Google Slides presentations. See Table 7 and Figures 4–9.
Lessons plans:
- Content: Set of ten lesson plans, two of which are for review and applying what students learned to their daily lives (Table 7)
- Structure: Each Lesson Plan has an Overview, Lesson, and Background section. Each Lesson is designed to be taught in 40 minutes, and has three parts: Introduction, Activity, and Wrap-up (Figure 5)
- Format: To accommodate for varied ICT infrastructure in schools, we created two modes of resources for teachers to deliver lessons: Blackboard lesson plans and Projector lesson plans. Blackboard lesson plans are optimised for teachers on mobile devices and work also offline. These can also serve as a back-up in the event of electricity outages. Projector mode is for use in classrooms that have access to a projector. We also created sets of optional, downloadable printouts for teachers to use as a paper back-up (Figure 6)
Teachers’ guide: This includes an introduction to why the learning goals are important, a description of the content and how it ties to the curriculum in Kenya, Rwanda and Uganda, how to navigate the resources, how they were developed, and where to find other relevant resources (Figure 7).
Extra resources:
Below we describe output from the preliminary studies in Phase 1. As explained earlier, data from these studies have already been reported and discussed in other publications. However, for the reader to understand the full development process, and how these results influenced our objectives, content, and design, they are summarised below.
Analysing contexts
We organised key findings from the context analyses in four themes: motivation to teach or learn the content; capability to teach the content; capability to use digital resources; and opportunity to teach the content within existing curricula using digital resources.39–42 Based on these findings, we identified opportunities and challenges, and made decisions about how to resolve those.
Stakeholders considered it to be important for students to learn to think critically about health, and the curricula included related learning goals. But teachers lacked capabilities and opportunities to teach both critical thinking in general, and about health choices more specifically. Additionally, schools’ digital infrastructures ranged from fully equipped with student computer labs to schools with few devices and no internet. Some of the strategies we decided to use included creating a teacher’s guide with in-depth background content and providing extra information about teaching strategies for critical thinking. We created lessons in different modes, with offline functionality to accommodate for variation in access to computers or digital infrastructure. See Table 8 for more detail about these findings and our solutions.
Prioritising IHC Key Concepts
Twelve curriculum specialists, teachers, and researchers, in Kenya, Rwanda, and Uganda prioritised nine of the IHC Key Concepts (Box 1) as suitable, relevant, and important for lower-secondary school students.54 These nine concepts formed the basis for the learning goals and subsequent content development.
• Treatments can cause harms as well as benefits.
• Widely used treatments or those that have been used for a long time are not necessarily beneficial or safe.
• Treatments that are new or technologically impressive may not be better than available alternatives.
• Personal experiences or anecdotes alone are an unreliable basis for most claims.
• Identifying effects of treatments depends on making comparisons.
• Comparison groups should be as similar as possible.
• Small studies may be misleading.
• Large, dramatic effects are rare.
• Weigh the benefits and savings against the harms and costs of acting or not.
Identifying teaching strategies
In an overview of systematic reviews, we identified teaching strategies for helping primary and secondary school students learn to think critically.44 We experimented with using different strategies in each lesson, but found it added too much to teachers’ and students’ procedural cognitive load. Therefore, we chose a limited set to minimize variation and used most strategies across all lessons (Box 2). Additionally, we prepared summaries of all the teaching strategies we considered using.
Strategies used across all lessons:
‐ Guided notetaking
‐ Small group work
‐ Response cards
‐ Homework – collecting claims and choices about health actions
‐ Standard lesson structure
‐ Setting objectives and providing feedback
‐ Multimedia design
Strategies used in individual lessons:
For more detail, see also descriptions of teaching strategies we considered including (located under the Extra resources menu of the Besmarthealth.org website).
Collecting examples
To illustrate concepts, and make lessons more engaging, we used a broad range of examples in the lessons involving conditions and health actions that students said they found interesting.45 We also created a searchable collection of relevant health actions and health problems or goals that can be used to illustrate key concepts, with plain language summaries of the evidence for each example (Figure 9). The examples collection is part of the “Extra resources”.
Using our findings from the Groundwork phase as a starting point, we conducted three development cycles of idea generation, prototyping, data collection and analysis, and stakeholder checking, which form the basis of this paper. The feedback was generally consistent regarding what people experienced as positive or problematic. Although there were far more positive findings, our main focus in collecting and reporting feedback was to identify problems that would require changes for the next iteration of the resources. We present the main findings thematically below.
Positive, constructive findings
Many teachers and students appreciated the learning in these lessons and experienced it as relevant both for other subjects and for their daily lives. Many students participated actively in class, including students who normally were less active. Many teachers felt the Projector mode of lessons worked best, providing structure and visual support to the lesson and helping to focus class attention. The blackboard mode and print-outs work satisfactorily for classes without projectors, or as a back-up in the event of power failure. Table 9 provides a detailed description of positive or constructive findings.
Negative findings and how we addressed them
We resolved problems as they appeared and changed the design of the prototypes dramatically from the first to the final development cycle. However, four main thematic challenges persisted throughout development: issues regarding the student-computer mode of lesson plans, Time, Comprehension, and Examples.
• Computers in classrooms took time to set up; equipment problems were common; students were often distracted by other online content; teachers found class discussions difficult to organise when students were using computers. Since teachers who had tried both the student-computer and Projector modes preferred the latter, we agreed with curriculum developers to drop the student-computer mode.
• School schedules lacked time to teach lessons that were not in the curriculum, and teachers struggled to complete lessons within the allotted 40 minutes. We simplified and shortened lesson content as much as possible, and developed modules for teacher training workshops to increase teachers’ confidence and capability to teach the lessons.
• Students had varying degrees of difficulty understanding lesson content, sometimes displaying a misunderstanding that was exactly the opposite of what they were intended to learn. We added “Common misunderstandings” to the lesson background, and designed lessons to include more review opportunities and informal assessment questions. We are exploring experiences and views of potential adverse effects in separate process evaluations30,31,55 and in a qualitative evidence synthesis.56
• We struggled finding appropriate examples of reliable and unreliable claims, as students often thought the lesson was about the example rather than the underlying Key Concept. Our solutions included finding a balance between real and fictional, well-labeled examples, guidance and prompts for teachers (including suggestions to find their own examples), and developing an example collection for teachers to find alternatives.
Tables 10-13 provide a more detailed description of challenges and solutions.
The COVID-19 pandemic disrupted our work in many ways. At times, social-distancing and travel restrictions, as well as school closures, prohibited in-person meetings and data collection and piloting. When schools reopened, teachers were pressed to make up for lost time, sometimes rushing to finish a pilot lesson or not completing it. On the other hand, the abundance of unreliable claims about Covid-19 provided timely examples and motivation.
“The lesson itself was very easy to understand because the students have already experienced what was used as example in the content (closing schools to reduce the spread of covid-19 infections).” (Rwanda, Observation pilot lesson)
Workarounds to challenges included communicating with stakeholders individually via audio or video call, or chat, and conducting some of interviews with home networks.
Using a human-centred design approach, we developed adaptable, digital resources for teaching critical thinking about health in secondary schools in Kenya, Rwanda, and Uganda. In Phase 1 of the design, we gained insight into user and stakeholder needs, and the wider educational contexts. In Phase 2, we iteratively developed prototypes by generating ideas, prototyping, collecting and analysing data, and checking our ideas and analysis with stakeholders.
In many countries, curriculum developers are moving away from traditional knowledge-based curricula to competency-based curricula, with critical thinking among the core competencies,57 including in Kenya, Rwanda, and Uganda.39–41 However, our context analyses showed that teachers lacked training and resources to teach critical thinking.39–41
In their systematic review of strategies for teaching students to think critically, Abrami et al. found that three strategies were most effective, especially when combined: “the opportunity for dialogue, the exposure of students to authentic or situated problems and examples, and mentoring”.58 Based on those findings, and findings from our overview of systematic reviews,44 we focused on using authentic examples and discussion activities in our secondary school resources.
A review of education technology in low-income countries distinguishes between two categories of digital interventions: computer aided learning (CAL) and computer aided instruction (CAI).59 Broadly speaking, CAL interventions are designed for the student-as-user, and either complement or supplement classroom instruction. CAI interventions, on the other hand, are designed for the teacher-as-user, and aim to enhance the quality of instruction. We developed and piloted resources for both CAL (the student-computer mode of lessons) and CAI (the blackboard and projector modes of lessons).
We found a variety of problems with the student-computer (CAL) resources. Many of the problems were logistical or technical, and not specific to our project, such as time needed to set up and charge computers, faulty or missing equipment, and unstable electricity. In addition, some curriculum developers (where student-computer resources were piloted) said that the resources lacked media-richness (such as animations or videos) as well as interactivity that we could not feasibly develop in this project and that teachers and students were possibly not experienced enough to use.
Regardless of the level of ICT preparedness, it might be that self-led study strategies and CAL interventions are not the best starting point for developing critical thinking skills, since it is more challenging to creating opportunities for dialogue when students are alone behind a screen, working at different paces from each other. In our case, developing functionality for dialogue to take place on the computer was not an option, due to technical barriers in the settings, such as lack or unreliable internet access. Computer-based mentoring60–63 would require much more sophisticated programming than we had resources for. Moreover, less literate and less computer-literate students would be at a disadvantage. In their review, Abrami and colleagues’ found that discussion was especially effective when the teacher posed questions and there were whole-class, teacher-led discussions, or teacher-led group discussions.58
Rwandan teachers who piloted both the student-computer and projector resources preferred the latter, with the blackboard and printed resources as a backup for when electricity failed. Besides being technically less demanding, the slides seemed to better activate students, support small and large-group discussion, and focus collective attention. Moreover, teachers can download and tailor slides. The lower-tech CAI modes require much less investment in equipment than high-tech CAL modes and could dramatically lower the threshold for bringing digital learning resources that support critical thinking into low-resource classrooms.
Strengths of this study include comprehensive groundwork; several development cycles; including pilots of all lessons in three countries; input from a wide variety of stakeholders and participants; systematic methods used to prioritise content and identify effective teaching strategies; data collection using complementary, mixed methods; independent initial coding and analysis by more than one researcher; and stakeholder checking to triangulate analyses.
A limitation is that the project team developed the prototypes and collected and coded data. This might have biased participants to give positive feedback, or introduced bias in the analysis, leading to an exaggeration of positive feedback or lack of attention to negative feedback. We took measures to mitigate this: the interview guide included questions that explicitly encouraged participants to provide negative feedback; data was collected in three countries and coding in each country was reviewed by two researchers in order to reduce potential for bias; we introduced “stakeholder checking” to triangulate our data analyses and check that the solutions we developed were appropriate.
It is possible that the amount of data we collected placed an undue burden on participants and other stakeholders. We are exploring this in a separate stakeholder evaluation.38 It is also possible that the prototypes caused harm to some students who misunderstood concepts or examples. We are exploring potential adverse effects of the final resources in process evaluations30,31,55 and a qualitative evidence synthesis.56
Reflexivity considerations
The core project team all have backgrounds as public health researchers, while the stakeholders recruited for the purposes of collaborating and providing feedback have mostly backgrounds in education with no previous engagement in this research topic. The core project team members are from many different settings, while all stakeholders are from East Africa with the exception of the international advisory group. The PhD fellows and several other core team members were working full-on this project, and therefore highly invested in a positive outcome, while stakeholders were participating voluntarily, in addition to their other full-time work or study commitments.
We have tried to be mindful of our differing agendas, cultural perspectives, and the ways in which the core team’s ambitions might influence the outcome of the work or place undue burden on the participants. We discussed these topics frequently during weekly core team meetings and have also explored some of these questions using more structured methods, designed as separate studies. The context analyses we conducted at the beginning of this work helped us view our research agenda from the perspective of teachers and curriculum developers and understand whether and how our research might better align with existing objectives within the three educational systems. The stakeholder engagement assessment (manuscript under development) provided stakeholders with opportunities to give us feedback about how they experienced their participation while it was ongoing. The qualitative evidence synthesis of post-trial process evaluations (manuscript under development) will help us understand if there have been any adverse effects of the resources, seen from the point of view of the participants.
Relationships between the core team and recruited stakeholders developed over time, and some people who were initially recruited as participants became engaged to the degree that they fulfilled requirements for co-authorship of this study.
Using a human-centred design approach, we created adaptable, digital resources for teaching lower secondary school students to think critically about health actions in ten 40-minute lessons. Teachers and students in Kenya, Rwanda and Uganda found the lessons relevant and useful. The open access resources are free to use and can be translated and adapted to other settings. Designed to accommodate use in classrooms with differing digital infrastructure, such as lack of internet connectivity or unstable electricity, they include materials in two modes for classrooms equipped with blackboards or projectors. Teachers with access to a projector preferred the projector resources. In addition to the lessons, resources include a teachers’ guide, glossary, a searchable set of alternative examples of health actions, summaries of teaching strategies for critical thinking, and teacher training materials. The Covid-19 pandemic disrupted the design of the resources, but also provided timely examples and motivation to learn.
Next research steps are already in progress, including randomised trials to evaluate the effect of using the resources on students’ ability to think critically about health actions, parallel process evaluations and one-year follow up studies.
Zenodo. Dataset for “Teaching critical thinking about health information and choices in secondary schools: human-centred design of digital resources”, https://doi.org/10.5281/zenodo.7695782. 47
This project contains the following underlying data:
• DevelopmentCycle1__data feedback and observations.csv (Coded datapoints extracted in Development cycle 1 from individual/group interviews with teachers/students and observations of lesson pilots in Rwanda, Kenya and Uganda)
• DevelopmentCycle1_internal team feedback.csv (Feedback, comments, suggestions from our research team during Development cycle 1)
• DevelopmentCycle1_interntl-advisory-grp-dec-jan-2020.csv (First round of feedback from international advisory group, via email)
• DevelopmentCycle1_interntl-advisory-grp-jun-aug-2020.csv (Second round of feedback from international advisory group, via email)
• DevelopmentCycle1_interntl-advisory-grp-proto1.2.csv (Third round of feedback from international advisory group, via email)
• DevelopmentCycle1_legend.csv (Description of implications codes we used to assess the importance of observations and feedback for the next iteration of the resources, e.g., “Problem/Showstopper/A problem that we should address”) in Development Cycle 2
• DevelopmentCycle2__data feedback and observations data.csv (Coded datapoints extracted in Development cycle 2 from individual/group interviews with teachers/students and observations of lesson pilots in Rwanda, Kenya and Uganda,)
• DevelopmentCycle2_codes and response options.csv (List of codes/response options used in Development cycle 2 to code or describe datapoints)
• DevelopmentCycle2_definitions.csv (Description of “nature”, “implications” and “Action for resolving” codes used in Development cycle 2)
• DevelopmentCycle2_internal team feedback.csv (Feedback, comments, suggestions from our research team during Development cycle 2)
• Pilot data and test results - analysis of problems and proposed solutions.pdf (Analysis of problems and proposed solutions after lesson pilots at the end of Development cycle 2 that we presented to curriculum developers for stakeholder checking and input)
Zenodo. Extended dataset and Coreq Checklist for “Teaching critical thinking about health information and choices in secondary schools: human-centred design of digital resources”, https://doi.org/10.5281/zenodo.7806139. 48
This project contains the following extended data:
• COREQ_checklist New.pdf (COREQ checklist for this article)
• Group Interview guide Post-Pilot - Students Group interview guide.docx (Guide used as a basis for interviewing groups of students after they had participated in piloting lessons at the end of Development cycle 2)
• Group Interview guide Post-Pilot - Teachers.docx (Guide used as a basis for interviewing teachers after they had piloted lessons at the end of Development cycle 2)
• Interveiw guide Early - Student network.docx (Guide used as a basis for interviewing students in Development cycle 1)
• Interview guide Early - Curriculum develop Questionnaire.docx (Guide used as a basis for interviewing curriculum developers in Development cycle 1)
• Interview guide Pilot - Teachers User testing interview guide.docx (Guide used as a basis for user test interviews with participating teachers in pilot lessons at the end of Development cycle 2)
• Interview Guide Pre-pilot - Teacher-Student - User test.docx (Guide used as a basis for user test interviews of students and teachers before they participated in pilot lessons at the end of Development Cycle 2)
• Observation form Pilot - Observers.docx (Form used by observers to structure observation data of pilot lessons)
Data are available under the terms of the Creative Commons Attribution 4.0 International Public License.
We would like to thank all of the students, teachers, and schools who generously gave of their time to participate in networks and pilots. In particular, we would like to thank Besweri Wandera, Joseph Jude Agaba, Enock Kiyemba, Deogratious Ssekyole, Dan Bubaale, Fredrick Kiyingi Kinobe, Grace Namakula, Peter Kizza, Edward Kanoonya, Enock Kiyemba, Judith Nantongo, Patrick Ssendujja, Steven Musisi, John Bosco Kalangwa, Paul Sendi, Grace Namakula, Bridgious Kato, Charles Musabi, Ann Murangira, Stanely Shitanda, lynnet Juma, Pamela Nandi, Robert Onyango, Brian Okoth, Emmanuel Otieno, Rose Omondi, Bridgit Abayo, Fredick Okoth, John Ouko, Lawrence Ariang’a, Innocent Uwimana, Paul Nkundimana, Aloys Kayinamura, Janvier Ntibizerwa, Aaron Habarurema, Donat Nkurunziza, Chui Hsia, and Christine Holst. We also thank the Kenya Institute of Curriculum Development (KICD), Rwanda Basic Education Board (REB), and National Curriculum Development Centre (NCDC) in Uganda for their ongoing encouragement and collaboration.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the work clearly and accurately presented and does it cite the current literature?
Yes
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
Partly
If applicable, is the statistical analysis and its interpretation appropriate?
Not applicable
Are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions drawn adequately supported by the results?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Critical health literacy, evidence-based health information, informed shared decision making
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | |
---|---|
1 | |
Version 2 (revision) 11 Mar 24 |
|
Version 1 11 May 23 |
read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)