Teaching critical thinking about health information and choices in secondary schools: human-centred design of digital resources

Background Learning to thinking critically about health information and choices can protect people from unnecessary suffering, harm, and resource waste. Earlier work revealed that children can learn these skills, but printing costs and curricula compatibility remain important barriers to school implementation. We aimed to develop a set of digital learning resources for students to think critically about health that were suitable for use in Kenyan, Rwandan, and Ugandan secondary schools. Methods We conducted work in two phases collaborating with teachers, students, schools, and national curriculum development offices using a human-centred design approach. First, we conducted context analyses and an overview of teaching strategies, prioritised content and collected examples. Next, we developed lessons and guidance iteratively, informed by data from user-testing, individual and group interviews, and school pilots. Results Final resources include online lesson plans, teachers’ guide, and extra resources, with lesson plans in two modes, for use in a classroom equipped with a blackboard/ﬂip-chart and a projector. The resources are accessible offline for use when electricity or Internet is lacking. Teachers preferred the projector mode, as it provided structure and a focal point for class attention. Feedback was largely positive, with teachers and students appreciating the learning and experiencing it as relevant. Four main challenges included time to teach lessons; incorrect comprehension; identifying suitable examples; and technical, logistical, and behavioural challenges with a student-computer mode that we piloted. We resolved challenges by simplifying and combining lessons; increasing opportunities for review and assessment; developing teacher training materials, creating a searchable set of examples; and deactivating the student-computer mode. Conclusion Using a human-centred design approach, we created digital resources for teaching secondary school students to think critically about health actions and for training teachers. Be smart about your health resources are open access and can be translated or adapted to other settings.


Background
Learning to thinking critically about health information and choices can protect people from unnecessary suffering, harm, and resource waste.Earlier work revealed that children can learn these skills, but printing costs and curricula compatibility remain important barriers to school implementation.We aimed to develop a set of digital learning resources for students to think critically about health that were suitable for use in Kenyan, Rwandan, and Ugandan secondary schools.

Methods
We conducted work in two phases collaborating with teachers, students, schools, and national curriculum development offices using a human-centred design approach.First, we conducted context analyses and an overview of teaching strategies, prioritised content and collected examples.Next, we developed lessons and guidance iteratively, informed by data from user-testing, individual and group interviews, and school pilots.

Results
Final resources include online lesson plans, teachers' guide, and extra resources, with lesson plans in two modes, for use in a classroom equipped with a blackboard/flip-chart and a projector.The resources are accessible offline for use when electricity or Internet is lacking.Teachers preferred the projector mode, as it provided structure and a focal point for class attention.Feedback was largely positive, with teachers and students appreciating the learning and experiencing it as relevant.Four main challenges included time to teach lessons; incorrect comprehension; identifying suitable examples; and technical, logistical, and behavioural challenges with a studentcomputer mode that we piloted.We resolved challenges by simplifying and combining lessons; increasing opportunities for review and assessment; developing teacher training materials, creating a searchable set of examples; and deactivating the studentcomputer mode.

Conclusion
Using a human-centred design approach, we created digital resources for teaching secondary school students to think critically about health actions and for training teachers.Be smart about your health resources are open access and can be translated or adapted to other settings.

Introduction
Claims about how to care for our health are everywhere, spread by friends, family, news media, industry, healthcare professionals, policymakers, researchers, and others.Many of these claims unreliable, 1 but people often lack the skills needed to assess them. 2,3When we believe unreliable claims, we might take ineffective or harmful actions, or fail to take helpful actions.The Covid-19 pandemic showed how easily unreliable claims [4][5][6] and research [7][8][9] spread, impacting public trust and protective behaviours. 10,11blicly debunking untrustworthy information has value, but the effectiveness of this retroactive strategy might be limited.Misinformation, once spread, can be resistant to correction; an alternative narrative may not exist (for instance, there may not be a safe or proven alternative treatment); and individuals may reject the scientific process or the source altogether rather than change their established views about some topics. 12Pre-emptively 'inoculating' people against misinformation 13,14 by teaching them to think critically about claims of what works, and how to make informed decisions, has the potential to provide broader, more long-lasting protection.
6][17][18][19][20] In that project, we created a framework of principles that are important for people to understand when assessing the reliability of healthcare claims and making informed choices: the IHC Key Concepts. 21,22This framework is a starting point for designing curricula, learning resources, and evaluation tools.Focusing on a selection of the IHC Key Concepts, we then developed printed learning resources for primary school children (age 10-12 years) and their teachers 15 and a podcast for parents. 16 evaluate the effect of these resources, we conducted two randomised trials in Uganda, one including 120 schools and more than 10,000 children, and another that included 675 parents and guardians of primary school children. 23The primary outcome measure was responses to multiple-choice questions that measure respondents' ability to apply IHC Key Concepts: the Claim Evaluation Tools. 17,18The primary school trial demonstrated that use of the resources led to a large improvement in the ability of children and teachers to apply IHC Key Concepts to hypothetical scenarios. 17The podcast trial showed a smaller, similar effect among parents. 18Follow-up data showed that children retained their learning for at least one year, 19 while the performance of the parents declined. 20The primary school resources have been translated into 12 languages and adapted for use in other countries. 24ongside the trials, we undertook process evaluations to explore barriers and facilitators for scaling up use of the learning resources, potential adverse effects, and potential additional benefits. 25,26We found that children, teachers, parents, and district education officers valued the IHC primary school resources.The human-centred design approach we employediteratively addressing user and stakeholder concerns prior to and during development -was likely an important contributing factor for this positive reception.However, the study pointed to two important implementation barriers in Uganda: printing costs and lack of time in the school schedule.
Informed by these findings, we began the current five-year project in 2019 to develop and evaluate resources for lower secondary schools (age 13-16) in Kenya, Rwanda, and Uganda.Since drafting this article, we have evaluated the resources in three parallel randomised trials, [27][28][29] are carrying out process evaluations, [30][31][32] and will conduct one-year follow-up trials in each of the three countries.This article describes the development of these resources that took place prior to evaluation during the first 2.5 years of the project in two phases.We began development with this set of objectives (see protocol 33 ): REVISED Amendments from Version 2 In response to peer review, we have made the three types of findings more explicit and easier to distinguish from each other: the final version of the designed output, findings that emerged from data analyses, and our corresponding design decisions.In the discussion section, we also added a description of why we used a human-centred design approach, its' strengths and limitations, and reflections on learning theory as related to this work.
Any further responses from the reviewers can be found at the end of the article To explore how we might develop resources that are • Digital (avoiding printing costs) • Suitable for use with schools' available digital technology and infrastructure • Compatible with national curricula • Based on evidence about effective strategies for teaching critical thinking • Experienced as accessible, useful, usable, understandable, credible, and desirable by students and teachers, and well-suited to use in their schools • Easily translatable and adaptable to other contexts • Sustainable (i.e., not dependent on our team for rolling out at scale) We organised the work in two phases: 1) a set of preliminary studies (involving data documented and discussed elsewhere) and 2) an iterative development phase with data collection and analysis reported in this article.Although the preliminary studies have already been published elsewhere, findings from those studies resulted in reframing of some of our objectives and provided the basis for content and design decisions made in the second phase.Therefore, we describe methods and results from both phases (with references for more detail about phase one).In the discussion, we describe how this study may inform development of similar educational resources for teaching critical thinking.

Methods
We employed perspectives and methods from human-centred design.This is an iterative approach to creating products, systems and services that places users' and other stakeholders' needs and experiences at the centre of the design process. 34,35Early and continued engagement with stakeholder groups and multidisciplinary collaboration are central components of a human-centred design approach. 36,37We employed qualitative data collection and analysis methods to explore the user experiences of multiple types of stakeholders for the purpose of informing the development process.
We established a core team with backgrounds in health systems and public health research, design, journalism, education, social science, statistics, and information and communication technology (ICT).Research leadership was shared among senior team members in East Africa and Norway.Three PhD fellows (one female and two male) based in Kisumu (Kenya), Kigali (Rwanda) and Kampala (Uganda), engaged with stakeholders and collected and analysed data, supported by their local teams including researchers with experience developing and evaluating the IHC primary school resources.All teams contributed to design and content development, led by the team based in Norway who also analysed data.The ICT team, based in Chile, developed the technical solutions.
We conducted this work in two phases: 1) Groundwork and 2) Development cycles (Figure 1).The first phase began in August 2019, the second phase began January 2020 and ended in April 2022.

Identifying and recruiting participating stakeholders
We defined stakeholders as people or organisations that have a vested interest in the process or results of the study.Intended userssecondary school teachers and studentswere the key stakeholders.Additional stakeholders included curriculum developers and other education policymakers, school administrators, parents, educational researchers, researchers in public or clinical health, and health professionals.
For the purpose of providing continuous input, we established groups of stakeholders in each country early in the project.We formed teacher networks, recruiting via formal invitation letters to teachers who worked in a mix of government funded and private schools, with varied ICT resources (Uganda), and in rural, semi-rural, and urban environments (Kenya and Rwanda).We formed student networks by asking members of the teacher networks to suggest lower secondary school students who were interested, likely to contribute, and able to participate without adversely affecting their schoolwork, sending them formal invitations and assent forms, and consent forms to parents.Due to Covid-related school closures in parts of Development Cycle 1, we also established "home networks" of students that we had access to when schools were closed.Home networks included both students that were already recruited to the student network and additional students whom team members could easily reach from nearby communities.We employed convenience sampling and used the same assent forms and consent forms to parents as for the student networks.
We engaged with the national curriculum development offices to establish channels of contact and collaboration.We formed national advisory panels of policy makers at the ministerial and regional and district levels, school directors, head teachers, leaders of teacher unions, and representatives of parents' groups and civil society.We created an international advisory panel of people from 18 countries who had expertise in education, education policy, and relevant areas of research, such as health literacy, evidence-informed decision making, science communication and ICT.Additionally, during the final cycle of development, we recruited seven schools across the three countries to pilot use of the resources in one or more classes over a school semester.
Apart from people in the international advisory panel, many of whom were a part of our existing professional networks, and some of the students in home networks, we did not have established relationships with the stakeholders prior to the onset of this project.
Details of how we selected and recruited each stakeholder group and characteristics of participants and schools are described in Tables 1-3.More information about the methods, degrees, and nature of engagement with different stakeholders, can be found in a protocol for evaluating stakeholder engagement. 38ase 1: Groundwork The groundwork phase consisted of a series of preliminary studies.More detail about the methods and results can be found in separate publications (see overview in Table 4).

Analysing contexts
To inform the development of the resources, as well as their potential implementation in Kenya, Rwanda and Uganda lower secondary schools, we explored if there was demand for learning resources to teach critical thinking; if such resources were in use; how the content maps onto existing curricula; what administrative approval was necessary; what ICT infrastructure is available in schools and how it is used in schools.We analysed the three country contexts.[41]   To synthesize findings from these studies in a way that was useful for developing educational resources, we drew on the behaviour change wheel. 42This is a framework for characterising and designing interventions for behaviour change, built around three essential conditions for change: Capability, Opportunity, and Motivation (the COM-B system).We organised findings from the three context analyses according to these three themes.The synthesized findings from these three context analyses supplemented our original set of development aims and provided a much more detailed understanding of contextual challenges before we started the development phase.

Prioritising IHC Key Concepts
Due to the scope of our research funding, as well as the pandemic and school closures, we could not pilot and evaluate resources that took longer to use than one school semester.To prioritise which of the 49 IHC Key Concepts to include in the learning goals for the resources, we organised a structured, iterative consensus process with curriculum specialists, teachers, and researchers.Detailed methods are reported in a separate publication. 43

Identifying teaching strategies
To identify effective, relevant teaching strategies to inform the resources, we conducted an overview of systematic reviews of strategies for teaching critical thinking.Detailed methods are reported in a separate protocol. 44

Collecting examples
To identify relevant and engaging examples for the resources, we interviewed Kenyan, Rwandan, and Ugandan secondary school students (aged 14 to 16) from the student networks, via WhatsApp.We collected examples of health actions, their claimed and actual effects, and information sources.The students rated their interest in a broad range of health actions and conditions, and we used those with higher scores as examples in the resources.Detailed methods are reported in an online report. 45

Phase 2: Development cycles
Starting with what we learned and developed in the preliminary studies, we developed resources in three iterative cycles of content creation and feedback.Each development cycle (Figure 2) included: -Generating ideas -Prototyping -Pilot testing prototypes and collecting data -Analysing problems -Checking our analysis of the most important problems and solution ideas with key stakeholders, drawing on the concept of "member checking" 46 We also sought ad hoc feedback from the student and teacher networks as needed.

Generating ideas
We generated content and design ideas in two ways: brainstorming 33 internally, within the team; and identifying ideas suggested in external feedback, from participating stakeholders. 47We selected ideas based on considerations such as consistency with what teachers and students said they valued, and ease of implementation.

Prototyping
A prototype is a sketch or model.We developed increasingly refined prototypes in each cycle.We used the platforms Google Drive (including Google Docs and Google Slides) and Adobe XD for drafting, editing, and sketching.We drafted and edited text using Google Docs, exported early prototypes to PDF format, and used Google Slides and Adobe XD later to create interactive sketches.Final versions were published as web pages, using Google Drive as a content editing platform.

Individual interviews employing user testing methods
Using semi-structured guides 48 that were pilot tested, the three PhD fellows conducted user test with individual teachers and students, to explore how they experienced prototypes and how we might make improvements.Data were collected face to face or online when covid measures restricted face to face meetings.When possible, user testing immediately followed a pilot lesson, so the participant had their experiences from the lesson fresh in mind.Interview duration was approximately one hour.Occasionally, teams printed the digital prototypes on paper to collect feedback where there was no access to a device or internet connection.
Each user testing session had three main parts.First, interviewers introduced themselves and the reasons for the interview.
Next, they employed a think-aloud approach: a method of qualitative data collection that encourages the participant to articulate their thoughts while performing a task, [49][50][51] in this case reviewing prototypes of the resources.The last part was conducted as a semi-structured interview.
We captured data as observers' notes and audio recordings.We designed the interview guides to explore different facets of the user experience, including usefulness, understandability, usability, credibility, desirability, and identification, based on a revised version of Morville's framework of user experience (Table 5). 34

Group interviews
The PhD fellows conducted semi-structured interviews with groups of students or teachers to explore their experiences of the prototypes, using pilot-tested interview guides. 48When possible, interviews were scheduled immediately after a pilot lesson when the participants had their experiences fresh in mind.Sessions lasted approximately one hour.
We chose group interviews, rather than user testing or individual interviews, when we anticipated that the former would improve the quality of the data, for instance by increasing students' confidence in speaking with researchers.Group interviews took place at schools or other locations determined to be practical for the participants.After introducing themselves and explaining reasons for the interview, one researcher moderated and one or more researchers observed and took notes.Sometimes, teachers were present during student interviews.With written informed consent (from parents and teachers) and assent (from students), we audio-recorded sessions.

Piloting and observation
We facilitated pilots of the prototypes to explore how teachers and students used and experienced them in as natural settings as possible.Early pilots involved single lessons, with one of the PhD fellows sometimes assuming the role of the teacher.These took place both at schools and in other settings, such as students' neighbourhoods, when schools were closed due to the pandemic.In the final pilot, teachers taught all the lessons over a school term, with access to a complete prototype of the resources including the teacher's guide.(In Uganda, although schools were still closed, we got permission for students from the student network and teachers to meet at the schools for the purpose of conducting pilot lessons.)PhD fellows or research assistants observed and took notes using a structured guide, 48 without intervening.We followed up pilots with either individual or group interviews.
"Critical Thinking about Health" Test After piloting a full set of lessons, students took the "Critical Thinking about Health" Test. 52Administering the test had two purposes: validating the items included in the test and giving us a sense of whether the prototypes had the intended effects.We report the validation in detail in a separate article. 52nsulting the advisory groups Twice annually, we emailed the international advisory group, to update them on the project, and ask for feedback on specific parts or prototypes.We entered their feedback in a spreadsheet, 47 familiarised ourselves with those data, and discussed and agreed on how to deal with any specific suggestions.We held face-to-face meetings with the national advisory groups to keep them updated and seek feedback related to ensuring the sustainability and future scaling up of resources, if shown to be effective.

Data analysis
In Uganda and Kenya, all interviews were carried out in English and transcribed.In Rwanda, interviews were carried out in Kinyarwanda and transcribed in English.There, the PhD fellow and research assistant relied on notes they made during interview sessions, and selected some recordings to review when further clarification of their notes was needed.The three PhD fellows and their three research assistants reviewed transcriptions, their own interview and observation notes, and recordings, and extracted data about negative and positive experiences.They entered the data into Excel spreadsheets, 47 supplemented with quotations where relevant, and tagged the data using pre-determined codes related to the nature of an experience (e.g., "negative"); the "location" of the experience, i.e., the relevant part of the resources (e.g., "illustration/ graphics"), and suggested implication (e.g., "consider changes").We did not anticipate differences in results between male and female students, so coding did not include gender identification.The prototyping team reviewed the coded data, flagged data entries that they did not understand, suggested changes to codes, and discussed the suggestions with the PhD fellows and their research assistants.
After agreeing with the original coders on the final codes, the three-person prototyping team sorted data entries according to the nature, implication and location, and re-reviewed them, focusing on data tagged with the implication "showstopper" or "consider changes".New topic codes emerged during this process and were added as thematic labels of issues that needed addressing (e.g., "Time", "Conceptual misunderstanding", "student-computer mode").This coding scheme evolved throughout.For an example from the dataset, 47 see the "Topics" column in the Development Cycle 2codes and response options.When data analysis raised important unanswered questions or resulted disagreement about interpretation, the team either contacted teacher or student networks for additional input, or adjusted interview guides to include these issues in the next development cycle.
The prototyping team drafted a description and assessment of the most important problems with the prototypes, with an initial set of ideas for solutions.The whole project team discussed these drafts, reached a consensus about which were the most important problems and brainstormed additional ideas for solutions.Based on this input, the prototyping team made the final decision about which solutions to implement.

Checking with stakeholders
Drawing on the method "member checking", 46 we carried out "stakeholder checking", checking our analysis and ideas with the teacher networks and curriculum developers.At the end of each development cycle, we presented descriptions of what we considered the most important problems and our proposed solutions.We asked about the accuracy of the analysis, their reactions to the proposed solutions, and any suggestions of their own.We modified solutions based on this input.

Ad hoc feedback
At different points within each development cycle, we also contacted people in the teacher and student networks for quick feedback on specific issues, for instance what to call the resources.
Figure 3 and Table 6 provide an overview of what we did in the three development cycles.

Results
Drawing on a guideline for reporting design in the context of research, 53 we present three different types of results that emerged from this work: designed output (the final version of the resources), descriptions of the main findings from data analyses, and the design decisions we made corresponding to those findings.We report the designed output first, to give the reader an overview of the resource features and components we refer to in the rest of the findings.
Designed output: the final version of "Be smart about your health" resources The final set of resources, Be smart about your health, is an open access website for teachers comprised of ten lesson plans, a teachers' guide, and extra resources including materials for teacher training. 54Lesson plans are designed in two modes: for use in classrooms with a blackboard/flipchart (optimized for smartphone) and for classrooms with a projector with lessons formatted as downloadable Google Slides presentations.See Table 7 and Figures 4-9.

Lessons plans:
-Content: Set of ten lesson plans, two of which are for review and applying what students learned to their daily lives (Table 7) -Structure: Each Lesson Plan has an Overview, Lesson, and Background section.Each Lesson is designed to be taught in 40 minutes, and has three parts: Introduction, Activity, and Wrap-up (Figure 5) -Format: To accommodate for varied ICT infrastructure in schools, we created two modes of resources for teachers to deliver lessons: Blackboard lesson plans and Projector lesson plans.Blackboard lesson plans are optimised for teachers on mobile devices and work also offline.These can also serve as a back-up in the event of electricity outages.Projector mode is for use in classrooms that have access to a projector.We also created sets of optional, downloadable printouts for teachers to use as a paper back-up (Figure 6) Teachers' guide: This includes an introduction to why the learning goals are important, a description of the content and how it ties to the curriculum in Kenya, Rwanda and Uganda, how to navigate the resources, how they were developed, and where to find other relevant resources (Figure 7).

Extra resources:
-Teacher training materials, designed to be taught in workshops by teachers who participated in pilots (Figure 8) -Glossary -Examples collection (alternative examples of health actions for each lesson) (Figure 9) -Teaching strategies for teaching critical thinking -Underlying principles Findings from Phase 1: Groundwork Below we describe output from the preliminary studies in Phase 1.As explained earlier, data from these studies have already been reported and discussed in other publications.However, for the reader to understand the full development process, and how these results influenced our objectives, content, and design, they are summarised below.

Context analysis findings and corresponding design decisions
0][41][42] Based on these findings, we identified opportunities and challenges, and made decisions about how to resolve those.
Stakeholders considered it to be important for students to learn to think critically about health, and the curricula included related learning goals.But teachers lacked capabilities and opportunities to teach both critical thinking in general, and about health choices more specifically.Additionally, schools' digital infrastructures ranged from fully equipped with student computer labs to schools with few devices and no internet.Some of the strategies we decided to use included creating a teacher's guide with in-depth background content and providing extra information about teaching strategies for critical thinking.We created lessons in different modes, with offline functionality to accommodate for variation in access to computers or digital infrastructure.See Table 8 for more detail about these findings and our solutions.

Stakeholders chose nine IHC Key Concepts to form the basis for the lessons
Twelve curriculum specialists, teachers, and researchers, in Kenya, Rwanda, and Uganda prioritised nine of the IHC Key Concepts (Box 1) as suitable, relevant, and important for lower-secondary school students. 55These nine concepts formed the basis for the learning goals and subsequent content development.Curriculum developers, teachers, and students said it was important for students to learn to think critically about health information and choices.However, teaching was largely examfocused, so teachers and students were unlikely to prioritise content not included in exams.
In the teachers' guide, we included descriptions of how the content mapped onto each country's national curriculum.We communicated regularly with curriculum development offices to facilitate alignment with curricula, ownership of the resources, and future uptake of them.

Capability to teach the content
Teachers lacked prior knowledge of the IHC Key Concepts.
We created a teachers' guide with an introduction and more detail about the IHC Key Concepts.
In each lesson plan, we created a detailed background section describing the respective IHC Key Concepts for that lesson.
Teacher said they lacked experience teaching and evaluating critical thinking in general.We did not identify any existing resources in use for learning or teaching critical thinking.
We created lessons drawing on teaching strategies for critical thinking that we identified in our overview of systematic reviews and included descriptions of these in lesson plans and Extra resources.

Capability to use digital resources
Teachers had varied experience using ICT for teaching.Many lacked ICT training.
To facilitate ease of access we developed open-access web-based resources.The design is responsive, therefore suitable for any screen size.
To increase ease of use, we dropped login functionality and simplified the interface as much as possible.We used large font sizes, consistent formatting, and minimized amount of texts to facilitate ease of use during teaching.
In the teachers' guide, we included a help section explaining the navigation and technical features of the site.
We created sets of optional, downloadable printouts for each lesson for teachers who preferred or had the opportunity of making paper copies.

Opportunity to teach the content within existing curricula using digital resources
We identified subjects in the Kenyan, Rwandan, and Ugandan curricula where the IHC Key Concepts could fit in the event of future uptake.Curriculum development offices would need to approve the use of any new teaching resource.Resources must be possible to download, adapt and republish on national platforms.
To facilitate tailored implementation, we created an adaptable, translatable solution using Google drive as an editing platform.Curriculum development offices can install their own versions of the resources for future translations or adaptations.
Schools had very different levels of access to ICT infrastructure for teaching and learning, ranging from almost no access to well-equipped computer labs.About half of Rwandan secondary schools have "smart classrooms": computer labs with laptops and Internet access.Poor Internet connectivity and unstable electricity were persistent barriers to use of digital resources in many schools, in all three countries.
To accommodate for varied access to ICT across schools, we created resources for delivering lessons in three different modes: Blackboard, Projector, and Student-computer modes.Blackboard lesson plans were optimised for teachers on mobile devices.These could also serve as a back-up in the event of electricity outages.Projector lesson plans included downloadable lessons formatted as Google Slides presentations.We also created sets of optional, downloadable printouts.Student-computer mode were designed for computer lab classrooms,

We embedded teaching strategies for critical thinking in the lessons
In an overview of systematic reviews, we identified teaching strategies for helping primary and secondary school students learn to think critically. 44We experimented with using different strategies in each lesson, but found it added too much to teachers' and students' procedural cognitive load.Therefore, we chose a limited set to minimize variation and used most strategies across all lessons (Box 2).Additionally, we prepared summaries of all the teaching strategies we considered using.To accommodate users with a poor Internet connection, we built in offline functionality: when the user visits a page online, an automatic download begins, with messages and icons alerting the user.We also created a cached version that loads faster, for slow Internet connections.
• Treatments can cause harms as well as benefits.
• Widely used treatments or those that have been used for a long time are not necessarily beneficial or safe.
• Treatments that are new or technologically impressive may not be better than available alternatives.
• Personal experiences or anecdotes alone are an unreliable basis for most claims.
• Identifying effects of treatments depends on making comparisons.
• Comparison groups should be as similar as possible.
• Small studies may be misleading.
• Large, dramatic effects are rare.
• Weigh the benefits and savings against the harms and costs of acting or not.
Box 2. Teaching strategies embedded in lessons.

Strategies used across all lessons:
-Guided notetaking For more detail, see also descriptions of teaching strategies we considered including (located under the Extra resources menu of the Besmarthealth.orgwebsite).

We compiled a collection of student-relevant examples
To illustrate concepts, and make lessons more engaging, we used a broad range of examples in the lessons involving conditions and health actions that students said they found interesting. 45We also created a searchable collection of relevant health actions and health problems or goals that can be used to illustrate key concepts, with plain language summaries of the evidence for each example (Figure 9).The examples collection is part of the "Extra resources".
Findings from Phase 2: Development cycles Using our findings from the Groundwork phase as a starting point, we conducted three development cycles of idea generation, prototyping, data collection and analysis, and stakeholder checking, which form the basis of this paper.The feedback was generally consistent regarding what people experienced as positive or problematic.Although there were far more positive findings, our main focus in collecting and reporting feedback was to identify problems that would require changes for the next iteration of the resources.We present the main findings thematically below, together with the corresponding design decisions that we made based on those findings."The lessons enabled the students to think of a personal choice and the advantages and disadvantages of making a certain decision" (Teacher, Kenya Group Interview).

Engaging and confidencebuilding for students
Most students found the lessons engaging and that they increased their confidence.In the pilot lessons, students participated actively in quizzes and activities, including some who normally were quiet.
"I am shy, but group work made me stand to present and this will give me confidence."(Student, group interview, Uganda)

Structure and design of the resources
For the most part, the structure of the resources and the lessons was appreciated.
"I am impressed of how you present the content in a clear and easy-to-follow manner.I also like how you organize the concepts in main groups and subgroups, it gives an immediate understanding of the focus areas."(International advisory group member) In Rwanda, stakeholders favored a lesson structure that encourages "discovery", where the teacher does not provide explanations before an activity.However, in Kenya, stakeholders favored a lesson structure where the teacher provides definitions and explanations were upfront, which are then reinforced through learning activities.
Design decision: We added a "discovery" question at the start of each lesson and provided definitions and explanations before the activity.

Usefulness for thinking critically about health actions
Many teachers and students said the resources were useful for thinking critically about health actions and in general.
"It was nice and educative.It will help us not to just follow band wagon.Like we buy certain toothpaste but just because it is many friends use it."(Student, group interview, Uganda)

Satisfaction with projector and blackboard modes
The projector mode worked best, with the illustrated slides helping focus class attention, and aiding understanding.The blackboard mode also seemed to work satisfactorily.
"Projector based lesson plans helped both teacher and students to focus on one activity than the computer-based version.There wasn't any interruption during the lesson."(Observation pilot lesson, Rwanda)

Useful lesson printouts
Some teachers found the printouts a useful way of continuing lessons even when there was a power outage preventing them from using a projector.
"After some few minutes, the Power went off and the projector stopped showing the lesson.The teacher immediately gives the printouts to students to follow the lesson.After 3 minutes, the power returned."(Observation pilot lesson, Rwanda)

Positive, constructive findings
Many teachers and students appreciated the learning in these lessons and experienced it as relevant both for other subjects and for their daily lives.Many students participated actively in class, including students who normally were less active.Many teachers felt the Projector mode of lessons worked best, providing structure and visual support to the lesson and helping to focus class attention.The blackboard mode and print-outs work satisfactorily for classes without projectors, or as a back-up in the event of power failure.Table 9 provides a detailed description of positive or constructive findings.

Negative findings and corresponding design decisions
We resolved problems as they appeared and changed the design of the prototypes dramatically from the first to the final development cycle.However, four main thematic challenges persisted throughout development: issues regarding the student-computer mode of lesson plans, Time, Comprehension, and Examples.
• Computers in classrooms took time to set up; equipment problems were common; students were often distracted by other online content; teachers found class discussions difficult to organise when students were using computers.Since teachers who had tried both the student-computer and Projector modes preferred the latter, we agreed with curriculum developers to drop the student-computer mode.
• School schedules lacked time to teach lessons that were not in the curriculum, and teachers struggled to complete lessons within the allotted 40 minutes.We simplified and shortened lesson content as much as possible, and developed modules for teacher training workshops to increase teachers' confidence and capability to teach the lessons.
• Students had varying degrees of difficulty understanding lesson content, sometimes displaying a misunderstanding that was exactly the opposite of what they were intended to learn.We added "Common misunderstandings" to the lesson background, and designed lessons to include more review opportunities and informal assessment questions.We are exploring experiences and views of potential adverse effects in separate process evaluations 30,31,56 and in a qualitative evidence synthesis. 57We struggled finding appropriate examples of reliable and unreliable claims, as students often thought the lesson was about the example rather than the underlying Key Concept.Our solutions included finding a balance between real and fictional, well-labeled examples, guidance and prompts for teachers (including suggestions to find their own examples), and developing an example collection for teachers to find alternatives.
Tables 10-13 provide a more detailed description of challenges and corresponding design decisions.

Other noteworthy design decisions
In addition to the main "design decisions" presented in Tables 8-13 above, we also made larger, more sweeping decisions that involved a reframing of the problem we were trying to solve.For example, our original project aim was to design digital learning resources for students to interact with.But over time we gradually increased our focus on designing resources that met teachers' needs, as we became aware of the challenges they experienced teaching this content within the constraints of their skillsets, curricula, class sizes, and varying quality of ICT infrastructure.Ultimately we eliminated the student computer mode of the resources altogether, even for schools with computer labs, as a consequence of the scope of technical, logistical and behavioral challenges we observed in school pilots.We reflect on this decision in the discussion, in particular as it relates to considerations for teaching critical thinking.

Covid-19 consequences
The COVID-19 pandemic disrupted our work in many ways.At times, social-distancing and travel restrictions, as well as school closures, prohibited in-person meetings and data collection and piloting.When schools reopened, teachers were pressed to make up for lost time, sometimes rushing to finish a pilot lesson or not completing it.On the other hand, the abundance of unreliable claims about Covid-19 provided timely examples and motivation.
"The lesson itself was very easy to understand because the students have already experienced what was used as example in the content (closing schools to reduce the spread of covid-19 infections)."(Rwanda, Observation pilot lesson) Workarounds to challenges included communicating with stakeholders individually via audio or video call, or chat, and conducting some of interviews with home networks.

Discussion
Using a human-centred design approach, we developed adaptable, digital resources for teaching critical thinking about health in secondary schools in Kenya, Rwanda, and Uganda.In Phase 1 of the design, we gained insight into user and stakeholder needs, and the wider educational contexts.In Phase 2, we iteratively developed prototypes by generating ideas, prototyping, collecting and analysing data, and checking our ideas and analysis with stakeholders.The result was a final set of educational resources, descriptions of findings from data analyses, and corresponding design decisions we made.

Why a human-centred design approach?
Human-centred design is a creative approach to problem solving (e.g.developing artifacts, services, systems) that is characterised by early engagement with multiple stakeholders and iterative development. 58It has been used across a number of different areas of research and development in fields not traditionally associated with design, [59][60][61][62] and is particularly well suited to addressing complex problems. 63The problem we set out to address was complex in many ways, with many unknowns: -We knew little about whether the attitudes and perspectives of key stakeholders were aligned with the overall goals of this project -We knew little about where in the curricula this new content might fit in or overlap -We knew very little about about the context of use (e.g.what digital infrastructure was available in secondary schools across the three countries, and how it was being used in teaching) To support students' and teachers' understanding: • We iteratively simplified the lesson structure, language, and content.• We added "Common misunderstandings" to each lesson Background.• We added review questions and short key messages at the end of lessons, which are repeated at the start of the next lesson.We developed printouts with definitions of key terms.
To support teachers' evaluation of students understanding and progress, we incorporated two review lessons (lessons 5 and 10) with quizzes.
To support teachers' understanding of the content and confidence in using the resources, we added a teacher training component to the intervention: a two-day workshop with presentations and activity materials, introducing teachers to the resources, teaching strategies, and content of each lesson, as well as providing teaching and preparation tips.We designed these training materials to be used by teachers who were already familiar with the resources (e.g., those who participated in pilots), so training would not be dependent on our team's participation and would be scalable.
-We knew very little about the characteristics and needs of end users -We knew very little about the desired attributes of the solution, especially given the complexity of teaching critical thinking.Tiruneh et al. argue that "Critical thinking requires complex learning … learning environments that may be successful for simple learning outcomes may not work well for the development of complex learning" 64 -We needed to learn what factors regarding the solution could support not just use, but future implementation For these reasons, we needed an approach that was not merely task-focused or "user-centered" but that included exploration of the learning context, collaboration with multiple key stakeholders, and that allowed us to learn about and reframe and resolve problems as we worked.

Reflections about learning theory
We did not start this work with an explicit theory of learning.Our intention was to understand the existing curricula and teaching practices in the three countries, then develop resources that could fit these (or be adapted to them) rather than imposing a new theoretical perspective from the outside.Some of the curriculum development offices have made their learning theories explicit.For instance, the Republic of Kenya Basic Education curriculum Framework is clearly founded on a constructivist approach to learning. 65In the background section of each lesson plan, we summarised research evidence about the used in that lesson.We also provided summaries of the evidence for the effects of all the health actions used in the resources in the searchable collection of health actions.We did this to ensure that the teachers were familiar with the examples, felt comfortable using them, and were prepared to prevent misunderstandings about the reliability of claims or evidence relating to them.
In retrospect, our approach to lesson structure resembles Merill's First Principles of Instruction. 66BeSmart lessons are taught in the context of health conditions, treatments and claims that learners are familiar with and interested in; lessons begin by engaging learners' prior knowledge and experiences; teachers provide a brief introduction to new content and demonstrate how it is used; learners apply the new knowledge on new examples; peer-to-peer and class discussions provide opportunities for reflection on what is learned and how it is relevant to the learners' own lives.
More generally, we are influenced by Bruner's concept of a "spiral curriculum" when conceptualizing how resources to teach IHC Key Concepts might be best integrated into a curriculum: introducing concepts at a young age and revisiting them repeatedly with increasing degrees of difficulty and reinforcement of previous learning. 67[41] In their systematic review of strategies for teaching students to think critically, Abrami et al. found that three strategies were most effective, especially when combined: "the opportunity for dialogue, the exposure of students to authentic or situated problems and examples, and mentoring". 69Based on those findings, and findings from our overview of systematic reviews, 44 we focused on using authentic examples and discussion activities in our secondary school resources.
A review of education technology in low-income countries distinguishes between two categories of digital interventions: computer aided learning (CAL) and computer aided instruction (CAI). 70Broadly speaking, CAL interventions are designed for the student-as-user, and either complement or supplement classroom instruction.CAI interventions, on the other hand, are designed for the teacher-as-user, and aim to enhance the quality of instruction.We developed and piloted resources for both CAL (the student-computer mode of lessons) and CAI (the blackboard and projector modes of lessons).
We found a variety of problems with the student-computer (CAL) resources.Many of the problems were logistical or technical, and not specific to our project, such as time needed to set up and charge computers, faulty or missing equipment, and unstable electricity.In addition, some curriculum developers (where student-computer resources were piloted) said that the resources lacked media-richness (such as animations or videos) as well as interactivity that we could not feasibly develop in this project and that teachers and students were possibly not experienced enough to use.
Regardless of the level of ICT preparedness, it might be that self-led study strategies and CAL interventions are not the best starting point for developing critical thinking skills, since it is more challenging to creating opportunities for dialogue when students are alone behind a screen, working at different paces from each other.In our case, developing functionality for dialogue to take place on the computer was not an option, due to technical barriers in the settings, such as lack or unreliable internet access.Computer-based mentoring [71][72][73][74] would require much more sophisticated programming than we had resources for.Moreover, less literate and less computer-literate students would be at a disadvantage.In their review, Abrami and colleagues' found that discussion was especially effective when the teacher posed questions and there were whole-class, teacher-led discussions, or teacher-led group discussions. 69andan teachers who piloted both the student-computer and projector resources preferred the latter, with the blackboard and printed resources as a backup for when electricity failed.Besides being technically less demanding, the slides seemed to better activate students, support small and large-group discussion, and focus collective attention.Moreover, teachers can download and tailor slides.The lower-tech CAI modes require much less investment in equipment than high-tech CAL modes and could dramatically lower the threshold for bringing digital learning resources that support critical thinking into low-resource classrooms.

Strengths and limitations
Strengths of this study include comprehensive groundwork; several development cycles; including pilots of all lessons in three countries; input from a wide variety of stakeholders and participants; systematic methods used to prioritise content and identify effective teaching strategies; data collection using complementary, mixed methods; independent initial coding and analysis by more than one researcher; and stakeholder checking to triangulate analyses.
A limitation is that the project team developed the prototypes and collected and coded data.This might have biased participants to give positive feedback, or introduced bias in the analysis, leading to an exaggeration of positive feedback or lack of attention to negative feedback.We took measures to mitigate this: the interview guide included questions that explicitly encouraged participants to provide negative feedback; data was collected in three countries and coding in each country was reviewed by two researchers in order to reduce potential for bias; we introduced "stakeholder checking" to triangulate our data analyses and check that the solutions we developed were appropriate.
It is possible that the amount of data we collected placed an undue burden on participants and other stakeholders.We are exploring this in a separate stakeholder evaluation. 38It is also possible that the prototypes caused harm to some students who misunderstood concepts or examples.We are exploring potential adverse effects of the final resources in process evaluations 30,31,56 and a qualitative evidence synthesis. 57rengths and limitations of the human-centred design approach The main strengths of this approach are ensuring we are solving the right set of problems and building ownership to the output by engaging early with a wide range of stakeholders; working iteratively with low-fidelity prototypes before committing to a final solution; and taking a holistic approach that includes early consideration of implementation during development. 63,75,76wever, this approach to developing educational resources likely takes much more time to practice than less comprehensive methods. 75en practiced outside of research projects, human-centred design involves processes that resemble qualitative research methods, but with less emphasis on creating an audit trail that can be replicated by others. 77When these design processes are embedded in and framed as research, any point of contact with stakeholders (e.g.observation, feedback, or collaboration) becomes a potential data collection event.The large number of interactions we had with stakeholders resulted in a large amount of data that was at times difficult to analyse, a problem that can also occur qualitative research. 78e also chose to report entire 2.5 years of development in this one article, possibly making it difficult to follow the data trail.An alternative article strategy may have been to write up each design iteration as a separate manuscript, then linking them together as reports summarized in one article. 79

Reflexivity considerations
The core project team all have backgrounds as public health researchers, while the stakeholders recruited for the purposes of collaborating and providing feedback have mostly backgrounds in education with no previous engagement in this research topic.The core project team members are from many different settings, while all stakeholders are from East Africa with the exception of the international advisory group.The PhD fellows and several other core team members were working full-on this project, and therefore highly invested in a positive outcome, while stakeholders were participating voluntarily, in addition to their other full-time work or study commitments.
We have tried to be mindful of our differing agendas, cultural perspectives, and the ways in which the core team's ambitions might influence the outcome of the work or place undue burden on the participants.We discussed these topics frequently during weekly core team meetings and have also explored some of these questions using more structured methods, designed as separate studies.The context analyses we conducted at the beginning of this work helped us view our research agenda from the perspective of teachers and curriculum developers and understand whether and how our research might better align with existing objectives within the three educational systems.The stakeholder engagement assessment (manuscript under development) provided stakeholders with opportunities to give us feedback about how they experienced their participation while it was ongoing.The qualitative evidence synthesis of post-trial process evaluations (manuscript under development) will help us understand if there have been any adverse effects of the resources, seen from the point of view of the participants.
Relationships between the core team and recruited stakeholders developed over time, and some people who were initially recruited as participants became engaged to the degree that they fulfilled requirements for co-authorship of this study.

Conclusion
Using a human-centred design approach, we created adaptable, digital open access resources for teaching lower secondary school students to think critically about health actions that teachers and students in Kenya, Rwanda and Uganda found relevant and useful.The human-centred design approach provided a helpful framework to incrementally develop solutions that served the needs of teachers, students, and schools in both high-and low-resource contexts.Designed to accommodate use in classrooms with differing digital infrastructure, such as lack of internet connectivity or unstable electricity, they include materials for ten 40-minute lessons in two modes: for use in classrooms equipped with blackboards or projectors.In addition to the lessons, resources include a teachers' guide, glossary, a searchable set of alternative examples of health actions, summaries of teaching strategies for critical thinking, and teacher training materials.
Teachers with access to a projector appeared to prefer the projector resources.Persistent challenges during development included issues regarding the modules for student computers, finding sufficient time in the school schedule for content not included in the curriculum, misunderstanding of some of the concepts, and problems related to the use of examples.
We provide details of how we approached emerging challenges with corresponding design decisions.The Covid-19 pandemic disrupted the design of the resources, but also provided timely examples and motivation to learn.
A Prospective meta-analysis synthesizing the results of the three randomised trials has shown that the resources led to a large improvement in the ability of students and teachers to think critically about health choices, though only 42% of students achieved a passing score. 80Next research steps are well underway: process evaluations of the three trials, reporting about potential adverse effects, and one-year follow up studies.

Data availability statement
Underlying data Zenodo.Dataset for "Teaching critical thinking about health information and choices in secondary schools: humancentred design of digital resources", https://doi.org/10.5281/zenodo.7695782. 47is project contains the following underlying data: • DevelopmentCycle1__data feedback and observations.csv(Coded datapoints extracted in Development cycle 1 from individual/group interviews with teachers/students and observations of lesson pilots in Rwanda, Kenya and Uganda) • DevelopmentCycle1_internal team feedback.csv(Feedback, comments, suggestions from our research team during Development cycle 1) • DevelopmentCycle1_interntl-advisory-grp-dec-jan-2020.csv (First round of feedback from international advisory group, via email) • DevelopmentCycle1_interntl-advisory-grp-jun-aug-2020.csv (Second round of feedback from international advisory group, via email) • DevelopmentCycle1_interntl-advisory-grp-proto1.2.csv (Third round of feedback from international advisory group, via email) This project contains the following extended data: • COREQ_checklist New.pdf (COREQ checklist for this article) • Design reporting checklist.pdfThe authors provide a detailed description of a multi-phase and multi-step process to develop and test learning modules about health information and health choices for children aged 13-16 in Kenya, Rwanda, and Uganda.The project was developed and tested in partnership with many stakeholders in the three countries.The Introduction opens with a brief discussion of the many challenges of identifying accurate information but ends with a purpose statement that the article will describe the development of the learning modules.The rest of the paper is about the module development process with Methods and Results being about the modules and learning techniques.Although the paper includes a discussion of teaching strategies in the Results section, what is missing is a theory of learning and expected results that would help explain why the team chose the human-centered design approach that it did, along with some literature justifying this choice.The paper provides a lot of detail about how a human-centered design approach collects feedback and iterates prototypes, but I am unclear why a human-centered design approach was necessary to produce this set of modules.I am also unclear about the lessons the authors want to impart to readers.Should we always use a human-centered design approach for health curriculum for children of these ages?Under what conditions or purposes might a different approach be better?Instead of naming the products developed as the primary results, I expected to read about how well or not a human-centered approach had worked to produce these products.I also lost the thread of what happened to all the interview, group discussion and stakeholder data.The authors provide a rich description of their development process, but I'm not sure what they intend for readers to take away from this report of their process and products.

Is the work clearly and accurately presented and does it cite the current literature? Partly
Is the study design appropriate and is the work technically sound?Yes curriculum for children of these ages?Under what conditions or purposes might a different approach be better?RESPONSE: Educating young people in low-resource school settings to think critically about health choices is a complex problem.Developing instructional material for simpler and more well-defined learning tasks might be approached by less resource-intensive methods.I have added a text elaborating on this issue under Discussion: "Why a human-centred design approach".

4) PEER REVIEW:
Instead of naming the products developed as the primary results, I expected to read about how well or not a human-centered approach had worked to produce these products.RESPONSE: "Results" are not only what we found when collecting and analysing data, but also the final solution we created and the noteworthy design decisions we made along the way.These three types of findings can be of interest to a range of readers: people who are designing educational resources based on IHC Key Concepts, people who are designing related educational resources, such as for teaching critical thinking or for use in lowresource learning environments, or people interested in the design approach more generally within the context of a research inquiry.However, I agree that we could have also included more explicit findings related to the human-centred design approach.Bazzano et al (2020) proposed a reporting guideline when reporting health research involving design that suggests reporting the following from design activities in the results section: What was designed Interventions to teach critical thinking are seldom included in the curricula of schools in the countries in which this research was conducted, despite that critical thinking is recognized as a valuable skill for young people to acquire, which will help them make decisions through the life course.In the countries in which the research was conducted, and in many other countries, teachers lack skills and resources to enable them to teach critical thinking.Both the intervention and the measure selected to evaluate the effects of the intervention are based on a large body of sound research which is referenced in the manuscript.
The manuscript provides valuable information about how to use a human-centered design approach to adapt a program that promotes critical thinking about health and to adapt measures to evaluate the program.Stakeholders contributed to the intervention development process is described in sufficient detail and is appropriate.The changes made to the interventions in response to stakeholders' input are detailed.The problems identified during the design process, and the actions taken to address such problems indicate how feedback from stakeholders was used to inform the development cycles of the intervention.
The description of the set of lessons is clear and valuable and together with information about the key concepts that were taught in a sample lesson helps the reader gain more information about the intervention.The manuscript is written clearly, and the research processes are described comprehensively.It is rare to read a manuscript in which the intervention development process is set out systematically and clearly.The manuscript is excellent, and I do not have any suggestions for improvement.strategies to build on, how to format the materials).As we made formatting and content decisions, we gradually developed a full set of lessons, not all of which were piloted individually in the the first development phase."

Major points
Question 3: The authors translated several of the audio recordings from individual and group interviews.Please add information on the selection of the recordings and why it was done.

Reply 3:
I changed the text in the first paragraph under "Data analysis" to reflect more accurately what was done in each country: "In Uganda and Kenya, all interviews were carried out in English and transcribed.In Rwanda, interviews were carried out in Kinyarwanda and transcribed in English.There, the PhD fellow and research assistant relied on notes they made during interview sessions, and selected some recordings to review when further clarification of their notes was needed." Question 4: The authors provide a teacher`s guide and also reported that lack of competences was a relevant issue.Please clarify whether participation in the workshop for teachers was optional or mandatory and whether teachers competences were assessed.

Reply:
The workshop was a part of the intervention that was tested in three randomised trials evaluating the effect of these resources, and reported in those studies.Participation in the workshop was not mandatory.Teachers' competences were not assessed at the end of the workshop, but they were assessed in the trials at the end of all the lessons.You can find the trial results for both students and teachers in Table 3 of this article: Effects of the Informed Health Choices secondary school intervention: A prospective meta-analysis: https://onlinelibrary.wiley.com/doi/10.1111/jebm.12552I have added the following short summary of the results of this prospective meta-analysis in the Conclusions section, along with the corresponding citation reference: "A Prospective meta-analysis synthesizing the results of the three randomised trials has shown that the resources led to a large improvement in the ability of students and teachers to think critically about health choices, though only 42% of students achieved a passing score [64].
Next research steps are well underway: process evaluations of the three trials and one-year follow up studies." Other updates to reflect referenced studies that are now published: Reference #52 has been updated as this article has now been published.

○
Reference #64 in the Conclusion is new, as the trial results are now published.

○
The benefits of publishing with F1000Research: Your article is published within days, with no editorial bias • You can publish traditional articles, null/negative results, case reports, data notes and more • The peer review process is transparent and collaborative • Your article is indexed in PubMed after passing peer review • Dedicated customer support at every stage • For pre-submission enquiries, contact research@f1000.com

Figure 1 .
Figure 1.Phases of the work.

Figure 3 .
Figure 3. Overview of the three development cycles.

Figure 4 .
Figure 4. Final version of the "Be smart about your health" resources.Contents include Teachers' guide, Blackboard lesson plans, Projector lesson plans, and Extra resources.

Figure 5 .
Figure 5. Lessons designed in two modes: Blackboard mode for use in classrooms equipped with a blackboard or a flipboard, and Projector mode for use in classrooms with a projector.

Figure 6 .
Figure 6.Lesson plan content.Each lesson plan includes an Overview, Lesson, and Background section for teachers.Each Lesson has three parts: Introduction, Activity, and Wrap-up.

Figure 7 .Figure 8 .
Figure 7. Teachers' guide includes: Introduction, Overview of the lessons, Using the resources, Background for teachers, Development and evaluation, and Other relevant resources.Page 17 of 41

Figure 9 .
Figure 9.The examples collection provides alternative health actions or health problems/goals for each lesson or Key Concept in the resources.
Small group work -Response cards -Homeworkcollecting claims and choices about health actions -Standard lesson structure -Setting objectives and providing feedback -Multimedia design Strategies used in individual lessons: -Concept mapping -Concept cartoons -Inquiry-based instruction -Quiz -Role play

○ 5 ) 1 Reviewer
Summary of major insights or reflections from the design process○ Description of decisions made during the design process ○ The guidance also suggests reflecting on the application of design to the work, including strengths and limitations.I have added a completed guidance checklist to the Extended data set, which indicates where in the manuscript each item can be found.Additionally, I am making the following changes to the manuscript: "Results" section changes: Made changes to the 1st paragraph in Results section to more clearly indicate the type of results we are reporting ○ Added heading and text: "Other noteworthy design decisions" ○ Made changes to some subheadings in the Results sections ○Made changes to some column headings in Tables9-13, to clarify where results about "design decisions" could be found ○ "Discussion" section changes: Added "Strengths and limitations of the human-centred design approach for this work"○ "Conclusion" section changes Adjusted text to differentiate more clearly between the three types of results○ PEER REVIEW: I also lost the thread of what happened to all the interview, group discussion and stakeholder data.RESPONSE: Our data is published on Zenodo.See list of data "Underlying data", reference 47.However, I acknowledge that the documentation of a project of this nature can be difficult to follow.Therefore I've added content to the Discussion addressing this under the heading:"Strengths and limitations of the human-centred design approach" ○ Competing Interests: No competing interests were disclosed.Version Report 05 September 2024 https://doi.org/10.5256/f1000research.145511.r250515© 2024 Mathews C.This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Catherine Mathews 1 South African Medical Research Council, Cape Town, South Africa 2 South African Medical Research Council, Cape Town, South Africa 3 South African Medical Research Council, Cape Town, South Africa This manuscript describes the processes undertaken to develop a set of digital learning resources to teach students in Kenyan, Rwandan and Ugandan schools to think critically about health and health care, and to develop skills to assess the reliability of claims about interventions to improve health.

Table 1 .
Participant selection and recruitment methods.

Table 3 .
Characteristics of schools participating in pilot of ten lessons.

Table 4 .
Overview of preliminary studies.

Table 5 .
Revised version of Morville's framework of user experience.
UnderstandabilityDoes the user recognize what the product is, and do they understand the content?(Subjective experience of understanding)CredibilityIs the content trustworthy?Desirability Is this product something the user wantshas a positive emotional response to?Identification Does the user feel the product is for "someone like me" or is it alienating/foreign-feeling?(e.g., age, gender, culture-appropriate) Ethics approvalWe obtained ethics approval for the entire project from local institutional review boards and national research governing councils in Kenya, Rwanda, and Uganda: the Rwanda National Ethics Committee(approval number 916/RNEC/2019); Masinde Muliro University of Science and Technology Institutional Ethics Review Committee and the Kenyan National Commission for Science, Technology and Innovation (approval number NACOSTI/P119/1986); and Makerere University School of Medicine Research Ethics Committee (REC REF 2020-139) and Uganda National Council of Science and Technology (HS916ES).

Table 6 .
Scope of teacher and student data collection (interviews, lesson pilots, tests).

Table 7 .
Final list of ten 40-minute lesson plans.
Thinking critically to make smart personal choices about health actionsLesson 9: Community choicesThinking critically to make smart community health choicesLesson 10: Using what we learnedA review of all nine lessons, applying what has been learned to daily life, and recognising its limits

Table 8 .
Synthesis from context analyses and solutions.Curricula in Kenya, Rwanda and Uganda included learning goals related to 'critical thinking' as a generic competence and to 'health', but none related specifically to 'critical thinking about health'.

Table 10 .
Difficulties using the Student-computer mode.

Table 12 .
Problems with understanding.

Table 13 .
Unsuitable or distracting examples illustrating the Key Concepts.

•
48velopmentCycle1_legend.csv (Description of implications codes we used to assess the importance of observations and feedback for the next iteration of the resources, e.g., "Problem/Showstopper/A problem that we should address") in Development Cycle 2Extended dataZenodo.Extended dataset and Coreq Checklist for "Teaching critical thinking about health information and choices in secondary schools: human-centred design of digital resources", https://doi.org/10.5281/zenodo.7806139.48 • DevelopmentCycle2_internal team feedback.csv(Feedback,comments, suggestions from our research team during Development cycle 2)• Pilot data and test results -analysis of problems and proposed solutions.pdf(Analysis of problems and proposed solutions after lesson pilots at the end of Development cycle 2 that we presented to curriculum developers for stakeholder checking and input)

•
Group Interview guide Post-Pilot -Students Group interview guide.docx(Guideused as a basis for interviewing groups of students after they had participated in piloting lessons at the end of Development cycle 2)• Group Interview guide Post-Pilot -Teachers.docx(Guide used as a basis for interviewing teachers after they had piloted lessons at the end of Development cycle 2) • Interveiw guide Early -Student network.docx(Guideusedas a basis for interviewing students in Development cycle 1)• Interview guide Early -Curriculum develop Questionnaire.docx (Guide used as a basis for interviewing curriculum developers in Development cycle 1)• Interview guide Pilot -Teachers User testing interview guide.docx(Guideusedas a basis for user test interviews with participating teachers in pilot lessons at the end of Development cycle 2)• Interview Guide Pre-pilot -Teacher-Student -User test.docx(Guideused as a basis for user test interviews of students and teachers before they participated in pilot lessons at the end of Development Cycle 2)• Observation form Pilot -Observers.docx(Form used by observers to structure observation data of pilot lessons) Data are available under the terms of the Creative Commons Attribution 4.0 International Public License.