Background

It is estimated that half of all adult mental health conditions emerge by the age of 14 (Kessler et al., 2007), and so schools are considered key settings for mental health prevention. While the risk of children and adolescents developing these conditions can be reduced via school-based early prevention programs (Werner-Seidler et al., 2021), there is increasing evidence that educators experience many barriers to implementation which undermine potential benefits (Williams et al., 2020). While not exhaustive, these barriers include time constraints (Allara et al., 2019), lack of training, high staff turnover (Dijkman et al., 2017), misalignment between program and regular practice (Coombes et al., 2016), program implementation fatigue, prioritising academic learning over socio-emotional learning (Locke et al., 2019), lack of principal support, lack of financial and/or material resources (Herlitz et al., 2020) and educational policy constraints (Owens et al., 2014). Furthermore, school staff typically do not receive expert support or consultation on how to collect and analyse data to guide and improve program implementation (Bruhn et al., 2015).

To support prevention programs being embedded into schools, there is a need for implementation strategies to be developed and tested for efficacy (Lyon & Bruns, 2019). Implementation strategies are activities, processes, or resources that can be tailored to improve the implementation outcomes of evidence-based programs, which in turn enhance mental health outcomes (Proctor et al., 2009). There are eight implementation outcomes that predict mental health benefits: reach, adoption, sustainability, fidelity, cost, appropriateness, acceptability, and feasibility (Proctor et al., 2011). Implementation strategies can manipulate four types of implementation factors: system/environment (educational policy), organisational (whole-school setting), groups/teams of staff, and individual program providers (teachers) (Proctor et al., 2009).

In educational settings, there is emerging evidence to suggest that monitoring program delivery and providing feedback to staff directly involved in implementation (Solomon et al., 2012), distributing educational materials that explain the program (Nathan et al., 2016), recognition of effort (Wolfenden et al., 2019), executive leadership support (Baffsky et al., 2022b) and school champions (Dijkman et al., 2017; Nadeem et al., 2018) enhance implementation when used in combination with other strategies such as providing ongoing training (Sutherland et al., 2020) and coaching (Smith et al., 2018). To date, most school-based implementation studies have focused on physical health programs, and the majority have been conducted in the United States (US) (Barnes et al., 2021).

Strategies that work in other countries (e.g., US) may not work in Australia, due to unique operational and strategic differences between schools. First, the Australian curriculum is considered relatively crowded (Bowles et al., 2017) and mandates teaching socio-emotional skills (Australian Curriculum, 2018). However, there is no guidance on how these skills can be integrated into subject learning to protect teachers’ work hours (Gilbert, 2019), which are already higher than teachers in other OECD countries (Thomson & Hillman, 2019). Second, Australia’s national educational policy lacks targets for socio-emotional learning, such that it is often under prioritised (Productivity Commission, 2020). Third, Australia does not have national or state-level policy guiding standardised mental health program implementation, as seen in the US (Laurens et al., 2021). Instead, the onus is on Australian principals to select evidence-based programs and develop their own implementation supports (Collie et al., 2017), adding to their already high administrative workload (Thomson & Hillman, 2019). To overcome these implementation challenges, the Australian Productivity Commission has recommended schools consider teacher training, coaching, monitoring program progress, and the use of dedicated program champions to support the uptake of evidence-based mental health programs (Productivity Commission, 2020).

To address implementation knowledge gaps, we are conducting a ‘hybrid type 3’ effectiveness-implementation trial (Curran et al., 2012) of an evidence-based social-emotional universal prevention program, PAX Good Behaviour (PAX GBG), in New South Wales (NSW) government primary schools, Australia. The PAX GBG is a classroom-based intervention, designed to support the development of emotion and behavioural regulation, and teach prosocial decision-making (Simpson et al., 2020). The program consists of 10 evidence-based and trauma-informed strategies that teachers introduce into regular lessons. The first strategy sets expectations for appropriate classroom behaviour (called PAX behaviours) and inappropriate behaviours (called ‘spleems’). The other strategies encourage PAX behaviours and discourage spleems. Once students master the individual strategies, they are combined and simultaneously delivered as a ‘PAX Game’. During the game, students work in teams on an academic task for a fixed time. Teams who display four or less spleems at the end of the time period are rewarded collectively.

The standard PAX GBG model is supported by teacher training, coaching, a classroom toolkit to deliver strategies and a printed instructional manual. The 6–8 h teacher training is delivered by an expert from PAXIS, the US-based program developers, before implementation and entails strategy information and modelling. The physical program resources and instruction manual are distributed. Coaching is provided by the PAXIS program experts as needed during the first year of implementation and involves consultations to resolve local challenges. The PAX GBG and its standard implementation process are outlined in a logic model in Coombes et al. (2016).

Our PAX GBG trial is described in detail in the study protocol (Baffsky et al., 2022a). Briefly, we are using a cluster randomised controlled design to test if an evidence-informed multicomponent implementation strategy (called ‘PAX Plus’) leads to higher rates of program adoption compared to the standard delivery model. PAX Plus is a toolkit offering access to nine evidence- and user-informed implementation strategies in addition to the training and educational materials provided in the standard model. Schools were randomly allocated PAX Plus (intervention) or standard delivery (control) upon trial registration. The current study reports on the co-design of the PAX Plus strategy with educational staff (end users).

There have been some efforts, mainly in the US, to co-design strategies to enhance the implementation of evidence-based mental health prevention programs. For example, the standardised PAX GBG coaching was iteratively developed with input from teachers and students (Becker et al., 2013). Other school-based mental health programs (Bruns et al., 2016; DuPaul et al., 2018), and their implementation supports such as educational materials (Coelho et al., 2016), training (Kern et al., 2015), and implementation manuals (Coles et al., 2015) have been informed by feedback from teachers, parents and/or school mental health professionals.

Outside of the US, researchers have worked with health departments to select and adapt PAX GBG and its implementation supports to be culturally appropriate for different countries. In Brazil (Schneider et al., 2016), Estonia (Streimann et al., 2017), England (Coombes et al., 2016) and the Netherlands (Breeman et al., 2016) the language of PAX GBG was adapted to fit local context. An extended 3-day training was used in Estonia (Streimann et al., 2017) and the widely available PAXIS 1-day refresher training utilised in England (Coombes et al., 2016). In Canada, training was modified to include relatable examples of delivering PAX GBG in First Nations communities (Wu et al., 2019). In Sudan, educators and parents worked to co-design rewards they perceived would be more acceptable to students than the standard US rewards (Saigh & Umar, 1983). In Chile, regular team meetings were used to facilitate cultural adaptations of the PAX GBG during early adoption in schools (Pérez et al., 2005).

Whilst it is common for researchers to acknowledge that stakeholder feedback guided intervention development or refinement, studies rarely described co-design processes in sufficient detail to allow quality assessment, learning, or replication. Furthermore, researchers frequently report that the co-design process improved a program’s acceptability, feasibility, adoption, effectiveness, sustainability (Kern et al., 2015) or fit to context (Becker et al., 2013; Bruns et al., 2016) without substantiating these results with data.

To address these gaps, this study reports the process in which a multicomponent implementation strategy (PAX Plus) was co-designed with educational staff in 15 NSW government primary schools prior to PAX GBG implementation, and then refined following acceptability testing 6-months into the trial.

Methods

Study Design and Ethics

This study used a transdisciplinary action research approach, in which researchers and educators shared power in co-developing the PAX Plus implementation strategy (Stokols, 2006). The development process was guided by Hawkins et al.’s (2017) framework for the co-design of public health interventions, using a qualitative methodology (Hawkins et al., 2017). Ethics approval for this study was granted by the University of New South Wales Human Research Ethics Committee (HC200759) and the NSW Government State Education Research Applications Process (SERAP 2020364).

Participants and Setting

The 15 New South Wales government primary schools involved in the co-design were registered in the wider trial, with plans to have staff trained in PAX GBG. Twelve schools were regional (80%) and three were urban (20%). School sizes ranged from 20–437 students (see supplementary file 1 for further details). From these 15 schools, we worked with 29 educational staff, 13 of whom were teachers, 16 were part of the school leadership team, and 27 were female. Their roles and involvement in the co-design process are described in Supplementary File 1.

Co-design Procedure

Figure 1 shows the methods used across the three co-design phases in our study.

Fig. 1
figure 1

Three co-design phases and overview of methods

Stage 1: Co-design of draft strategies

Two researchers (RB and MT) worked with the NSW Department of Education wellbeing leadership team (co-authors RS, PK, TP) at fortnightly (every two weeks) meetings for 3 months to shortlist potential strategies for inclusion in the toolkit. The Department of Education is responsible for funding and distributing resources to support the program. Meetings focused on discussing the feasibility of strategies with the strongest evidence of improving program implementation in schools. Strategies were identified through a rapid review of the academic and grey literature (original studies, review studies) published from 1 January 2000-28 January 2021. Four electronic databases (PubMed, PsycINFO, CINAHL and ERIC) and Google Scholar were searched. The inclusion criteria were: (a) Population—educators (teachers and school leadership staff), (b) Intervention—a strategy to support implementation of an evidence-based mental health prevention program for children in primary schools, (c) Study designs—randomised controlled trials, quasi-experimental studies, pretest-post-test studies and qualitative studies, (d) Implementation outcome(s)—adoption, reach, fidelity, sustainability, appropriateness, acceptability or feasibility, (e) Publication type—peer-reviewed journal articles, (d) Language—English. RB single-screened titles and abstracts and full texts against these inclusion criteria in a Microsoft excel spreadsheet. See supplementary file 2 for the full search strategy.

Stage 2: Co-design of the implementation toolkit

Three focus group discussions (FGDs) with educational staff were conducted to discuss and build upon the strategies drafted in Stage 1. A NSW Department of Education partner (RS) used purposive sampling to identify staff who had adopted PAX GBG and invited them to attend the FGDs. Informed ‘opt-in’ consent was obtained via email. RB and RS co-facilitated the FGDs, using a semi-structured interview guide. The guide provided a brief overview of the six strategies identified in phase 1 and their evidence-base followed by questions about acceptability, feasibility, and optimal delivery model. A sample question about the ideal delivery mode for audit and provide feedback was, “What type of feedback would you find most motivating (if any)?”. Lastly, the guide included open-ended questions about additional strategies educational staff wanted to use to overcome implementation challenges, to capture new ideas. For example, “What other strategies (not discussed) could help you to implement the PAX GBG?”.

There were six staff in the first focus group, five in the second and five in the third (supplementary file 1). Each focus group lasted 45–60 min and took place between 25 February 2021 and 9 March 2021. FGDs were conducted online using the Microsoft Teams videoconferencing platform.

Findings from FGDs informed the development of the first iteration of the PAX Plus implementation toolkit, which consisted of eight implementation strategies (see Table 1). The toolkit described the rationale and evidence (if applicable) for each included strategy, and the actions required by schools to support its implementation.

Table 1 The eight strategies comprising the ‘PAX Plus implementation toolkit’ with descriptions and exemplar quotes from FGD (N = 16)

Stage 3: Acceptability testing (prototyping)

Following the roll out of PAX GBG in schools and the intervention group receiving PAX Plus implementation toolkit for six months, RB conducted interviews with educational staff to identify early issues with the acceptability of PAX Plus, and to guide refinements. The interviews took place between 25 November and 16 December 2021. Using purposive sampling, RB identified staff from different roles (teacher, principal, assistant principal, instructional leader) and varying geographical locations. Instructional leaders are staff who mentor and train teachers to support student learning. All staff consented to participate and record interviews. Interviews were directed by a semi-structured interview guide asking staff to identify strategies that worked to support PAX GBG in the past 6 months. Staff in the intervention group were asked about the acceptability and feasibility of the initial PAX Plus strategies, and to recommend changes. A sample interview question was, “To what extent did you find the e-newsletters appealing (if at all)?” Staff were asked to suggest new strategies, not included in the original toolkit that may be important to program implementation. Suggested strategies were tested for acceptability/feasibility in subsequent interviews. Each interview lasted between 35 and 50 min, and staff were reimbursed with a $40 e-gift voucher. The staff-informed refinements were formalised in version 2.0 of the PAX Plus implementation toolkit in January 2022, and then trialled in a second cohort of schools.

Data Analysis

Focus group discussions and interviews were analysed using a deductive framework analysis approach (Gale et al., 2013). Transcripts were managed and stored on NVIVO software (QSR International, 2020). RB performed line-by-line deductive coding of the phase 2 FDGs, which involved generating codes that related to strategies identified in the rapid review (phase 1) or identified through this data. These codes were clustered into two categories a priori: (i) School-level strategies and (ii) Teacher-level strategies, and then defined in an analytic framework. The analytic framework was used to deductively code the phase 3 interview data and was also iteratively refined based on interview data about what strategies worked in practice. The analytic framework was reviewed and refined in discussions between RB, MT, and PC resulting in nine codes/strategies.

Results

Figure 2 presents an overview of the strategies from the staged co-design process. These will be described below.

Fig. 2
figure 2

Overview of the findings from the three stages of the co-design process

Stage 1: Co-design of draft strategies

The rapid review screening process is outlined in supplementary file 3. Forty-eight eligible papers were included, and eleven effective strategies identified (Supplementary file 4). In collaboration with the Department of Education, we shortlisted six strategies for potential inclusion in the toolkit: using a program champion, audit and provide feedback, recognition system, executive support, emailed reminders, and implementation planning. The most common reasons for exclusion of strategies were that they were too resource intensive in respect to personnel (e.g., coaching, consultation) and/or funding.

Stage 2: Co-design of the implementation toolkit

Sixteen educational staff agreed upon eight strategies in focus groups, which formed version 1.0 of the PAX Plus implementation toolkit, sent to intervention schools of the wider trial in July 2021. Staff contributions to the selection, format and delivery mode of strategies are described below (and summarised in Table 1).

  1. (1)

    Strategies

Staff wanted five out of the six strategies from Phase 1 to be included in the toolkit. They did not perceive implementation planning to be useful but agreed it could be integrated into ‘executive support’. Three additional strategies were put forward in the FGDs. First, a peer learning network to exchange knowledge with staff from other NSW schools. Second, access to PAX Chats, which were ongoing online consultation sessions about specific program components administered by PAXIS. Third, a continuous progress monitoring system for quality improvement.

  1. (2)

    Format of delivery

Formatting recommendations were based on educators’ reports of what had worked to support other programs in practice. They recommended using ‘tootle boards for teachers’ and/or ‘certificates of achievement’ as recognition systems (Table 1). The ‘tootle board for teachers’ is a blank board in the staff room to post affirmations (tootles notes) acknowledging peers’ PAX GBG implementation efforts. It was suggested that schools use existing administrative data, such as behavioural incident referrals, as evidence of the impact of PAX GBG on students and provide this feedback to teachers as a motivational tool. Some staff also felt it would be useful to have ‘fidelity’ self-report checklists and brief ‘process’ surveys for continuous progress monitoring. It was raised that the ‘school champion’ should be self-selecting (rather than chosen by the school leadership) and ideally positioned within the leadership team.

  1. (3)

    Timing

Educational staff explained they had limited time outside of class to support PAX GBG and recommended implementation activities be paced more slowly than suggested in Phase 1. They recommended executive support meetings and e-newsletters (emailed reminders) be provided monthly, compared to fortnightly as suggested by the Department of Education. They also suggested program champions organise fortnightly implementation team meetings with school personnel, rather than weekly meetings as suggested by the researchers.

Stage 3: Acceptability Testing (prototyping)

Nineteen educational staff agreed to be interviewed at the 6-month follow up, however, three withdrew prior to the interview due to time constraints. Table 2 summarises the acceptability of the original PAX Plus implementation toolkit to staff, based on majority opinion. It was consistently reported that program champions, audit and provide feedback, and executive support were acceptable and could continue to be implemented as originally designed. Educational staff liked some components of the training and recognition system strategies, such as learning from experts and positive reinforcement, however thought that the format and timing could be improved. The PAX Chats, peer learning networks, e-newsletters and continuous progress monitoring strategies were underutilised by all educational staff, with potential to be improved through reformatting or strategy promotion.

Table 2 Educational staff feedback (N = 16) on the acceptability of original toolkit

Refinements

The way in which staff feedback informed the reformatting of five underutilised strategies (Table 3) and the addition of a new strategy (all-in-one list of resources) for the second iteration of the PAX Plus implementation toolkit (version 2.0) is explained below.

Table 3 Comparison of strategies from original and refined toolkits highlighting changes made after acceptability testing

Recognition System

To improve the acceptability of the recognition system, one interviewee suggested using PAX-branded t-shirts to reward teachers. Other teachers agreed that PAX branded t-shirts would be a fun incentive to motivate adoption. The refined toolkit (version 2.0) encouraged principals to select the recognition system most appropriate for their school, with the option of using PAX-branded t-shirts in addition to ‘tootle boards for teachers’ and/or ‘certificates of achievement’ from the original toolkit (version 1.0).

Remind School Personnel

Teachers and the research team decided to reformat the e-newsletter into a physical flip-through desk calendar to improve the acceptability of the reminder system. Teachers felt the calendar would be a helpful visual reminder to deliver PAX GBG daily.

Continuous Progress Monitoring

Program champions suggested reformatting the continuous progress monitoring system from using self-report methods to walk-through observations. This involves the champion observing the teacher’s PAX GBG program implementation, assessing fidelity against a checklist, and providing feedback for improvement after class. Participants recommended walk-through observations be conducted ‘twice a term for 15 min each’.

Peer Learning Network

Educational staff wanted to use a peer learning network and were mostly unaware that an existing network was available. To raise awareness, information about accessing the peer learning network was provided in the desk calendar and ‘all-in-one’ list of resources.

Promoting PAX Chats/Refresher Training

PAX Chats were reformatted into short refresher training sessions. Educational staff asked for refresher videos to revise lessons from the initial workshop, which was considered too much information to digest in one day. Teachers wanted these videos to demonstrate program delivery in an Australian context. Leadership staff suggested it would be efficient to integrate refresher training into pre-recorded PAX Chats and share these with teachers during monthly staff meetings.

All-in-one List of Resources

Teachers wanted streamlined resources to support PAX GBG implementation. Specifically, several teachers wanted to receive a one-page email with a list of hyperlinked resources to save them the time of trawling through websites to find support materials.

Final PAX Plus Implementation Toolkit

The version 2.0 of PAX Plus implementation toolkit was distributed to principals of intervention schools on the 28th of February 2022. School recruitment and baseline data collection was completed in April 2022. The leadership team were responsible for selecting the most relevant strategies for supporting PAX GBG in their school and then sharing the resources from the toolkit to teachers and support staff.

Of the nine strategies offered in the final toolkit (Table 3), six targeted teachers’ motivation, and self-efficacy with the PAX GBG. These included desk calendar reminders, promotion of the peer learning network, PAX-branded t-shirts for recognition, PAX Chats/refresher training, audit and provide feedback, and an all-in-one list of resources. Three school-level strategies were provided to support school leaders to resolve implementation challenges (through monthly meetings), use program champions, and monitor progress using walkthrough observations.

Discussion

Across three stages, a multicomponent toolkit (PAX Plus) was co-designed with educational staff, researchers, and Department of Education wellbeing staff to enhance the implementation of the PAX GBG program in NSW government primary schools. The PAX Plus implementation toolkit was grounded in implementation theory and scientific evidence to enhance potential effectiveness. At the completion of the iterative co-design process, PAX Plus was found to be acceptable to educational staff.

We identified several benefits of co-design, found in other trials of school-based programs. Involving educational staff from conception aligned PAX Plus to the needs of schools (Ponsford et al., 2021). Educational staff provided unique insights into the delivery format and timing of strategies that had worked in practice. Collaborating with the Department of Education wellbeing leadership team during the preparation phase also highlighted which evidence-based strategies were likely to be infeasible because of time and human resource considerations. This minimised research and effort waste by ensuring resources were not allocated to the development and evaluation of strategies that would not be feasible and thus ineffective in practice (Milton et al., 2022).

We found a slight disconnect between some of the strategies educators selected during the preparation phase and the strategies they used and preferred during implementation. During preparation, educational staff consistently asked for a peer learning network and access to PAX Chats, however after six months of implementation, few had utilised these strategies. From the interviews, it was clear that staff lacked awareness of the peer learning network and PAX Chats and felt they had no time to utilise the strategies. The outcomes of the co-design phase could have been improved by focusing not only on what strategies educational staff wanted, but how they wanted to be informed and reminded of these strategies. Our findings also demonstrate that strategy acceptability is dynamic, changing from preparation to implementation phases (Proctor et al., 2009). For example, whilst staff found a continuous monitoring system acceptable during preparation, it is possible that time constraints made this strategy unacceptable and underutilised during implementation. Our findings also suggest that acceptability alone does not predict strategy utilisation, rather there are other important factors such as appropriateness and feasibility (Weiner et al., 2017).

Executive support was found to be acceptable to educational staff, which involved the researcher acting as a consultant and providing monthly check-ins to the leadership team. There was some overlap between our executive support strategy and the coaching/consultation strategy found to improve implementation of the PAX GBG and other school-based mental health programs (Becker et al., 2013; Schneider et al., 2016; Stormont et al., 2015). It is best practice for consultants to be school psychologists, behavioural consultants, or other educational staff as they have lived experience and relatability to support program implementation (Erchul & Sheridan, 2014; Schneider et al., 2016; Streimann et al., 2017). However, during implementation trials like ours, it is much more common and feasible for consultation to be delivered by researchers (Stormont et al., 2015). We need to consider how monthly check-ins can be transferred to educational staff for program sustainability, without adding to the administrative burden facing Australian educators (Thomson & Hillman, 2019).

Program champions and monitor and provide feedback strategies were highly acceptable to educators, consistent with other school-based trials (Dijkman et al., 2017; Nadeem et al., 2018). Educational staff wanted school champions to be self-selecting program advocates, as seen elsewhere (Nadeem et al., 2018). Educational staff found it motivating to receive feedback about students’ behavioural outcomes, consistent with evidence that providing feedback has a moderate to large effect on program fidelity (Merle et al., 2022; Solomon et al., 2012). Our insight into the positive impact of monitoring and feedback on acceptability extends prior descriptions of this strategy to support the PAX GBG in Estonia (Streimann et al., 2020) and Denmark (Breeman et al., 2016).

We found Australian teachers wanted refresher training to reinforce the skills and knowledge acquired during initial training, as seen in the UK (Coombes et al., 2016), and for this training to be culturally adapted to include localised examples of implementation. Similar locally relevant adaptations have been used in Estonia (Streimann et al., 2017) and Canada (Wu et al., 2019) to improve the relevance of the US-developed training.

Strengths and Limitations

A core strength of PAX Plus is that it was co-designed from conception with a range of senior and classroom-level educators and wellbeing staff. Taking a co-designed approach improves the likelihood of PAX Plus being effective (Milton et al., 2022), resulting in better outcomes for children and adolescents (Proctor et al., 2009).

Our multimethod qualitative approach was a strength. Focus group discussions allowed participants to build upon each other’s ideas of strategies that could improve PAX GBG program adoption (Stewart et al., 2007). This had a ‘synergetic effect’, resulting in co-design ideas that might not have been produced through one-on-one interviews (Stewart et al., 2007). During acceptability testing, it was appropriate to use semi-structured interviews to gain in-depth insights into how and why PAX Plus could be improved. In reporting, our findings were justified with data, a strength that distinguished our study from other co-design studies (Bruns et al., 2016; Kern et al., 2015). Lastly, our reporting of the co-design process was relatively in-depth compared to other studies, improving transparency and replicability (Becker et al., 2013; Coles et al., 2015).

The study had several limitations. First, it would have been beneficial to involve educators in phase 1 of the co-design, rather than only working with the Department of Education. Greater representation of stakeholders from all levels of the educational hierarchy would have allowed for better power distribution among those expected to do the work and those with strategic responsibility for the work. Second, our co-design process disproportionately involved regional schools, as regional schools were less affected by COVID-19 restrictions (e.g., school closures) during data collection, compared to urban schools. Certain strategies could be better suited to the characteristics of regional schools, such as truncating the role of principal and champions to accommodate for smaller school sizes. Our upcoming realist evaluation will consider what strategies worked to support PAX GBG implementation in different school settings and why.

Conclusion

We co-designed a novel multicomponent implementation toolkit (PAX Plus) with educational staff and Department of Education leaders that shows promise for enhancing the implementation of school-based mental health prevention programs. The effects of PAX Plus on implementation and effectiveness outcomes are currently being tested using a cluster randomised hybrid effectiveness-implementation trial. This study demonstrates how involving educational stakeholders in co-design improves the acceptability and feasibility of implementation strategies, although acceptability is not always sufficient for adoption. The clear description of our co-design process provides a roadmap for other researchers and practitioners to co-design strategies with educational staff to enhance school-based program implementation.