EIDM training as a key intervention among researchers to enhance research uptake and policy engagement: an evaluation study

The Evidence Informed Decision Making (EIDM) field has evolved faster in the past decade. This progress shows a need for capacity enhancement amongst evidence producers and evidence users in EIDM training. Through the Enhance DELTAS programme, led by the African Institute for Development Policy (AFIDEP), the project provided research uptake and policy engagement training, mentorship and webinars to awardees of the Developing Excellence in Leadership, Training and Science (DELTAS) Africa initiative, led by the African Academy of Sciences (AAS). Two workshops were offered to individual early career DELTAS researchers in policy engagement and evidence uptake, referred to as ENHD101, and among research leaders to enhance institutional capacity on policy engagement and evidence uptake, (ENHD102). Overall, the programme attracted 31 early career researchers and 20 research leaders over the eight months of training, mentorship and webinars. Following the programme, the early career researchers understood the importance of EIDM for better health policies and programmes. In addition, the team appreciated the complexities of the policymaking processes as they developed the policy engagement strategy for their research. The implementation of the EIDM knowledge was reflected during the mentorship of research fellows with policy briefs as the end product. Notably, research leaders, appreciated their role in strengthening the capacity for EIDM in decision-making spaces. Although none of the research leaders participated in strengthening the capacity for EIDM during the programme, the team anticipated improving in the long run. In addition, the research leaders developed and implemented institutional strategies for policy engagement and research uptake through the use of social media to influence policymakers. In conclusion, the project supported the capacity building of African researchers in EIDM. It was evident that enhancing knowledge and skills on EIDM through an integrated approach to include training, mentorship, and webinars demonstrated enhanced capacity for policy engagement and evidence uptake.


Amendments from Version 1
We made the revisions to the attached document based on feedback as follows: 1. Revision of grammatical errors to make the general reading of the paper easier.
2. Clarification of the methods section by explaining why and how the different evaluations were used (e.g.needs assessment, pre-post, endline) 3. We revised the discussion section to correlate with the Findings and Results section 4. The conclusion has been re-casted so that it highlights lessons from this evaluation.
5. We made all other revisions as suggested by the 2 reviewers.
Any further responses from the reviewers can be found at the end of the article

Background
Evidence has an important role to play in improving policy, programme, and practice decisions that ultimately improve development effectiveness 1,2 .Evidence-informed Decision-making (EIDM) is an evolving discipline to help translate the best available evidence into context-appropriate recommendations aligned with the priorities of decision-makers.EIDM is defined as a process where high-quality evidence from research, local data, and professional experiences is synthesised, disseminated, and applied to decision-making in policy and practice [3][4][5] .
The EIDM process is complex as it has to compete with many other factors including interests of policymakers, politics, value systems, individual and institutional capacities, and financial constraints [6][7][8] .Individual and institutional weak capacity for evidence use in policy and programme decisions has attracted a lot of focus in the last decade as one of the many barriers to evidence use 2, 8,9 .There is a lot of research that has investigated the need and efforts to strengthen institutional capacity to increase increase or enhance the use of evidence in decision-making 10,11 .The studies showed a need for a better understanding of efforts to strengthen capacity for evidence use as well as understanding context-specific lessons and insights in building institutional capacity for evidence use.The technical support was designed to address the gaps in knowledge and skills for knowledge translation and policy engagement among many DELTAS fellows through the Learning Research Programme (LRP) of the DELTAS initiative 12 .The Learning Research Programme (LRP) report highlighted institutional weaknesses in promoting knowledge translation and policy engagement practices within DELTAS partner institutions.The institutional weaknesses in evidence use are also mirrored by unpublished PhD research of a DELTASfunded Ph.D. researcher and AFIDEP staff, who has documented similar weaknesses in her doctoral research (research in progress).
Enhance DELTAS team worked with the first DELTAS Africa programmes to enhance the capacity of individuals, support DELTAS institutions in creating enabling environments for policy engagement and research uptake, and facilitate interaction between researchers and policymakers.This DELTAS Africa programme is an initiative implemented by the AAS Alliance for Accelerating Excellence in Science in Africa with the support of the Wellcome Trust and the UK's Foreign Commonwealth and Development Office (FCDO).Phase Ie of the programme, which ended in May 2021 was designed to train world-class researchers and research leaders in health sciences in Africa and to strengthen the environments in which they operate.The first DELTAS Africa programme supported 11 collaborative teams, spanning 54 lead and partner institutions.DELTAS Africa Phase II started in early 2021 and has introduced a suite of new strategies designed to address gaps identified during DELTAS Africa Phase I.The strategy includes balancing equity and excellence within the constitution of various consortia 13 .The integrated learning program included five sessions of three hours of virtual workshops, online selflearning materials including videos, a six-month mentorship phase, and interactive EIDM at individual and institutional levels.These are described in more details under the methods section.

Research question
1. Can a multi-faceted intervention that combines training and mentoring improve researchers' knowledge of EIDM and practice?

Methods
The programme implemented a holistic approach at the individual level that intends to strengthen individual capacity and existing institutional systems, structures and processes to enable sustained EIDM.The multi-faceted intervention to including training, mentorship and webinars adopted a virtual format.The following integrated approach was used: a) Consultation with AAS to identify potential trainees: In August 2020, we held virtual consultations with AAS were held to introduce Enhance DELTAS programme and assess their interest in co-facilitating.The AAS team supported getting the programme publicised across the DELTAs family.o Training for senior researchers and consortium leaders (ENHD102), to enhance institutional capacity for policy engagement and evidence uptake was designed for research leaders or senior researchers who were responsible for leading policy engagement and research uptake.The components of ENHD102 were: introduction to EIDM, developing institutional strategies for policy engagement and research uptake, generating demand for evidence uptake, and creating an enabling institutional culture for research uptake.c) Mentorship programme: As part of the learning process, participants were invited to the virtual mentorship programme to help consolidate the learning, build depth, and most importantly, help them complete their policy products.The mentorship was provided over eight months to monitor their progress of implementing of their policy engagement tasks.This approach has been used on several of our programmes with good success rates 9 .After the training, fellows were assigned a task to complete; for example, developing a policy brief based on their research, doing a stakeholder mapping and power-interest matrix, or developing a policy engagement strategy.
Out of the participants trained for ENHD101 and 102, six participants from ENHD101 training for early career researchers (four females and two males) expressed interest in being mentored to develop some evidence products.For those participants chose not to take part, it was because they were unsure whether they needed the mentorship as yet because they were starting their research projects.Among those who accepted mentorship, their research ranged from anti-microbial resistance, sexual and reproductive health, strengthening health research, and maternal healthcare service utilisation, among others.The fellows were assigned to mentors who supported them up to the end of the programme.During their first meeting, each mentormentee pair was asked to complete an agreement outlining the goals and expectations, and a plan for completing at least one evidence product for sharing with relevant policymakers.Out of the six mentees, four mentees developed policy briefs as the evidence product of choice while two were unable to complete the mentorship.End-line evaluation: eight months after the project, an end-line evaluation was conducted among the trained team to understand the effectiveness of components or the whole programme achievement of the intended outcomes.The self-administered online questionnaire contained both qualitative and quantitative questions that took an average of 15 minutes to complete.However, the questionnaire was sent through an email for the participant to complete at their convenience.The end line evaluation questionnaire was completed online using the Microsoft Teams questionnaire tools where the respondents had one month to send back their responses and the survey was closed.Some of the common intended outcomes for both ENHD101 and ENHD102 include developing a policy engagement strategy for their research and developing and Table 1.Trainee participants and their DELTAS institutions.

DELTAS Programme affiliation
ENHD 101  Identifying key stakeholders (stakeholder mapping), stakeholder power interest matrix; effective engagement of policymakers; policy engagement toolkits.
"So What?" tools -Embedding monitoring and evaluation and learning in policy engagement and research uptake strategies.Practical sessions including role play.

Knowledge Translation and Packaging.
• Rapid synthesis of evidence, translating and packaging evidence in suitable formats for policymakers and non-academic audiences.• Communications plans-strategic communication tools, collaborating with knowledge translators and the media.
• Involving policymakers in research advisory committees; participating in policy advisory committee.
• Developing a monitoring and evaluation framework -theory of change, outputs and outcomes to measure.
3. Generating demand for evidence uptake; lobbying for research and knowledge translation funding.

Creating an Enabling
Institutional Culture for research uptake -EIDM champions, incentives and motivations for research uptake.
implementing institutional strategies for policy engagement and research uptake respectively.For the end-line evaluation, ENHD101 had 15 (48%) respondents while ENHD102 had three (15%) respondents.
A copy of the questions can be found in the Underlying/ Extended data 14,15 .

Results
The findings are presented based on the various evaluations conducted, as highlighted in the methodology section.

Pre and post-evaluation
The pre-and post-training test was administered and analysed using the Survey Monkey software.

a. Technical skills developed during the training
Early career researchers.The ENHD101 pre and post-course survey results showed that the level of knowledge on EIDM that included various domains as listed in Table 3 before training was 66%, compared to 83% at the end of the training.
In addition to the pre-and post-survey assessment, we also evaluated the overall quality of the training.Generally, all the participants rated the quality of the training as very good (30%) and excellent (70%) on a scale of 1 to 5 with 1 being the (lowest) poor, 2 being fair, 3 being good, 4 being very good and 5 being the (highest) excellent.Overall the fellows' understanding of technical aspects improved by the end of the training.For example, knowledge of a well-defined policy question improved by 2.5%, understanding of the streams necessary for the window of opportunity for policy influence increased from 14.7% to 58%, and lastly, knowledge of the steps in applying evidence synthesis concepts increased from 51% to 87%.
Senior researchers.Similarly, the ENHD102 pre and post-course survey results showed that the level of knowledge on EIDM that included various domains as listed in Table 3 improved by the end of the training from 56% to 80%.For example, knowledge of the definition of EIDM and stages of the policymaking process improved from 57% to 63% and from 57% to 75%, respectively.Understanding of Kingdon's three streams necessary for the window of opportunity for policy influence increased by more than double, from about 30% to 62%.Largely, the level of knowledge increased and the participants were generally interested in follow-up engagement to support their targeted study areas of interest.
The surveys also sought to gauge the participants' satisfaction level with the training workshop's overall design.There was a general consensus in that all participants indicated that the training was effective and it met their expectations.All the respondents rated the quality of the training as "very good" and "excellent".More results on the training evaluation are included in Table 3.

b. The training quality
In addition to the technical knowledge obtained following the training, participants were asked to assess the training based on things that they liked the most.The following were some of the responses: "The ease with which the facilitators delivered the training, they are knowledge-packed and interactive which allowed participants to express themselves freely.""I have learned that it is not always about focusing on publishing but remembering the policy implication.So, for every research, and every protocol that I have developed I will be thinking about what is the policy implication for this.How

"We have worked with researchers for about 5 years. But with COVID it disrupted a lot of things here. I want to say it is possible to improve MNCH with interventions that are evidence-based. We've used evidence to move a lot of processes forward for example communication regarding maternal neonatal and child mortality. The communication of evidence has helped to pull a lot of people to try to see how they can use males in improving access to family planning and child spacing" (Policy makers, ENHD101).
Medium-term impact following project end-line survey Eight months after the training, an end-line survey was conducted to check on the utilisation of knowledge/skills they obtained during the training.From the respondents, the participants provided positive feedback as to how they have used the skills.

ENHD101:
The participants responded positively with examples of how the skills were used.

Participant 1: I applied to knowledge to write a blog on the potential benefits of my research work (Ph.D student, ENHD101). Participants 2: I'm developing my research protocol with a view to influencing policy using tips from the training (Ph.D student, ENHD101). Participants 3: There was a session during the training that covered how to write for different audiences. Used the skill to write a blog article that would be easily understood by a wide audience, both lay and expert (Postdoctoral student, ENHD101). Participants 4: Yes, the training enabled me to write a better literature review chapter for my PhD proposal. It enabled me to think more critically about my literature search. The training also further re-echoed to me that for any grant proposal I am to write, I have to think about the public health impact of the proposed work
and how this will be achieved.The training showed me that for any work, it is important to do stakeholder analysis and take the highlighted stakeholders throughout the research project journey, right from the conception of the idea, to implementation and this will make writing policy briefs easier.Thank you so much for the training.I am grateful (Ph.D student, ENHD101).
Among the early career researchers, three beneficiaries had written blogs as illustrated in Figure 1 respectively.
ENHD102: Similar to the ENHD101 group, the participants responded positively with examples of how they have used the skills.

Participant 1: I'm better able to translate and compile important information into smaller snippets to share on our social media pages (Program manager, ENHD102). Participant 2: Yes. I was able to identify the stakeholders in relation to their possible influence on the objective of my policy engagement (advocacy for improved mental health service in Oyo state) (Program manager, ENHD102). Participant 3: I have participated in writing a press release to share the result of Sars-Cov 2 ARN sequenced in our Lab in Mali. This brought in the Malian Prime Minister and the Minister of Health to make an official visit to our facilities. Also, our Center was contacted before any communication from the government on the evolution of COVID-19 in Mali (Research leader, ENHD102).
In addition, among senior researchers one beneficiary each reported having written blog, media brief and policy engagement plans following the training as illustrated in Figure 2. Two senior researchers mentioned that they used social media to influence policy.

Priority area for future training
Further, when participants were asked to make arecommendation for future training, the early career and senior researchers in unison recommended training on accessing, appraising and synthesising research, developing a media brief and giving an elevator pitch as shown in Figure 3 and Figure 4 respectively.

Discussion
Interest in EIDM has grown in the current decade, and so ies a need for capacity enhancement both at individual and institutional level amongst evidence producers and evidence users.EIDM is a deliberative process that guides decision-making using the best research evidence 16 .Since the 1990s research evidence has traditionally played an integral part in  decision-making 17 .Despite knowledge of EIDM, healthcare organisations worldwide, have considerable difficulty in translating research evidence into practice 18,19  aimed at strengthening the capacity for the evidence used to inform the decision-making process.The results showed that the interventions were successful in building the capacity of individuals to access, understand, and use evidence/data 8 .However, there are no frameworks to measure the effect of capacity building across various levels of policy-making cycle.
During the intervention phase, the facilitators and researchers acknowledged the need for leadership skills in engaging stakeholders to enable working better as a team in multi-cultural and multi-sectoral contexts.This is in line with the emerging evidence which shows evidence use in decision-making is enabled by strong leadership and 'soft' persuasion skills [22][23][24][25] .
Leadership can use their power to promote and support the EIDM implementation process.Leadership support is considered to be an important facilitator that can act as the champion, initiator, and role model of change interventions to enhance the implementation of EIDM culture at both individual and institutional level 26 .Stakeholder engagement was identified within the target audience as being difficult yet it was also indicated as the most important aspect of EIDM.It is evident that the decision-making process is complex with difficulty in / engaging researchers and policy-makers that have never worked together to hold dialogue 22 .However, it is crucial to involve stakeholders from the beginning, and throughout the entire process, to align priorities and foster a common vision toward decision-making and facilitate the uptake of synthesised evidence 25,27,28 .
Following the intervention, it was noted that the demand for skills in how to write for non-academic audiences and policy briefs, and the need to embed in research training was shown through the mentorship phase.In addition, the early career researchers and research leaders agreed that briefs need to be written in clear and jargon-free language.This is because many policy-makers are generalists and do not necessarily come from specific research areas.Therefore, as the skills are included in the EIDM process, there is a need to instil writing skills at the start of research, not at the end.
The flexibility of the Enhance DELTAS programme, in being able to adapt to its target audience requirements, was a key strength of the project.Among the trained group of both early career researchers and research leaders, the project identified a critical gap in evidence synthesis and knowledge management capacities, which affected the ability to respond to project objectives.Some of the knowledge gaps that were recommended for future training include accessing, appraising and synthesising research, developing a media brief, and giving an elevator pitch.The knowledge gap is anticipated to be addressed in the next phase of this project if it is successfully renewed.
The programme experienced challenges that hindered the intervention to include training, mentorship, and webinar intervention to increase or enhance the use of evidence in decision-making.One of the challenges was as a result of the ongoing COVID-19 pandemic, the ENHD 101 training adopted a virtual format as opposed to the originally planned in-person training.This posed challenges in terms of maintaining interactive participation as well as getting the fellows engaged throughout the training.Another challenge experienced was due to poor internet access and connectivity, as such, there was inconsistency in the number of participants who remained online during the sessions.Some of the participants also complained about high internet costs within their home countries, hampering them from being fully involved in the training.Time constrain was reported to be a challenge, the feedback from participants indicated that the time to complete exercises was not enough, and more time needed tobe allocated especially for the practical sessions.Similarly, some participants were unfamiliar with Microsoft Teams as the training platform, especially in accessing the breakout rooms as well as training material.This resulted in the project team transferring the training resources to Google Drive the platform for ease of access.Along the way, this was shifted to Zoom platform as most of the participants were more familiar with this platform.
The low participation rate was one of the limitations of this evaluation.The dropout of participants was contributed by the virtual modality that seems to be the norm with the global COVID-19 pandemic.

Conclusion
Generally, following the intervention, the level of knowledge increased and some of the participants were interested in a follow-up mentorship to support their study areas of interest, concerning research uptake and policy engagement.The participants developed respective tools and demonstrated various skills for engaging and communicating with policymakers such as blogs and policy briefs.The participants also suggested potential areas that they wished were covered in more details for future training.Some of these areas include; social media engagement, systematic review and meta-analysis and monitoring and evaluation of the policies.Additionally, it was recommended to consider having such courses integrated within the university curriculum to train the fellows at an earlier stage. project

Bey-Marrie Schmidt
University of the Western Cape, Cape Town, South Africa This article describes the results for a multi-faceted intervention that combines training and mentoring to improve researchers' knowledge of EIDM and practice?This study consists of consultations with trainees, the implementation and evaluation of training and mentorship aspects, and follow-up webinars.The various aspects are well-described, a pre-and postquestionnaire was administered right before and after the training, and an end-line evaluation was conducted.The results are detailed and provide evidence of participants' experiences.The discussion is appropriate and the authors detail gaps in research and practice, however, it would have been more beneficial if they had suggested potential ways of addressing these gaps.For example the authors state "However, it is crucial to involve stakeholders from the beginning, and throughout the entire process, to align priorities and foster a common vision toward decisionmaking and facilitate the uptake of synthesized evidence".They could have perhaps noted the role of integrated knowledge translation in this type of study.Overall, it is an interesting article and offers valuable findings.

Is the work clearly and accurately presented and does it cite the current literature? Yes
Is the study design appropriate and is the work technically sound?Yes

Are sufficient details of methods and analysis provided to allow replication by others? Yes
If applicable, is the statistical analysis and its interpretation appropriate?Yes Are all the source data underlying the results available to ensure full reproducibility?Yes

Are the conclusions drawn adequately supported by the results? Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Evidence-informed decision-making, evidence synthesis, qualitative research, knowledge translation, community engagement.
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.The major part of the article consists of the description of the organization and implementation of the training.It contains a lot of redundancies and needs to be streamlined.The mass of abbreviations complicates the comprehensibility -why is it important for readers to know (and remember) all of them?This is not at all necessary and I strongly recommend the authors to check to what extent these are needed.For example, why is the abbreviation ENHD101 used to identify the first intervention?For readers to understand, that there were two different trainings, the terms "intervention 1" and "intervention 2" or the like would be sufficient.
Overall, the article requires a clearer structure.In particular, the description of the training and the context in which they took place belong in the methods section of the article, but make up a large part of the theoretical section of the paper.This section in turn only touches briefly, but accurately EIDM, but as this is at the heart of this work, it should be given more space.This is a well-established topic of research in different disciplines.An example for an interdisciplinary network dealing with this topic can be found here: https://transformingevidence.org/projects/transforming-evidence-network.What would be way more interesting for international readers (and is missing in the introduction/theory section), is an explanation of the African context and the health sector, and why is EIDM important at all.This can also provide the framework for a much more elaborate discussion of the results (see below).
A lot of the decisions made are not transparent and must be better substantiated.For example, I wasn't sure which stakeholders took part in the training (and why) until the end of the paper: Researchers, policy makers, or who?One reason for that was the repeatedly mentioned objective of supporting "research uptake and policy engagement" of "evidence producers and evidence users", but isn't the former part of being a researcher anyway and why are the participants evidence users?Other aspects, that are not sufficiently substantiated are, why young and senior researchers were invited to participate in two separate trainings, why the two different trainings are investigated in the same paper, but not at all compared?Accordingly, Fig. 1/2 and Fig. 3/4 should be considered to be merged (and statistical analysis carried out).Even though the research design is described in the methods section, it needs to be straightened (redundancies).The difference between the pre-and post-course evaluation and the end-line evaluation is not easy to follow.One reason for this is that the actual survey is not described in detail: What did the survey actually assess?Were there parallel versions available?And why are there differences in the platforms used to carry out the surveys?And what do the authors mean by technical skills or knowledge?Do they mean research capabilities or competencies (which are necessary for wellreflected decision-making) or really technical skills?From what is reported in Table 3, one comes to the conclusion that self-ratings about the ability to carry out certain EIDM steps were assessed, that are not at all an operationalization of skills or knowledge (for which it can be identified, what the correct solution is).This means that a more profound description of the used research instruments is necessary, and if the authors assessed in fact self-ratings, they need to be more precise in their terminology.For example (if I am right), they investigated the self-rated ability to make evidence-informed decisions as an outcome of a training… here, another theoretical aspect is missing in the theoretical section of this article: Which form of learning and learning gains are to be expected of such a training?For example, self-ratings of abilities are closely related to concepts of self-efficacy beliefs (Bandura, 1997) that are (as an indicator of motivation) reportedly of relevance to EIDM.Connected to that, the research question should be revised, too.
The results section contains a lot of interesting results, particularly the quotes on page 8.However, these need to be framed accordingly, both in the introductory part and in the discussion of the paper (see above).In its current form, the article remains at the level of an evaluation report and as mentioned, does not fully develop the potential to provide insights into how training for EIDM can motivate and impart strategies.And as a researcher in the field of EIDM (even though in another discipline), this is the truly interesting part of this paper, as the research field in general struggles to develop trainings that are able to convince and empower stakeholders to carry out EIDM.If possible, it would be interesting, too, to learn not only about the content of the training that was liked most (which is not associated with knowledge gains, but with the motivation to get involved), but what was missing.In principle, this is echoed in "3.Priority area for future training", but that is not enough, especially as contradictions arise here that need to be addressed.Why, for example, do emerging and senior researcher recommend "training on accessing, appraising and synthesising research"?Haven't they been already educated in that regard -or how did they become researchers?For example, why did one participant report "Yes, the training enabled me to write a better literature review chapter for my PhD proposal" and how was this connected to the objective of the training?At this point at the latest, the suspicion arises that the training was more like a tutoring session for researchers -or was the aspiration more than that?Another contradiction not addressed is the result, that senior researchers rated their EIDM knowledge

Kirchuffs Atengble
1 PACKS Africa, Accra, Ghana 2 PACKS Africa, Accra, Ghana Firstly, the entire article needs to be revised to improve its ease of reading, as grammatical constructs (and obscure connections in sentences) makes current reading a bit difficult.Consistency and coherence in idea communication should critically be looked at; e.g."Through the email circulation from AAS, the interested DELTAS programme expressed interest in our two training modules, ENHD 101 and ENHD 102 which are described fully below".Do the authors mean the programme itself, or interested programme partners showed interest?
Quite unconventionally also, the article was written in active voice, using first person plural pronouns.This is however left to the discretion of journal editors.
The methods described suggests that different kinds of evaluations (needs assessment, pre-post, endline) were being reported.These should be clearly differentiated in the findings/Results section, and better articulated.
Quality of the (learning/intervention) evaluations could have been improved if demographic data on learners were used to evaluate their respective degree of learning, producing insights to support learning among particular demographics.
It is unclear if the discussion portion of this article was truly dedicated to findings of the study, or random review of the literature.These should have a common focus, linked to the different findings/results reported.
The conclusion will need to be adequately revised to coalesce lessons from the different findings from the evaluation.Potentially, the authors could explore implications for program management, intervention design, among others.This is an evaluation.

Is the work clearly and accurately presented and does it cite the current literature? Yes
Is the study design appropriate and is the work technically sound?

Are sufficient details of methods and analysis provided to allow replication by others? Partly
If applicable, is the statistical analysis and its interpretation appropriate?

Not applicable
Are all the source data underlying the results available to ensure full reproducibility?Partly

Are the conclusions drawn adequately supported by the results? Partly
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Research methods, EIDM, knowledge management, public policy, decision making, international development, information systems, organisation development I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.
This paper draws on the Enhance DELTAS programme led by the African Institute for Development Policy (AFIDEP) to strengthen individual capacity for evidence use to include training, mentorship, and webinars as key interventions among researchers and policymakers.These interventions to enhance research uptake and policy engagement were provided to awardees of the Developing Excellence in Leadership, Training and Science (DELTAS) Africa initiative, led by the African Academy of Sciences (AAS).
creation of formal and informal interaction between researchers, policymakers and other decision-makers; and how the team contributed to the use of evidence for decision making.The data was collected once, on the first day of training to assess participants' level of understanding of the technical components.This was administered using the Survey Monkey online platform.Immediately after the training, the participants were encouraged to complete a post-course questionnaire to assess the change in knowledge after the training and also sought information on the quality of the training, future topics, and potential areas of improvement for the training programme.As observed for the pre-training evaluation, this was administered at once for all participants using the Survey Monkey online platform.The data was based on training materials developed by AFIDEP.For the ENHD101 out of 31 participants who joined the training, 27 (87%) participants completed the pre-post course questionnaire after the training.On the other hand, for the ENHD102, out of 20 participants who attended the training, 8 (40%) participants completed the pre-post survey.

6 )
Malaria Research Capacity Development in West and Central Africa (MACARD) 7) West African Centre for Cell Biology of Infectious Pathogens (WACCBIP)

"
The content of the presentations and the interactive session were all impactful and engaging.I also like your flexibility in order to achieve the aim of the training.""Learning about what policy is, stakeholder mapping, synthesis of data, writing policy briefs.It has been an amazing course.""I like the teaching (presentations).All the presenters are experts in the field and have a lot of knowledge in policymaking.""The discussion on evidence, how strong is the evidence?Reviews and the practical on writing the policy brief among others.The whole programme was wonderful."b) General reflections from the participants The team conducted a qualitative survey immediately after the training programme to understand the views of the participants following the training.The participants had the following to say for ENHD101 targeting early career researchers: "I am feeling more comfortable to develop a policy brief; I will pay more attention to stakeholder mapping.I will go back to do a paper on systematic reviews which I had initially dropped" (Ph.D student, ENHD101) "I have seen things from another angle, in pushing my work further regardless of low government interest.I am now more aware of the stakeholders I need to target" (Ph.D. student, ENHD101).

Figure 1 .
Figure 1.Policy engagement and research uptake following training among early career researchers.

Figure 2 .Figure 3 .
Figure 2. Policy engagement and research uptake following training among senior researchers.

Figure 4 .
Figure 4. Priority area for future training among senior career researchers.

Reviewer Report 08
April 2024 https://doi.org/10.21956/wellcomeopenres.22431.r74318© 2024 Groß Ophoff J.This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Jana Groß Ophoff Institute for Secondary Education, University College of Teacher Education Vorarlberg, Feldkirch, Austria Dear authors, dear editorial board, Thank you very much for the opportunity to review the article "EIDM training as a key intervention among researchers to enhance research uptake and policy engagement: an evaluation study", which describes the effects of a training program for young and senior researchers in Africa to improve their Evidence Informed Decision Making.I recommend a major revision.

○
Were there quantitative results for the 8 month follow up?I believe the same survey (as the pre-and first post-tests) were administered.○Discussionsection:Results should be integrated into the wider body of literature.○Overall-Edit for grammar, use of consistent verb tense, flow, and concision throughout the manuscript.Is the work clearly and accurately presented and does it cite the current literature?PartlyIs the study design appropriate and is the work technically sound?PartlyAre sufficient details of methods and analysis provided to allow replication by others?NoIf applicable, is the statistical analysis and its interpretation appropriate?PartlyAre all the source data underlying the results available to ensure full reproducibility?NoAre the conclusions drawn adequately supported by the results?NoCompeting Interests: No competing interests were disclosed.ReviewerExpertise: Quantitative Methodology, Statistical Analysis, Program Evaluation I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.Reviewer Report 10 March 2023 https://doi.org/10.21956/wellcomeopenres.19978.r54701© 2023 Atengble K.This is an open access peer review report distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Table 2 . ENHD 101 and 102 Training Content. ENHD 101: Policy Engagement and Evidence Uptake for Early Career Number of days
: 4-day 3 hourly sessions Topics:1.Introduction to Principles of Evidence-Informed Decision-Making 2. Mapping the Health Policymaking Landscape.Understanding the policymaking landscapes; understanding the political, social, and economic contexts which influence policymaking; case studies of national, global and regional health policymaking processes.3. Developing a Policy Engagement Strategy for Research Project.

1
School of Social Work, University of Windsor, Windsor, Ontario, Canada 2 School of Social Work, University of Windsor, Windsor, Ontario, Canada I'm not sure what "There are a number of evidence that have investigated the need and…" means.It is not clear what the pre-and post-test scores mean or are scored out of -details for all measures must be provided.There is currently almost no methods details.
○Results section:Quality of training -"generally, all the participants…" must be explained in greater details.Provide frequencies for results.