Health Research Priority Setting in Uganda: A Qualitative Study Describing and Evaluating the Processes

Background: Over the years, several approaches to health research priority setting (HRPS) have been devised and applied in low-incomes countries for national level research prioritization. However, there is often a disconnect between the evidence that health policymakers require for decision-making and the research that receives funding. There is a need for countries to evaluate their prioritization processes to support strategies to translate priority setting into policy practice. While health research priority setting is continuously carried out in Uganda, these processes are rarely reported on the scholarly literature and have not been evaluated. This study aimed to describe and evaluate HRPS in Uganda. Methods: This was a qualitative case study consisting of document review and key informant interviews with stakeholders who had either directly participated in or had specialized knowledge of HRPS in Uganda. Results: While Uganda has established and legitimized a National health research organization to set health research priorities, coordinate and provide oversight for health research in the country, several institutions independently conduct their own health research priority setting. The evaluation revealed that while the priority setting processes are often based on systematic approaches and tools and tended to be evidence based, most of the prioritization processes lacked stakeholder involvement and implementation. Moreover, the priorities were not publicized and none had mechanisms for appeals or revisions. In only one case were the priorities implemented. Conclusions: The availability of strong political commitment and a national priority setting institution is an opportunity for strengthening health research priority setting. There should be increased support for the institution to enable it to carry out its duties. The institution should not only invest in participatory, systematic health research priority setting and implementation but evaluation as well in order for them to identify areas for improvement.


Introduction
In low-income countries (LIC), there is often a disconnect between the evidence that health policymakers require for decision-making and the research that receives funding. In many ways, this is due to a lack of mechanisms to protect researchers from the mandates of stakeholders such as politicians, interest groups, and foreign funders.
Compounding this problem, projects addressing health issues in LICs are often uncoordinated, resulting in duplication and fragmentation. Research from multiple institutions, therefore, rarely coalesces into focused, evidence-informed policy decisions.
Given the scarcity of funding for health research in LICs, it is critical that resources are optimally allocated (1,2). This can be done via priority setting (PS), a process that seeks to optimize resource allocation by providing stakeholders with a systematic approach to identifying all research possibilities, developing and evaluating criteria to assess those possibilities, and ultimately creating a ranked list that comprehensively captures their particular needs as priorities (3).
Over the years, several approaches have been devised for health research priority setting (HRPS). A methodical review for the period 2001-2014 found that across 165 studies, the most frequently used PS approach for health research was the Child Health and Nutrition Research Initiative (26%), followed by the Delphi approach (24%), James Lind Alliance (8%), Combined Approach Matrix (2%), and Essential National Health Research (ENHR) (< 1%) (4). Meanwhile, McGregor et al. (2014) showed that the majority of HRPS approaches were initiated by international organizations or collaborations (46%), with researchers and governments as the most frequently represented stakeholders. This study also showed that there was limited evidence of implementation or follow-up strategies, and that challenges to PS include engagement with stakeholders, data availability, and capacity constraints (5).
The above studies demonstrate that progress has been made in understanding health research prioritization. Speci c to low-income countries, some studies have assessed the institutional capacities, whereby limited institutional capacity has been documented (2,5,6). Other studies have documented limited capacity to implement the identi ed priorities (5), while others have focused on the criteria used (7), or the stakeholders involved in the process (8,9). However, most of these studies have focused on some but not all of the critical components of health research priority setting. The literature highlights the relevance of understanding the context, the inputs or precursors, the actual process (of which fairness, stakeholder involvement, explicit guidance tools and criteria are key components), the implementation of the priorities and the evaluation of the process (10,11). A study based in Zambia described and evaluated health research priority setting based on an internationally validated framework, which featured the key priority setting components discussed above (12). The study demonstrated that the framework provides a systematic and step by step approach to assess the degree to which the various key components of health research priority setting are being achieved. They were also able to identify key areas where improvement efforts could be focused (10,12). The identi ed need for countries to evaluate their prioritization processes was critical to this paper, in order to obtain evidence upon which to base improvement strategies (1,10).
While there have been efforts within the Ugandan research community to systematize HRPS, there have been no systematic evaluations of the same. With regard to nancing of health research, a survey undertaken by the East African Health Research Commission found that health research in Uganda is nanced from a variety of sources but mostly from external funders. The total health research investment in Uganda for 2014/2015 was USD 116.84 million out of which the share of domestic nancing was USD 11.08 million. Therefore, 90.51% of the nancing for health research was generated from external sources and only 9.49% was from the domestic sources (13). It is critical that these resources are allocated e ciently. This paper, based on the framework for evaluating health research priority setting, lls this gap in the literature.
The insights provided in this study can form the basis for designing focused strategies to strengthen health research prioritization and will bene t the Government of Uganda (GOU), other LICs, and partners who support health research systems in LICs.

Objectives
This study aimed to describe and evaluate HRPS in Uganda with the following speci c objectives: 1. To describe and evaluate historical practices of HRPS in Uganda based on a validated framework. 2. To identify facilitators and barriers to effective HRPS in Uganda; and best practices that can be shared with other LICs. 3. To make recommendations for strengthening HRPS in Uganda and similar countries.
The conceptual framework: The evaluation was based on Kapiriri's framework for evaluating HRPS (11,14). While this framework was originally developed to evaluate priority setting for health interventions, it was adapted, validated and used to evaluate HRPS in Zambia (12). This framework consists of ve domains: (i) The priority setting context, (ii) The priority setting pre-requisites, (iii) The priority setting process, (iv) Implementation and (v) Outcome and Impact. Within each domain, the framework speci es parameters. Subsequently, each parameter has objectively veri able indicators and respective means of veri cation. These are summarized in Table 1.
The framework provided a basis both for the development of the study instruments and data analysis.  were interviewed. Respondents were recruited until theoretical saturation was achieved (15).
Key informant interviews were conducted between February and April 2019. An interview guide, based on the evaluation framework, was used to collect the data. EE conducted the face-to-face interviews, which lasted between 30-40 minutes. All interviews, except for one, were audio recorded with permission from the respondents.
The recorded interviews were transcribed verbatim.
Data Analysis: Analysis was conducted manually. To facilitate the description and evaluation of how the current health research priorities of the MoH were identi ed; EE and LK read through the whole transcripts and identi ed broad themes that matched the study objectives. Further analysis involved the two investigators mapping out the responses according to the parameters in the evaluation framework. Themes related to the various parameters were then grouped under the respective domains. Subsequently, we assessed the degree to which the description met the requirements for a given parameter. The report is organized according to the domains and parameters.
Areas of alignment were identi ed as lessons of good practice to be shared and areas of non-alignment were identi ed as challenges where improvement strategies are required.
Validation: The initial synthesis of the results was presented to and validated by study participants and additional health research stakeholders in a workshop in May 2019. Ethics Ethical clearance for the study was obtained from the National HIV/AIDS Research Committee and nal approval from Uganda National Council for Science and Technology (UNCST). The study was also approved by McMaster University Research Ethics Board.

Results
The ndings are based on the document review and analysis of the 33 key informant interviews including: four respondents from development assistance partners (DAPs), ve from health-related ministries, seven from academic institutions, ve from private research organizations, four from regulatory institutions, and eight district health o cers.
The results section is organized according to the ve domains of the framework: (1) contextual factors, (2) prerequisites, (3) processes, (4) implementation of priorities, and (5) outcomes and impacts.

Contextual factors for HRPS
This domain re ects on the degree to which the political, social, and cultural contexts are conducive for HRPS. Politically, the reviewed documents and the interviews alluded to the idea that the Ugandan government recognizes the importance of health research for evidence-based policy and decision-making. For example, the 1997 Local Government Act stipulated that while the DHOs set and implement priorities which should be aligned with national priorities; at the national level, health-related ministries manage and implement research.
Furthermore, the health policies also mandate the establishment of a national health research organization with speci ed roles and responsibilities. The other contextual factors were discussed in terms of barriers to HRPS and implementation whereby lack of funding and negative cultural beliefs are believed to hamper HRPS and health research.
"it has set up bodies like NDAs, like UNHRO, like us UNCST then there is political will otherwise they would have just shut us down." However, 'Political will is constrained by limited resources." #18.
"Then cultural issues play a role but at a very micro level. They get pieces of research showing that may be there is resistance to immunization because of certain cultural issues and practices or beliefs." #30.

Prerequisites for HRPS
The evaluation framework identi es three prerequisites for successful HRPS: (1) political will, (2) availability of human and nancial resources, and (3) a HRPS institution with the capacity to set priorities.
Political will for HRPS As discussed above, there is political will is demonstrated by However, some respondents expressed doubt, referring to the lack of funding. These respondents believed that relative to other political priorities, health research did not enjoy as much political support.
"I don't think even political will which is there. It's not as much as a high agenda as security, it's not as a high agenda as infrastructure, it's not a high agenda as probably looking for markets, tourism. I don't think it's that level." #28. Availability of nancial and human resources Several respondents reported that there were limited nancial and human resources to support HRPS. Health research priority setting receives funding from the GOU or from donor agencies, but this funding is often meagre. For example, the governmental allocation for health research for 2017-2018 was only 0.17% of the health sector allocation and 0.01% of the overall budget allocation (16).
"There is political will, but the ability to support I know is limited through allocation of resources." #27 External funds often support actual research. However, it is di cult to quantify, as these contributions do not come via GOU budget support and the Ugandan annual national health accounts do not itemize health research as a study category (17): "There's a lot of research work going on, but is funded through project support not budget support, and it is not captured in the government reporting framework-even if they are consistent with national research priorities." #18 Arguably, the limited resources also impact the number and quality of human resources that are available to provide health research oversight: "But you look at the entire ministry, we had challenges with human resource capacity in every aspect-inadequate numbers and inadequate capacity. UNHRO has no representation at sub-national level (e.g., regional, district); they should be represented in all centers where power belongs." #16 "I don't think they [ "…There are multiple layers, a busy landscape-NGOs, donors, development partners-so coordination becomes di cult. Determining priorities becomes di cult…" #18 "…There are so many research institutions that do not talk to each other..". #8 "…I think there is no proper coordination of … either the research institutions or those who generate the research questions. There is no repository for research either-evidence or even research questions. There is no … research hub or a unit … such that these things can be taken up, can be prioritized, can be discussed and funded…" #11 Although the districts are also mandated to identify their own research priorities, their capacity to do so was doubted: "For local governments, some don't have capacities to identify priorities, they are not able to observe and articulate a priority because they are too engrossed in being part of the community and are not exposed to other contexts." UHNRO-Busitema University-MoH (2018). We describe these processes in detail, where information was available.
These processes have been conducted independently although they should, ideally, support the ENHR strategy. For each initiative, we identify the critical parameters relevant to the evaluation framework for a standardized evaluation. Table 2 summarizes the previous HRPS in Uganda, identifying the leaders of the initiative, the stakeholders who were involved, the criteria used to rank the priorities, outputs and dissemination and implementation. Below we organize the descriptions under the parameters according to the evaluation framework. District Leadership, the community (direct), and community (indirect: CSOs, NGOs, etc.) (Fig. 1).
All initiatives involved representatives from academia. However, among all initiatives, the Ad Hoc committee, which was the rst HRPS initiative in Uganda, was the most participatory; involving focus group discussions with the community representatives and districts -a stakeholder group that is consistently missing from most of the subsequent initiatives.
One DHO recalled attending an HRPS event organized by the Ad Hoc Committee on ENHR at regional level 10 + years earlier. At this meeting, several districts were represented by the political and technical leaders, including DHOs, to seek their perspectives on research priorities.
"Every district would present a list, then we would say among these (which are the priorities)? There would be a scale… some kind of ranking. The forum was interactive, guided by some lectures, experience from some countries…" #29.
Other than that initiative, however, most of the district respondents reported participating at dissemination meetings as opposed to the meetings where priorities are determined.
"So you are called at the end (at national dissemination meetings); basically that you're given the research priorities but have not contributed to determining them." #24. Some key informants noted the limited stakeholder involvement, although they recognized the relevance of including a broad range of participants in the PS process in improving the process's success and the subsequent uptake of the priorities: "…The problem I'm seeing is the way they organize the process of setting the priorities, because it's like they went and sat in [region]. How many were there? How many people come? Instead they should have broken it into regions … they can go to some sub-counties and invite sub-county chiefs, CDOs…" #6 Use of an explicit approach/ method, evidence, and criteria All ve initiatives reported a systematic approach however, these varied from using a speci c approach e.g. the ENHR approach with the Ad hoc committee, to explicit steps in which options are identi ed and ranked based on speci c considerations or criteria, or "consultations" with experts about what they perceived the priorities to be, as exempli ed by a respondent who stated, "…It was more of a consultative process to start research priority setting..." #10 The prioritization processes were all achieved in face-to-face workshops. All the approaches considered research evidence, evidence on the disease burden, and gaps in the literature. However, notably, only the UVRI process, considered the previously identi ed research priorities.
With regards to criteria, while all ve initiatives considered several factors when determining their research priorities, these again varied from explicit criteria to questions that the stakeholders were asked to consider. The criteria used by the different initiatives collectively included: feasibility (including costs, capacity), avoiding duplicity, urgency, acceptability, potential bene t, disease burden, link to (institutional, national e.g. UHC, and regional) research priorities, degree of uncertainty, and opportunity for change. While all these criteria could potentially be relevant to any HRPS process, no single initiative considered all these criteria when identifying their research priorities. Another controversial criterion was the funders' priorities, "…Donor interests will in uence what kind of research you would want to do. Remember we do a lot of collaborative research [with] people from the west…" #4 Overall, there was no clear documentation on the origins of the criteria with the exception of the CHS initiative where they identi ed Vierger's framework (7) as the source of the factors/criteria they considered. Publicity All initiatives developed a list of health research priorities ( Table 4). The lists show a shift from disease-speci c priorities to health system issues. The study team found a list of priorities dated 2005-2010 on the UNHRO website, a narrative summary of the MRC-UVRI on the MRC website, and the report of the SPEED project PS events on the SPEED website. The other priorities (usually within HRPS event reports) were sourced from key informants, since they seemed to be available only within institutions. This was supported by the key informants who reported that their priorities are usually disseminated: "…We disseminated to those who were in the space: the Ministry of Health, those who attended. We had a mail list actually, many of these conferences, these are documents we print and put on the desk for people to pick…" #7 However, one respondent highlighted the need for priorities to be made available to stakeholders who are not usually considered in the dissemination process. Complementing this suggestion, and considering that not everyone may have access to the above mechanisms, were recommendations to pursue more channels for publicizing priorities: "…A research agenda, I believe it would be important to communicate it to key stakeholders who would at least participate in research especially universities, some health facilities. The big ones like national referral hospitals, regional referral hospitals … But also, it should be available to the public in various fora … and should be freely available online…" #15

Implementation of the health research priorities
This domain includes allocation of resources according to the identi ed research priorities.
The initiatives did not report clear and explicit implementation plans. For some, the process ended with submitting the list of research priorities to the organizations that contracted the HRPS or the MOH (for example the UVRI and MUSPH initiatives). The latter hoped that their priorities would be used by the MOH, recognizing their limitations to actually ensure resource allocation: "The goal was actually not necessarily around research funding. It was to give the kind of document which speaks to issues that are necessary to move the agenda with HSDP forward. And some of the research is being guided by that, but also other players have come to do the research. But I can't tell you how much money has been spent on those topics because we don't have capacity to tell who is working in this area but at least there was a dissemination of that information for use." # 7.
Our respondents recognized the limitation and futility of priority setting without the accompanying allocation of resources: "…I think as long as priority setting is not linked to resources then it can't be a priority…" #28 "…If we do not implement what we have put down, then it is as good as a waste of time. So, the experience in Uganda, you will nd a lot of write ups on priority setting efforts in Uganda, but what follows after the priority setting exercise? It's worrying!" #8 Additional implementation challenges exist at the district level. These issues cut across all components of HRPSeither due to funding constraints or resistance to research efforts: "…Locally there is political will but the ability to support I know is limited; through allocation of resources…" #27 "…When there is poor communication with the community, that one can also affect the implementation of those priorities of research..". #21 However, previous surveys of health research in Uganda seem to suggest that some of the work being undertaken aligns with the priorities that have been set. One study of publications from Makerere University between January 2005 and December 2009 showed that out of 837 publications, two-thirds (66%) addressed the country's priority health areas (18). There are no equivalent studies, however, on alignment of research outside the con nes of Makerere University.

Outcome and Impact
Our results did not provide clarity with regards to which priorities were actually implemented and had an impact on policy, with the exception of the PEPFAR initiative of priority setting for HIV/AIDS research. In this case, the list was used to identify the annual research focus for funding by PEPFAR. The other output was the recommendation to institutionalize the UNHRO, which was made by the Ad Hoc priority setting initiative Beyond these two examples, respondents decried the limited translation of research into policy and programming and the disconnect between the research and policy needs: "…That connection with the government, constant connection with policy makers and implementers, I feel is lacking. They are not connecting, so that that to me is a big weakness in national development. Instead we have got people who do research from outside and talk better about our systems…" #9 "…We have various government structures here where research questions are generated … unfortunately not many of the research questions generated are taken up for research…" #11 The GOU requires each recipient of a budgetary vote to report on speci c development indicators based on their responsibilities. For example, the MoH reports on the indicator of "Proportion of research informed policy and guidelines" for funds received for health research, which in 2017/18 was rated as 30%. The same indicator for the same year is reported by UVRI as 20%.
A key informant noted that these numbers do not capture external health research funding: "…The reporting indicator framework for MoH/NPA is an incentive to comply-i.e., you have to report on priorities. This is an in-built mechanism to ensure ministries use resources for what they were purposed, but most research is donor funded and not reported through the government structures, so does not get reported. .."#18

Recommendations for improvement
At the member check and dissemination meeting the participants, who are stakeholders in HRPS in Uganda, made some recommendations based on the ndings. These were related to: stakeholder involvement, championing HRPS, establishing a HRPS think tank, and including HRPS in annual work plans. These are summarised in Table 5. This results in nationwide ownership of the set priorities "…The process needs to be so wide, and so integrated in as many constituencies as possible to be owned nationally by as many people as possible…". #10

Champions for HRPS
It is not enough to include stakeholders; there must be people whose responsibility it is to drive HRPS "…It requires a champion. Somebody, a senior researcher to champion this agenda. It needs a lot of lobbying and advocacy for research either at parliamentary level or at ministerial level so that research issues are embraced by all institutions…" #11 Establishing a think tank One of the private academic institutions has established a think tank consisting of individuals from different disciplines-including the business and industry communities-to advise on research needs in the country "…We also have a think tank, which is speci cally to us but not the country at large. At that think tank we have multiple stakeholders; the think tank is very much research driven and majority of our priority setting comes from those conversations…" #3

Capturing research priorities in annual work plans
Including research priorities in the work plan as an important step towards implementation "…After agreeing on the (research) priorities, we were able also to capture this, to prioritize this and capture in our annual work plan and budget. Where we have the source of fund identi ed we also include, where we don't have the source of funding, we keep it as unfunded priority then we start looking for the money… #24

Discussion
Across the ve domains (context, prerequisites, PS process, implementation, and outcome/impact), our study found a number of factors that either facilitated or hindered HRPS in Uganda. In terms of the context domain, the study ndings show that both the political and socio-cultural contexts including the relevant laws, statutes, and health policies provide an enabling environment for HRPS to take place in Uganda. However, there are some contextual barriers in terms negative cultural beliefs which hamper successful HRPS. Furthermore, limited and irregular funding make it di cult for the PS institution to strengthen their capacity and regularly conduct health research priority setting activities such as implementation, monitoring, and evaluation.
The evaluation revealed that all the historical processes used a systematic approach to identifying health research priorities and that they based their priorities on the available evidence. This may be an indication of capacity in the systematic priority setting. Recent years have seen focused capacity strengthening for HRPS in LICs (3,(19)(20)(21).
While there is still room for improvement, these ndings maybe an indication that with concerted focus on speci c areas where there are weaknesses, health research priority setting systems in low income countries can be strengthened. The framework used in evaluating health research priority setting facilitated the identifying of speci c areas where such concerted efforts could be focused (11,12,14). We discuss these areas in detail.
The ndings that while a national health research organization is mandated by an Act of Parliament, to be the coordinating institution for all public and private research actors in the health sector, limited resources hamper the organizations' operations have been documented in similar contexts such as Zambia. This limited institutional support results in countries having various institutions (often those that are better resourced) assume the health research priority setting role. This fragments national health priority setting. Furthermore, although the national health research organizations should be coordinating all HRPS within a country, some of these organizations do not report to or involve these organizations in their processes (1,5,12). However, recently the Uganda One Health Strategy has shown promise in bringing a multi-sectoral approach to the health system. The strategy speci cally prioritizes the need to build capacity for multidisciplinary, collaborative research within Uganda and could be a step in the right direction of facilitating coordinated HRPS (17).
Another limitation of the HRPS was the limited involvement of the public and sub-national level stakeholders.
When public input was sought this was limited to consulting with civil society organisations and nongovernmental organisations. These ndings are consistent with the HRPS literature (8,22). While this literature emphasizes the relevance of public participation in priority setting, involving the public in health research priority setting may be more challenging since research (as compared to health interventions or disease programs) is not commonly discussed in public contexts. Since the respondents recognized this as a gap in their PS processes, the national health research organization could take the lead in educating the public about HRPS. Existing and already validated approaches such as the Choosing Healthplans All Together (CHAT) tool could be used as a basis for meaningful public engagement in health research priority setting (23). This approach would be supported within the decentralization framework. Since research implementation often occurs at the district level, decentralization provides opportunities where the public could potentially be meaningfully involved in HRPS within their local settings, and these priorities could contribute to the national level health research agenda. Such a participatory, bottom up process would contribute to local capacity strengthening and potential for support for research within the districts-which would be relevant to the local context (7,8,24).
Lastly, while all initiatives produced a list of priorities, there was no clarity with regards to: how they re ected on any prior priorities that were set; how they disseminated the list beyond the reports that were produced; and whether and how they evaluations were done to assess the degree to which the identi ed priorities were actually implemented. The literature alludes to the fact that lack of a formal publicity of results research priorities, and limited stakeholder buy-in can jeopardize the priority setting process and subsequent implementation of the determined priorities (1,25). There were only two documented instances where priorities were actually implemented: PEPFAR's HIV/AIDS priority setting initiative and the Ad Hoc Committee, which recommended the institutionalization of UNHRO. These examples underscore the discussion above with regards to stakeholder buyin and engagement in implementation.
There is need to monitor and evaluate priority setting right from the planning process to implementation. Our ndings support scholarly literature which has found that there systematic mechanisms for evaluating health research prioritization processes are lacking (25) and a paucity of published information on the implementation and evaluation of HRPS in LICs (5).
Limited evaluation of the degree to which priorities are actually implemented underscore the importance of a framework which looks beyond HRPS which culminates into a list of priorities. This, in many countries, ends up on shelves until yet another exercise. Systematic evaluation and publicizing the ndings would examine the degree to which the research that is implemented in a country aligns with the national health research priorities and the health strategic plan. Evaluation would not only support focused allocation of health research funding, but also ensure that there is a synergy between health research and health policy and practice (26,27).
While, in recent years, there have been efforts to set program and/or disease-speci c health research priorities within Uganda (28,29), to date, there are no recent undertakings on the stock of research in Uganda to better inform future research, to use resources more effectively and e ciently, and to reduce fragmentation of the responsible agencies, research institutions, and research funding agencies.

Limitations
The ndings should, however, be interpreted with caution. Being a qualitative study with a disproportionate representation of national level respondents, it is di cult for us generalize the ndings to sub-national levels.
Furthermore, lack of direct observation of the priority setting process may imply that our ndings may not accurately re ect actual practice. However, direct observation was beyond the scope of the current study and is a promising area for future research. Finally, there is also the possibility that researchers' personal experience and knowledge in uenced the observations and conclusions related to research problems, re exivity of the researchers and the stakeholder validation of the results should have mitigated this.

Conclusions
While there is political will for undertaking HRPS within the GOU, insu cient funding has led to fragmentation of approaches and applications over the years. Health research priority setting in Uganda faces, challenges with coordination of HRPS efforts by the research coordinating entity (UNHRO), and dependence on external funding.
Although all stakeholders have important roles to play in health research management, relevant stakeholders are unclear on the scope of their work, how they are meant to collaborate, and how to best ensure their outputs are coordinated and cost effective.
There is a need for improved coordination of health research across research institutions and other stakeholders.
The One Health Platform-a new initiative-is planning to address the need to collaborate across sectors. While it is, still in its infancy and needs support, the initiative shows promise.
While there is su cient awareness of the different approaches to HRPS at national level, no uni ed and accepted methodology has been applied consistently. Regardless of which HRPS approach is chosen, this study both point to the fact that the key elements of the process need to be (1) wide stakeholder consultation, (2) agreement of criteria to be used, and (3) a good ranking system.
In addition, there must be a systematic evaluation across all research implementers, in order to (1) align priorities, and (2) ensure the translation of research ndings into actions. This will ensure that HRPS leads to the improvement of the population's health.