Keywords
Post-award management, grant management, monitoring and reporting, compliance, assurance, research funding, research bureaucracy, research impact assessment
This article is included in the Research on Research, Policy & Culture gateway.
Post-award management, grant management, monitoring and reporting, compliance, assurance, research funding, research bureaucracy, research impact assessment
We have addressed the comments of the reviewers to produce an improved version of the scoping review. In this, we have included an additional and important record (identified by the reviewer, Simon Kerridge) describing the roles of research management and administrative (RMA) professionals in the UK, which supports our analyses of the tasks and effort in post-award management. We have also expanded on the limitations of the review and cleaned up our data by removing an interim version of a public review of research bureaucracy in the UK, since the final report was already included. We reflect these changes in our figure (Figure 1), the data characterisation and - where applicable - the data analysis (namely Table 4). Finally, following reviewer advice (Caitlin Brandenburg) we have improved on the use (and readability) of our Table 6 on issues and solutions in post-award management by including a summary of the themes in the table in our main text Results section. We are happy that Version 2 of the paper now provides a better, more readable version.
See the authors' detailed response to the review by Caitlin Brandenburg
See the authors' detailed response to the review by Simon Kerridge
The availability of research funds is declining and is subject to tighter compliance, performance, and fiscal controls.1 As funders are under more pressure to monitor and account for the research they fund, the effort of ‘post-award management’ is also increasing and affecting the productivity and culture in academic research.2–5
Post-award management, also frequently known as ‘compliance’,6 ‘grant management’,7 and ‘monitoring and reporting’,8 refers to the stages of research after a decision to fund has been made. These stages involve many tasks to set up and manage the research, including awardees reporting to funders on its progress, outputs, outcomes, and impacts. The effort involved (particularly for researchers) can be considerable, and administrative support is therefore important to avoid hindrance to research and spending too much time on grant-related administrative activity.9 Where support is lacking however, too many compliance and reporting requirements can lead to faculty burden,9 pressure on research careers and the training environment.10 Despite this, practices in post-award management have been rarely explored by the research community, and resources related to funding and ‘grantsmanship’ remain focused on pre-award areas, such as applications and peer review.11
The disruptive effect of the pandemic on research has led to renewed focus on academic support and removing administrative barriers to mechanisms of funding and research delivery.12,13 In the United Kingdom, plans for reducing ‘research bureaucracy’ include making changes to practices that now crucially involve post-award management, and organisations are asked to streamline how they collect and share information to reduce duplication and delay in research.14 While there is evidence that funders are indeed making changes to reduce administrative burden in practices, these still mostly focus on pre-award processes (e.g., shorter funding applications and contracts),15–17 while post-award management remains an area of few visible changes. Funders and Higher Education Institutions (HEIs) must therefore show they are addressing their post-award processes and comparing their practices, involving other institutions and sponsors in making decisions (e.g., on methods of information collection), and explaining to researchers why information is asked for, and how it is used.14
As strategic changes occur in this space,18–20 they should be supported by better understanding of the tasks and effort that go into post-award management. The value of the process needs to be clarified to the research community,21 and so should the roles that funders and HEIs play in supporting more efficient methods of research reporting. The aim of this review was to understand the current position and landscape of post-award management; catalogue and summarise the different activities involved; and explore the purpose of information collection in research. An additional aim was to compare how funders currently approach post-award management; identify any unnecessary effort or need for improvement; and to use evidence of previous solutions to inform recommendations for both funders and HEIs.
The review followed the unpublished protocol of the authors and is based on the PRISMA-ScR and JBI methodological and reporting frameworks for scoping reviews.22,23
We used a broad definition of ‘post-award management’ to capture any process relevant to the set-up, management, monitoring, or reporting of externally a funded research award (funds awarded to a HEI by a government agency or charity). Processes are relevant from when the decision to fund has been made (e.g., Notification of Award) and include any type of communication or request for information (e.g., progress report) from the funder or other external research stakeholder (e.g., government sponsor) to inform on the status, outputs, outcomes or impacts of research during award delivery. Relevant internal (HEI) processes include curation and management of research data and the financial management of awards. Other relevant funder processes include methods of collection and tracking of research data (e.g., for research impact assessments). Processes carried out during award close-out and post-completion are also considered ‘post-award’ and relevant, and include end-of-grant reports and tracking the long-term impacts of research.
To be included in the review, records had to be written in English, be accessible in full text or PDF format and describe any process(es) relevant to post-award management and reporting, as per the definition above. Records were excluded if they broadly referenced to post-award management without detail on the processes involved, or if they were out of the study scope (e.g., focusing on financial management of research awards or frameworks for research impact assessment). Eligible records could be peer-reviewed publications, grey literature (e.g., blogs, reports), presentations, or websites and no limit was placed on the publication date, status, or country to capture as much of the literature as possible in what we expected to be a sparsely explored area of research.
All authors were involved in developing the electronic database search strategy used to identify relevant literature on post-award management and reporting of funded research.
To test the initial search term and keyword combinations, limited searches were conducted in Embase and Medline by the lead author (KC) to check the availability and relevance of titles and abstracts. Since combining all the search terms returned no results in either database, the search strategy had to be refined, whereby multiple separate single-term searches were conducted (Table 1) and the results of each search were screened separately. This process was followed by searches using search term/keyword combinations to narrow searches where possible (as shown in Table 1) and these results were also screened.
Full literature searches were undertaken in relevant electronic databases, namely Embase, Medline, Pubmed, Web of Science and Google Scholar. Additionally, manual searches of the content tables of key journals (determined using the interquartile rule for outliers) were conducted to find relevant articles published in the last year (2022-23). Initial searches were conducted in March 2022 and the final manual searches were conducted in March 2023.
In addition to electronic and manual searches, the websites of 11 funding organisations (listed in Table 2) were reviewed to obtain information on current funder approaches to post-award management. The funders (listed in Table 2) were chosen to incorporate a range of geographical regions, funder sizes and monitoring and reporting approaches. The websites were reviewed between March and October 2022, with updated review in February 2023. The dates the websites were accessed and links to the web pages searched are shown in Table 2.
Records identified through database and manual searches were exported with citation, titles, and abstracts into Endnote 20 (Clarivate, UK). Duplicates were removed using the EndNote duplication function and manually. Records were divided into two groups to screen titles and abstracts against the eligibility criteria. Each group was independently screened by two authors (AJBJ and KF), with the lead author (KC) screening all records. Any disagreements between authors regarding the decision to include or exclude were resolved through discussion until consensus was reached. Full texts were retrieved for all records that were agreed to be included at this stage. Where full texts could not be retrieved, access to full texts was requested from the University of Southampton Library Services. Two authors (KC and AJBJ) screened all the full texts of records, and if agreement to include or exclude was not reached the third author (KF) was consulted to arbitrate.
A structured data extraction form was created in Microsoft Excel and piloted using five eligible records. Following team discussions, the form was revised before formal extraction began. The lead author extracted the data for each record. Extraction fields (shown in Box 1) included: (i) study identifiers, (ii) study characteristics, (iii) details on post-award management processes, (iv) descriptions of unnecessary effort or need for improvement, and (v) descriptions of any previous solutions or recommendations from authors.
An additional data extraction form was created for information on the current post-award management practices of funders, as obtained from their websites. The lead author extracted data from 11 websites in total, with the relevance of all information checked by all authors. The extraction fields are shown in Box 1.
Extracted information on specific processes, requirements and tasks was analysed thematically, whereby similar processes were categorised and the resulting categories represented the main components of the post-award management processes. Each category was then ordered according to its frequency in the literature (i.e., the number of citing records) to give an idea of the effort involved. In adaptation of a method used by Glonti et al. to explore the roles and tasks of journal reviewers,24 extracted information relating to the purpose of post-award management was collated and used to compose common purpose-related statements (“The purpose of post-award management is …”). To do this, all relevant information was extracted into a Microsoft Excel spreadsheet by KC. Subsequently, information was coded into common statements and similar statements were categorised into overarching themes and ordered by frequency, again based on the number of citing records. Finally, extracted information relating to unnecessary effort or need for improvement in post-award management was summarised for each publication and thematically categorised according to the component of post-award management (e.g., research impact assessment) the information was relevant to. This information, along with any information on previous solutions or authors’ recommendations, was used to inform broad recommendations for funders and HEIs.
The information obtained from funder websites was extracted into and collated in Microsoft Excel and used to draw out the main points of variation in how funders approach post-award management. This included comparison of reporting requirements and the frequency of reporting, use of digital systems for award management, funder policies, resources and evidence of support.
The search strategy is depicted in Figure 1. Database searches yielded 2731 eligible records. Duplicate records were removed, and of the remaining 2117 records, 1926 records were excluded through title and abstract screening for not having met our eligibility criteria. Following full text screening of the remaining 191 records, a further 144 records were excluded (due to being too broad (n=38), out of the scope (n=100), or the full texts could not be retrieved (n=6)), which left 47 records to be included in data extraction. Manual searches identified a further 4 eligible records, giving a total of 51 records that were ultimately included in the review. Following peer review, we also included an additional record: a professional development framework for research managers and administrators in the UK (Table 3), bringing the total number of records to 52.
Author(s)/year | Publication type | Funding field (and country) | Extracted information on post-award management processes, information, and reporting requirements | |||
---|---|---|---|---|---|---|
Organisational level (Higher Education Institution (HEI) or Funder) | List of processes, tasks, or requirements described (must have) | Systems used to support processes (if any) | Stakeholders mentioned in the process (if any) | |||
Abdullahi et al., 202144 | Mixed-methods study | Small grants in health research (Kenya) | Funder | Promotion of funded research on digital and non-digital platforms, reporting collaborations, feedback, and recommendations for program improvement | Not specified | Principal investigators |
Abudu et al., 202258 | Review | Research grants and contracts | Funder | Annual or end-of grant reports, reporting research outputs (citations/publications, research accomplishments, collaborations/networks, capacity-building, career advancement, future funding, research targeting, media citations/presentations) and outcomes (products/research tools, patents, drugs, clinical practice policy/commission memberships), and impacts (broader health economic or societal downstream impacts of research, or ROI studies), administrative or financial data related to projects, Researchfish reports, surveys or semi-structured interviews with investigators | Dimensions.ai, Researchfish | Principal investigator or grantees, contracting officer or project representative |
Adam et al., 201836 | Opinion piece | Health research contracts/grants (Spain) | Funder | Reporting on research impact, plain English summary in all reports | Not specified | Not specified |
Adam et al., 201235 | Case study | Extramural grants in clinical and health services research (Spain) | Funder | Progress reports on achievements, results, and changes to work plan, declaration of all outputs | Not specified | Principal investigators |
Agostinho et al., 202079 | Case study | Research and Innovation (Portugal & Spain) | Funder and HEI | Contract negotiation, adherence to funder and statutory Ts & Cs, compliance with auditing requirements | Not specified | Institutional research support staff |
Aliyu et al., 202180 | Evaluation | Clinical and health research (USA & Nigeria) | HEI | Annual performance report, research performance progress report, financial tracking and reporting, subcontract management and compliance, fiscal oversight, effort reporting | Research Electronic Data Capture protocol tracking database (REDCap) | Office of Research Administration (grant managers, project coordinators, fiscal accounting staff), Institutional Review Board, Ethics Review Board, Community Advisory Board |
Al Mawali et al., 201681 | Review | Health research (Oman) | HEI and funder | Progress reports, final reports, clinical trial registration | Ministry of Health Centre for Studies and Research (MoHCSR) website | Not specified |
Allen, 201637 | Blog | Research (UK) | Funder | Reporting of grant outputs, products, and impacts (using common taxonomies developed by funders and policy makers) | ResearchFish, Pure, Converis, ImpactTracker | Not specified |
Association of Research Managers and Administrators, 201125 | Professional Development Framework | Research (UK) | HEI and funder | Drafting, negotiating, and accepting contracts, dealing with project finance, employing staff on research contracts, reporting to funders, collecting data on research outputs and supporting researchers to collect evidence of research impact, promoting knowledge exchange, supporting technology transfer, providing sound administrative support (for researchers and students), contributing to organisational and funder policies (e.g., Open Access), contributing to the Research Excellence Framework, supporting research ethics and governance, working with information systems, supporting audit, and making statutory returns. | Research Excellence Framework (REF) | Research managers and administrators (RMA), postgraduate researchers, funders, research staff |
Bagambe, 201267 | Journal abstract | Biomedical research grants (Uganda) | HEI and funder | Grants acquisition and management (from funding acquisition to project closure), contract negotiation, programmatic and financial compliance with funder requirements, monitoring and evaluation | An automated System for Integrated Grants Management (SIGMA) | Institutional and sponsor grant managers |
Baghurst, 202119 | Commissioned report | Health and social care research (UK) | Funder | Open Access policy compliance, plain English summary, reporting publication DOIs, reporting employer and funder affiliations, acknowledgements, ORCID ID numbers | CRIS, Researchfish, | Researchers, research managers, library services |
Basner et al., 201377 | Prospective evaluation | Federal clinical research (USA) | Funder | Semi-annual progress reports (overall center progress, research projects, cores, education and training unit, outreach dissemination unit, collaborations, in-progress publications, peer-reviewed publications, leveraged funds, patents, trainees, courses, eetings and outreach activities) | Interdisciplinary Team Reporting Analysis and Query Resource (iTAQR), NIH Research Portfolio Online Reporting Tools and Expenditures and Results) RePORTER | Investigators, trainees |
Bates & Jones, 201248 | Guide | Public health and community research (UK) | Funder and HEI | Periodic progress reporting: inputs, day-to-day activities, outputs, and outcomes (e.g., tracking types of events, activities, costs of delivery and characteristics of users), dissemination within community (e.g., social media), reporting of findings for external/internal feedback, publication of articles | Not specified | Project participants, sponsors, commissioners, volunteers |
Bhurke, Chinnery & Raftery, 2018*52 | Report | Health and care research (UK) | Funder | Grant acceptance and special conditions, compliance with T&Cs, final report, progress report, mid-term report, annual report, invoicing, extension request, publication | Not specified | PI/CI, host institution, funder, administrator, officer, programme manager, research advisor, head of research funding, director, business development team, trial steering committee |
Bird, 199582 | Book chapter | Ethics in research | Funder | Intellectual property, authorship, individual contributions, data retention policy, publication, progress milestones, notification of approval for changes (e.g., to research design or objectives) | Not specified | Not specified |
Bonham & Barnes, 202083 | Journal feature | Extramural grants in health research (USA) | Funder | Reporting ‘foreign influence’ such as: financial conflict of interest (including travel to affiliated institutions), foreign or personal funding, ‘time commitment’ to foreign institutions, conflicting IP’s or authorship/expected co-authorship, internal review of grant holders and projects, and reporting corrective actions to agency (in the case of foreign inquiries) | Not specified | Principal investigators, co-investigators, agency managers, government regulators, law enforcement and US Congress |
Briar-Lawson et al., 200884 | Case scenario | Federal social research (USA) | HEI and funder | Expenditure reports, purchasing of materials, hiring staff, travel arrangements, payments to research participants | Not specified | Principal investigator, HEI administrators, business managers, Associate Dean of research |
Brouard and Glass, 201785 | Conceptual article | Philanthropic foundation research (Canada) | Funder | Descriptive reporting of activities and outcomes, including end of project reports, testimonials and success stories, external evaluations of project/program, audited financial statements. Site visits, presentation to foundation board. Evaluation and ‘performance measurement’ of outcomes of activities. | Not specified | Principal investigators, grant managers and foundation staff/board |
Buck, 201438 | Correspondence | Research (UK) | Funder and HEI | Annual output, outcomes, and impact reporting | Researchfish | Not specified |
Burland and Grout, 201639 | Journal abstract | Research (UK) | Funder and HEI | Data management and compliance with policies of open access, accessibility/discoverability, and research administration standards. Reporting of outputs and impacts of research | Jisc, Current Research information Systems (CRIS/IR), CASRAI, ISNI, Je-S, Gateway to Research (GtR), open-source platforms, repositories, databases/spreadsheets | Not specified |
Clements et al., 201786 | Journal abstract | Research (UK) | HEI | Reporting publication output, intellectual property (IP) and engagement activities, acknowledging funding source in publications | Researchfish, Current Research Information System (CRIS/IR) | Investigators, programme managers |
Collado et al., 20178 | Mixed-methods pilot study | Philanthropic health research (UK) | Funder | Reporting impact (narrative via telephone interviews): outputs, grant products, direct grantee outreach, impact activities | Researchfish | Principal investigators, designated grant monitor |
Corona Villalobos, 202029 | Thesis | Administration research (USA) | HEI | Research administrator tasks: serving as a point of contact for federal grant managers, ensuring compliance (e.g., ethics, health and safety), implementing and reporting on federal grants, coordinating site visits or reviews, coordinating internal/external audits, completing financial and programmatic progress reports, grant closures, facilitating training | Not specified | Research administrators, Principal investigators, federal grant managers |
Croxson, Hanney & Buxton, 200149 | Journal article | Health-related R&D (UK) | Funder | Reporting measurable outputs: publications, patents granted, higher degrees awarded, conferences/meetings, references in published policy reports & guidelines, annual reports, self-reporting (e.g., questionnaires) | UK Economic and Social Research Council REGARD system, Research Outputs Database (Wellcome) | Not specified |
Davidson et al., 201440 | Pilot | Research (UK) | HEI | Compliance with data management and sharing policies, Data Management Plan, data registration | Researchfish, Digital Curation Centre Data Management Plan (DMP) online tool, CRIS, Edinburgh DataShare, Research data registry and discovery service (RDRDS), Jisc | Researchers, data managers, funders |
Decker et al., 20077 | Federal report | Federal research (USA) | Funder and HEI | Periodic scientific progress reports, financial reports, certifying the effort of research participants | Not specified | Principal investigators and co-investigators, administrative faculty staff |
DeMoss et al., 20186 | Journal article | Research (USA) | Funder and HEI | Review of award notice Ts & Cs, establishing accounts and allocating budgets, providing account information, effort reporting, setting up subcontracts, monthly account reconciliation, individual project and overall portfolio analysis, process effort changes, annual financial reports, forecasting/burn rates, review and transfer of trailing charges, project inactivation, Open Access policy compliance, financial conflict of interest | Not specified | Post-award accountants |
Dresen, 201230 | Exploratory survey and usability study | Federal research (USA) | Funder and HEI | Monitoring grant outcomes, budget, compliance with HEI and federal requirements, time and effort reporting, grant budgeting and accounting, financial conflict of interest, institutional IRB protocol, expenditure for contracted services | Not specified | Faculty research staff and students, sponsored program administrator, research administrators |
Flores-Rivera, 202031 | Implementation of a faculty services model | Research (USA) | Funder and HEI | Progress reports, subcontract-related invoicing, accounts payable, monitoring award balances, budget forecasting, record preparation and review, processing personnel, purchasing transactions, budget oversight and proposal processing, signoff, compliance, and data security, hiring, Just-In-Time requests, award-set up and financial support, submission of requests to pre- and post-award offices (e.g., for extensions, revisions), effort reporting, cost sharing transfers | New Oracle Cloud HCM and Finance Platforms | Research administrators, faculty staff, researchers |
Fowler & Zitske, 201587 | Presentation | Agricultural and life science research (USA) | HEI | Compliance with grant Ts & Cs, progress reporting, purchase documentations, technical and fiscal closeout, and beyond award activities (record retention, property control, audits) | Not specified | Principal investigators, faculty staff, dean’s offices, accounting and business services, Research and Sponsored Programs office (RSP) |
Guthrie et al., 201947 | Report | Health and medical research (Australia) | Funder | Grantee variation requests (e.g., change in research plan, contracted investigator, supervisor, institution, FTE, commencement date), financial acquittals, end of grant reporting (plain English summary, research achievements, likely impact on health, economy, community, policy, updated CV and featured outputs linked to grant ID), information on commercialisation (patents, spin outs, income from IP, licence agreements, industry co-funding), innovation (collaboration, other disciplines involved, journals cited), development and implementation of novel interventions, tools, therapies, processes or services | Sapphire grant management system | Principal investigators |
Hinrichs, Montague & Grant, 201534 | Report | Research grants & contracts (UK) | Funder | Recording research outputs, outcomes, and impacts | Researchfish | Researchers, evaluators |
Hunter et al., 201433 | Review | Environmental and global health (UK) | HEI and Funder | Financial reporting, ethical clearance, governance | Not specified | Grantees and students, HEI administrators, funder consortium |
Kane et al., 202188 | Observational study | Clinical research (USA) | Funder | Annual research performance progress report (or quarterly, if behind targets), enrolment data | Not specified | Federal program officers |
Kasim et al., 202132 | Case study | Science research (Malaysia) | Funder | Research project performance, financial project report, presentation of outcomes and results after project completion | Not specified | Centre for Research Grant Management, researchers |
Knowles et al., 202028 | Retrospective study | Clinical research (UK) | Funder | Annual compliance monitoring, after end-of-grant reporting of results summary (public), annual output reporting, registration of clinical trials with public trial registries, informing funder of trial registry number | Researchfish, Gateway to Research, trial registries (e.g., ISRCTN, EUCTR, clinicaltrials.gov) | Principal investigators, grant managers, regulators, and policymakers (UK House of Commons Science and Technology Committee) |
Magee, 201089 | Conference abstract | Translational research (USA) | Funder | Registration of clinical trials with public trial registries (e.g., clinicaltrials.gov), budget compliance (and post-award revisions), staff expenses and payments, invoicing, enrolment data, adherence to study protocol | Online clinical trial registries (e.g., clinicaltrials.gov) | Principal investigators, co-investigators, clinical research managers/administrators |
McLaurin & Gray, 202090 | Presentation | Research grants (USA) | HEI | Award and account set up, submission of reports, requests for no-cost extensions, financial management of award, changes in scope of work/key personnel/Ts& Cs/budget/planned or existing subawards | Not specified | Institutional research administrators, advisors, grants and contracts specialists, sub-award analysts |
Muller, 200945 | Review | Federal research (USA) | Funder | Reporting, informal discussions and site visits, grantee-funder engagement, closeout (final reports, financial accounting and reconciliation) | Not specified | Grantee, sponsoring agency |
Nature, 202150 | Commentary article | Research (UK and USA) | Funder | Journal outputs and research income | Not specified | Principal investigators and research managers |
Norris, 200691 | Mixed-methods case study | Arts research (USA) | Funder | Final report: final budget, project activities, community involvement/participation, programs (e.g., new programs), administration (e.g., new staff), partnerships/collaborations, publications) | Not specified | Grantees |
Lehman, 201642 | Exploratory sequential mixed-methods study | Research (USA) | HEI | Accounting, accounts payable and receivable, property and inventory control, payroll, and reporting | Electronic Research Administration (eRA) system | Investigator, research administrators, HEI leadership, IT support, sponsoring agency |
Riechhardt, 199851 | Nature news article | Federal research (USA) | Funder and HEI | Institutional grant approvals | Not specified | Grant and top managers at sponsoring agency, government officials (Office of the Inspector General) |
Roudebush & Moore, 201875 | Book chapter | Health research grants (USA) | Funder | Compliance with reporting requirements in Notice of Award, progress and final reports (project milestones, personnel changes, analyses/results, unexpected events, enrolment reports, popular media contacts/reports, presentations or manuscripts, preliminary results, What’s Next), financial report (summary narrative and how the money was spent) | Electronic Research Administration (eRA) Commons status page | Principal investigator, program officer, foundation grants manager, funding agency budget manager |
Sajdyk et al., 201446 | Conference abstract | Federal research (USA) | Funder | Progress reporting: barriers to research, key outcomes, feedback on programme (survey) | REDCap (integration software system for data collection) | Programme managers, principal investigators |
Sakraida et al., 201092 | Journal article | Federal health and behavioural research (USA) | Funder and HEI | Compliance with federal guidelines, funding agency standards and patient safety. Participant recruitment and retention procedures, follow-up telephone calls, preparation of information packets, manuscript writing, set up and maintenance of project office, data management and analysis procedures, dissemination of study outcomes, liaising with media/publicity, authorship and acknowledgement of research and support staff and students, progress reports, grant budget management and expenditure recording | Research administrative staff/project office, principal investigators, students, project director, oversight and compliance officers, grants and contracts officer, IRB, publicity/media consultants | |
Scacchi et al., 199793 | Article | Naval research (USA) | HEI | Funds expenditures, grant renewals, no-cost funds extension, invoicing and funds transfers, in-progress procurement actions, regulatory compliance, record keeping practices, and resource control, tracking reports resulting from research grant awards | INRIS, CAMIS, STARS | Office of Naval Research, researchers |
Sundjaja, 201943 | Mixed-methods study | Research grants (Indonesia) | HEI | Monitoring and evaluation of program implementation, evaluation of grant requirements, reporting research activities, revising grant strategy, budgeting, data accessibility, relationship management | Review of systems (Good Done Grant Management System, JK Group Easymatch, SmartSimple GMS360, Npower Foundation Connect, Fluxx, Bromelkamp, Akoya, Versale Grants, MicroEdge GIFTS Alta) | Grant administrator, grantee |
Tan et al., 202194 | Computational modelling | Research (Singapore) | HEI | Reporting milestones and deliverables, expenditure and justifications, publications, and IP | Project Reporting Management System (PRMS) | Principal investigators, higher management |
Thomas et al., 201995 | Report | Health and care research (UK) | Funder | Responding to government requests for evidence, reviews, and inquiries, serving on advisory committees, participating in working groups/consultations, reporting patient and public involvement, collaborations within/outside of UK, publications, consultancy reports, policy briefings, post-award employment | Researchfish | Award holders |
Tickell, 202214 | Interim report | Research (UK) | Funder and HEI | Reporting on project finances, data management, due diligence, export control, sponsorship for health and social care research, animal testing licenses, bullying and harassment, institutional concordats and strategic frameworks, budget change requests, extension requests | Flexi-grant system, Worktribe, UKRI Funding Service | Researchers, funders, HEIs |
Tickell, 202214 | Final Report | Research (UK) | Funder and HEI | As above (Tickell, 2022 interim report), including post-award set up of finances, procurement and recruitment, project change requests, no-cost extensions, collaboration agreements, internal approvals, data management, reporting | UKRI Funding Service, Jisc | Researchers, institutional staff, government, regulators, funders |
Viney, 201318 | Presentation | Health research (UK) | Funder | Reporting outputs, outcomes, and impacts | Researchfish | Not specified |
Just over half of the records on post-award management and reporting consisted of primary research studies and reviews (27 records; 51%). Research included mixed-methods studies, observational studies, case studies, evaluations (prospective and retrospective), needs assessments and computational models. The remaining 25 records (48%) were non-research (e.g., opinion pieces and blogs) or included reports, results of pilots (e.g., of funding programmes and grant management systems) and a professional development framework for research managers and administrators. Where research fields were specified, most were related to health and clinical research (19; 37%), with less represented fields including agricultural and life sciences (2; 4%), social research (1; 2%), the arts (1; 2%) and the study of research administration (1; 2%). The remaining records were deemed widely generalisable, as they were not aimed at any specific research field or type of research funding (24; 46%). Records were predominantly from the USA (21; 40%) and the United Kingdom (18; 35%), followed by Europe (3; 6%), Africa (3; 6%), South Asia (3; 6%) and the Middle East (1; 2%).
All records described processes relevant to post-award management; however, there was significant variation in the terminology used by authors (Box 2), with most frequent terms including ‘grant management’, ‘monitoring’, and ‘monitoring and reporting’. In terms of the organisational level, half of the records (26; 50%) focused ‘externally’ and described the tasks and information requirements (e.g., progress reports) that funders or other sponsors external to HEIs request on funded awards. In contrast, only 10 records (19%) focussed on the HEI side of the process by describing the administrative and technical operations involved in the internal set up of awards (e.g., financial approvals, staff hiring) and the tasks related to compliance and assurance reporting (e.g., data management, preparing audit reports). In almost a third of the records (16; 30%), authors explored both the funder and HEI sides of the process, describing mechanisms relevant to both.
Administration of awards
Assurance [reporting and monitoring]
Award management [process]
Compliance [and reporting]
Grant administration
Grant monitoring, tracking and reporting
Grantee reporting
Grantmaking practices
Grants and awards management
Grant[s] management
Grants management and reporting
Grants program monitoring and evaluation
Grant[s] reporting
In-grant implementation
In-grant management
Management of external research funds
Management of [grants/awards] and reporting
Monitoring
Monitoring and closeout
Monitoring and [outcome] evaluation
Monitoring and reporting
Monitoring compliance
Ongoing assessment
Post-administration requests
Post-award administration
Post-award grant[s] management
Post-award management
Post-award monitoring
Post-award monitoring and reporting
Post-award phase
Post-award processes
Post-award research activities
Project management
Project reporting process
Project tracking management
Reporting [process]
Reporting and research impact assessment
Research assessment of progress criteria based on reported output
Research contracts reporting
Research management
Research reporting
Scientific reporting
Tracking individual projects
Table 3 includes a summary of the 52 records and the key information extracted from each record. In line with eligibility criteria, all records provided information on post-award management processes. The majority (44 papers; 85%) also referred to the types of stakeholders involved, of which 30 different types were identified and the ‘researcher/principal investigator’ was the most frequently mentioned (20 publications; 38%). Twenty-eight records (54%) also provided information on digital systems used by funders and HEIs to support award management and reporting; in these, we captured 37 systems/initiatives in total.
To help simplify the complex process of post-award management and gauge the level of effort that may be involved, the processes, tasks and reporting requirements extracted from records were simplified into a list of ‘items’, and divided into 14 categories (representing the components of the post-award management process (see Table 4)). Items include any processes, considerations or tasks also handled by non-research HEI staff, such as research managers, finance and administrative support staff.25 Based on the number of records citing items in each category, the top five categories were: Award-set up, compliance and close-out, Financial management and financial reporting, Progress reporting, End-of-grant reporting, and Digital tracking of outputs, outcomes, and impacts.
Category | List of processes, tasks and information requirements related to post-award management and reporting (items) | References |
---|---|---|
Award set up, compliance and close-out | Notice of Award, Grant acceptance and special conditions, Contract negotiation, Account set-up, Reviewing award T&Cs, Subcontract management compliance, Effort reporting (add/remove effort or personnel), Programmatic and financial compliance, Data management compliance, Monitoring and evaluation compliance, Open Access policy compliance, Grant closure and technical closeout, Due diligence, Export control, Animal testing licenses, Institutional concordats, Auditing, Beyond award (Record retention, Property control, Invention control), Institutional Review Board protocol, Invention statement, Ethical clearance, Governance compliance, Subcontract monitoring and compliance, Inactivating projects, Hiring, Travel arrangements, Payment to research participants, technology transfer, statutory returns | 6,7,14,19,25,29–31,40,42,48,52,65,67,75,79,80,84,87,90,93 |
Financial management and financial reporting | Accounting and monitoring award balances (accounts payable and receivable), Payroll, Financial reports, fiscal oversight, Budgeting and budget forecasting, Burn rates, Financial statements (Staff expenses, Payments, processing of Invoices), Financial conflict of interest, fiscal closeout, Final financial accounting/reconciliation, Financial acquittals, Expenditure for contracted services, Subcontract-related invoicing, Record preparation and review, Processing grant personnel (internal and external), purchasing transactions, Just-in-Time requests, Cost sharing/transfers | 7,79,80,85,87 1,6,14,30,31,42,45,52,58,65,75,89,92,93 |
Progress reporting | Project goal and objectives, Overall Progress, Participants, Milestones and deliverables, Key research outcomes, Achievements and preliminary results, What’s Next, Changes to work plan, Outputs (publications, in-progress publications, non-compliant publications, conference abstracts, books), Authorship, Individual contributions, Presentations, Activities, success stories and testimonials, Barriers to research, Expenditure and justifications, Intellectual Property, Collaborations, Leveraged funds, Patents and products, Courses, meetings and outreach activities, Unexpected events, Personnel changes, Summary of original expected outcomes and planned activities to achieve them, carry over requests, Special reporting requirements | 7,31,35,46,48,52,58,75,81,82,85,90,92,94 |
End-of-grant reporting | Plain English summary, Achievements, Further funding, Engagement activities, Intellectual Property (if relevant), Details on innovation, Collaboration and Interventions/Tools/Services Developed, Impact statement (scientific impact and likely impact from research on the health, economy, community, and policy), Publications linked to grant ID, Summary narrative and how the money was spent, Difficulties that have been encountered, Most challenging/surprising aspects of the project, Advice to others planning a similar project, Strengths and limitations of the project Post-grant plans | 28,44,47,52,58,75,81,85,86,91 |
Digital tracking of outputs, outcomes, and impacts | Outputs, Products, Outcomes, and Impact reporting on shared digital and open research platforms (Researchfish, ORCID, Gateway to Research, Pure, Converis, ImpactTracker, REDCap, UKRI Funding Service, AMRC Open Research, Wellcome Open Research), PPI question set, Domestic and foreign collaborations, Publications of any type (consultancy reports, policy briefings, journal papers), Post-award employment (Researchfish) | 14,18,19,34,37,38,40,58,95 |
Variation requests | Applications for Changes (Research Plan, Principal Investigator, Institution, Commencement Date) Revision of grant strategy or budgets, Extension requests, Changes in Standard of Work or Key Personnel, Changes in T&Cs of award, Notification of approval for changes to research design or objectives, Submitting requests to pre-/post-award offices for project changes/prior approvals | 14,31,43,47,52,82,89,90 |
In-person and informal reporting | Site visits, presentations to programme officers, Site reviews, Informal discussions, presentation of outcomes and results after project completion, Surveys or interviews with principal investigators (e.g., for qualitative impact assessment) | 8,29,35,42,44,45,58,85 |
Performance reporting | Participant enrolment/recruitment (e.g., for clinical trials), Research income, Journal output and Relative Citation Ratio, Measurable outputs (publications, patents granted, higher degrees awarded, conferences or meetings, references in published policy reports/guidelines), preparing data for reporting to the Research Excellence Framework (REF) | 1,42,49,50,75,79,88,89 |
Data management and accessibility | Data management plan (DMP) and teams Completion of institutional data repositories (Jisc, Je-S, Gateway to Research, CASRAI, ISNI, CRIS/IR, Integrated Grants Management System, Integrated financial Management System, Worktribe), Open access policy, Accessibility/discoverability of data, Internal grant approval, preparing accessible reports to support institutional and local decision-making | 1,14,39,43,51,67 |
Promoting, publishing, and disseminating research | Selecting Open Access-compliant journals, Promoting projects on digital platforms and social media, or using non-digital methods (posters), publication on organisations’ websites/community platforms, Acknowledging source of funding in publications, Dissemination plan (authorship considerations and acknowledgements), employer affiliation, Plain English summaries of articles, ORCID IDs | 19,44,48,86,92 |
Clinical trial transparency | Adding clinical trial to a public trial registry (ISRCTN, clinicaltrials.gov, EU CTR) Reporting to funder the trial registry number Reporting trial outcomes and outputs within a certain timeframe (e.g., within 12 months of study completion) | 28,81,89 |
Security of research | Reporting foreign influence (Financial conflict of interest, Conflicting IP or authorship, Other support, Time commitment to foreign institutions, Foreign government grants or Personal funding), Data security | 14,83 |
Feedback to funders | Written feedback to funder on award experience for programme improvement | 44,46 |
Assessing researcher performance | Researcher CV data (affiliations, outputs, ORCID ID) | 39 |
Based on Table 4, for HEIs the most post-award effort is likely to be around the set up and close-out of research awards. This includes the duties of researchers, RA staff and finance departments, and any other stakeholder involved in supporting the processes associated with these post-award stages. Table 4 presents the processes, tasks and information requirements that could be involved, of which 29 items were related just to Award set up, compliance, and close-out. As an example, some of these are tasks related to providing ‘assurance’ to funders,14,26 and include due diligence, internal auditing, and reviewing award Terms and Conditions (T&Cs). Others include considerations for funder policies, such as compliance with Open Access of study data, governance, and record retention. Financial management and financial reporting also involve many tasks (17 items; Table 4) required for funder assurance,26 and these include monitoring award balances, managing Payroll, reporting expenditure for contracted services, and producing financial statements.
Reporting is another major focus of post-award effort, based on all the information that may be required for Progress and End-of-Grant reporting (Table 4). We found 22 items associated with Progress reporting which refer to specific information requirements for report contents (such as the ‘project goals and objectives’, the ‘overall progress’, ‘milestones and deliverables’, and ‘changes to work plan’) and 15 items associated with End-of-grant reporting (Table 4), which notably include content such as a ‘Plain English summary’, an ‘impact statement’, and ‘post-grant plans’. Some information, such as ‘outputs’, ‘achievements’, and ‘Intellectual Property’, can be requested in both types of report and can also be required for Digital tracking of outputs, outcomes, and impacts (Table 4) on research platforms such as Researchfish.27 Of note, other categories of reporting that add to the overall effort for researchers include In-person and informal reporting (6 items) and Performance reporting (5 items) (Table 4). The former may apply to any project and typically involve researchers presenting their progress to programme officers during site visits, or delivering additional information to funders (e.g., on impact activities) via surveys or interviews. Performance reporting on the other hand is more specific to research involving participants (e.g., clinical trials) and involves the reporting of participant data, such as enrolment and recruitment figures, as well as the outputs and outcomes of these studies (e.g., in compliance with funder clinical trial transparency policies).28
In an effort to better understand the value of post-award management and reporting, and why organisations request information from researchers and HEIs throughout and beyond the award period, relevant information from records was collated into common purpose-related statements (i.e., “The purpose of post-award management is …”) (Table 5). In total, 57 common purpose-related statements were written from collation of data and then thematically grouped into 14 categories of purpose (Table 5) and arranged in a descending order of frequency (based on the number of citing records). Based on this analysis, the most common reasons for needing a post-award management process were: Research impact assessment (14 records) (e.g., “To understand what works in research and leads to impact”), Compliance (12 records) (e.g., “To comply with local legislation and the institutional rules that govern research”), Accountability to sponsors, volunteers and the public (12 records) (e.g., “To satisfy project sponsors and commissioners”), Funder programme development and planning (10 records) (e.g., “For funders to identify gaps where future funding may be needed”), and Ensuring responsible research conduct (9 records) (e.g., “To maintain integrity and ethics in research”). Less common but notable reasons included Securing future funding for researchers (8 records) (e.g., “To maintain funding support and advocate for the need for further research”) and Promoting and protecting the reputations of institutions and researchers (6 records) (e.g., “To promote the profiles of projects and researchers”).
Purpose-related statements reflecting the information collated from publications and reports | References | |
---|---|---|
The purpose of post-award management is … | ||
Research impact assessment | To understand what works in research and leads to impact | 34,58 |
To provide information on broader changes in society and the translation of research (e.g., impact on health) | 47,85,95 | |
To connect original research and grantee outputs and products (e.g., Intellectual Property) with new developments in the field | 8,86,95 | |
For awareness of grantee-reported exchanges with policymakers and public stakeholders (e.g., providing commissioned evidence or serving on advisory committees) | 8,95 | |
To evaluate the impact of research on the number and type (e.g., cross-disciplinary) of collaborations established by investigators as a consequence of the funded research | 77 | |
To advocate for the need to continue funding a particular area of research | 34,58 | |
To measure the translational impact of research on local and international communities | 42 | |
Compliance (regulatory, technical, financial, administrative) | To comply with local legislation and the institutional rules that govern research | 7,14,42 |
To ensure day-to-day compliance with funder conditions and requirements during and beyond the award (e.g., data management, audit activities, record retention) | 30,43,52,87 | |
To comply with funder monitoring policies (e.g., reporting recruitment milestones for trials) | 88 | |
To comply with funders’ fiscal and regulatory policies | 42 | |
To ensure proper use of funds for programmatic operations | 45,65,75 | |
Accountability to sponsors, volunteers, and the public | To satisfy project sponsors and commissioners | 34,48 |
To be transparent and accountable, building trust between funders, research institutions, sponsors and the public | 30,87,91 | |
To track and account for the spending of public and charitable funds | 14,38,49 | |
To demonstrate public and patient involvement in research (e.g., as required by Researchfish) | 95 | |
To motivate research volunteers (e.g., public members) to continue participating in research by demonstrating the impact of their involvement in research | 48 | |
To help governments and public sponsors decide on best allocation of funding resources | 95 | |
To demonstrate payback in the form of new knowledge, contributing to research capacity, patient care and political/economic benefits | 49 | |
Funder programme development and planning | For funders to monitor the successes and failures of their programmes, and identify areas for improvement and learning (e.g., develop training) | 52,95 |
For funders to obtain information that is critical to their goals and the good of society | 82 | |
For funders to identify gaps where future funding may be needed | 40,58 | |
To provide data for annual reporting of funder rates, strategies and impacts, as well as other relevant statistics (e.g., diversity in funding) | 47,84 | |
To use the data gathered to set policies and make the case for more government backing | 19,84 | |
For funders to compare their performance against other organisations | 18,84 | |
Ensuring responsible research conduct | To maintain integrity and ethics in research | 14,52 |
To demonstrate good clinical practice | 48,52,88,89 | |
To demonstrate rigour and effectiveness of studies (e.g., statistical power to answer a research question, high standards of work) | 7 | |
To protect the welfare of research participants | 89 | |
To ensure there is documentation of high-quality research being carried out and standards of good research practice being implemented | 81 | |
Transparency and dissemination of research outcomes | To report non-academic publications (e.g., consultancy reports, policy briefings and standards papers) | 95 |
To provide information on any changes over the grant period, as well as academic outputs, community engagement, and dissemination activities | 47,75,86 | |
To report collaborations (locally and oversees) | 95 | |
To grow the evidence base and inform the work of other organisations | 48 | |
To ensure transparent reporting of studies and outcomes, and to reduce publication bias (e.g., as per clinical trial registration and transparency policies) | 28,38 | |
Securing future funding for researchers | To maintain funding support and advocate for the need for further research | 29,38,48,87,89,95 |
To help demonstrate success in securing further funding as a reported outcome of the award | 86 | |
To build a relationship with the funder through regular communications (e.g., progress reports) | 75 | |
Monitoring project progress, achievements, and responding to issues | To monitor the progress of projects against targets and milestones, and course-correcting when required (e.g., to help meet enrolment targets) | 52,75,88 |
To allow researchers to reflect on how well their research is progressing according to their own plans, and what they have discovered | 75 | |
To give researchers the opportunity to ask for help | 75 | |
For funders to respond to issues as they arise with quick intervention (e.g., real-time monitoring) | 46 | |
Promoting and protecting the reputations of institutions and researchers | To promote the profile of projects and researchers | 48,87 |
To track researchers’ careers (e.g., reporting on post-award employment) | 95 | |
To demonstrate good work and responsible use of funds to the sponsor (e.g., through reporting) | 75 | |
To demonstrate the performance of HEIs | 42 | |
To ensure that there are no issues with compliance that may have legal ramifications or jeopardise an investigator’s, or the institution’s, ability to secure future funding | 6 | |
Supporting researchers with tasks | For the funder to determine how they can better support the researchers they fund | 52 |
To capture mentoring activities | 47 | |
To capture how researchers experienced their awards, including how much time was spent on research or indirect research activities (e.g., preparing grant reports) | 47,95 | |
To improve evaluation and monitoring processes (e.g., by capturing and reducing burden for researchers) | 47 | |
Reusing information | To reduce duplication and enable data sharing, reproducibility of research, and learning (e.g., by reporting on shared data platforms) | 14,86,95 |
For universities to collect information on their research and establish repositories for wider access and reuse of data | 40 | |
Protecting research from theft | To assess and mitigate potential security risks | 14 |
To prevent researchers from breaching institutional and government laws on ‘foreign influence’ (e.g., disclosing conflicts of interest or of commitment, receiving other support) | 83 | |
Maintaining focus on innovation | To indicate that innovative research has been conducted (e.g., in end-of-grant reports) | 47 |
Improving research management | To gain information on whether/how organisations can improve how they manage research | 49 |
To identify which specific area(s) of post-award management and reporting may be creating perception of unnecessary effort or need for improvement through solutions/interventions (e.g., new methods of information collection, administrative support), relevant information was extracted from publications and summarised in Table 6. We identified 29 records (55%) reporting unnecessary effort or issues in post-award management, including recommendations or solutions for addressing these issues. These findings were summarised for each record and the summaries categorised based on the area of post-award management (Table 6). The six emerging categories were: Supporting researchers with tasks; Research impact assessment; Data management and accessibility; Reporting and digital tracking of outputs, outcomes, and impacts; Monitoring and evaluation strategies; and Award set up. Out of these, issues were most frequently found in ‘Supporting researchers with tasks’ (7 records), ‘Research impact assessment’ (6 records), and ‘Data management and accessibility’ (5 records). Notably, a generalised category, ‘Bureaucracy in research’, summarised a government-comissioned report covering issues across multiple areas of post-award management in the UK.14
Area of post-award management Publication (and funding field) |
Supporting researchers with tasks Corona Villalobos, 2020 (US federal funding of medical research)29 Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
|
Research impact assessment Adam et al. 2018 (Health research in Spain)36 Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
|
Data management and accessibility Burland & Grout, 2016 (UK research)39 Described issues:
Described issues:
Described issues:
Described issues:
Described issues:
|
Reporting and digital tracking of outputs, outcomes, and impacts Abdullahi et al. 2021 (Small grants in Kenyan health research)44 Described issues:
Described issues:
Described issues:
Described issues:
|
Monitoring and evaluation strategies Croxson, Hanney and Buxton, 2001 (UK health-related R&D)49 Described issues:
Described issues:
Described issues:
Described issues:
|
Award set up Riechhardt, 1998 (US federal research)51 Described issues:
|
Bureaucracy in research Tickell, 2022 (UK research)14 Described issues:
|
The issues identified not only concern post-award processes, but the culture around these as well. There is a general perception of unnecessary effort and little support in post-award management, and that numerous areas of the process could potentially benefit from tailored solutions (e.g., new systems, resources or streamlined approaches) to reduce burden or add value to research (Table 6).
The need to better support researchers with post-award tasks was frequently cited.6,7,29–33 This stems from reports of significant workloads, a stressful environment in HEIs and issues in communications and working relationships. Solutions to these depend on individual HEIs and could range from hiring research administrators (RAs) or improving existing RA-researcher working relationships, ensuring these are collaborative and respectful of the different roles in managing and effectively delivering reserach. For researchers, a better balance in administrative workload would in turn free up more time for research – particularly in overwhelmed settings like public HEIs and medical schools. This would also ensure that funder investments primarily cover the research and not grant-related administrative activity, which would translate into timely research and quicker impact on end-users. Investing in RA support should also help with compliance issues (e.g., late reporting and consequent sanctions), which can occur when HEIs are overburdened with grant management tasks or where researchers are not fully aware of the funder requirements. Organising training or onboarding programmes for faculty, or employing research managers with knowledge in compliance, are therefore potentially scalable solutions that could widely apply to different research settings and disciplines. The efficacy of these solutions, and HEI grant management functions in general, could also be monitored and measured with metrics – to identify where more support with grant management and other management issues in research (e.g., ethics, recruitment) are needed.
The need to improve methods and culture around research impact assessment (RIA) emerged as another key issue in post-award management.8,34–38 Despite increased focus on RIA in research, the criteria applied by organisations to measure impact and the methods of gathering award data for RIA may need improvement. Funders are urged to develop better RIA guidelines and address gaps in their frameworks, such as by building on standard output-based metrics (e.g., number of publications) with qualitative examples of impact activities. There is also a need for transparency as to how impact reports may affect the researchers, and to ensure that research assessments include some participation of research end-users. Focus on non-academic impact, and to evaluate the complete research lifecycle, are also encouraged for demonstrating the impact of research on priority setting and funding decisions. At the same time, organisations are urged not to over-focus on impact reporting to allow more time for the research and improve the impact of research itself. To that end, recognising the value of sharing and re-using research data were also suggested, which would help towards developing common systems, taxonomies and methods of data curation for RIA. Such efforts should be collaborative and maximise the value of existing digital platforms (e.g., Researchfish) to ensure completeness and quality of the impact data reported in these. Where needed, funders should also ensure they have the expertise and capacity to effective use these platforms to inform their evaluations and RIA activities.
Commonly reported were issues with the standards and sharing of research data, as well as its curation and management.39–43 With information requirements increasing, but organisations using different and non-interoperable digital systems to store and access the data, researchers find themselves spending more time on reporting due to having to manually input information into open repositories, tracked platforms, and funder and HEI grant management systems. In turn, issues in the transfer of information between systems can lead to errors and duplication in data, creating further burden to users. As such, there is a need for more efficient bi-directional flow of information between HEI systems and those accessed by others (e.g., research sponsors). These include widespread adoption of data standards and interoperable systems by HEIs, and automated data transfer and grant-linking of outputs. HEIs and funders could also consult data centres and switch to enterprise-wide information and grant management systems, so that all staff are trained and have the access for efficient management of award data.
Funders asks that researchers report on awards in a variety of methods/formats, from submission of written progress reports to updating online forms (e.g., Researchish) with the outputs and impacts of studies. However, where there is overreliance on technology this reportedly leads to an increase in reporting requirements, with the extra administrative effort created leading to delays in research and completion of research reports.28,44–46 The need for the information requested is also not always clear, nor are the reasons why funders choose certain reporting strategies over others. Funders could consider what they require as minimum and strive for ‘risk-proportionate’ monitoring, focusing post-award support on those who need it the most (e.g., early career researchers) and being transparent with researchers about how the information they provide is being used. There is also opportunity to ask researchers for feedback at the time of reporting – for instance, to monitor administrative effort in post-award management or learn where researchers require more support (e.g., stakeholder engagement). Resorting to real-time reporting is another solution that could potentially take the pressure off researchers, lead to better information (e.g., examples of impacts) and enable faster funder responses to any issues with research.
Increasing effort in research is put into ensuring that funders’ investments are justified and demonstrate sufficient return in terms of performance and impact. However, how funders themselves evaluate and report these may not result in accurate indicators of value and outcomes from research, and will not work equally well across all disciplines and research activities.47–50 For instance, there is a perceived focus on output-based indicators, such as number of publications, in evaluating performance; however, these will not necessarily correlate with the true benefits of research to health and society. Other factors that are frequently overlooked but may impact the fairness of evaluations include the time lag between outputs and impact (which can be up to 20 years) and administrative burden in the post-award phase. Solutions in the form of improved evaluation strategies have been suggested; for instance, using qualitative indicators of performance or capturing metrics of effort in research and grant management (e.g., “hours of internal administration time”). In developing more equitable and balanced evaluation criteria, funders should also ensure that some groups of researchers (e.g., female investigators) are not disadvantaged, and that performance or research assessments account for unforeseen setbacks to research delivery (e.g., as in the case of Covid-19).
One record described historic issues associated with the set-up of NASA-funded awards; these included significant ‘post-award lag’ in the form of late funding disbursements, a long bureaucratic approvals process, and delay to research.51 These issues, affecting the Space Science Programme, led to grantees labelling the entire grants process ‘unpredictable’ and further caused them post-award burden in the form of a protracted process of grant renewals. Ultimately, NASA achieved reform to the programme and solutions included simplifying the HEI processes required for approvals and set-up, digitising grant tracking and finance, and standardising forms for employees. Shorter grant renewal periods were also given to those requesting minimal changes to funding requirements, and feedback was sought from researchers to inform improvement discussions with other federal agencies.
Tickell’s large-scale review of UK’s research bureaucracy identified numerous issues in post-award management.14 These were relevant to the processes of both government funders and HEIs, and included unnecessarily complex and duplicative assurance and reporting requirements, as well as inefficiency in the use of data systems, and administrative pressure on research activities. The biggest contributor of burden in HEIs was in most cases a long and bureaucratic approvals process, while funders were found to place unnecessary scrutiny on assurance and therefore added reporting requirements. Recommendations to the whole sector included simplifying and harmonising data systems and focusing on proportionality and flexibility in grant processes and research assessment. Funders were especially encouraged to streamline their assurance requirements and the HEI system to share their research and data management practices.
Consulting the websites of funders (Table 2) revealed that all websites included information on post-award management processes, although to varying degrees of detail. Overall, we noted considerable variation in the funders’ approaches to monitoring and reporting: differences included the terminology used to describe post-award processes (e.g., ‘scientific reporting’, ‘project management’), specific reporting requirements (e.g., progress reports) and the frequency of reporting (e.g., annual, quarterly), the type of information requested (e.g, impacts) and where it must be reported (e.g., end-of-grant reports vs platforms), and the digital platforms used to support applications and award management. There were also differences in the level of detail relating to guidance and supporting information for researchers and HEIs (such as policies, justification for requirements and explanations of why information is asked for, how it is used, and who uses it). The main points of variation in practices are outlined in more detail below.
• For most funders, specific reporting requirements and the frequency of reporting depend on the grant scheme or funding programme, meaning that the information asked of researchers can vary. Funders may also vary their requirements on a case-by-case basis or use a ‘risk-proportionate’ approach – for instance, as done by the National Institute for Health and Care Research.52 Some funders, however, may use the same approach to monitor all their awards and we found this is to be the case for the Canadian Institutes of Health Research, Alzheimer’s Research UK, the National Research Foundation, and the National Institutes of Health.
• Most funders require submission of periodic progress reports as part of routine project monitoring. As an exception, the Canadian Institutes of Health Research require submission of a single electronic grant report at the end of a study, while the Medical Research Council generally asks that researchers submit study updates annually via Researchfish (although they may ask for updates on progress using other methods).
• In addition to completing progress reports and publishing in journals, some National Institute for Health and Care Research (NIHR) funded researchers must also publish their full study outcomes and outputs on NIHR platforms, namely the NIHR Journals Library (NJL) or the NIHR Open Research platform (depending on programme). Recently, the NJL has transitioned from publishing full study reports to a flexible ‘threaded publication’ approach,20 where for some studies findings can be published as smaller reports and followed by a synopsis of all study outcomes after study completion. Notably, health and social care studies funded by NIHR must also report to the Health Research Authority, and this is in line with the ‘Make it Public’ transparency and openness strategy that the UK now applies to all publicly funded health research.
• In addition to end-of-grant reports, the National Health and Medical Research Council in Australia requires that fellowship award recipients specifically also submit a single-page summary of the research, and all awardees are also required to annually update their electronic CVs to reflect latest grant outputs as part of routine ‘performance reporting’.
• Most of the funders (7 out of 11) require funded clinical trials to be prospectively registered on at least one public trial registry platform (e.g., ISRCTN53) and for trial results to be transparently shared within a feasible time frame of study completion. For the National Institute for Health and Care Research, clinical trial investigators are also required to submit ‘performance reports’ to England’s Clinical Research Network, so that data such as recruitment can be reviewed against national benchmarks and used for publishing annual performance statistics. Trial and intervention study applicants to the Health Research Council New Zealand, on the other hand, are required to plan their own monitoring as part of application and in advance of funding decisions, as studies of this type must undergo periodic review of safety/efficacy and requests for appropriate panels must be made in advance of the project delivery phase.
• Apart from the Canadian Institutes of Health Research and University Grants Committee Hong Kong, most funders use their own in-house digital systems to manage applications and funded awards, as well as for receiving and managing research reports.
• Three funders – National Institute for Health and Care Research, the National Institutes of Health, and the European Research Council – each currently use more than one in-house system for managing studies; however, it must be noted that use of systems can be subject to change, for instance, as funders undergo restructuring or as part of funding programme improvement.
• The National Institute for Health and Care Research and the Medical Research Council both require that researchers register their studies, and annually report their outputs, outcomes, and impacts, via the shared platform Researchfish. For both funders, this is a compulsory reporting requirement for all funded awards that is used for monitoring and research impact assessment in addition to other reports.
• In Singapore, all publicly funded research is managed under a single digital system, the Integrated Grant Management System. This system is accessible to all funders, HEIs and researchers and is used for submission, management and tracking of all funding applications and research reports.
• With the exception of one funder – the University Grants Committee Hong Kong (for whom this information was not found) – all the funders have dedicated web pages for monitoring and reporting requirements. These include guidance and relevant policies, and in some cases detail on what information is asked for (e.g., downloadable report templates). However, the level of detail and notably the focus of policies related to monitoring vary. For instance, the National Institutes of Health and the Canadian Institutes of Health Research both focus their monitoring policy on clinical trial registration and transparent reporting of outputs, whilst the policy of Alzheimer’s Research UK focuses on research impact assessment, and the National Health and Medical Research Council’s on evaluation strategies and ‘innovation’ of grant management practices. Most funders however make sure to update their monitoring policies regularly, although whether this is the case for the National Research Foundation and the European Research Council was unclear from their websites.
• Most of the funders share downloadable report templates on websites, or provide a summary of the type of information they request from researchers. Alzheimer’s Research UK and Health Research Council New Zealand, on the other hand, do not seem to share their report contents on websites, which suggests they may send them directly to grantees (for instance, once the reporting window is open).
• All funders give some indication on their websites as to who in HEIs may be best placed to fulfil certain compliance and monitoring activities (e.g., a project director).
• All funders provide a list of relevant offices and research managers/administrators who researchers and HEIs can contact for information or assistance with managing their research awards.
• Awardees with the National Institute for Health and Care Research receive monitoring support from dedicated research managers and teams, who are specifically assigned to monitor and risk-assess funded contracts and support their researchers with fulfilling reporting requirements. As an example, support is provided by sending researchers reminders of upcoming deadlines for progress and final reports.52
• The Medical Research Council employ a Translational Research board and Research Funding Policy and Delivery team, both of whom help manage awards and respond to researchers’ enquiries or variation requests.
• The National Institutes of Health provide their grant recipients with a ‘Welcome Wagon’ letter as part of early post-award communications, which includes helpful information and resources to help them set up their research and manage their research awards.
• The National Research Foundation in Singapore have a web page with guidance and training videos on award management specifically for researchers.
Post-award management is basic condition of funding that serves many purposes but varies in the mechanisms and administrative effort involved. This section discusses the findings of the first scoping review on this topic, focussing on their implications in terms of effort in post-award management, the responsibilities involved, and the support that can be provided or remains needed. We also discuss the availability of evidence in this space, limitations, and future directions for research, and offer broad recommendations for both funders and HEIs.
Managing funded research involves more than the signing of contracts and completion of progress reports. The landscape of post-award processes and conditions for funding is complex and there is no relationship between organisation or award type and the approach to reporting. Cataloguing and summarising the available evidence however allowed us to better understand processes and gauge where most of the effort may lie. We were also able to highlight areas where effort may be perceived as unnecessary and improvements are needed, focusing on solutions and recommendations that are relevant to funders, HEIs, and researchers.
For HEIs, significant effort is needed for compliance and the post-award set up of studies, which involves setting up the conditions for the award, obtaining necessary approvals and arranging timely funding disbursements,54 as well as ensuring that the correct infrastructure is in place for responsible and compliant management of finances and research data throughout the award. Notably, research has become increasingly digitalised,55 not least because of Covid-19, and HEIs now store vast quantities of research data which they must ensure is accessible, discoverable to others externally and standardised39 for effective sharing and reuse by the research community. However, siloed approaches to managing data – where funders and HEIs all use their own systems – has led to an overwhelming presence of digital platforms, of which 36 were captured in this review alone and most of which lack interoperability, resulting in duplication of effort for users.56 Although numerous collaborative initiatives39–41 (such as Jisc and Current Research Information Systems-Institutional Repositories (CRIS-IR)) now provide HEIs with solutions for better system interoperability and data sharing with funders, they have yet to be standardised across the research sector14 and there is still room for reducing manual effort in and improving the transfer of grant-linked research data between systems. Moreover, while HEIs are encouraged to engage with specialised data services to improve the accessibility and reuse value of the data they hold, evidence that funders also engage with these services seems to be lacking40 and inconsistency in how funders themselves use technology for tracking research outputs, outcomes, and impacts may explain why research data is still not being efficiently shared across sectors,28 perpetuating unnecessary effort and research waste for users.
Importantly, too many digital sources of data can also affect funders’ abilities to perform research impact assessments (RIA),57,58 which in today’s ‘accountability climate’, are crucial for demonstrating that research impact is ‘measurable’ (e.g., resulting in new policy or technology) and for the continued support of the funders’ research programmes.34,58,59 While the specific reporting requirements of funders may vary, we found they consistently request that researchers anticipate and report on the impacts of the research they fund. Moreover, the reported information must be relevant and updated after study completion, and as such tends to be collected frequently and using multiple methods, including progress reports, end-of-grant reports, impact statements, and online submissions to tracked impact platforms (such as Researchfish). However, research has shown that having a ‘plethora’ of data sources available to funders for RIA does not mean that assessments are always useful to funders, and there remains no consensus on RIA frameworks, or the meaning of ‘impact’.35,36,58,60 This brings into question the end value of impact reporting to stakeholders, and of adding more effort to this activity on both sides of the award.58 For instance, while there is evidence that new frameworks and tools for capturing broader impact data are being developed,8 the ‘value add’ of these versus the costs (in terms of money and effort) should be also considered to avoid placing unnecessary burden on routine funding operations or on the delivery of research activities.
Researchers already report struggling with routine reporting requirements,61 as well as the multiple systems used to track research data and having to manually link study outputs with the identifiers of research awards.39 With respect to the accuracy of the data reported, some argue there is still room for improvement34 and for funders and HEIs this may mean training researchers in ‘impact literacy’62 or explaining more clearly the type of impact information they should be reporting. The need for certain types of reporting, such as progress reports, and the need to include impact data in these reports is also up for debate, as it is suggested that funders mostly rely on end-of-grant reports or tracked platforms to collect data for retrospective analyses of the overall and long-term impacts of studies.58 Indeed, we found that not all funders require progress reports of researchers, and reducing effort in this area may therefore mean giving researchers more autonomy as to how they update funders on the progress of their awards – for instance, allowing them to report in real time or through more direct communication channels with funders.8
A lot of the post-award effort discussed in this work is shared between researchers and many other staff within funders and HEIs, who help coordinate and deliver the complicated post-award management process.63,64 However, we found that specific responsibilities for requirements, and the level of support offered to researchers in HEIs, depends on availability of research management and administrative (RMA) infrastructure (such as availability of RAs) and other factors in institutional set up.6,7,29–32,65 As such, while certain award tasks, such as negotiation of contracts and hiring, can be delegated to relevant HEI departments (such as Human Resources and Finance) the level of support offered for other activities – such as review and approval of research operations, managing direct information requests and reporting – is not always clear due to differences in HEI facilities, resources, and internal funding. In addition, RMA appears to heavily vary by country,65–67 and even where it is readily available (such as in dedicated ‘grant offices’ or ‘post-award offices’3) a ‘systematic problem’ of administrative burden and issues with compliance is still being reported,68 with issues stemming from factors such as overburdened central offices,31 poor leadership,42 inadequate training,30,65 and ineffective relationships between researchers and administrators.29 The concern therefore is that not having the needed support for post-award tasks may affect the timely delivery of the research and reduce its impact, as well as return on the funders’ investments. As such, adoption or improvement of RMA and grant support functions in HEIs may be necessary, with the onus then on governments and funders to deliver the infrastructure and training required,69 investing in better research support to fund better quality research.70
There is evidence that strategies to improve funding systems now include efforts to optimise grant management processes15–17,47,71,72; however, as a research area, we believe that the post-award phase may still be in its formative stage. Evidence on post-award practice is limited in scale, robustness and focuses on interventions (e.g., training, alternative mechanisms) compared to topics like grantsmanship and peer review,11,73,74 and the literature aimed at researchers is mostly on improving the quality of grant applications, and not on what happens post-award or improving post-award skills (such as writing of progress reports75).
Most literature in the post-award space also focuses on research impact assessment (RIA) and tends to be high-level, exploring RIA frameworks and strategies in isolation of the reporting that funders require (see Refs. 59, 76 for examples). We found only six publications8,34–38 that linked RIA back to the funders’ methods of information collection, and which considered the feasibility of funders being able to collect certain types of data (e.g., qualitative impact data) for the purpose of monitoring and RIA activities. These publications were useful as they showed how some solutions for funders (e.g., improving the accuracy of impact data) may affect individual reporting requirements (e.g., the need for telephone interviews with researchers) and the implications on effort for the evaluators and researchers directly involved in reporting.8 However, literature focusing just on how research is monitored by funders is sparser, and a large proportion of information had to be gathered either directly from funder websites or funding reports, or from publications and theses on award management systems. Much of the evidence on perceived unnecessary effort in post-award management was also gained from grey literature (e.g., opinion pieces37), with little observational data available, such as from faculty interviews and surveys with staff (as in Ref. 29).
Ultimately, we believe the reason for the lack of research in monitoring and reporting is simple: that feeding back to funders on research that has already been funded is generally seen as well-justified and less onerous than applying for funding or undergoing grant application review. For instance, we found that no record argued against funders needing to oversee their research investments in general, and the literature instead provided a catalogue of reasons why monitoring research is important to numerous stakeholders within and beyond academia. As such, despite how researchers may experience the effort that goes into managing research and complying with funders’ requirements, we and others believe they are still likely to see this effort as a ‘necessary burden’ in funding, which will seldom deter them from applying for funding or continuing working in research.3
Nevertheless, it is important to raise awareness of any unnecessary effort or issues with practices that have been accepted in the past but are now impacting on efficiency in funding processes or today’s research culture. To that end, the research sector will benefit from this review of previous work, and more exploratory research,42 independent reviews,14 needs assessments33 and systematic comparisons of practices. Notably for funders and HEIs, strategic changes should focus on ‘grant implementation’, ‘in-grant management’, and ‘digital platforms’,14 and the success of any future interventions (whether it is guidance and training or integration of new systems) should be prospectively evaluated (such as in Ref. 77) or followed up to determine the long-term effects as the research sector evolves and new burdens arise. Continuing to capture ‘effort’ and the experiences and perceptions of stakeholders is also crucial going forward, and in our opinion such assessments could complement the development of administrative ‘indicators’47 to appraise where effort in post-award management is concentrated, assess its end value to research, and identify areas for further improvement.
We have drawn on the evidence of common issues and potential solutions in post-award management (Table 6) to inform key recommendations for funders and HEIs. We believe these recommendations will be relevant to many funders internationally and could facilitate effective future changes to reduce unnecessary effort in research14 or identify where more research is needed to inform feasible opportunities for improvement in post-award management.
• Funders should aim to simplify and harmonise their practices for monitoring and evaluation, ensuring they continue to collaborate, follow evidence of best practice, and attempt to overcome difficulties – such as being able to accommodate organisational differences (e.g., in priorities and monitoring requirements).
• Funders should evaluate and improve the frameworks they use for RIA but consider the effort involved for researchers and evaluators when changing monitoring and reporting requirements.
• Funders should ensure they engage HEIs, researchers, RMA experts, and any other relevant research stakeholders and sponsors, when making future decisions on practices that may impact research activity (such as adding reporting requirements).
• Funders should make sure they clarify for researchers the purpose of post-award information requirements, what happens to the data and who uses it, as well what data (e.g., related to impact) is relevant to report.
• Funders should ask researchers and HEIs for feedback on their programmes as part of routine reporting activity and include in this the time spent on administrative activities as a ‘metric’ of effort or burden.
• Funders should streamline their compliance and assurance requirements to reduce duplication in HEI processes and delay to the start of research – for instance, by adopting a principle of ‘ask once’ when requesting information from HEIs.
• HEIs should strengthen their grant management capacities, recognise the importance of non-research personnel in assisting with research operations, and particularly the role of research managers and administrators in enhancing the quality and success of research.
• More HEIs should make use of existing networks (such as the Association for Research Managers and Administrators) to share information and resources on effective award management. This could help towards harmonising HEI practices, such as through the use of more standardised processes (e.g., the Lambert agreement for research contracts) and common information management systems.
• HEIs should provide regular feedback to funders to drive continuous improvement in post-award management.
• Funders and HEIs should work with relevant suppliers (such as Jisc) to further improve interoperability between digital systems to better share information and reduce duplication in data management and reporting.
• Funders and HEIs should work with relevant organisations (such as euroCRIS) to improve transferability of grant-linked data between HEI and external systems, and promote application of universal standards to validate data.
• Funders and HEIs should both better support researchers through the post-award phase and with reporting activities, helping them build better working relationships, and secure future funding through the ultimate success of their research.
The broad nature of the topic and the breadth of terminology used to describe post-award management in the literature made screening for relevant papers more challenging than initially anticipated. Having to also manage a large number of citations and employ strict criteria for eligible literature, it is therefore possible that some relevant articles and resources will have been missed. Nevertheless, the literature sample we obtained contained more research data than anticipated and provides the key information to appraise the landscape of post-award management. We attempted to capture current funder practices as accurately as possible but acknowledge that, our funder sample is small and predominantly in the biomedical/health space, and second, and that funders’ websites will not necessarily capture every operational nuance and policy regarding their post-award practices. It also needs mentioning that research practices are constantly changing, and there is therefore a limit to how much current detail on practices can be obtained without consulting funders or HEIs themselves (for instance, as done in Ref. 78). Finally, we note that the findings of the review and especially any assessments of ‘effort’, should not be seen as reflective of all real-life experiences in research; the perspectives of researchers on funding and post-award management will vary and we have recently shown this when interviewing researchers in the UK.61
The overall need to manage and report on research is clear and widely appreciated. However, the effort can be considerable and reports where it is perceived as unnecessary need the support of more rigorous evidence, and consultations between researchers, HEIs, funders and other relevant stakeholders, so that key administrative barriers to efficient research delivery can be identified and addressed more collaboratively in a connected, interoperable research environment. In the meantime, HEIs and researchers could benefit from more administrative support services, and researchers could particularly benefit from guidance on ‘impact’ and training in post-award management. Funders could also find ways of reducing duplication and research waste in reporting, with a goal to minimise the effort required to report whilst increasing its value and accountability to research end-users.
The authors would like to thank the NIHR Coordinating Centre monitoring team based in Southampton, and Sheetal Bhurke, for permission to use an internal report comparing UK funder monitoring practices 52 in the review. We also thank all the NIHR staff who took the time to review the relevance of data and for contributing to the revision and editing of early versions of the manuscript.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Research Management and Administration
Are the rationale for, and objectives of, the Systematic Review clearly stated?
Yes
Are sufficient details of the methods and analysis provided to allow replication by others?
Yes
Is the statistical analysis and its interpretation appropriate?
Not applicable
Are the conclusions drawn adequately supported by the results presented in the review?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Research Management and Administration
Are the rationale for, and objectives of, the Systematic Review clearly stated?
Yes
Are sufficient details of the methods and analysis provided to allow replication by others?
Yes
Is the statistical analysis and its interpretation appropriate?
Not applicable
Are the conclusions drawn adequately supported by the results presented in the review?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Meta-research
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 2 (revision) 28 Sep 23 |
read | |
Version 1 20 Jul 23 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)