How incident reporting systems can stimulate social and participative learning: A mixed-methods study.

Incident reporting systems (IRSs) have been widely adopted in healthcare, calling for the investigation of serious incidents to understand what causes patient harm. In this article, we study how the Dutch IRS contributed to social and participative learning from incidents. We integrate quantitative and qualitative data in a mixed-methods design. Between 1 July 2013 and 31 March 2019, Dutch hospitals reported and investigated 4667 incidents. Healthcare inspectors scored all investigations to assess hospitals' learning process following incidents. We analysed if and on what aspects hospitals improved over time. Additionally, we draw from semi-structured interviews with incident investigators, quality managers, healthcare inspectors and healthcare professionals. Healthcare inspectors score incident investigation reports better over time, suggesting that hospitals conduct better investigations or have become adept at writing reports in line with inspectors' expectations. Our qualitative data suggests the IRS contributed to practices that support social and participative learning-the professionalisation of incident investigation teams, the increased involvement of patients and families in investigations-and practices that do not-not linking learning from the investigation teams to that of professionals, not consistently monitoring the recommendations that investigations identify. The IRS both hits and misses the mark. We learned that IRSs need to be responsive to the (developing) capabilities of healthcare providers to investigate and learn from incidents, if the IRS is to stimulate social and participative learning from incidents.


Introduction
The idea that incident reporting holds an important key to improving safety of healthcare is well-established. [1,2] Adapted from high-risk industries, the premise of incident reporting is that by reporting and investigating incidents, we might understand what causes or contributes to patient harm, so that preventive strategies can be devised and healthcare made safer [3,4]. In many countries, incident reporting systems (IRSs) have been set up with the aim to learn from incidents [5,6]. Research has shown, however, that IRSs struggle to foster learning [5,[7][8][9]. In these studies, learning from incidents is understood as being able to prevent future incidents, so that learning is believed to have occurred when fewer incidents are reported. When the effectivity of IRSs is evaluated in terms of the number of incidents reported, IRSs frustrate or disappoint. [10,11] IRSs fail to demonstrate progress, suggest-ing that learning has not occurred [12,13]. We argue that such evaluations are problematic as they work with impoverished conceptualisations of what learning is-generally confusing learning with performance [14]-, neglect how definitions of what constitutes incidents shift [15,16] and are inattentive to how more reported incidents might be reflective of a safety minded organisational culture rather than poor performance [17,18].
In the Netherlands, the Dutch Health and Youth Care Inspectorate (further: Inspectorate), the national regulator tasked with monitoring quality and safety of care, has designed and maintains a national IRS for hospitals. The Dutch IRS focuses on hospitals' learning processes following sentinel events (further: SEs) and was designed with the idea that it should 'lead to social and participative learning at the local level' (see box A in appendix A for the type of incidents reported in the Netherlands and the role of the Inspectorate). [16] Rather than assessing what hospitals learn from SEs, the Inspectorate monitors how hospitals learn from SEs, inquiring if hospitals learn to learn from SEs. [16] Specifically, the Inspectorate monitors hospitals' ability to investigate incidents and identify fitting corrective actions. In order to monitor 'the quality of Box 1: Scoring instrument to assess the quality of the SE analysis report. the learning process' of hospitals [16], the Inspectorate developed a scoring instrument that sets forth key conditions to properly investigate and learn from SEs (Box 1). In line with this instrument, the Inspectorate published a guideline, informing hospitals on what the Inspectorate expects from an investigation. [19] Since July 2013, every SE reported and investigated by hospitals is scored by the Inspectorate [16].

Item Judgement of inspectors
In this article, we study the effects of the Dutch IRS on the local learning process of hospitals. In line with the aims of the IRS, we approach learning from incidents as a social and participative practice, drawing on work of Macrae [7] and Ramanujan and Goodman [14]. Learning from incidents, for Macrae, 'involves people actively reflecting on and reorganising shared knowledge, technologies and practices. It is these processes of action and reorganisation that constitute learning and must be supported through investigation and improvement.' [7] For Ramanujan and Goodman, 'learning represents a shared understanding among group members of a new course of action to minimize or prevent the recurrence of negative events. (. . .) If learning does take place from the event analysis, this new repertoire would be shared, stored, and enacted at the appropriate time.' [14] Our study is guided by the question: How does the Dutch IRS stimulate social and participative learning from incidents?

Methods
To answer our research question, we adopted a sequential mixed-methods study design. Drawing on quantitative and qualitative data, we aim to generate a more comprehensive understanding of the effects of the Dutch IRS. [20,21] We present and integrate quantitative data on scored SE investigation reports and qualitative data on how SE investigators perceive the effects of the IRS on their investigation practices and learning processes.

Database of SE investigation reports
As researchers, we were granted access to an Excel-export that listed 4667 scored SE reports, from all 96 hospitals in the Netherlands, between 1 July 2013 and 31 March 2019. We received an anonymised version and could not link hospitals to individual SE reports. The database shows how inspectors scored each of the 25 items for each SE investigation report. If an item is adequately addressed, it receives a 'yes' and is scored as '1 . If a report does not adequately address an item, it receives a 'no' and is scored as '0 . When it is unclear to inspectors whether something was or was not done, inspectors score a '?' and is scored as '0 . If an item is deemed inapplicable, it is removed from the set of questions that come to make up the total score the report receives. Based on the items scored, each report receives an overall score, expressed as a percentage from 0% to 100 %. Multiple inspectors score individual reports which are discussed in weekly multidisciplinary meetings, as a result of which scores may be amended. [35] Given our interest in how an IRS might stimulate social and participative learning, the database with scored SE investigation reports potentially provides an indication if and on what items hospitals improved their capability to investigate SEs. We draw on qualitative research to understand what happens behind the numbers.

Qualitative research on the effects of the Dutch IRS
Since 2015, all authors except MV have been involved in various research projects that studied the effects of the Dutch IRS. [33][34][35][36] All of these projects included qualitative, ethnographic research. In all, we conducted 73 semi-structured interviews and 36 h of ethnographic observations. In this article, we present data collected within two projects specifically (Table 1). In the first project, the objective was to explore how hospitals organise their SE investigation practices, how managers and SE investigators perceive the effects of investigating SEs on their learning processes and what challenges they encounter. In the second project, following the first and other research projects into the Dutch IRS, the objective was to review and synthesise findings from studies conducted in the collaborative on the effects of IRS on learning and, with stakeholders, think about how the Dutch IRS could be developed further.
In both projects, sampling was purposive and while depth was strived for in the first project-aiming to reach data saturation-breadth was strived for in the second Respondents were approached via email and informed about the objective of the research in this email. In the email, the voluntary nature of participation was stressed, as was the fact that data would be fully anonymised. All approached respondents agreed to participate. During interviews internal incident investigation protocols and related documentation (meeting minutes, agenda's, report formats etc.) were reviewed and when possible/appropriate hard copies were collected for further analysis.
We have discussed methods used to conduct this study more in-depth elsewhere. [33] Project 2 Jan 2017 -May 2018 DdK and KG 8 semi-structured interviews with (former) healthcare inspectors involved in designing and/or monitoring the IRS. Respondents included inspectors involved in scoring SE investigation reports of hospitals, as well as inspectors regulating other healthcare sectors (e.g. mental health care). Interviews lasted 57-103 minutes (total 10 respondents). Respondents were approached via email and informed about the objective of the research in this email. In the email, the voluntary nature of participation was stressed, as was the fact that data would be fully anonymised. All approached respondents agreed to participate. Focus groups with 1) healthcare inspectors (3 h), 2) healthcare managers and professionals (3 h), 3) the Dutch Ministry of Health (1.5 h) and 4) citizens (5 h). Field notes were made during the focus groups.
Policy documents of the Inspectorate on the Dutch IRS were analysed in order to understand the historical development of the IRS.
We have discussed methods used to conduct this study more in-depth elsewhere. [36] project-soliciting insights from inspectors supervising a variety of care sectors and other stakeholders. All semi-structured interviews were structured using interview guides. Interview guides listed themes of interest and were amended in light of findings from preceding interviews. Interviews were digitally recorded following respondents' consent and transcribed verbatim.

Database of SE investigation reports
Descriptive statistics were applied analysing the 4667 scored SE reports. To study changes over time, we obtained how SE reports scored on each of the 25 items scored by the Inspectorate per quarter, as the percentage of reports adequately addressing each item. We also determined the average final score awarded to SE reports over time. Following two meetings with inspectors and a statistician of the Inspectorate, who were intimately familiar with the data and with how the scoring instrument was developed and used over time, we revisited the data and constructed groups of hospitals. To construct the groups, the initial year (01−07-2013/01−07-2014) was used to calculate the average score of the SE reports by each of the 96 hospitals. Hospitals that reported less than three SEs during the initial year, were not assigned to groups (n = 16 hospitals). The 80 remaining hospitals were assigned to one of four quartiles, based on average scores ( Table 2). We merged the two groups in between the 'low' (n = 20) and 'high' (n = 20) scoring hospitals, referring to that group as the 'middle' (n = 40). Our reasons for doing so are informed by the Inspectorate's ideas about how hospitals should learn from SEs. [16,35,36] For one, the Inspectorate 'tailors its regulatory practices to the learning capabilities and the developmental stages of healthcare providers.' [18] Second, conducting good SE investigations is thought to be a skill that hospitals develop over time. [16,35,36] So, while hospital performance-in terms of SE scores-might be benchmarked against other hospitals that are in similar developmental stages, the Inspectorate is particularly interested if hospitals improve over time. [16,35,36] To plot the development of average SE scores for all hospitals over time masks differences between hospitals. Therefore, we constructed 4 groups of 20 hospitals that remain stable over time-the two groups between the low and high scoring hospital groups we merged into one middle group. We can expect that group construction based on received SE scores during the first year serves as an approximation of hospi-tal's learning capabilities and the developmental stages they are in.

Semi-structured interviews
The transcribed interviews were analysed with the aim to identify themes, performing thematic analysis. [22] The concept of learning as social and participative practice functioned as a sensitizing concept that guided but did not restrict our analysis. DdK and JK individually analysed two interviews each, identifying themes. Following that, DdK and JK reviewed the coded material and developed a coding scheme that was reached through iterative discussions and multiple meetings between both authors. DdK and JK coded the remaining interviews with the coding scheme in Microsoft Word, at times refining or adding codes to the coding scheme. The coding scheme and the themes identified were discussed among all authors. Consensus was reached over the course of two meetings with all authors.

Results
We identified five core themes that we formulate as practices the IRS can contribute to. Respondents linked the IRS to: 1) changed staff attitudes and increased reporting, 2) improved SE investigations, 3) participative learning, 4) local learning, and 5) recommendations that improve quality and safety of care. These themes order our results and we present quantitative and qualitative data per theme.

Changed staff attitudes and increased reporting
Several hospital respondents report that the IRS contributed to changed attitudes towards patient safety, helping to generate, as they call it, 'safety thinking'.  refers, rather, to a way in which professionals approach their work, cognizant of risks their work holds. Also, respondents credit the IRS with stressing the need for reporting SEs.
R1: When I compare where we were five, six years ago with today, we've really developed. Also just in terms of the SEs we report. We never had SEs. . . R2: (laughs) R1: You had nothing to worry about when you visited our hospital; things did not go wrong. . . Now we report 12 SEs each year. (Investigation committee chair and incident investigator, 20−9-2016) Many hospital respondents state that they report and investigate more SEs now than in the past. This is supported by data of the Inspectorate that shows how, since 2009, reported SEs have steadily increased ( Figure A in appendix A). The quote also shows that what (the number of reported) SEs tell us has changed. 'Before,' an inspector told us 'no SEs meant you were the best organisation. Now, when an organisation reports no SEs, something's not right' (Inspector, 30−05-2017). Thought of as reflective of an organisational safety culture, the amount of reported SEs becomes a quality metric in its own right, but one that says little about how organisations are able to learn from them. [7,23]

Improved SE investigations
A key aim of the Dutch IRS was to have hospitals improve their capability to investigate SEs as an important step towards learning from SEs. [16] For how SE reports are scored by inspectors since 2013, see Fig. 1 in this text and figure B-G in appendix A.
We might conclude that the high scoring group of hospitals already did fairly well, having many of the conditions for conducting SE analysis in place and that, particularly, the low scoring group of hospitals developed. From Q4 2015 onwards, some two years after SE reports were scored in accordance to the new scoring instrument, the development of the average SE scores of low and high scoring hospitals intertwine. The IRS offers the opportunity to zoom in further, on specific items scored. This is potentially insightful given that not all items are equally easy to perform well on. Doing well on some items (e.g. 'Do the corrective actions address the identified root causes?') requires more expertise and work from investigation committees than others (e.g. 'Is the method for analysis specified?'). Moreover, while for the final score of a report each item is granted equal weight, inspectors deem some items more important than others. [34] We selected three specific items scored by the IRS that, according to inspectors, adequately reflect the capability to conduct SE investigations (see figure C-E in appendix A.) [34] As to the weight attributed to these items by inspectors, one inspector notes: What happened [leading up to and during the SE] has to be clear (. . .) so I can tell if the root causes are properly identified. This is where it starts; it determines the next steps and whether or not these steps make sense. (Inspector, 1-11-2016) The items that inspectors emphasise are sequential in the sense that one item builds upon the next. The quality of an investigation, multiple inspectors report, starts with adequately addressing the 'why' question (figure C)-so that the root causes might be identified (figure D) and corrective actions devised that address those root causes ( figure E).
While the data clearly shows progress of hospital scores over time, we cannot determine based on this data whether hospitals have become better at investigating SEs or if hospitals have become more adept at writing SE reports in line with the scoring instrument of the Inspectorate. From our interviews, we know respondents are well aware of what needs to be in the SE report. Also, the score awarded to SE reports is interpreted by hospital respondents as a 'grade' and the investigation becomes a practice respondents want to score well on.
If the Inspectorate wants us to note down how many hours we have spent doing something, or whatever criteria they have thought of, well then we add it to our checklist of things to add in the report. We want to score 100 %. (Committee chair, 20-09-2016) Hospitals have invested in the professionalisation of investigation teams-emphasised and argued for in multiple studies [8,24]-by training them in methods on how to conduct SE investigations and by keeping teams consistent, allowing investigators to develop expertise. But, dedicated teams are also needed due to the increased numbers of SEs that are reported and need to be investigated.
These investigations take so much time. Medical specialists do them on the side, while a dedicated [investigation] team develops experience [with SE investigations] so that the quality of investigations is consistent. And yeah, it takes an incredible amount of time. . . and you want the investigations to be of good quality. (. . .) These reports go to the Inspectorate. (Medical doctor, 18-08-2016) As hospitals increasingly set up dedicated teams in response to increasing numbers of SEs that need to be investigated, coupled to the desire to 'score' well, conducting SE investigations becomes a particular organisational activity and responsibility, targeted at creating reports that fit the requirements of the Inspectorate. Input from concerned professionals, especially in the recommendation phase, is often not taken seriously.
I: What if professionals don't agree with the root causes you've identified and the recommendations you propose. . . Does that happen? R: Yeah, sure, that happens (laughs). Um, so, with the investigators we'll look at the response [of the professionals]. What do we think? Are they correct? And are we going to change that? If we believe that it does not fit the investigation we conducted, we do not change it in the report. (Committee chair, 28-06-2016) Another hospital respondent told us that when professionals disagree with the recommendations of the investigation team, the team is willing to consider the professionals' perspective when it identifies 'errors' in the report, but that when '[professionals] think our recommendations are radical or something else, well. . ., it's our recommendation' (Medical doctor, 18-08-2016). Investigators develop recommendations in light of how the Inspectorate scores them-as fitting the analysis-rather than if they contribute to the quality and safety of care practices.

Participative learning
The importance of involving patients and families in incident investigations is increasingly recognised and is spurred by the idea that healthcare can learn from the patients' and families' perspectives [25][26][27].In the Dutch IRS, hospitals are expected to involve patients and families in SE investigations and as such, it encouraged hospitals to widen the circle of people able to participate in and contribute to SE investigations.
Yeah, [involving patients and families in SE investigations] it's something we've wanted for some time, thinking 'we need to do this, this is important'. But to actually start doing it, is quite a big step. (. . .) So on the one hand, we were motivated to involve patients and families, having heard how important it is and on the other hand, the pressure from the Inspectorate to start doing this. . ., it helped. (Medical doctor, 28-06-2016) The quantitative data suggest that, in 2013, involving patients and families in SE investigations was no customary practice (figure F in appendix A). Similarly, the IRS assessed and contributed to the degree to which SE investigations reports are shared with patients and families afterwards ( figure G in appendix A). The IRS contributed to the normalisation of a practice-the increased involvement of patients and families-that is widely argued for.
But involving patients and families in SE investigations is not the same as learning from them. The IRS operationalises the need 'to engage the patient or a patient representative in SE analysis' [16] by inquiring if 'input was sought from patient/relatives?' The IRS does not specify what constitutes such 'input' or the extent to which hospitals need to involve patients and families. Hospitals, in response to the IRS's encouragement to involve patients and families, have developed different ways of organising said involvement. Typically, however-and we report on practices of patient and family involvement in SE investigations more extensively in our other work [33,37]-incident investigators predetermine the scope and the questions the investigation needs to provide answers to.
[In case of an SE] we [the investigative team] look at: what is the focus of the investigation and based on that, what do we want to know? We draft the research questions. And then we decide, given all that, who we want to speak to. We schedule appointments with those people and then, basically, we have all the information we need. Moreover, although hospitals are committed to involving patients and families in SE investigations, when the perspective of patients and families does not align with that of professionals, investigators tend to grant the professional perspective more weight. Hospitals also have different ways of sharing SE investigation reports; while some share reports in full, others provide summaries to patients and families or arrange a face-to-face meeting wherein the investigation's findings are presented to patients and families. While some hospitals explore possibilities for more comprehensive patient and family involvement-e.g. by asking patients and families what kind of questions they would like to see the investigation address-this involvement in SE investigations generally happens on the hospital's terms. Clearly then, the IRS-in inquiring if hospitals solicit input from patients and families-does not attend to or discern between the different ways in which hospitals look to involve patients and families in SE investigations.

Local learning
While investigating SEs is expected to generate learning, the need to investigate SEs is not prompted by the potential learning opportunities an SE holds but because it is severe in terms of patient outcome (see Box 1). This, respondents point out, means that organisational resources and time are committed to investigating SEs at the cost of attending to less severe incidents that might hold valuable learning opportunities. I just came back from a holiday and wanted to get back to my plan on how to take these [SE investigations] to a higher level and then I saw three more SEs in my inbox. (. . .) It's frustrat-ing; we want to do it the right way. . . It's like. . . running; you can train for endurance or for speed. When you do both at the same time, you'll get injured. So we always have to investigate more and, at the same time, the investigations have to be better, because every time we receive feedback [from the Inspectorate] 'you're not doing this well enough'. And it's making me anxious. We get the idea [of the Inspectorate], but we struggle keeping up. (Committee chair, 10-08-2015) The incessant stream of reported SEs that need to be investigated by hospitals comes at the cost of reflecting on what singular SEs tell a hospital about its quality and safety of care and how findings from particular investigations might generate aggregated learning at a deeper level. Inspectors report similar experiences. As hospitals continue to investigate and report on SEs, inspectors have to keep scoring them. 'What do all these SEs tell us? How might other organisations learn from this? (. . .) We want to get to those questions, but we don't have the time. We are so caught up in getting these SEs wrapped up. . . it's overwhelming' (Inspector, 25-09-2017).

Recommendations that improve quality and safety of care
One of the aims of the Dutch IRS was to have hospitals learn to devise corrective actions that fit their context. While figure E seems to suggest hospitals are increasingly capable of doing so, recommendations are scored in light of whether or not they fit the analysis, rather than if they contribute to safe care practices. Also, hospital respondents acknowledge that it is a challenge to keep track of all the recommendations SE investigations identify.
Sometimes I find out a particular recommendation has just vanished. Then there is a new manager and nobody is able to recall that recommendation. (Incident investigator, 12-07-2016) Um, we have all these recommendations in an Excel-sheet and we try to follow up on these every three months, asking people how they're faring. At times, our annual meeting with the Inspectorate serves as a trigger to think 'oh, right, we still have to do this'. (Incident investigator, 18-05-2016) Our interviews suggest that hospitals struggle to keep track of and evaluate the effects of the identified recommendations. Respondents suggest that while organisational investment into investigating SE is considerable, following up on recommendations after the investigation does not receive the same (structured) attention.

Discussion
In drawing on and integrating quantitative and qualitative data on the Dutch IRS, our study suggests that the IRS contributed to a range of practices in hospitals. In terms of its contribution to social and participative learning from SEs, the IRS both hits and misses the mark. Going back to Ramanujan and Goodman's definition of social and participative learning, 'learning represents a shared understanding among group members of a new course of action to minimize or prevent the recurrence of negative events.' [14] Our study finds that while hospitals invest in the training of incident investigators and while hospital SE investigation reports are scored higher by inspectors over time, the learning process of the investigation teams is not or poorly connected to that of the involved healthcare professionals. While patients and family members are increasingly involved, their input is not always valued by investigators. The input and perceived value of both patients and professionals is linked to the extent to which it helps investigators conduct the investigation as outlined by the IRS. The 'shared understanding of a new course of action' that Ramanujan and Goodman speak of, is mostly shared among incident investigators, who-on account of their expertise and the need for an independent investigation-claim ownership over the investigation which can hamper the participation of others and shared learning. Paradoxically, in the attempt to encourage and measure social and participative learning, the IRS engendered practices of learning that restrict who can truly participate. Investigators can act as gatekeepers of the investigative process; investigations are organisationally cordoned off and participation is valued in light of the standard the Inspectorate holds investigations to. Moments of reflection and opportunities for aggregated learning, meanwhile, are scarce given the consistent pressure to report and investigate (for hospitals) as well as score (for the Inspectorate) more SEs. This is a trend we can expect to continue as reporting behaviour has become a quality metric in its own right, that is said to be indicative of a hospitals' safety-mindedness and transparency [7]. While corrective actions are adequately identified, they are not consistently monitored or evaluated by hospitals. Also, corrective actions are assessed in terms of coherence with the SE analysis rather than if or how they are of value for the practice of healthcare professionals. 'If learning does take place from the event analysis,' Ramanujan and Goodman further write, 'this new repertoire would be shared, stored, and enacted at the appropriate time.' [14] The data collected through the IRS sheds no light on if and how hospitals share, store or appropriately enact this new repertoire that the investigation ideally results in.
Given that we know that organisations invest in practices that are externally monitored, [28,29] it is hardly surprising that hospitals consistently deliver higher scoring SE reports. Still, our findings resist the interpretation that the Dutch IRS is a tick box exercise hospitals have become increasingly adept at. Asking hospitals whether they asked the patient and family for input generated discussions about the value of patient and family involvement and hospitals organise for and value such involvement differently. [33] Here we want to point out that the involvement of both patients and professionals in SE investigations is instrumental to the objective of learning from an SE and that the emotional impact of SEs, on both patients, families and professionals, is not accommodated for in these investigations. [30,33,37] As Nicolini et al. already pointed out, failing to engage with and make room for the emotional impact of an SE in favour of the quest for facts and evidence can actually hamper learning [30]. Elsewhere, we explore how 'being emotional' renders patients and professionals prone to being disqualified as contributing valuable input in an SE investigation. [37] Now, the IRS does inquire into aftercare practices of hospitals following an SE, for both patients and professionals, that might make room for said impact-even if the IRS does not follow up on how those aftercare practices are organised and valued by those who make use of them. The professionalisation of SE investigators and the reports they deliver is a valuable achievement, even if that also allows a hospital to score well. Our respondents note that knowledge about patient safety has increased as a result of investigations. But although it is acknowledged that investigating incidents 'is just one step in the path to improvement' [16], the IRS risks singling out the investigation as the most important one. Scoring SE reports as reflective of hospitals' learning process perpetuates, or at least does little to dispel the mistaken notion that investigating incidents is the same as learning from incidents [7,14,31]. With the aim to encourage and contribute to social and participative learning from incidents, the Dutch IRS monitors a dynamic practice, rather than an outcome. However, we conclude that the IRS does not adequately reflect the dynamic practice it monitors. Now that the conditions for hospitals to properly investigate their SEs seem in place, the IRS should redirect its focus to encourage reflection, monitor how shared understanding develops after an SE and stress the linkage between investigating and learning. We propose two ways in which an IRS might further encourage shared and participative learning from SEs.
First, there is a need to rethink the emphasis on investigating singular SEs. Investigations are prone to become stand-alone activities, disconnected from wider organisational safety practices and learning opportunities. [8,9,32] In the Netherlands, as in other countries, 'the perimeter[s] of patient safety' [15] keep expanding as more events qualify as SEs [16]. As both hospital respondents and inspectors struggle with the amount of SEs that have to be investigated and assessed, a continued focus on singular SEs might become untenable. Especially for hospitals that consistently demonstrate the ability to adequately investigate singular SEs, the IRS would do well to accommodate an aggregated level of analysis, encouraging hospitals to reflect on and learn from SEs in connection to their wider safety policies and practices [8,9,32]. Second, there is a need to move beyond the investigation practices and monitor how hospitals use SEs to improve daily care practices. Following Ramanujan and Goodman, the IRS can monitor how hospitals work to link the analysis of an SE with learning by posing questions that address how learning is shared, stored and enacted [14]. For example: How did patients and families contribute to your understanding of the SE? How do you link the learning process of the investigation team to the professionals working with their solutions? How do you institutionalise and normalise the solutions identified so that they are used in practice? [14] Such open questions encourage hospitals to reflect on how investigation practices (of singular SEs when this is warranted or at an aggregated level) are meaningful to their safety practices and enable hospitals to demonstrate ownership of these practices.
Our study has some limitations. The Dutch IRS's focus on social and participative learning of hospitals following SEs is unique and developed in response to problems identified in other IRSs, so that our findings are specific to the Dutch IRS. Still, how the Dutch IRS, as a monitoring instrument, encourages and generates particular organisational practices and investments can be valuable for the design and continued development of IRSs that have a different focus. Our findings could have been strengthened by the perspectives of SE involved healthcare professionals as well as patients. In our focus on how the IRS encourages practices of social and participative learning, we foregrounded the accounts of incident investigators and committee chairs; the professional groups that, in hospitals, organise the investigative practices that aim to support such learning. By conceptualising learning as a social and participative practice, we were able to demonstrate how IRSs can encourage hospitals to develop valuable practices. Drawing from both quantitative and qualitative data, we were able to generate an insightful understanding of the effects of the Dutch IRS.

Conclusion
IRSs can encourage hospitals to develop and invest in practices that contribute to social and participative learning from incidents. IRSs need to be dynamic to accommodate for the improved learning capabilities of healthcare providers and encourage continued improvement. When providers succeed in meeting the demands an IRS set, these demands should be adjusted towards a next level. Continuously raising the bar or adding new elements prevents a plateau effect that would diminish the effectiveness of measures over time and stagnate further learning. Assessing and stimulating hospitals' learning process with the aid of IRSs is a promising strategy, but its success depends on the consistent evaluation of its effects and its further development.

Contributors
The authors contributed to the manuscript as follows. DdK, JK, KG, IL and RB contributed to the conception and design of the study. DdK, JK, KG and MV were involved in data collection. DdK, JK, KG, IL and RB performed data analysis and interpretation. DdK and JK drafted the initial manuscript. The manuscript was critically revised by DdK, JK, KG, IL, MV and RB and approved by all authors before being submitted.

Funding
The research projects reported on here were conducted within and supported by the Dutch Academic Collaborative Centre of Supervision, a research collaborative that pairs up the Inspectorate with four research institutes with the aim to study and enhance the effectivity of the Inspectorate's regulatory practices. The first project was not funded. The second project was supported by ZonMw, the Dutch organisation for Health Research and Development, project number 516004604.

Patient consent for publication
Not required.

Ethics approval
The research presented did not require ethical approval. In the Netherlands, research that does not involve subjecting participants to medical intervention or does not dictate particular codes of conduct for participants requires no approval (ccmo.nl). Respondents consented to be interviewed. Contributions were anonymised and respondents were given the opportunity to review their quoted material.

Declaration of competing interest
IL and MV work as inspectors for the Dutch Health and Youth Care Inspectorate. Both IL and MV have been and are involved in the process of designing the IRS and MV is part of a team of inspectors scoring hospital SE reports. IL and MV did not participate in qualitative data collection.