Sustainability in Health care by Allocating Resources Effectively (SHARE) 8: developing, implementing and evaluating an evidence dissemination service in a local healthcare setting

Background This is the eighth in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE Program was a systematic, integrated, evidence-based program for disinvestment within a large Australian health service. One of the aims was to explore methods to deliver existing high quality synthesised evidence directly to decision-makers to drive decision-making proactively. An Evidence Dissemination Service (EDS) was proposed. While this was conceived as a method to identify disinvestment opportunities, it became clear that it could also be a way to review all practices for consistency with current evidence. This paper reports the development, implementation and evaluation of two models of an in-house EDS. Methods Frameworks for development of complex interventions, implementation of evidence-based change, and evaluation and explication of processes and outcomes were adapted and/or applied. Mixed methods including a literature review, surveys, interviews, workshops, audits, document analysis and action research were used to capture barriers, enablers and local needs; identify effective strategies; develop and refine proposals; ascertain feedback and measure outcomes. Results Methods to identify, capture, classify, store, repackage, disseminate and facilitate use of synthesised research evidence were investigated. In Model 1, emails containing links to multiple publications were sent to all self-selected participants who were asked to determine whether they were the relevant decision-maker for any of the topics presented, whether change was required, and to take the relevant action. This voluntary framework did not achieve the aim of ensuring practice was consistent with current evidence. In Model 2, the need for change was established prior to dissemination, then a summary of the evidence was sent to the decision-maker responsible for practice in the relevant area who was required to take appropriate action and report the outcome. This mandatory governance framework was successful. The factors influencing decisions, processes and outcomes were identified. Conclusion An in-house EDS holds promise as a method of identifying disinvestment opportunities and/or reviewing local practice for consistency with current evidence. The resource-intensive nature of delivery of the EDS is a potential barrier. The findings from this study will inform further exploration. Electronic supplementary material The online version of this article (10.1186/s12913-018-2932-1) contains supplementary material, which is available to authorized users.


Section 3
Evaluating the change: Evaluation Plans a. Model 1

Reach
Have Southern Health Decision-makers either personally reviewed or nominated a member to receive and report EDS alerts?

Section 4 Survey of decision-makers: Preferred content and format of evidence product
From survey of Monash Health staff who made decisions about resource allocation.
Full details of all survey questions are in Paper 7 of this series [7].

Type of research publication to inform decisions about health technologies or clinical practices
Respondents were invited to choose as many as applied n (%) Critical appraisals of primary research 88 (83.0) Full text of secondary research (eg evidence-based guidelines, systematic reviews) 83 (78.3) Critical appraisals of secondary research 79 (74.9) Full text of primary research (eg clinical trials) 73 (68.9) Abstracts of primary research 50 (47.2) Abstracts of secondary research 44 (41.5) Other* 7 (6.6) Total 106 *Other: consumer perspectives, case-studies of other health services, web-access to journals, professional guidelines and web-access for participation in group wide trials

Focus of research to inform decisions about health technologies or clinical practices
Respondents were asked to rank at least three preferences with 1 being the most preferred option *Other: consumer initiated, focused and developed research; international relevance; focus needed depends on the task; skill or procedure specific eg bed management

Format of research dissemination to inform decisions about health technologies or clinical practices
Respondents were asked to rank at least three preferences with 1 being the most preferred option  Decision-makers are frequently unaware of these resources.  Due to lack of time, knowledge and skills decision-makers do not actively seek these resources when making decisions and do not differentiate between high and low quality resources.  Cost-effectiveness data is often based on modelling which is perceived not to reflect reality National  The Medical Services Advisory Committee and Pharmaceutical Benefits Advisory Committee provide evidence-based recommendations for use of medical and surgical procedures and drugs.
 Not all medical and surgical procedures and drugs are covered by these processes.  Nursing and allied health practices, models of care and clinical consumables are not covered. State  Guidance for introduction of new health technologies and clinical practices (TCPs) is provided by DHS. This includes reporting requirements.  Monash Health has developed tools to implement these processes. DHS has recommended these tools to other health services.  Monash Health Decision Summaries are published on the health service website.
DHS requirements and processes are cumbersome  There is no sharing of information or decisions. Individual health services duplicate the process of finding and appraising relevant evidence, developing business cases, etc.  DHS declined to coordinate sharing of information through a central database or website.
 The Victorian Policy Advisory Committee on Technology (VPACT) has an annual funding round for introduction of new high cost TCPs  Respondents unaware of any long-term state-wide strategic planning for equipment purchases  Lack of coordination of equipment use and procurement at state level and no communication between health networks.  Some guidance for purchasing is provided through the Victorian Government Purchasing Guidelines, Medical Equipment Asset Management Framework (MEAMF), Targeted Equipment Replacement Program (TERP) and Health Purchasing Victoria (HPV).  HPV is responsible for bulk purchasing of pharmaceuticals, clinical equipment and consumables to streamline ordering and reduce costs. If the item required is in the HPV catalogue the specified brand must be purchased from the designated suppliers at the cost and conditions noted.  The processes are transparent and accountability is clear.
 HPV catalogue only covers 30% of Monash Health consumables  Inclusion of items in the HPV catalogue is not always based on a rigorous evidence-based process  Safer, more effective or more cost-effective alternatives may not be included in the catalogue  HPV does not cover large items so MEAMF and TERP have no benefits from bulk purchasing and hospitals have to negotiate their own arrangements with suppliers  Decision-makers do not know which of these multiple systems are relevant to a particular situation  Terminology differs between systems and they are difficult to navigate  The Victorian Aids and Equipment Program is administered by Monash Health on behalf of the DHS.
The application process is standardised based on tight explicit criteria for transparency and accountability.
 This is a 'last resort' process after other sources of funding have been exhausted. Clinicians waste valuable time writing funding applications for multiple programs which could be integrated and allocated centrally.  The Department of Treasury is interested in supporting disinvestment initiatives but requires details of savings. If savings or reinvestments can be quantified the department may provide more funding.
 It is hard to measure the savings  The savings are rarely realised because they are absorbed and used to treat more patients 9

Monash Health environment: General
Enthusiastic and dedicated staff Staff commitment to quality improvement Organisational support Support from the Executive Management Team Support from Directors of Nursing Involvement of people who are outside of, or uninterested in, the politics of the organisation

High staff turnover in the organisation, particularly agency nurses and junior staff, increases difficulty in communication and implementation High staff turnover in projects diminishes organisational knowledge and expertise and increases training requirements
Organisational culture is difficult to change Organisational politics Incident reporting software (Riskman) is flawed, does not cover all requirements and does not enable valid aggregation of data related to consumer information  Strategic planning provides an opportunity for integrating disinvestment decisions into organisational practices. Monash Health had transparent strategic and business planning processes.  Confusion about 'who does what'  Duplication of some committee and project activities  In addition to policies and guidelines there were supporting documents such as application forms, business case templates, requisition forms and checklists governing activities related to resource allocation such as purchasing and procurement and development of clinical guidance documents.
Too much paperwork and existing paperwork is confusing and ambiguous  Some documents were not well organised, not easily accessible, multiple versions were available and some required considerable skills and resources to complete  Emphasis on 'business' aspects and less consideration of evidence of safety, effectiveness and costeffectiveness in many of these documents Transparency and accountability  Transparency and accountability in decision-making was highly valued by respondents  Improved transparency and accountability at Monash Health was desired by most respondents

Lack of transparency in all aspects
Lack of transparency and accountability in decision-making reduces confidence  Inadequate transparency and accountability was one of the strongest messages from respondents  Clear documented lines of accountability and reporting requirements in some areas  Individuals and members of committees at the top of their respective decision-making hierarchies reported that they had clear understanding of how the processes should work, who is accountable, who makes the decision, etc and knew the difference between recommendations, decisions and authorisation.  Many of these respondents also reported that all decision-makers have the same understanding as they do.
 Many individual and group decision-makers lower down the respective hierarchies admitted they were unsure of the processes. Others who said they were sure gave answers that were inconsistent with each other. Some reported ambiguities and inconsistencies in the systems and processes.  Confusion between the concepts of 'decision' and 'recommendation' which may lead to uncertainty in accountability. Some committees saw their role as 'recommending' a course of action with the 'decision' being made by a higher-level committee. In contrast, the higher-level committees saw their role as one of guidance and support in response to robust investigation of decision options which they expected to occur at the lower level 'decision-making' committees.  Individual decision-makers did not always know who to report a decision to and whether formal authorisation was required.  Some committees recognised the overlap in their work and the potential to work together. These were in two groups, those considering introduction of new TCPs and those involved in purchasing.  People who were members of more than one committee often provided the links between them.  There were many examples of cross-unit/department consultation and collaboration for policy and protocol development and implementation.  Four projects were linked to others with similar aims.

Lack of knowledge and awareness about
 Lack of awareness of other committees within Monash Health  Other than reporting, there were no documented relationships between committees  Other than the committees considering new TCPs, there were no formal processes of referral for issues that might affect, or should be addressed by, other committees  Decision-making 'in isolation' was noted to be a problem in multiple settings. 'Fragmentation' and a 'silo mentality' were used in relation to decisions made without consideration of the areas they will impact upon or consultation with relevant stakeholders.  No systematic processes to link projects across the organisation Monash Health environment: Stakeholder engagement Involvement of broad range of stakeholders from multiple sites and a range of health professional disciplines  Reported benefits of broad stakeholder involvement in decision-making included improved decisionmaking, more effective dissemination of decisions and informing and encouraging others about the need to consult with the groups represented. Lack of/inadequate coordination of current resources  Some committees had a Secretariat comprised of 1-2 officers from named roles within the organisation. These positions were allocated sufficient time to complete the required tasks.  Some projects were provided with adequate resources for implementation and evaluation  Some wards had additional staffing for education support and clinical nurse support. These were invaluable resources for practice change, protocol development and implementation.  Some projects had external funding from DHS, universities, etc for staff or infrastructure costs.

Lack of consultation with clinicians in decisions made by managers
 Some committees used the Personal Assistant of the committee Chair in an administrative role. If a new Chair did not have a personal assistant there would be no resources to support the committee.  Some respondents found it difficult to separate the role of the committee from the role of their department. Committee work significantly increased their overall workload, particularly administrative matters, and it was not always clear if these duties were part of, or additional to, their normal duties and what they could cut back in order to accommodate committee obligations.  Many projects were to be carried out 'within existing resources'. Respondents noted that they either did unpaid overtime or aspects of the project were not undertaken.

Expertise and Training
Lack of/inadequate skills in  use of information technology  finding and appraising evidence from research and data  project management  change management  Staff in Centre for Clinical Effectiveness (CCE) and Clinical Information Management (CIM) were available to decision-makers to provide expertise in research evidence and local data respectively.  CCE ran training programs in finding and using evidence, implementation and evaluation  Six of 10 projects had training for project staff in change management, leadership or IT skills.
 CCE's funding for training was redirected due to budget cuts so it was unable to provide free inhouse programs (however many staff attended the fee-paying courses CCE provided)  Lack of understanding of information systems and project management in senior decision-makers was reported and training for committee members was suggested  Most projects used a staff member from the department involved to deliver the project, most of these did not have project skills or expertise  Education and training is not well provided for part-time and night staff.

Provision of extra computers
Lack of computers and/or access to computers, particularly for nurses Difficulties using intranet to find organisational data  CCE and CIM were available to provide information to decision-makers  Monash Health libraries provided access to health databases and electronic journals, as well as advice in searching the health literature.
Lack of research evidence and local data to inform decisions  Many decision-makers chose not to use these sources of information  Priority was given to senior decision-makers and high level decisions; sometimes decisions at lower levels could not be provided with information due to limited resources.  General perceptions that  financial drivers were stronger than clinical drivers  impetus for change was ad hoc, there was no systematic or proactive approach  internal bureaucracy and red tape stifled ideas  Some committees had a well-documented application process.

Decision-makers
Complex and time consuming nature of application processes People by-pass the system, usually not deliberate but due to lack of awareness of the process  Some applications are driven by pharmaceutical or equipment manufacturers Decision criteria  Documenting explicit criteria was generally viewed positively.  The committees with application forms had some documentation of criteria.  Other decision-making groups and individuals had 'mental checklists' of criteria they considered.
 Only one committee (TCPC) and one individual used explicit, documented decision-making criteria.  Some committees had no decision-making criteria.  Some individual decision-makers strongly rejected documentation of explicit criteria as 'another form of paperwork that will waste clinician's time'.
 Most committees considered the Monash Health Strategic Plan, quality, safety, access and equity.  All committees considered financial factors.
Organisational priorities dominated eg  'Sound practice is not always affordable practice'  'The operational aspects of nursing (Key performance indicators that are reported to DHS) come first and professional aspects comes second'  There was a perception that there was 'too much emphasis on financial return for investment'

Ascertainment and use of evidence
Strong knowledge of the literature Attendance at conferences  Using research evidence and local data in decision-making was considered to be important.  All respondents reported using research evidence and data in decision-making to some extent.  Most committees sought a broad membership in order to utilise expertise in the consideration of research evidence and for decision-making with limited evidence.  Four out of ten projects sought research evidence from the literature to inform the project.
Amount of time needed to search the literature or collect data Access to evidence is not easy or coordinated Lag time between what universities teach and latest research evidence so new staff are not always aware of best practice Drug company marketing  Only one committee (TCPC) required explicit inclusion of research and local data and considered the quality and applicability of this evidence. Only one of the projects appraised the evidence used.  The other committees had no process to seek evidence from research. When evidence from research and data was used, it was not usually appraised for quality or applicability.  Due to difficulty finding uninterrupted blocks of time, slow computers and lack of skills in finding and analysing evidence, decision-makers relied on clinical expertise and advice from colleagues.  Appropriate local data was frequently reported to be lacking, unavailable and 'manipulated'.
Reminders and prompts to consider disinvestment  One application form (TCPC) had an explicit question about what the new technology will replace and what can be disinvested.
 "It's all very well to ask the question but it's very hard to get a clinician to say they will stop doing something".

Deliberative process
Robust and honest conversations Autonomous decision-making  Decision-makers expressed a desire for a documented standard process.  Many respondents noted that the main goal of discussion was to reach decisions by consensus.
Process not seen as priority for some  Some committee members do not attend  Meetings too short for proper deliberation Some decisions made reactively, 'on the run', due to lack of consultation or not following process Long lag time between application and decision  Lack of standardised process  Many of the current processes were perceived to be unclear, 'ad hoc' and lacking objectivity  Lobbying, both covert 'behind the scenes' and overt 'squeaky wheels', was perceived to result in favourable decisions.  Most committees required not only the presence of a quorum to make decisions but also attendance of members with relevant knowledge or expertise to the decision at hand.
 Not all committees had a defined quorum. Of those that did, some made decisions in the absence of a quorum and some made decisions even if a meeting was cancelled due to lack of a quorum.  Some decisions were made outside committee meetings or by the Chair only. Documentation and dissemination  One committee (TCPC) published Decision Summaries which were formally distributed to the Therapeutics Committee, EMT, DHS, the Applicant, Department Head and Program Head and made publicly available on the internet.  Most committees recorded minutes; these were considered to be confidential and were not published, but were available to appropriate requestors by contacting the committee secretariat  All of the individual decision-makers interviewed reported disseminating decisions to people they considered appropriate and, when deemed necessary, disseminating decisions organisation-wide.  Many respondents reported others disseminating decisions to them.
Large size, nature and diversity of the organisation increases  difficulty in dissemination of information  frequency and range of communication methods required Not everyone uses email Using email too often dilutes the effect  The majority of committees did not publish minutes or anything similar.  One committee did not keep any records.  Although some related committees exchanged minutes there was a lack of formal communication across committees.  Documentation and dissemination of decisions made by individuals was informal and ad hoc.  Not all projects communicated decisions to other staff members or the wider organisation. Unless people were directly involved, some projects appeared not to make project work or associated decisions public knowledge.  Lack of processes for knowledge transfer, especially across sites. Implementation Purchasing  Robust organisational processes that met annual audit requirements  Electronic ordering was controlled through an approval hierarchy with delegation thresholds.  It was assumed that the decision to purchase was made with due process before reaching the purchasing unit.
 Use of evidence in purchasing decisions was not outlined in the Purchasing Policy Guidelines.  Those making the decision of 'whether to buy' were responsible for ascertaining evidence of safety, effectiveness and cost-effectiveness in the first stage; however there was no system to check that this has been done before the second stage.  Health Technology Services, the Product Evaluation Committee and working parties set up to evaluate large individual capital purchases considered appropriateness of equipment to Monash Health, availability of spare parts, life expectancy, servicing requirements, related consumables, availability of technical expertise and fit with the DHS Asset Management Framework. They also had expertise in contract negotiation.
Difficulty managing expectations eg 'once something is approved people want it immediately'  Some were unaware of this process and went directly to the manufacturer. If this was overseas it may be difficult or expensive to get parts, there may not be relevant skills for local maintenance and it excludes benefits that may already exist with a local manufacturer that could supply the same product under better terms and conditions. Re-negotiating contracts, or establishing new ones, creates bad feeling and wastes lots of time.  Purchasing of clinical consumables within budget allocation is done electronically. Electronic authorisation is required for items above individual limits (eg Nurse Unit Manager approval up to $10,000, items above this require authorisation).
 There is little assessment of safety, effectiveness or cost-effectiveness of clinical consumable items.
Policy and guidance  Monash Health was developing a new Policy and Procedure Framework  Broad support for increased standardisation of practice through policies and procedures  Development process seen as a communication tool between professional groups and across sites.
 Lack of structure and standardisation of processes, especially between sites.

Implementers
Finding others who have done the same work for support, advice and information Establishing Working Parties and Steering Committees for support, endorsement, troubleshooting Project leader whose primary role is 'at the coal face'  Decisions made at program level that involve multiple wards, departments or sites are usually implemented by multidisciplinary teams. Project-specific barriers such as logistical challenges with product being implemented  Some committees provide an approval process only and the applicant is responsible for implementing the decision. In most cases the applicant has control over the process (eg head of department implementing a new procedure) and is motivated to implement the change.
 Sometimes practice change is required beyond the applicant and their department. Committees do not require applicants to have or acquire knowledge and skills in implementation.
 Training and education activities and 'champions' were reported as the two key strategies used to effect change and encourage sustainability of the intervention.  Most projects had a champion and/or Executive sponsor. Project champions were generally the head of the relevant department; others included the Chief Executive Officer, Executive Directors who were Steering Committee Chairs and 'Ward Champions' selected to encourage and promote change.  Those with champions unanimously considered champions important to the success of the project.  Training or education included passive methods using posters and memos, interactive learning on new equipment and participatory approaches involving staff in design and implementation.  Seven projects involved training for the target group, most of which was done by external providers of new equipment.
 Lack of knowledge and skills in project management, change management and use of information technology were exacerbated when interventions were complex and required high levels of training  Lack of known, standardised processes for implementation at Monash Health  Most considered their project sustainable and believed the change was embedded in the system. This was reportedly achieved by involving a variety of staff and 'bottom-up' approaches to change.
 Only two considered sustainability in the design of the project.
 Half of the projects tailored the implementation plan to anticipated barriers and enablers sourced from other health services, literature searches and personal experiences of project staff.  Half reported that implementation was conducted as planned. Some noted that it mostly went to plan but 'amendments were made continually to improve the process'.
 One project had no implementation plan  Half of the projects did not consider barriers and enablers The benefit of the proposed practice change is clear and observable  Lack of baseline data meant that potential adopters were unable to see the benefit or relevance to their situation resulting in less 'buy in' and poor uptake.

Evaluation of outcomes of decisions General
Use of pre-existing (and pre-tested) tools from other organisations eg audit tools  Evaluation and monitoring were considered important and had broad support  Monitoring of projects after implementation was thought to increase sustainability.
Quality and Risk Managers are not included at the beginning to help with collection of baseline data and evaluation design  Lack of baseline data  A lack of data was seen to contribute to the current state of 'little or no process of evaluation'.  Limited funds, knowledge and/or skills inhibited both the planning and conduct of evaluation. Evaluators  CCE was establishing an in-house Evaluation Service at the time of these interviews.
 No specified evaluators with appropriate training or expertise had been utilised by the respondents. Requirements for evaluation  Monitoring, evaluation and reporting of outcomes was required by DHS sponsored projects and TCPC. The Therapeutics Committee requested reports for some decisions.  Routine clinical audits and monitoring of adverse events undertaken for hospital accreditation purposes provided indirect evaluation of decisions in some situations.  Half of the completed projects had been evaluated; all but one project reported achieving its planned objectives.
 Monash Health had no requirements for evaluation of outcomes of decisions or projects.  Most committees had no planned evaluation of outcomes of decisions or implementation projects.  The purpose of reports for TCPC and Therapeutics was questioned by some respondents who noted that it may be inconsistent with the knowledge needed for program staff.  Only 2 projects planned evaluation as a project component. Some were evaluated post hoc.

Reinvestment
Reinvestment or reallocation of resources would be an incentive to disinvestment  SHARE Steering Committee keen to establish and support methods for reinvestment/reallocation  Flexibility and thinking laterally to include novel methods/indicators such as reducing waiting lists, getting patients out of Emergency Department faster, freeing up time in procedural/operating suites, freeing up bed days that are used to treat another patient group faster (eg X procedure saved Y$/bed days which was used by Z patients).

Lack of planning for resource reallocation Lack of transparency and consultation in reallocation of savings creates disillusionment
Staff dissatisfaction that savings generated are not reallocated  A health economist is required to do this properly, Monash Health had no resources for this  'We don't look far enough for downstream effects; we're too simplistic in assessment of savings'.  It was noted that savings made in a project in one area sometimes increased costs in other areas; hence reallocation of the savings to the project department would be unfair.  Savings of bed days or time in procedural/operating suites were used immediately to treat another patient group so were never realised  Accounting practices did not enable measurement and/or reallocation of savings in some areas, for example changes to one TCP may affect multiple cost centres eg department, ward, ICU, pharmacy. Six potential opportunities to integrate disinvestment decisions into organisational infrastructure, systems and processes were identified.

Literature SHARE leaders
Investigate methods to implement disinvestment decisions in the six settings identified.
Systems and Processes Undertaking disinvestment projects was a key element of the original proposal. Waiting for investigation of the six settings is too long to delay pilot projects. Some 'quick wins' would be valuable.

SHARE leaders Monash Health Staff
Develop methods to identify and prioritise potential target TCPs in parallel with the investigation of the six settings. Undertake pilot projects to disinvest them.

Disinvestment projects
Current decisions are made 'routinely' or 'reactively'. Introduction of TCPs is based on applications from clinicians or managers and removal of TCPs is based on emerging problems or product alerts and recalls. Research literature and local data could be used 'proactively' to drive health service practice.

Monash Health Staff SHARE leaders Project team
Build on current 'routine/reactive' processes that are done well. Develop new processes to use evidence 'proactively' to drive decisions and/or priority setting. Make these explicit elements of the new program.

Principles
Using evidence 'proactively' requires time and attention from decision-makers. The information provided must be trustworthy, applicable and sufficiently important to

Monash Health Staff SHARE leaders
Develop methods to identify appropriate high-quality information, process and package it for ease of use and deliver it to the

Finding
Source Decision Program element warrant adding to their workload. relevant decision-makers. Decisions for resource allocation are delegated to committees and individuals. There are opportunities for improvement in the governance of these processes and to introduce routine consideration of 'disinvestment'.

Monash Health Staff SHARE leaders Project team
Review processes and governance of decision-making by committees and the authority delegation schedule

Systems and Processes
There is no guidance on consumer participation in disinvestment activities.
Literature Develop methods to capture and utilise consumer perspectives and integrate them into the new program.

Systems and Processes
With a few exceptions, committees and project teams do not routinely involve consumers in making or implementing decisions and the organisation does not have a framework for engaging consumers.

Monash Health Staff Project team
The systems and processes for evidence-based decision-making cannot be delivered without appropriate and adequate skills and support

Literature Monash Health Staff
Develop support services that enable capacity-building and provide expertise and practical assistance Support Services With a few exceptions, staff do not routinely seek evidence for decisions, are unaware of best practice in implementation and do not evaluate outcomes.

Monash Health Staff Project team
Provide expertise, training and support in accessing and utilising evidence in decisions. Provide expertise, training and support in implementing and evaluating evidence-based change.

Support Services
The main barriers to use of evidence and effective implementation are lack of time, knowledge, skills and resources.

Literature Monash Health Staff
Health service projects are not usually well supported. It is common for funding to be insufficient, timelines inadequate and staff lacking in knowledge and skills in project management, data collection and analysis.

Monash Health Staff Project team
Influence planning of disinvestment projects to ensure adequate resources and appropriate timelines. Provide expertise, training and support in project methods and administration Support Services Disinvestment projects are generally based on health economic principles Literature Utilise in-house expertise and take an 'evidence-driven', rather than 'economics-driven', approach to investigation of disinvestment in the health service context.

Principles
Monash Health does not have expertise in health economics and does not intend to fund this in the foreseeable future Monash Health Leaders Safety, effectiveness, local health service utilisation and benchmarking parameters are possible alternative considerations for disinvestment. SHARE leaders Monash Health Staff Project team Monash Health has high-level expertise in accessing and using research evidence and health service data to inform decisions. Monash Health does not have the level of expertise in health program evaluation required for SHARE and has no expertise in health economics.

Project team Engage consultants in health program evaluation and health economics to assist in development and evaluation Preconditions
There is no guidance to inform a systematic organisational approach. Literature Undertake action research to investigate the process of change in addition to program and economic evaluations. Run a national workshop to learn and share information. Disseminate all findings.
Evaluation and Research In addition to detailed program and economic evaluation, understanding what happened in the process of investigation, what worked, what didn't work and why is required.

SHARE leaders Project team
This large program will need funds. It is consistent with the disinvestment agenda of the Victorian DHS who are sympathetic to a funding application.

DHS documents DHS staff
Seek funding from the state health department. Preconditions To be successful this ambitious proposal will need endorsement, support and strategic direction from the highest level and links to those with power and influence in the organisation.

Literature SHARE leaders Project team reflection
Increase membership of the Steering Committee to reflect those best able to provide the appropriate influence, direction and support.

Preconditions
All projects should be aligned to the Monash Health Strategic Goals. Program activities will be facilitated if integrated into the organisation Business Plan.

SHARE leaders Project team reflection
Align SHARE with the Monash Health Strategic Goals and include program activities in the annual Business Plans Principles

Section 7 Factors that influenced development, processes, outcomes and revision of EDS a. Development
Influencing factors are presented in the matrix below. Decisions are summarised in the table following.
Development, implementation and evaluation of the pilot Data, Capacity Building and Project Support Services are reported in Paper 7 [7]. Matrix reproduced with permission.

Based on sound evidence or expert consensus
There is evidence of desirable characteristics of evidence products, but no clear evidence of effectiveness for the overall model.

Presented by credible organisation
Sources of evidence, such as The Cochrane Library, are considered credible. CCE is considered credible as a knowledge broker.

Able to be tested and adapted
A formal pilot will be undertaken, ongoing feedback will be sought, and systems and processes will be refined based on stakeholder feedback.

Relative advantage is evident
All stakeholders consulted have responded that they would welcome up-to-date evidence being delivered directly to them.

Low complexity
Users only have to register to receive evidence, however they will have to appraise it. Reporting template is as simple as possible.

Compatible with status quo
There is no current system for receiving disseminated evidence. Reporting is integrated into the existing monthly reporting schedule.

Attractive and accessible format
The email and website formats are attractive and easy to use. The evidence is categorised and readily accessible.

Structure
CCE is an appropriate vehicle to deliver EDS within the organisation. Line management is the appropriate way to report use of evidence, change in practice, etc.

CCE team includes systematic reviewers, knowledge brokers and a health librarian. The Monash Health Medical Administration Registrar
(trainee) with up-to-date clinical knowledge was seconded to ensure correct classification within clinical categories. The decision-makers may not have the skills to appraise the evidence appropriately.

Resources
Adequate funding was provided from the SHARE Program and by Monash Health allowing secondment of staff to the EDS.

Commitment
The organisation has demonstrated commitment through endorsement by the Executive Management Team and the Board and representation on the SHARE Steering Committee (3 executive directors, 10 clinical program directors, 4 committee chairs, 5 senior managers, legal counsel and 2 consumer representatives). All senior decision-makers consulted expressed their support.

Leadership
The Executive Director of Medical Services and Quality, Chair of the Technology/Clinical Practice Committee and Director of CCE are leaders of the process. All have credibility within the organisation.

Based on sound evidence or expert consensus
This model addressed the desirable characteristics of evidence products better than Model 1.
No evidence of effectiveness for the overall model, no evidence that it has been done before.

Presented by credible organisation
Sources of evidence, such as The Cochrane Library, are considered credible. CCE is considered credible as a knowledge broker.

Able to be tested and adapted
A formal pilot will be undertaken, ongoing feedback will be sought, and systems and processes will be refined based on stakeholder feedback.

Relative advantage is evident
Changes between Models 1 and 2 are based on stakeholder feedback and the benefits of the changes are clear.

Low complexity
Recipients of Evidence Bulletins only have to check applicability of the evidence and make changes if required. The response form is even simpler and has been reduced from seven responses to two.

Compatible with status quo
There is no current system for receiving disseminated evidence. Designated decision-makers are responsible for making sure practice in their area of authority is up-to-date.

Attractive and accessible format
The Evidence Bulletins are attractive, able to be read at a glance, with key information extracted from the publication and summarised.

Structure
Designated decision-makers for the topic under consideration are the appropriate recipients of Evidence Bulletins.
Program Directors are the appropriate individuals to disseminate the evidence and request a response from the decision-makers who report to them.
The Technology/Clinical Practice Committee (TCPC) is the appropriately authorised group to govern the EDS process.
CCE is an appropriate vehicle to develop the evidence products.

Skills
CCE team have the relevant skills to produce the Evidence Bulletins.
The TCPC and Program Directors have the relevant knowledge to assess applicability of the evidence and need for change within the organisation.

Resources
Funding has been provided by Monash Health for the piloting phase, but ongoing funding to enable continuous delivery of the EDS will be needed.
The current level of funding does not enable dissemination of all available evidence; limitation of selected publications to areas of priority within the organisation will be required.

Commitment
The Chief Executive has made EDS an organisational priority and requires notification of all responses related to evidence of harm.

Leadership
The Executive Director of Medical Services and Quality, Chair of the Technology/Clinical Practice Committee and Director of CCE are leaders of the process. All have credibility within the organisation.  Users were not certain about the purpose of EDS and why specific publications were not being disseminated. They were also not using the website search function.
 The EDS explanatory pages were revised and a 'Frequently asked questions' page was introduced. Knowledge brokering  The EDS process was complex and only one staff member was familiar with all the requirements, creating problems when they were on leave.
 An administrator's manual was developed and additional staff were trained to improve sustainability of the service.  The pilot website had no branding, which did not comply with internal standards for Monash Health publications.
 The Public Affairs and Communications Department assisted the EDS team to include Monash Health branding Processes and infrastructure  Executives, Senior Managers and Program Directors required information about policy and management decisions which was not addressed in the predominantly clinical evidence provided from the sources previously identified.
 The category of 'Evidence based policy and management advice' was added and criteria to identify high quality sources of this information were developed (Section 9).  The need for users to identify publications that recommended ceasing or restricting a TCP for evidence of harm or lack of effect was noted.
 The category of 'Disinvestment' was added  The initial taxonomy used first level ICD10 headings. This did not provide enough detail and half way through the pilot period this was changed to the second level. The change to second level headings within the limitations of the free software made the process of entering data very time intensive and created messy search results for users.
 ICD10 classifications were replaced with MeSH.
 The category of 'Professional Group' was thought to be too broad to be of real use, for example 'Medicine' was attached to almost every piece of evidence, and had considerable overlap with the 'Specialty' category.
 'Professional Group' was removed and 'Specialty' was modified slightly to accommodate this change  The Medical Administration trainee was unable to undertake the classification due to other commitments which were given greater priority. This was a limitation of the Medical Administration portfolio where crises requiring immediate attention occurred frequently.
 The EDS paid a medical graduate for one hour per week to ensure categorisation was correct and completed on time.
 Users reported a preference for shorter emails with fewer entries.  Distribution was changed from fortnightly to weekly with fewer entries.
 Citations in bulletins from EUROSCAN did not point to full text.  EUROSCAN was removed from the list of sources of evidence.
Evaluation plan  The free email software had significant limitations related to analysis of available statistics. (Separate email software was needed at the start of the pilot as the website software did not have an email subscription function but introduced it later so the separate email software was no longer needed)  The email service with the original provider was discontinued and re-established with the website provider 23 d. Model 1 Full implementation Domain Influencing factors Decisions/Action Evidence products  Although they were recent publications, they may not contain any new evidence eg update of SRs or HTAs with no changes  Although the sources of evidence were appraised for their requirements of rigorous methods, this does not guarantee that the publication is valid or has low risk of bias  There was a large volume of information, including a large number of publications that did not require action  The email Alerts did not contain many of the features known to increase use and application of disseminated evidence ie no targeted message, no specific request for action To repackage the evidence to highlight key messages, demonstrate local relevance and implications, and provide actionable recommendations.
Target audience  Lack of time to appraise for quality and applicability, check for consistency with current documented practice or complete the proposed reporting template  Findings were often irrelevant to recipient's areas of practice, already known to them  Wasted their time and increased the potential for them to miss findings that mattered To reduce the burden on busy decision-makers by filtering publications before dissemination to assess quality, applicability, lack of or inconsistency with policies and procedures, local importance and potential for change.  Evidence Alerts not always reaching the right decision-makers -self selected To deliver the repackaged evidence to a specified authorised decisionmaker responsible for practice in the areas addressed in the publication. Knowledge brokering  The EDS team had difficulty processing the large number of eligible publications within the available resources and proposed that the selection criteria be restricted to reduce the volume  To limit selection criteria for publications to areas of high priority within Monash Health.

Processes and infrastructure
 Lack of governance, particularly a lack of transparency and accountability. EDS broadcasts were developed and disseminated rigorously and systematically, but were not accessed or used rigorously or systematically. Those responsible for decisions within the organisation were required to self-select and take action, but there was no process to ensure that the appropriate person with authority in the area affected by the evidence had considered the information or made a decision. Recipients could choose whether to access, use, or report use of evidence; or not.
To introduce a governance framework for transparency and accountability and to ensure that the appropriate decision-makers are engaged, they address the evidence and take action as required, and the process is documented and reported.

Processes and infrastructure
 Evidence of benefit could not always be classified as clinical or cost effectiveness; for example effective methods to develop or implement guidelines.
 A new category of methodological effectiveness was added.
 There was not enough time to discuss the potential items for dissemination at the TCPC meeting  A standing item for EDS was introduced to the TCPC agenda  EDS was promoted as an organisation-wide priority  Responses were mandatory, would be audited and reported to Chief Executive every month  TCPC had the authority to require action  All senior managers were supportive  These were enablers

f. Model 2 Full implementation Domain
Potential influencing factors Potential Decisions/Action Evidence products  No negative comments were received regarding the Evidence Bulletins  The format could be replicated in subsequent models Target audience  The volume of information to each decision-maker was significantly reduced  Most bulletins were provided for information only, on average responses were required only once every few months.  All the bulletins decision-makers received were relevant to their clinical area  Their workloads were reduced to confirming whether change was needed, taking action if required, and reporting the outcomes  These were enablers  Many decision-makers in the target audience were researchers familiar with the literature and often contributors to systematic reviews or evidence-based guidelines. They were annoyed when receiving material they were familiar with.
 Difficult to know how to address this when EDS staff do not know which areas of research staff members are active in, and should not assume even if they are active that they are aware of all the evidence in that area Knowledge brokering  Several respondents appeared to be unclear about the purpose of the EDS, in particular it was perceived that CCE had undertaken the reviews, rather than capturing synthesised evidence as it was published by others  A flowchart or text summary of the EDS process within each bulletin may address this  Evidence regarding drugs that were not available locally was disseminated  Confirmation that drugs or other technologies are available would require an extra step in the process  Many publications had more than one conclusion, eg harm plus effect or effect plus lack of evidence.  Some complex issues were relevant to multiple decision-makers  New methods are needed to address these issues.

Processes and infrastructure
 The governance elements worked smoothly and enabled transparency and accountability of the processes  The methodological issues were addressed successfully; only valid evidence was disseminated in bulletins that highlighted key messages, demonstrated potential inconsistency with local practice, and clearly stated required actions

Section 9 Definitions of evidence products, inclusion criteria and appraisal of publication sources
Inclusion and appraisal criteria were applied to methods published on the websites of potential sources of high quality synthesised evidence.

Evidence-Based Guidelines
Evidence-based guidelines are systematically developed statements that aim to assist practitioner and patient decisions about appropriate health care for specific clinical circumstances. Developed after the systematic retrieval and appraisal of information from the literature, evidence-based guidelines usually include strategies for describing the strength of the evidence, and clearly separate expert opinion from the best available evidence. 3 Evidence-based guidelines have been sourced from sites or organisations that have appropriate methods of development.

Quality criteria
Sources were assessed against a subset of criteria from the AGREE II instrument. 4  Systematic methods were used to search for evidence -criterion 7  The criteria for selecting the evidence are clearly described -criterion 8  The methods used for formulating the recommendations are clearly described -criterion 10  There is an explicit link between the recommendations and the supporting evidence -criterion 12

Horizon scanning documents
Horizon scanning provides short, rapidly completed, 'state of play' documents. These provide current information on technologies to alert planners and policy makers of the advent and potential impact in terms of safety and cost, before they are introduced into the health system. In addition to new and emerging technologies, horizon scanning can also provide timely information about changes in the delivery and use of existing technologies. 5

Quality criteria
Sources were assessed against the eight principles of the HONcode (Health on the Net Foundation

Alerts and recalls
An alert is advice regarding a specific situation in which a therapeutic good which, whilst performing to meet all specifications and therapeutic indications, might present an unreasonable risk of substantial harm if certain specified precautions in regard to its use are not observed. 7 A recall advises the permanent removal of therapeutic goods from supply or use for reasons relating to deficiencies in the quality, safety or efficacy of the goods. 7 Alerts and recalls were not appraised but were limited to Australian government publications.

Evidence-based policy and management advice
Evidence-based policy and management advice is represented as synthesised research evidence related to governance, financial and delivery arrangements in health systems 8 as well as policies, programs and interventions at public health decision-making levels. 9 Quality criteria

Coding
The titles were coded so the reader could identify the type of publication  Systematic reviews and health technology assessments (HTAs) were identified by the prefix SR.
 Evidence-based guidelines were identified by the prefix GL.
 Horizon scanning can be identified by the prefix HS.
 Alerts and recalls can be identified by the prefix AR.
 Evidence-based policy advice can be identified by the prefix PL.
The new drug will be implemented following an education program and introduction of revised local guidelines In the absence of good evidence to retain or discontinue current practice, no changes will be made.

Section 15
Survey of staff enrolling in the EDS: Baseline data All subscribers had been invited to complete a baseline survey regarding their use of evidence when they registered with the EDS. The findings were very similar to other surveys in this area [23, 24, 29, 41-43, 45-48, 57, 73-78] , including others at Monash Health [7]. Users consulted a range of sources to inform their decision-making and believed that EBDM resulted in the best clinical care.
Almost half (18/41) of the respondents found out about the EDS through the advertisement on the Monash Health Intranet, the others found out through the Chief Executive's Newsletter (8), referrals from colleagues (8), posters in the hospital (4), or other means (3). Most (33/45) reported that their role involved decision-making about introducing or changing use of TCPs.
All respondents 'sometimes', 'often' or 'always' included research evidence in their decision making. The internet, The Cochrane Library, and electronic databases were the most commonly used resources. Most respondents spent more than two hours searching for, assessing and appraising evidence for their decisions.

Objective
To test and refine the features of Model 1 for use by individual decision-makers.

Characteristics of the pilot intervention
The scope, components and methods developed initially formed the pilot intervention.
Pilot activities were undertaken with a pragmatic sample of a range of individual decision-makers including executives, clinical program directors and senior managers from the SHARE Steering Committee and Technology/Clinical Practice Committee and clinical managers from one large multi-campus department.

Implementation strategies
EDS staff met with committee and department representatives to seek agreement in principle and then attended meetings to explain the service and obtain agreement from individuals. Personalised emails explaining the project and requirements of participants were sent to those who were not present at the meetings. The project team enrolled each of the designated staff members, but individuals were required to register to establish their account. An email invitation with information about the EDS, an embedded link for registration, and instructions on how to activate the link was sent to each participant.

Evaluation
Evaluation was conducted six months after implementation and included audit of website statistics, electronic survey of individual users, interview with EDS administrator, and reflections of the SHARE Steering Committee and project team. An additional survey was sent two months later to explore reasons for non-use of the EDS in the pilot sample. Details of the survey and interview questions, responses and project team observations are provided below and key messages are summarised.

Reach
Of the 73 individual decision-makers enrolled by the EDS team, 26 activated their email subscription and one created an RSS subscription. Due to problems determining the validity of email addresses it was difficult to define a denominator for this response. Medical staff frequently used personal email addresses and lists of committee members were not kept upto-date; some may not have received the invitation and others may have left the organisation.
Users preferred the email to the website with email 'views' growing significantly over the pilot period while the website remained steady with relatively fewer 'views'.
While not officially in the evaluation period, in the eight months between the formal pilot and implementation of the revised EDS, subscription more than doubled to 64 participants with an average number of 100 visits to the site per month. The 'Home' page was the most frequently visited page of the site with the most recent systematic review being the most common destination for users.

Usefulness: User satisfaction
There were only eight responses to an online survey sent to individual participants. While this small number limits generalisability, the themes were very consistent and most respondents replied positively. Users were 'mostly' or 'completely' satisfied with the service. The website was viewed as 'easy' or 'very easy to use' and the amount of information on the website met user's needs. Email alerts were read and respondents reported accessing full text at least 'sometimes' and one person 'always'. One respondent questioned why there were not more publications in their area of expertise, suggesting that they misunderstood the nature of the service ie that it captured publications as they were published rather than selecting them by topic.

Usefulness: Service quality
All respondents rated the information as 'trustworthy', 'current' and 'coming from an authoritative source'. One respondent was unaware of the classification system, but the others reported that entries had been classified correctly. Two respondents suggested improvements, both related to identifying information relevant to users' specialty areas.

Use
Two individuals had used the information in making decisions about clinical practice. No one had used it for purchasing clinical consumables or capital equipment; although half thought that they would in the future.
The executives and senior managers reported that the information in the EDS alerts did not influence their decisionmaking because it was predominantly about clinical practice and their decisions were not. They observed that the different levels of management within the organisation required different types of information and proposed three levels: 1) Department heads and unit managers needed evidence for local policies and protocols related to clinical practice. 2) Program directors required evidence that informed their one to two year planning processes and was relevant to procedural aspects of the health service such as programs and service delivery as well as individual practitioners. 3) Executives and senior managers required information to inform three to five year forward planning that aligned with the organisation's strategic objectives. This resulted in addition of a category for 'Evidence-based policy and management advice' and development of criteria to identify high quality sources of this information; details in section on Definitions of evidence products above.

Implementation fidelity
The only modifications to the planned intervention were that some of the sources were not accessed during the pilot period. The intervention was implemented as planned. Barriers and enablers were identified and action taken. Almost all were related to technical issues in delivering the service. There are a number of small things that have not been implemented due to the nature of the pilot (eg using the full list of original resources), however, the service is fully operational and implemented without any major changes to the implementation plan.

Interview with EDS Administrator
To what extent has the EDS been implemented as planned?
There are a number of small things that have not been implemented due to the nature of the pilot, however, the service is fully operational and implemented without any major changes to the implementation plan. The full list of resources to be checked has not been implemented yet as only a few of the resources were chosen for the pilot. These were those that met the quality criteria. This list needs to be revisited. New resources have emerged and will be added to the resource manual.
Have there been any unplanned modifications along the way and why or why not? Second level ICD10 headings introduced half way into pilot. Depending on evaluation results, this will be retained and all entries under the top level headings removed. Excluded EUROSCAN from the list of resources as it did not meet the quality criteria (sends out notifications without data -duplication of effort) The taxonomy will always be in development. This is due to the nature of starting with existing classification systems not designed for this purpose. For example, the category for medicine from MeSH is too broad and lacks some specialisations. There is also duplication within the taxonomy which must be addressed. The last month of the pilot was changed to a weekly email due to the amount of new evidence uploaded to the resources we are using over our holiday break Is there anything yet to be done that was in the plan?
The full resource list has yet to be searched. Move to a new blog with new domain name. The current EDS is on a personal account. Moving will allow any EDS team member to update it. Reviewing original resources for quality. We have got it listed in the EDS as 'at least annually'. Maybe once every two/three years? Did consider how long to keep the posts on the blog. It's not meant to be a repository -the email is main feature. Could remove after 6 months. Need to ask users or steering committee.

What have been the main barriers and enablers to establishing and continuing the EDS? Barriers
Slowness of work computer necessitating work from home one day a week. This has been resolved with a new computer at work. Incorporating this new task into work load has taken some time. Establishing a routine and developing a more streamlined process of gathering and updating the blog -once a barrier and now an enabler -hopefully. This has been facilitated by creating a template for broadcasts ie, keeping the headings in Setting aside a particular day and time to get evidence and loading it the next day really works. If this was to become a state or national project there would need to be increased leadership and budget from another body. The software would need to be upgraded and the use of IT technician might be needed. There might need to be some review process for quality assurance of the taxonomy with a clinical review every so often that checked a few posts to ensure categorisation was correct. This would be necessary for new people administrating the service.

Project team and Steering Committee observations
 Executives, Senior Managers and Program Directors required information about policy and management decisions which was not addressed in the predominantly clinical evidence provided from the sources previously identified.  The need for users to identify publications that recommended ceasing or restricting a TCP for evidence of harm or lack of effect was noted.  The Medical Admin trainee was unable to undertake the classification due to other commitments given greater priority. This was a limitation of the Medical Admin portfolio where crises requiring immediate attention occurred frequently.  The EDS process was complex and only one staff member was familiar with all the requirements, creating problems when they were on leave.  The pilot website had no branding which did not meet internal standards for Monash Health publications.  Users reported a preference for shorter emails with fewer entries.  Users were not certain about the purpose of EDS and why specific publications were not being disseminated. They were also not using the website search function.  The free email software had significant limitations related to analysis of available statistics. The website software did not have an email subscription function at the start of the pilot but introduced it later.  The initial taxonomy used first level ICD10 headings. This did not provide enough detail and half way through the pilot period this was changed to the second level. The change to second level headings within the limitations of the free software made the process of entering data very time intensive and created messy search results for users.  The category of 'Professional Group' was thought to be too broad to be of real use, for example 'Medicine' was attached to almost every piece of evidence, and had considerable overlap with the 'Specialty' category.  Citations in bulletins from EUROSCAN did not point to full text.
Follow-up electronic survey to explore non-use of EDS

Users
 Have no issues. Would love more renal/transplant issues but do find the other issues useful.
 When email is sent, it is very clear at a glance which units may be interested in article eg. Infectious Disease: Article A... Article B...  Very good format. Maybe a wider range of topics; more on clinical drug trial reports  Define source of information eg HS on email alert  Unable to do this -I am not a staff member (Consumer representative). Some staff might like particular areas to be categorised or highlighted to enable quick access. I did not explore the possibilities here.

Non-users
 I imagine the EDS would provide links to new sources of evidence, references and summaries of noteworthy publications etc. Perhaps the EDS would set up a permanent link on the Clinicians Health Channel or directly on the intranet or send out a regular e-newsletter.

5.
Although you may not have heard of the EDS or may not use it, please comment on how you imagine the EDS could be used to aid decision-making within the organisation more broadly.

Users
 Have already used information to pass on to head of unit which has been useful in decision-making for a trial we want to do  First point of call prior to development of new clinical policy/procedure  EDS has enormous potential. Sorry I can't be more helpful.

Non-users
 Good idea. Needs to be widely known about. Email updates are more likely to be effective than promoting web address. Specific topic updates on a regular basis may be helpful.

 Don't know what it is
 Would be very interested to receive the suggested ?monthly emails 45

Section 17 Model 1 Evaluation of full implementation
Evaluation was conducted ten months after implementation of Stage 1 and included audit of website statistics, survey of individual users, interviews and consultations with stakeholders, and reflections of the SHARE Steering Committee and project team.
The project team identified 46 of the 70 subscribers by their Monash Health email addresses (the others used anonymous personal emails) and surveys were sent by internal mail including an addressed return envelope and a chocolate incentive. A two week response time was stipulated.
The user survey had a 52% (24/46) response rate; all health professional groups and all campuses were represented. All three committee liaison representatives and two senior individual decision-makers participated in interviews.

Reach
Seventy subscribers enrolled during the evaluation period.
Most (20/24) survey respondents received email broadcasts and the others established personal RSS feeds. Although the EDS was set up for users to access information via email or RSS feed, it was encouraging to see the EDS accessed via the Monash Health intranet 182 times and 134 full text articles downloaded this way. It was difficult to interpret other available data as limitations with the free website software meant that 'user' and 'administrator' (EDS staff) traffic to the site could not be separated.
The Therapeutics Committee representative was a member of the SHARE team and received the full EDS email broadcasts; customised RSS feeds were developed to address the specific needs of the Medication Safety and Clinical Risk Committees.

Usefulness
Most (21/24) respondents were satisfied with the EDS and found the website, email broadcast or RSS feed met their needs 'fully' or 'partially'. The majority (17/19) of respondents found the categories useful and those that did not were not aware that this feature was available. Categories were used to quickly identify if the information was relevant to them and prevented them from looking at irrelevant information.
Committee representatives found that the format was "...clear and relevant", "layout of the bulletins was easy to read", "summary of the findings was very good" and "volume of material is fine".
The majority (22/24) of respondents found the content was 'current' and 'trustworthy', and 'useful' or 'partially useful'. Participants responded 'partially' or 'no' to any of the options because the information provided was not relevant to their area of clinical practice. The large volume of material was noted as a barrier to accessing the information contained in each broadcast. Six survey respondents provided suggestions for how the service could be improved; all related to making the categories more specific to avoid wasting time looking at irrelevant information.
Responses of committee representatives were mixed. Negative comments reflected the survey responses; "A lot of information that wasn't particularly relevant", "too clinical" and was "rarely helpful or useful". Positive findings included "…providing the correct kind of information" and "hitting the mark of what you would expect from an Evidence Dissemination Service".

Use
Less than half (9/24) of the survey respondents had used information from EDS in decision-making; examples of use included confirming current knowledge, ensuring knowledge is up-to-date, informing formulary decisions, passing information on to colleagues and using information in research. Only one respondent had used it for purchasing clinical consumables, none for purchasing clinical equipment, and nine for clinical practice change. However they were optimistic about the possibility of future use for purchasing clinical consumables or equipment, clinical practice change and other resource allocation decisions. The main reasons for not using the EDS information in decision-making were lack of time to read full articles and lack of relevance to the clinical setting.
Committee representatives reported that no information provided by the EDS was discussed at meetings held during the evaluation period. Further tailoring of customised RSS feeds was suggested by committees as a way to increase use, for example the Medication Safety Committee requested publications that demonstrated evidence of harm, evidence of reduction in risk of harm, and evidence regarding use of an effective alternative to a medication in current use. They were not interested in publications reporting lack of effect or insufficient evidence.
Two senior decision-makers responsible for organisation-wide portfolios were consulted regarding the draft reporting tool prior to implementation of stage 2. They were in agreement that the volume of work required to access the publication to identify whether it was relevant; then appraise it for quality, local applicability and consistency with existing policies and procedures; take appropriate action and report using the proposed tool was too onerous and it was unlikely that model would be achievable.

Implementation fidelity
There was one major modification to the planned intervention. Following evaluation of stage 1, it was clear that this model would not meet the objectives and stage 2 was not undertaken.
All the proposed implementation activities for the participating committees were completed as planned and there were only minor changes to the plan for organisation-wide roll-out. Time constraints prevented the project team delivering demonstrations of the EDS in Monash Health public places and icons were not placed on all computers.
The barriers and enablers identified in the evaluation are discussed as factors influencing the processes and outcomes below and in Section 7d.

Participants
Individuals (survey): Forty-six paper based surveys were sent and 24 were returned. The four participants that selected 'Other' came from the Quality Unit, Corporate Office and Research Nursing, and one described their role as a project officer.
A large proportion of respondents were Allied Health staff.
Due to the small numbers of overall respondents this may not be representative of the EDS user population. The majority of survey participants were located at Clayton.

Total
As an email (full bulletin) 20 As an RSS feed (selected topics delivered to inbox or browser) 4 Total Participants 24 The majority (83%) of participants received information from EDS as a full email bulletin.
Groups (interviews): The EDS engaged with three decision-making committees (Medication Safety, Clinical Risk and Therapeutics Committees). One committee representatives participated in a face to face interview, one an email interview, and one provided feedback directly as they were also a member of the EDS team.

Reach
The EDS attracted 70 active subscribers during the evaluation period.
The statistics generated by Wordpress.com suggested that users accessed EDS via the Intranet 182 times. The most clicked links included the resource page (19 clicks), the CCE internet homepage (18 clicks) and the CCE email query link (11 clicks). A total of 134 full text articles were accessed via the EDS website.
Access to the EDS website was variable over the 10 months of activity. Although the EDS was set up for users to access information via an email or RSS feed it is encouraging to see that users were still visiting the site. The reasons for the peaks and troughs in access are unclear. A potential explanation for the high peak in the first month may be due to access by the project team to sort through initial teething problems, or extra interest by new users which was not sustained. Limitations with the software meant that we could not separate 'users' from 'the administrator' (CCE staff).
All three committees participated.

Satisfaction
Survey participants were asked to rate their overall satisfaction with the EDS. The majority (21/24) of participants were either 'partially' or 'very satisfied' with the EDS overall

Content
Survey participants were asked whether the amount of information provided by the EDS met their needs; 8/24 found the website content useful, 12/24 found the email alert useful and 2/24 found the RSS feed content useful. The main message from the participant's feedback reflects that there was a significant amount of non-specific information being sent to users. This results in a time-consuming activity for participants who trawl through each piece of evidence. Participants who answered 'partial' or 'no' provided the following feedback:  "Probably too much irrelevant stuff (I am not sure whether I selected the correct options when I subscribed)"  "The amount of emails I receive is quite large and trawling through them is time consuming. I don't have much time to attend to articles"  "Would be good to group into medical, nursing, allied health specific info if relevant"  "A lot of irrelevant information -not much specific topical info"  "Very little information provided for medication safety that was relevant, however this may be a reflection of the lack of evidence for medication safety related topics"  "It isn't specific like the BMJ Evidence email service. I do not want to know about articles that are not relevant to my practice"  "So much unfiltered and irrelevant" Committee representatives were asked about usefulness of the content of EDS alerts. The responses were:  "A lot of information that wasn't particularly relevant...I just don't need RCTs but other published articles are also helpful".
 "Too clinical" and was "rarely helpful or useful"  "Providing the correct kind of information" and was "hitting the mark of what you would expect from an evidence dissemination service".

Format
The EDS categorises information by healthcare setting, type of technology, professional specialty and special interest groups. 17/19 respondents found the EDS categories useful. Participants found that the categories helped them to quickly realise if the information was relevant to them and prevented them from looking at irrelevant information. The reason they did not find the categorisation of evidence useful is because they did not notice the feature.

Yes No Missing Total
Usefulness of EDS categories 17 2 5 24 The following explanations were provided for participants finding the categories useful:  "Although would prefer more specific ones"  "It helps me quickly realise what info is useful to me"  "Allows quick browsing"  "Generic covered most areas"  "Useful so you don't have to sort through irrelevant information"  "I focus more on the topic presented, not the category"  "But would like more around current policy environment such as food and nutrition interventions, medicare locals"

48
The following explanations were provided for participants not finding the categories useful:  "Probably would be useful -wasn't aware of this feature"  "I have never noticed the grouping before"  "Need to be more specific" Committee representatives were also asked to respond to the format of the EDS alerts. The responses were:  "The layout of the alerts was easy to read and OK" but "the abbreviations were a bit confusing eg SR".
 "The summary of findings was very good" and the "volume of material...fine".
 "The format is clear and relevant".

Quality
The majority (22/24) of respondents found the information provided by EDS to be current, trustworthy and useful or partially useful. Participants who responded 'partially' or 'no' to any one of the options agreed that the information provided was not relevant to their practice.
 "Too much irrelevant information"  "A lot of the information I receive is of little or no use to my practice. Although some items are quite interesting"

Recommended improvements
Six of the 24 survey respondents provided the following suggestions for how the service could be improved:  "As discussed, further alerts about current health policy environment or health interventions"  "More categories to be able to focus in on relevant information"  "Provision of services related to a specific area eg can you please provide relevant research/evidence related to...would be most helpful"  "Make it clearer with regards to allied health related content"  "Categories more specific -although realised I should check the website which I will do"  "It might just be me -need to refine the subscription to make things more relevant"

Accessing EDS content
The majority of survey respondents 'always', 'often' or 'sometimes' browsed email alerts or RSS feeds for interesting items (22/23) and followed links to full-text for items of interest (17/20). A considerable proportion of survey participants did not know they could browse the EDS website for interesting items (11/24), follow links to full-text for items of interest from the website (9/24) or search the website by categories (11/24). The three committee representatives looked at the EDS alerts they received and screened them for relevance to their respective committees.

Use of EDS in decision-making
Less than half (9/24) of the participants had used EDS to guide decision-making; these included formulary decisions, to confirm ideas about certain interventions or to update clinical knowledge.
The main reasons participants had not used EDS to guide decision-making (15/24) was because they had not had time to read the full articles or there had not yet been any relevant information to their clinical setting.

Yes No Missing Data Total
Use in decision-making 9 15 0 24 The following comments were provided by participants regarding how they used EDS in decision-making:  "Confirm ideas/interventions"  "Formulary decisions"  "Only by passing info to medical staff"  "I ensure my clinical knowledge is up to date and look for further or stronger evidence in key areas"  "Have used info to add to other research" The following comments were provided by participants regarding why they did not use EDS in decision-making:  "Often don't have time to explore further"  "I often don't have time to read the full articles but if I did it would affect decision-making"  "Not as yet, I have only been a recent subscriber"  "I haven't been able to obtain any relevant information to assist decision-making yet"  "Nothing has been appropriate for me in decision-making but I have seen info which would be useful to others" Committee representatives reported that no information from the EDS had been discussed and acted upon at meetings.
The Medication Safety Committee representative noticed that the evidence in the alerts was rarely helpful or useful for their committee. They found the evidence was too clinical for their area of interest and did not match their committee's areas of concern. For this reason no information was presented to the committee.
The Clinical Risk Committee representative noticed that there was a lot of information that was not particularly relevant however they were happy to screen and choose areas that were of interest to the committee. This representative had had problems receiving customised alerts and therefore found it difficult to find any relevant information to pass on.
The Therapeutics Committee did not discuss any material at their meeting because the representative and chair of the committee decided that no information was relevant.

Use in decision-making for resource allocation
Only one respondent had used EDS to inform decision-making for purchasing clinical consumables, no one reported using it for purchasing clinical equipment, however 9 had done so for clinical practice change. One participant commented that their non-use was related to the fact that their area of practice was not represented often and that there was a lot of medical and drug information that did not apply to them. Other 0 12 12 24 Half (12/24) of respondents felt they would possibly or definitely use EDS to inform decision-making for purchasing clinical consumables and for purchasing clinical equipment. The majority (22/24) said they would use EDS to guide decisionmaking for clinical practice change in the future. Reasons given for future use or non-use included the following:  "I am not involved in clinical practice"  "Not for me but maybe for others"  "Would always pass on relevant info to relevant medical staff"  "Possibly clinical practice change if I have time to read and evaluate the evidence"  "I don't have control over any budget/purchasing"  "Depends on the information available" It was interesting to note that, of the participants who had answered 'no' (22/24) to using the EDS in decision-making for 'purchasing clinical consumables', 11/22 said they would, or possibly would, use it in the future. Participants who answered 'no' (23/24) to using the EDS in decision-making for 'purchasing clinical equipment', 12/23 said they would possibly use it in the future. Participants who answered 'no' (15/24) to using the EDS in decision-making for 'clinical practice change', 13/15 said they would, or possibly would, use it in the future.

Implementation
Implementation activities were undertaken for two separate target audiences, all Monash Health staff and the targeted committees. The EDS Manager was responsible for coordinating implementation activities and other EDS project members were responsible for providing technical support to users.
The majority (4/6) of the activities for implementation across the organisation were undertaken as planned.
Advertisements were included online and in print and were promoted on the Monash Health Intranet and specific staff portals. Due to time restraints the project team were unable to undertake demonstrations of the EDS for specific groups or in Southern Health public places.

Proposal Achieved Outcome
Place ad in CE's newsletter  One ad was placed when the EDS disseminated its first alert.
Adverts disseminated in the form of flyers via Email, eBoards, Notice boards  Flyers were placed across all campuses in public areas as well as within departments.
Demonstrations of EDS (public place or for specific groups eg registrar meetings) x Time restrictions meant that this activity was not undertaken.
Launch newly modified and updated website across the SH intranet site (also include message about brief survey)  A logo and brief description was posted on the front page of the Southern Health intranet as well as permanently placed in the side bar.
Investigate the possibility of putting an icon on all SH computers All activities for implementation with the target committees were undertaken. The EDS Manager met with the three committee representatives and discussed all elements of the EDS with them. Further work could have been undertaken to identify potential barriers and enablers to using the EDS with the committee representatives.  One user has suggested they like the information delivered by BMJ Evidence Updates because the information is relevant to their specialty. Although this is a different resource to EDS, we should consider relevance of information as a priority.  Particular attention should be made to demonstrate to users how to receive RSS feeds.

Use of EDS
 The EDS team should investigate other platforms to run the EDS. At the moment only one specialty area or the full alert can be selected for users to receive information. Users would like to be able to select more than one specialty to ensure emails are specific to their areas of interest and Wordpress.com does not allow this function.

Resource use -time and skills
 Participants: Too much, too busy, not all relevant, things they knew already, not new evidence or SR finding lack of evidence, not important, etc  KB team: too many publications, can't process all available  If we had followed our plan of getting department heads to do all the follow up re local policies and protocols etc this would have been very time consuming, particularly for evidence that was not very important, and which may already be documented practice for the organisation.
 We proposed that decision-makers appraise the information, check for policies and protocols, and report. Decision-makers don't want any additional work, we know they don't have the time and skills to appraise -we could do that for them Not achieving aims  Systematically disseminated but not systematically used  Not integrated into other decision-making processes -we tried with monthly reporting but too onerous  Not accountable or transparent  Those who did receive it were not always the appropriate decision-makers  Can't be sure practice is evidence-based  Individuals may or may not have changed practice or their own practice may have been consistent so they didn't need to change. SHARE was about a systemic approach, integrating new decision-making systems and processes into existing infrastructure for organisational impact. We needed a process that addressed organisational practice not individual practice. Needed to integrate it into existing processes for determining organisational practice.
 We had determined designated groups and individuals who made decisions regarding resource allocation for TCPs in previous project, targeting to them would be better use of resources and more likely to achieve aims  Use with caution: High Risk of Bias (Few or no criteria fulfilled or the conclusions of the study are likely or very likely to be affected) or Insufficient information (not enough information provided to be able to determine risk of bias)

Consistency with Southern Health documented practice
Southern Health policies or procedures appear to be consistent with the evidence Southern Health policies or procedures do not appear to be consistent with the evidence No Southern Health policies or procedures on this topic were identified

Model 2 (Full implementation)
Drop-down boxes were added to the template so that only findings applicable to this publication are reported. The text incorporates the implications of bias in application of the evidence.

Quality of evidence
Quality of this Systematic Review or Health Technology Assessment CCE staff have appraised the methods used in this publication and found the risk of bias to be LOW. This means that you can use the findings of the review with confidence as all of the quality criteria have been fulfilled or where criteria have not been fulfilled it is very unlikely the conclusions of the study would be affected.

Quality of the evidence contained in this Systematic Review or Health Technology Assessment
The review authors have appraised the available evidence and found it to be Level I Evidence (a systematic review of Level II studies) of high quality.

Consistency with Southern Health documented practice
Southern Health policies or procedures appear to be consistent with the evidence 54

Section 19 Model 2 Evidence Bulletin template
This bulletin is part of a process to ensure that Southern Health practice is consistent with current evidence. Your response is required by the date below. You can find more information about this process on the TCPC website.
The publication below indicates evidence of Choose an item. 10 related to Responses related to evidence of Choose an item. 11 are required within Choose an item. 12 Please complete and return this bulletin to marie.garrubba@monash.edu by Click here to enter a date.

Quality of this Systematic Review or Health Technology Assessment
CCE staff have appraised the methods used in this publication and found the risk of bias to be Choose an item. 13 This means that you can use the findings of the review with Choose an item. 14

Quality of the evidence contained in this Systematic Review or Health Technology Assessment
The review authors have appraised the available evidence and found it to consist of Choose an item. 15 The available evidence included in the review is of Choose an item. 16 10

Technology/Clinical Practice Committee Evidence Bulletin_164
Quality of Evidence

Quality of this Systematic Review or Health Technology Assessment
CCE staff appraised the methods used in this publication and found the risk of bias to be LOW. This means that you can use the findings of the review with confidence as all of the quality criteria have been fulfilled or where criteria have not been fulfilled it is very unlikely the conclusions of the study would be affected.

Quality of the evidence contained in this Systematic Review or Health Technology Assessment
The review authors appraised the available evidence and found it to consist of Level II Evidence (one or more randomised controlled trials). The available evidence included in the review is of variable quality.

Consistency with Southern Health documented practice
No Southern Health policies or procedures on this topic were identified.  Not applicable at Southern Health 7  Neuromodulators for pain management in rheumatoid arthritis (Potential Harm). 1 The options mentioned in the conclusion are not available on our PBS, so useless for our patients  Botulinum toxin for the treatment of strabismus (Potential Harm). 1 Botulinum toxin injection is not practised at Southern Health Ophthalmology Department  Eslicarbazepine acetate add-on for drug-resistant partial epilepsy (Clinical Effectiveness). 1 The drug is not in use in Australia and it does not appear in the TGA database. It is not helpful to examine data relating to drugs/devices not available in this country.

Response
 Gonadotropin-releasing hormone agonist versus HCG for oocyte triggering in antagonist assisted reproductive technology cycles (Potential Harm). 1 IVF not undertaken at Southern Health.
 Interventions for pregnant women with hyperglycaemia not meeting gestational diabetes and type 2 diabetes diagnostic criteria (Clinical Effectiveness).
Respondent reported this as 'Not applicable', however CCE would categorise this response as 'Not consistent with the evidence, remedial action commenced'. 1 The diagnosis and management of GDM and hyperglycaemia not meeting GDM guidelines is currently under national and local review. The Pregnancy Diabetes service at Southern Health has already initiated changes to current practice to conform to (new) ADIPS recommendations. The service is also completing on-going research to guide future practice.
 Cabergoline for preventing ovarian hyperstimulation syndrome (Clinical Effectiveness). 1 Southern Health does not do IVF.
 Milnacipran for neuropathic pain and fibromyalgia in adults (Potential Harm).  Naftidrofuryl for dementia (Clinical Effectiveness). 1 Drug not available in Australia  Cognitive stimulation to improve cognitive functioning in people with dementia (Clinical Effectiveness). 1 To my knowledge, specific interventions for patients with dementia while ideal and what we aspire to is very limited in the subacute inpatient setting (e.g. GEM) due to lack of resources and time.
 Short and long term effects of tibolone in postmenopausal women (Potential Harm). 1 Most menopausal women use combined HRT. Select groups need tibolone due to low libido or abnormal bleeding on HRT.
 Not consistent with the evidence, remedial action has been undertaken and completed  Not consistent with the evidence and remedial action has been commenced/planned 1  Perineal techniques during the second stage of labour for reducing perineal trauma. (Clinical Effectiveness). 1 This Cochrane Review will be looked at by the Maternity Guideline Development Group and existing practices reviewed 61

Pilot objective
To test and refine the features of Model 2.

Characteristics of the pilot intervention
The scope, components and methods described formed the pilot intervention. Pilot activities were undertaken with a pragmatic sample of publications containing evidence of harm. A catalogue of disinvestment opportunities had been compiled to identify pilot disinvestment projects for investigation in the SHARE Program [79]. Publications with high quality evidence indicating harm published in the previous two years were selected.

Pilot implementation
The implementation strategies focused on integrating the new processes into existing Monash Health infrastructure and communicating with stakeholders.
The procedure for the new EDS processes was documented and a routine item for discussion of EDS matters was included in the TCPC agenda.
The Director of CCE/SHARE Director made presentations to the Executive Management Team, Medical and Nursing Executive groups, and met with clinical directors of all medical programs, allied health, pharmacy, pathology, diagnostic imaging and procurement. The Chair of the TCPC delivered a presentation to the Monash Health Board. All senior managers expressed their support for the proposed governance structure. A letter outlining the new process was sent to stakeholders by the Executive Director of Medical Services and Quality and a flyer was circulated to the 'All Staff' email list by the Chair of the TCPC.

Pilot evaluation
The stakeholders listed above were asked to provide feedback regarding the new processes, and templates for feedback were included at the end of the Evidence Bulletins.
An audit of responses was undertaken two months after dissemination of the pilot bulletins.

Reach
Six evidence bulletins indicating harm were forwarded by Program Directors to the relevant decision-makers (Medicine Program 3, Women's and Children's Program 1, Specialty Program 1, Critical Care Program 1).
Four out of six responses from decision-makers were received by the due date (one month after receipt). The others were received after reminders were sent. The average time to respond was 28 days.
Bulletins were received and returned by the appropriate decision-makers.

Usefulness
No feedback was received regarding 'what worked, what didn't work and how we can improve the new process'; one person said "Thanks" on the feedback sheet.

Use
Five responses indicated that practice was consistent with the evidence, the sixth reported that the practice was not undertaken at Monash Health. No action was required in these cases.
One respondent indicated that the evidence should be communicated to other programs and it was forwarded accordingly.

Implementation fidelity
There were no modifications to the planned intervention and it was implemented as planned.

Section 23 Model 2 Implementation flyer
Ensuring Southern Health practice is up-to-date The Technology/Clinical Practice Committee (TCPC) is introducing a new process to ensure that practice at Southern Health is consistent with current evidence.
The Centre for Clinical Effectiveness (CCE) had developed an Evidence Dissemination Service to capture high quality evidence as it is published. The TCPC will disseminate this to the relevant decision-makers who will be asked to consult with colleagues and report back on any action required to align current Southern Health practice with the most up-to-date evidence.
The process has been developed to minimise your time and effort.  Only synthesised information such as systematic reviews, health technology assessments and evidence based guidelines will be provided. You will not receive trials or other primary studies, editorials or opinion pieces.  The synthesised evidence is retrieved from high quality sources and will be appraised by CCE staff so that you can be confident the information is trustworthy.  CCE staff will compare the evidence with current policies and procedures. If Southern Health documentation is consistent with the evidence, you will be informed but no response is required.  A response will only be required if there are no policies and procedures on this topic or if the current policies and procedures are inconsistent with the latest evidence.  Action will only be required if current practice is inconsistent with up-to-date high quality evidence that is relevant and applicable to Southern Health.  Responses will be required within an appropriate time frame. These have been determined to prioritise action to areas of greatest risk to patients, staff or the organisation. Where there is  evidence of harm, a response will be required in 1 month  evidence of benefit, a response will be required in 3 months  evidence of a more cost-effective alternative, a response will be required in 3 months  evidence of lack of effect, a response will be required in 6 months  lack of evidence, the publication will be provided for information only, no response required The new process will be implemented as a pilot. Your input and suggestions to improve the methods and materials is welcome and encouraged. Please direct your feedback and any questions to:

Model 2 Evaluation of full implementation
The EDS was discontinued prior to implementation of the planned evaluation activities, however data were collected for the first seven-month period and audited to meet reporting requirements.
Fifty-two bulletins required a response, however three contained information pertaining to two executive or program portfolios, making the total number of responses required 55. The remaining 123 publications were disseminated to 182 recipients for information only.
Of the 55 requiring responses, the Medicine Program and Women's and Children's Program received the most (n=16, 29% each), followed by Specialty (n=11, 20%), Surgery (n=5, 9%), Mental Health (n=3), Critical Care (n=2) and Emergency and Ambulatory Care and Other (n=1 each). A collation of 56 relevant bulletins was provided to Pharmacy, four to Diagnostic Imaging, two to Pathology and 156 to other programs and departments for their information.
Fifty-one of the 55 responses were due at the time of data collection, 4 were due in the following month. Forty-three had been received, 9 were overdue and 3 were pending.
Dissemination to the correct recipients was not formally assessed, however responses indicated that bulletins were received by the appropriate decision-makers.
Six of the 43 respondents recommended that the bulletin be forwarded to others including five internal departments, the Divisions of General Practice, health professionals across the organisation, and one did not specify the distribution.

Usefulness
Respondents reported that local practice was consistent with the evidence (n=32, 74%), the evidence was not applicable at Monash Health (n=6), local practice was not consistent with the evidence for a good reason (n=3), and changes to make practice consistent with the evidence had been commenced or was planned (n=2).
Evidence was not applicable to the Monash Health setting because the practices were not undertaken (n=4) or the specified drugs were unavailable in Australia (n=2). The three reasons for local practice being inconsistent with the evidence for a good reason also included a drug which was unavailable in Australia, plus a lack of resources and time to implement the proposed interventions, and undertaking the practice but restricting it to a specific patient group who were unable to receive the alternative treatment.
Many respondents included comments and feedback in the free text sections of the bulletins. Five offered positive comments, welcoming future bulletins. Although respondents were not specifically asked to comment on usefulness, many suggested it was not "useful", "helpful" or "valuable" to consider evidence that they were already aware of, that was consistent with current practice, or that addressed drugs that were not locally available.

Use
The 43 respondents had clearly read and understood the bulletins, and had used the bulletins to assess whether current practice was consistent with the evidence.
Given that the aim of the EDS was to use evidence proactively to drive decisions, 'use' in this context could be interpreted as leading to practice change. Two decision-makers noted that local practice was not consistent with the evidence. One department had already "initiated changes to current practice to conform to the recommendations", and the other had tasked their guideline development group to address the inconsistency.
Bulletins could also be 'used' to confirm that current practice does not need to be changed, but the 'usefulness', costeffectiveness and impact of resource use in achieving this was questioned in respondent's feedback and project team and committee reflections.

Resources
Delivery of the EDS was undertaken by the EDS Administration Officer (approximately two days per week to capture and process publications and develop bulletins, three days per month to prepare reports and documents for TCPC meetings and attend the meetings), the CCE Director (approximately one half day per week to review processes and bulletins, one day per month to prepare for and attend the TCPC meetings), the TCPC Chair (approximately half day per month to consult with EDS staff and review publications for local applicability), and the TCPC members (approximately 30 minutes per month discussing EDS issues).

Implementation fidelity
There were two major modifications to the planned intervention, both were due to resource limitations. Three months after implementation, the scope was revised to focus only on evidence in areas of high priority to the organisation. Including evidence of harm was essential for patient safety, and adding evidence of cost-effectiveness and lack of effect would complement current Monash Health initiatives ascertaining examples of more cost-effective alternatives and identifying organisational waste in clinical and corporate practices. Only publications with evidence in these three areas would be appraised prior to dissemination and would require a response. Evidence of clinical effectiveness, methodological effectiveness and lack of evidence were provided for information only. Three months later, the EDS was suspended altogether due to limited capacity within CCE.
There were no changes to the implementation plan and barriers and enablers are discussed with factors influencing processes and outcomes below.

Applicability to Southern Health Patient / Population
Persons working in the operation theatre that are exposed to the risk of percutaneous injuries with suture needles.
N 2961 participating surgeons Setting UK, US, Germany, Italy, Ireland, Netherlands. Four studies focused on abdominal closure, two on vaginal repair and two on hip replacement.

Intervention
Blunted suture needles (we defined blunt needles as suture needles that have a rounded blunt point and that are circular in diameter and that can be either curved or straight)

Comparison
Sharp needles (sharp needles are suture needles that have a tapered point and that can be either circular in diameter or square with cutting edges and that can be either curved or straight).

Outcomes Primary
Exposure of healthcare workers to contaminated blood or bodily fluids was our primary outcome measure. Exposure can be observed either as self-reported needle stick injury or glove perforations.

Secondary
We included satisfaction with, or ease of use of, the needles.

Inclusion Criteria
Randomised clinical trials (RCTs) and cluster-randomised trials (c-RCTs). "Persons working in the operation theatre" "Blunt suture needles (rounded blunt point that are circular in diameter and that can be either curved or straight) compared to sharp suture needles (tapered point, can be circular in diameter or square with cutting edges and can be curved or straight)"

Exclusion Criteria
Intervention was a needle handling device and not a blunt needle, study not randomised or controlled.

Quality of this Systematic Review or Health Technology Assessment
CCE staff appraised the methods used in this publication and found the risk of bias to be LOW. This means that you can use the findings of the review with confidence as all of the quality criteria have been fulfilled or where criteria have not been fulfilled it is very unlikely the conclusions of the study would be affected.

Quality of the evidence contained in this Systematic Review or Health Technology Assessment
The review authors appraised the available evidence and found it to consist of Level II Evidence (one or more randomised controlled trials). The available evidence included in the review is of high quality.

Consistency with Southern Health documented practice
No Southern Health policies or procedures on this topic were identified.