Challenges and opportunities for bioimage analysis core‐facilities

Recent advances in microscopy imaging and image analysis motivate more and more institutes worldwide to establish dedicated core‐facilities for bioimage analysis. To maximise the benefits research groups at these institutes gain from their core‐facilities, they should be established to fit well into their respective environment. In this article, we introduce common collaborator requests and corresponding potential services core‐facilities can offer. We also discuss potential competing interests between the targeted missions and implementations of services to guide decision makers and core‐facility founders to circumvent common pitfalls.

are more on the applied science side and focus on translating algorithms to user-friendly software applicable to answer research questions by scientists less proficient with computational methods.Their naming goes along 'bioimage analysis core-facility', 'bioimage analysis hub' or 'bioimage analysis technology development group', which highlights that these groups are not pure computer science research groups.While the idea of establishing computationally oriented core-facilities in biology research institutes is not new, 1 pure bioimage-analysisfocused core-facilities remain rare.Reasons to found bioimage analysis core-facilities are manifold and decision makers are increasingly reaching similar conclusions.The German Bioimaging Society currently 1 lists 41 groups from various institutions having image analysis among their services.In this article, we introduce the demands calling for such groups to be set up and potential services these groups can offer.We outline typical scenarios and potentially arising, competing interests between provided services and the missions of a core-facility.
The paths towards the establishment of core-facilities are manifold and strongly depend on the involved institutions, funding bodies and the needs of the respective local community.Hence, this manuscript aims to provide guidance for founders, managers and employees of bioimage analysis core-facilities who wish to advocate and structure their activities and offered services.This manuscript is written from the perspective of our experience on the Dresden Life-Science Research Campus.

MISSIONS OF BIOIMAGE ANALYSIS CORE-FACILITIES
Multiple needs that are apparent in the life sciences can be addressed within the scope of bioimage analysis: Oftenreceived requests from wet-lab scientists revolve around programming custom workflows or guiding collaborators towards the right workflow for image analysis, which should be considered an integral part of the experiment itself.Such requests obviously call for wide-range expertise.Consequently, peer-to-peer requests for bioimage analysis tasks typically receive more attention and interest from collaborators.However, institutional decision makers also demand for other, less visible, yet critically important services to be offered to the core-facility's user base.These services aim towards sustainability of services, infrastructure maintenance and the long-term operation of the group.In this section, we define these terms.We start with the most common requests and transition towards the more invisible tasks that also need to be covered.

Having image analysis expertise on site
Likely the most common request we bioimage analysts at the Cluster of Excellence 'Physics of Life' at TU Dresden/Germany see is the demand for a custom image segmentation workflow.Workflows are assemblies of reusable image analysis components serving a specific task. 2 For this, the workflow designer needs experience with a large number of relevant components and needs general methodological knowledge.Moreover, the workflow should be developed in a way that it can be executed reproducibly on the collaborator's computer and potentially later on computers of reviewers, readers and others who aim at using the same strategy in their own research.The three general strategies towards distributing written code encompass scripts, plugins and containers.Scripts are single pieces of code instructions with little or no information towards how or in which environment they ought to be used.Plugins or packages embed such scripts in a larger structure that adheres to common conventions.The latter are determined by the used framework (i.e., Fiji 3 or napari 4 ).They contain precise information regarding code dependencies and can contain variable amounts of documentation.Lastly, containers can ship not only the used functionality, but rather the entire required codebase and can run independently of operating systems and locally installed software.Depending on whether the workflow is provided as script, plugin or container, this requires different levels of software engineering and deployment expertise.Lastly, the goal of the analysis needs to be critically reviewed: Is image segmentation necessary for answering the scientific question?A common mistake by early career scientists in the field is focusing on improving the image segmentation strategy, thus losing the focus on answering the biological question.Computational experts are necessary to guide life scientists through the entire workflow and afterwards identify potential bottlenecks.Such tasks and requests can be best addressed with computational and bioimage analysis experts on site. 5Depending on the diversity of tasks and needed expertise, single individuals may not be able fulfil all needs.Hence, there might be a need for dedicated bioimage analysis teams.

Knowledge conservation and incubation
A major strategic aim of core-facilities is to serve as knowledge incubator.Experts working in such a group are expected to exchange knowledge with each other, the local community and other international experts.This aims to prevent developed workflows from being forgotten and reinvented afterwards.A successful strategy for this is key towards sustainable operation of the group.If experts manage to establish standardised workflows and components, future research projects can be executed more efficiently, giving the staff time to develop advanced tools for upcoming challenges. 6taying up-to-date in the rapidly developing field of bioimage analysis is challenging.For example, deep learning and cloud computing are expected to have major impact on how computational resources are used in projects in the next 5-10 years. 7Dedicated actions need to be undertaken by service-providing staff to determine which of the upcoming techniques are relevant for the institute, fulfil certain quality criteria and fit into the available computational infrastructure on campus.

Software maintenance
Similar to knowledge conservation, developed software needs to be conserved and maintained mid/long term for the benefit of its users and the institute. 8This includes efforts to ensure that software remains functional in the context of an evolving software ecosystem.Moreover, maintenance efforts undertaken by individuals, groups and institutes ensure a degree of control over the code base of strategic, regularly used software components.Software projects such as CellProfiler, 9 ilastik, 10 QuPath 11 and Fiji 3 serve as examples which play an important role not just at the institutes where they were developed, but also internationally.

Standardisation
As the distinct field of bioimage analysis is quite young, standardisation of common procedures such as reporting, 12-14 benchmarking 15,16 and file-formats 17,18 are still emerging.Standardised workflows for common tasks such as nuclei segmentation or cell shape measurements are not widely established currently.A possible exception to this is given by medical imaging and clinical data, which is typically acquired in highly standardised procedures, rendering the analysis feasible according to strict standard operating procedures. 19The present degree of standardisation within an institute or field introduces a trade-off between flexibility and comparability.Depending on research questions and input data type, standardised workflows can deliver comparable results that may, however, not yield the right answers.On the other hand, the results of custom workflows are difficult to compare but they can be tailored to the respective project.A corefacility can choose to establish local standards with limited scope even in the absence of common standards.Such local standards allow avoiding redundant efforts in subsequent projects with similar aims and enables comparison of quantitative analysis results between projects.

Research data management
Computational science relies on research data management (RDM).Crude data storage and management solutions using custom folder structures on individual or group level need to be left behind to enable cross-group RDM standardisation and reap the benefits of well-organised research data.Even more so as the publication of biological imaging data is expected to further advance the field as a whole. 20Journals and funding agencies increasingly require RDM standards.While technical infrastructure for managing biological microscopy imaging data has been developed in the last decade, [21][22][23] human resources are necessary for maintaining this infrastructure and training users.A guide towards state-of-the-art research data management are, for example, the FAIR principles: 24 Image data that is findable (F), accessible (A), stored in interoperable (I) file formats in a way it can easily be reused (R).
The FAIR principles were also recently re-formulated for research software. 25It appears natural for this responsibility to be given into the hands of bioimage analysts as they profit from well-organised imaging data.This can simplify data analysis workflow design and allow answering questions that were out of reach before.

SCIENTIFIC SERVICES
The above-mentioned missions can be tackled by offering specific scientific services.We outline what bioimage analysis core-facilities can offer to satisfy the introduced needs and explain relationships with other services.Figure 1 shows a radial enumeration of possible services and categorises them in accordance with the introduced missions.

Consulting and cosupervision
Every collaborative project in the bioimage analysis context should start with an open-door consultation.Conducting such a session according to defined guidelines can help to synchronise consulting activities within the facility and outline further activities.Furthermore, regular opendesk hours, email hot-lines and public online forums such as the Scientific Community Image Forum https://image.sc 26 can be suitable channels for this kind of interaction.
It is in general recommended to start discussions about bioimage analysis early during the project.In case image data appears nonsuitable for quantifying specific biological properties, modifying the sample preparation and imaging procedures might be required.We also recommend to schedule follow-up appointments to make sure the project develops in the right direction.

Data analysis as a service
Another potential strategy is pursuing data analysis as a service.A data producer ships entire datasets to a bioimage analyst.The analyst can then independently work with the data.Such a service assumes a shared understanding between data provider and analyst about which tasks are to be conducted.Consequently, it is reasonable to provide such services in case scientific questions can be answered using workflows and techniques that are highly standardised.Bioimage analysts working in such environments are typically also more specialised.A freelancing consultant who works for multiple institutes can focus more on a specific topic than generalists working within a single institute.This can, for example, be the case in the pharmaceutical domain, where workflows and used assays are highly standardised.It may not be possible to provide such a service in exploratory bioimage analysis projects, which typically demand a high level of interaction between data provider and bioimage analyst, and flexibility when designing a bioimage data analysis workflow.

Data stewardship
Managing data and software can be a challenging task, especially in the life sciences which face ever-increasing amounts and sizes of acquired image data.This challenge has been encountered in the bioimage analysis context early [27][28][29] and is increasingly being acknowledged by funding bodies and publishers, which commonly require robust RDM plans and sustainable implementation for funded projects.Consequently, frameworks for research data management (e.g.Omero 21 ), file formats 30 and standards for metadata annotation 14 have been developed by the community to address these challenges.This led to the development of the concept of data stewardship in order to guarantee RDM standards across institutes and streamlined application of recurrent workflows.Data stewards help groups to comply to common RDM standards.This can be implemented on different levels: On a strategic level, data stewards can help to formulate management guidelines and policies.On an intermediate level, data stewards can be tasked with setting up suitable hardware and software infrastructure.This should be done in close collaboration with the institute's IT support team.Lastly, data stewards can help in a consultation setting to guide collaborators towards using the existing resources effectively.While bioimage analysts can serve all these roles, their expertise lies close to the image data and the associated biology.Hence, the expertise of bioimage analysts is best used by consulting them on how to manage data.
If the institute is new to having data stewards, a first step could be conducted as consulting.From our perspective, many research groups struggle with defining their own folder structures and RDM plans.Thus, dedicated experts can guide scientists in managing their data.After such basic consulting strategies have been established, a second level of support appears worthwhile from an institutional decision maker's perspective: A data steward or RDM officer role can be established at the institute.Due to their responsibilities, these individuals may report directly to the board of directors and have the duty to establish common procedures at the institute, which includes rules for implementing standards such as the FAIR principles.Given the agreement of the producers and owners of the data, this officer can have a special right: They may be allowed to browse research data of groups at the institute and take a more active role in guiding the scientists, for example, contacting them directly in case they see violations of established rules.As this role comes with substantial responsibilities regarding data protection, safety and integrity, it is recommended to hire those persons on permanent positions.Needless to say, proven long-term experience in handling research data in a responsible fashion has to be a key requirement for these positions.

IT infrastructure maintenance
The maintenance of software and hardware infrastructure is a common task at institutes where computational methods are used frequently.Workstations: Dedicated hardware infrastructure, such as image processing workstations, are common equipment at research institutes.If the computational hardware is partially under control of bioimage analysts, unique solutions exploiting the available hardware can be developed.For example, if the institute has graphics processing units (GPUs) installed in a number of image processing workstations, software libraries that can harness the computational power of GPUs can be used. 31In advanced scenarios, it is reasonable to maintain GPU-accelerated image processing software and hardware infrastructure for the benefit of the institute.Moreover, localising workstations, potentially with commercial software, in central, open, makerspace-like environments, sparks collaboration and induces a culture of cooperation and knowledge exchange among scientists across groups and even interdisciplinarily. 32If research groups decide to buy their own computers because booking centrally administered workstations is too expensive, the benefits shared infrastructure cannot be exploited.
High-performance computing: The maintenance of high-performance computing (HPC) resources is typically beyond the scope of a bioimage analysis core-facility.Nonetheless, its staff should engage in frequent dialogue with the institute's IT support team.This ensures that HPC infrastructure meets the demand of the life science community and is being actively used to leverage computation-heavy projects.In case of a narrow research focus, specialised imaging and IT hardware can be purchased for the optimal operation of the core-facility.It may furthermore make sense to have joint groups providing imaging and image analysis services.In such groups, infrastructure can be optimised to have unified workflows ranging from the sample preparation, through standardised image acquisition and established image analysis a in high-performance-computing infrastructure.

Software development and maintenance
The development and maintenance of software can be considered as the digital counterpart of infrastructure maintenance.Bioimage analysis experts in core-facilities are often the only programming experts at biological research institutes.Thus, they may be responsible for developing software and maintaining it.There may also be dedicated research software engineering positions at the core-facility for this.
Development: In general, we differentiate two kinds of software development projects: (1) Custom workflow development for specific projects and (2) reusable component development for multiple projects.Both can be conducted in a consultation setting or by bioimage analysts taking a more active role in a project by, for example, transforming a custom script developed by a collaborator into a reusable component.The software development process could begin as part of a first exploratory project aimed at programming a custom script for counting cells together with a single collaborator.After receiving requests for similar tasks from a second and third collaborator, the written code can be refactored into a reusable component, for example as a plugin for a common image analysis framework such as Fiji or Napari. 3,4The first strategy, custom script development, allows bioimage analysts to become coauthors on specific research publications and demonstrate their wide-ranging skills in their curriculum vitae.The second strategy, development of reusable components, can lead to independent research projects where the software developers become first-author and the collaborators who provided research data and scientific questions can become coauthors.Either way, the collaboration is of mutual benefit.
Maintenance: Maintaining software is a continuous effort.It needs to be kept ongoing for software upon which scientists rely.In particular, this needs to be ensured for periods of time beyond the duration of the contracts of the core-facility's members.However, while maintenance is of strategic importance, bioimage analysts are part of the scientific community.They are consequently subject to a continuously high pressure to deliver scientific contributions.Hence, besides diverting effort to the maintenance of projects of strategic importance, employment in a bioimage analysis core-facility needs to offer staff the opportunity to develop their own profiles, projects and careers.The goal is to provide bioimage analysts a path to stay in academia long term.
Both kinds of efforts, software development and maintenance, are necessary to strengthen the reputation of the core-facility within the local and the international research community.Efforts on maintenance of software relevant to the host institute, but also to the international community, may appear costly to institutional decision makers.However, investing this effort can pay off in the long run: The development and maintenance of tools through a corefacility ensures that the software meets the demand of as many users on campus as possible, avoids reinventing the wheel and remains up-to-date.Core-facility members maintaining a software-package also typically know it well and can teach it to a local community.As maintaining software and related knowledge can come at high costs, keeping the number of maintained and taught software applications small is key towards sustainable software and knowledge maintenance. 7astly, the supervision of software and hardware within the scope of a core-facility furthermore allows staff bioimage analysts to oversee the process of projects as well as identify the possible need for further developments.Thus, deciding which software to develop and maintain is a decision that should follow strategic goals set by the core-facility members in consultation with group leaders at the institute.

Networking and outreach
In order to use resources of core-facilities efficiently, efforts should be routed to internal and external networking.
Local: Most importantly, the core-facility members must be well-connected to all relevant actors on their local campus: Knowing the data analysis needs of the local research community is important for planning and organising all consulting, development, maintenance and teaching efforts.Core-facility members need to talk to local and internal algorithm developers to know what new tools are in the making and to guide their developments towards scientific questions that are highly relevant for local collaborators from the biology side.
Cross-institutional: It is an obvious course of action to team up with colleagues from the same region and conduct consulting, development and teaching projects together.Nonetheless, core-facility members should strive for international activities that have been shown to be highly efficient in teaching bioimage analysis and in establishing the profile of the bioimage analysts themselves. 33If corefacility members are involved in inter-institutional opensource software development projects, they can influence the direction of these projects towards the interests of their institute.This is a social task rather than a technical one: It is reasonable that the core-facility members are involved in relevant societies internationally, such as the Network of European BioImage Analysts (NEU-BIAS) and associations of microscopy core-facilities such as EuroBioimaging.Also, national and regional associations, in our case the German BioImaging society (GerBi) and the Biopolis Dresden Imaging Platform (BioDIP), are invaluable partners in the networking context.
In some research contexts, a higher degree of standardisation of image analysis tasks and acquired data is present, for example, in the clinical research context.Then, such institutions can be established on a national and trans-national level 34,35 to synchronise efforts between participating institutions.This allows for analysing data in a standardised fashion at reduced costs and better comparability.If the same data analysis workflow is applied to data produced in many research projects, the need for consulting collaborators is lower and fewer custom scripts need to be programmed.Current efforts are underway to introduce such standards for bioimage analysis, for instance for image data storage. 30n the time of free online courses, open access shared image data and image analysis software running on every computer, it is obvious that state-of-the-art image data sci-ence is not exclusively happening in the rich countries.Networking activities enable us to work closely together with scientists worldwide, including community members from developing countries.Thus, we would like to motivate everyone to get involved in the NEUBIAS community, which aims to become a major player that can synchronise efforts worldwide and represent the community towards funding bodies, research associations and governments.

Teaching and education
Teaching and education pertain to all tasks regarding the distribution, incubation and conservation of knowledge.These are of particular importance in environments of high personnel fluctuation such as academic research institutes where scientists only get nonpermanent positions. 36Activities include lectures, workshops and pair programming sessions.Each of these is suitable for a particular audience size, depth of teaching as well as focus regarding content.
Lectures: For large audiences, lectures on bioimage analysis techniques and programming basics are a suitable way to leverage the overall proficiency on campus in a long-term and, thus, sustainable fashion.Naturally, scientists with basic knowledge in image analysis and programming will find themselves having an easier time in analysing their own data.The content focus and level should be chosen to be of interest to a large target audience and provide entry points of expertise to the attendees.Since lectures may be part of universities' bachelor and master programs, their content should also be standardised in an international context.Bioimage analysis textbooks can play a key role in defining these standards. 37,38orkshops: For smaller groups of collaborators with a narrower area of expertise, single-day or multiday workshops represent a suitable format of education in more constrained topics.Bioimage analysts can teach a single research group on topics that are relevant to their specific needs.This strategy is considered more time-efficient than lectures at a university which take an entire semester to complete.On the other hand, customising teaching materials for such workshops also comes at higher costs.Thus, while lectures in universities can be attended for free in many places, workshops could be financed through fees, and contribute to the sustainable long-term operation of a bioimage analysis core-facility.Advanced train-thetrainer workshop formats help to improve the knowledge transfer between attending individuals and experienced, trained-in-teaching staff. 39Selecting participants strategically ensures a lasting incubation of knowledge within the participants' groups and institutes by empowering them to relay principles of learned skills to a broader audience, a task which would be difficult for members of a bioimage analysis core-facility to shoulder alone.The focus of any offered workshop should be chosen in alignment with the research interests of the intended target audience.
Pair programming: Pair programming is an established technique in the computational sciences. 40An expert in programming and a collaborator meet to work for some time on one screen together.Typically, the nonexpert types while the expert speaks out what they would do if they were alone.If the sessions take longer, it is also recommended to switch the roles regularly.This very involved knowledge transfer technique allows for boosting education of collaborators and advancing a particular research project at the same time.It is an effective method for introducing collaborators to specific tools they need to achieve their particular goals and acquire bioimage analysis expertise.Depending on the degree of involvement in the project, the expert should be considered for cosenior authorship similar to the consulting strategy introduced above.Technically, pair programming is hands-on consulting.
Job shadowing: When substantial knowledge transfer is necessary, for example, for establishing a routine technique in a research lab, job shadowing is a common technique we also exercise on our campus.Embedding bioimage analysts temporarily in a research group, referred to as rent-a-bioimage-analyst, allows interaction on a level far beyond consulting.Also, wet-lab scientists spending a couple of weeks in the offices of bioimage analysts, locally referred to as mini-sabbatical, represents an excellent strategy to make both subcommunities of life scientists find a common language and collaborate efficiently.In both settings, projects can be pushed forward substantially and bring mutual benefit.Flexible payment schemes might be necessary to establish such a strategy on a larger scale.
Educational activities also include training the corefacility's own staff.If experts are hard to get on the job-market, they have to be trained within a core-facility.Postdoctoral fellowship programs are one opportunity to train core-facility employees in advanced techniques.This strategy has been exercised in imaging core-facilities and appears transferable to other technologies. 41In the bioimage analysis and data science context it appears even more beneficial: Coming back to the above introduced minisabbaticals, which core-facility employees can spend in research labs, they may decide their future career path during these close collaborations.
Data scientists are hard to hire in research groups, too.Hiring a core-facility-trained bioimage analyst on a staff scientist position in a research group can transfer expertise into these groups and enable image data science projects that are unthinkable in smaller PhD or postdoctoral projects.

COMPETING INTERESTS
In this manuscript, we introduced typical needs in the context of bioimage analysis which life science research institutes express and services that dedicated bioimage analysis core-facilities could offer to serve these needs.In the following, we discuss potential caveats arising from design choices to be aware of.First, both decision makers considering funding a corefacility and collaborators requesting services from a corefacility should be aware of the so-called unattainable triangle of project management 42 as shown in Figure 2. It states, in short, that the quality of a project is constrained by time, cost and scope.However, not all key parameters of a project can be optimised at the same time: fast-developed and well-designed solutions are expensive.If the costs are too high, either scope or development time must be sacrificed.

Collaboration between life scientists and software developers
When considering tasks such as software development and user education, it is apparent that putting the focus exclusively on either of these two activities is suboptimal regarding effort distribution between software users and software developers.A bioimage analysis component that requires an end-user to learn a programming language causes maximum effort on user's side while minimising effort on developer's side.In contrary, crafting an intuitive, user-friendly software with a graphical user interface (GUI) maximises the effort for the developer while keeping the workload minimal for the user.Algorithm developers may not see it as their responsibility to invest the increased effort for developing GUIs.This may be the case even more if the design elements are specific to particular biological research questions.Within the bioimage analysis community, the term going the extra mile was established for investing this effort, which is often undertaken by core-facility employees.Finding the right balance between these two strategies is key.We claim that the total effort of developers and users is minimal in case the developers spend some time on adding accessible userinterfaces to their software while users acquire some basic programming skills.The challenge is displayed in Figure 3.

Recharging
When it comes to billing services such as bioimage analysis consulting and software development support, the right model must be negotiated, potentially on a project-byproject basis.Commonly, initial consultation is free.After a first discussion, an evaluation of the benefits for individuals, research groups and the institute could be the The so-called unattainable triangle of project management has been proposed in several varieties.The left triangle identifies the quality of a software as the area of the encapsulated central area, which can be enlarged by spending more time, allocating more resources to it or widening the scope of a software product.The right triangle highlights an inevitable consequence of the left triangle: Optimising a project's boundary conditions (e.g., by minimising spent time and resources) at the same time is impossible without sacrificing at least one of these conditions.

F I G U R E 3
The mutual-effort diagram displays how developed software manifests in terms of effort on either the developer or the user's side.Well-designed GUIs maximise the effort for the developer, but will considerably decrease effort by the user.Conversely, a software without GUI increases the user's efforts while minimising workload on developer's side.
next step towards identifying the right model for billing.The more people profit from the core-facility executing a collaborative project, the lower the price could be for the inquiring research group.Such discounts or a general open-door policy can only be awarded if other funding sources are available.
The major aim of a core-facility, per definition, is to serve the entire institute and treating groups equally.If working hours are billed, the focus of the core-facility will naturally shift towards the research groups with larger budget, which are typically senior research groups.To enable young investigator groups to benefit equally, flatrate models can be established: All groups have a certain amount of annual working hours they can ask core-facility staff to work for them.
A core-facility may bill its collaborators while also claiming coauthorships.We want to highlight that we consider this appropriate both in terms of good scientific practice and responsibility 43 as well as towards securing the abovedescribed missions of the core-facility in a sustainable fashion: In essence, there exists no qualitative disparity between hiring an employee for the execution of a bioimage analysis task and engaging the services of a core-facility for the same task.Both warrant financial compensation as well as the acknowledgement of intellectual contributions in the form of coauthorships.This policy should always be communicated clearly within projects at an early stage.Challenging sources for competing interests appear in the context of billing collaborators for custom data analysis.If workflow development and consulting are billed per hour, collaborators might tend to exchange workflows among each other without consulting the developers in order to spare money.However, the limitations of custom image analysis scripts might not be easy to judge for nonexperts.If custom scripts don't work on other data, this poses the risk of presenting a bad picture of the work done in the core-facility.This scenario can be prevented by two major actions: (1) Asking questions in a basic consulting setting should always be free.Nobody should fear to receive a bill because they asked a question.(2) Documentation for custom image analysis scripts must explicitly state the intended use.It is also recommended to add a disclaimer to the documentation hinting the user of the script that the developer of the script is not liable or responsible for usage of the script.Common open-source software licenses such as BSD or MIT contain these disclaimers and additionally give the collaborators, who potentially paid for the software development, the legal right to publish the code along with their scientific publication without consulting the author. 8oining efforts between core-facilities and research groups is also a suitable approach to secure core-facility operations sustainably: Grant proposals written by collaborating groups can feature additional staff and infrastructure funding for data management and data science services, which could then be diverted to the core-facility.Since research data management, scientific data analysis support and consulting are increasingly recommended or even mandatory to include in funding proposals, adding resources to sustain a responsible core-facility is of mutual benefit.As funding agencies are aware of the needs, mentioning such items in a grant proposal may increase the chance of securing the funding.

Project prioritisation
Another important point of consideration lies in the nature of projects a core-facility is tasked with.If the core-facility's core-budget is and employees are paid from grants, members of the core-facility are typically expected to contribute to the field in the form of publications.In this scenario, core-facility members have to prioritise support requests.A criterion for deciding between research projects to work on is the exploratory or exploitative nature of the project.In case a suitable bioimage data analysis workflow is roughly known and just needs to be further developed to be exploited, for example, by being executable on HPC infrastructure, the efforts for the project are fairly predictable and allow an informed decision about the project costs.In case no image data analysis workflow is known yet and the outcome hard to predict, the effort for this kind of exploratory project is harder to estimate and thus, potentially not the right situation for a core-facility to invest a lot of time.We recommend consulting and pair programming as methods of choice for exploratory scenarios and software development efforts for exploitative scenarios.
An important aim of a bioimage analysis core-facility is to enable software solutions to be available long term.If a PhD project resulted in a new algorithm that appears beneficial beyond the group it was developed in, the corefacility should support maintaining this software.Software developers working hours invested in making software of a research group sustainable, should not entirely be billed to the group as the institute benefits from this work.Also in this case, larger groups who can put more substantial effort in developing new methods have a higher chance to benefit from the core-facilities support in this context.Thus, the decision making process for which software projects to support should be made in close collaboration with group leaders,-potentially democratic -for example in a core-facility committee.

Avoiding bottlenecks in scientific workflows
As knowledge incubation and transfer is a major mission of core-facilities, bioimage analysis as a service introduces some conflict of interest as well: If the employees of a core-facility are the only experts that can conduct a certain bioimage data analysis workflow, their service can become a bottleneck.On the one hand, exploiting this bottleneck by billing for the data analysis service is a potential source for funding.It is on the other hand a risk for the institute: Core-facility employees who are constantly overwhelmed by repetitive requests bore out and may seek for new opportunities outside the institute.The experience may be lost and the core-facility then failed in conserving the knowledge.On the other hand, a similar argument can be brought forth from the point of view of good scientific practice: If the expertise to execute a developed workflow resides exclusively with core-facility experts, how reproducible are the results in practice?To circumvent these issues, we recommend researchers to acquire bioimage analysis skills via learning-by-doing under guidance from core-facility employees.The core-facility members can also take user feedback from these sessions and further develop their tools.All involved parties, the software users, its developers and the core-facility profit from related publications as they ensure the continuous influx of third-party funding.

Teaching and education
The importance of teaching shall also be highlighted.Unfortunately, general computational skills, data management and data science are not part of undergraduate curricula in the life sciences and thus, core-facility staff often explains computational basics to researchers.Depending on the size of a core-facility with respect to the number of potential collaborators on campus, more or less consulting and custom script development services can be offered.In case of an overwhelming number of basic questions and support requests, increasing teaching activities can be considered to answer common bioimage analysis questions in a broadcasting fashion.Establishing a bioimage analysis lecture open to students but also PhD candidates and postdocs is worth the effort.Good computational practice will spread in the research groups on campus and the general situation improves.We are exercising this for the fourth year on our campus and the feedback from local researchers is very positive.Our impression goes in line with a recently executed, international survey: More and more scientists seek to learn bioimage analysis skills themselves and thus, there is a large need for tutorials and workshops. 44

Long-term strategy development
Considering the local needs and demands on a research campus, it may appear to be a feasible course of action to obtain bioimage analysis expertise on a per-project basis: Paying a freelance consultant or image analysis consulting company for consultations and training on topics such as good scientific practice in the domain of imageprocessing 45,46 can be a viable and cost-effective course of action compared to creating expensive internal facilities or hiring staff if image analysis is not constantly highly demanded.However, this comes at the disadvantage that an external expert may not always be available if needed and maintenance cannot be ensured.Nonetheless, external contractors with expertise in the field (i.e., from working in the academic field themselves) can help to raise proficiency and awareness on specific topics among hired staff of a core-facility, too.
If a core-facility is established as shown above in the various contexts, the staff of a core-facility has far-reaching responsibilities.Together with the value a core-facility provides to its collaborators, balancing the different services, both in terms of scope and its recipients, is a critical task that should be evaluated regularly.These considerations could be addressed, for example, by an external advisory board (EAB) of corefacility leaders from other institutes.Using their external perspective, they could on the one hand help the core-facility to adjust their goals and missions.Furthermore, an EAB report could give the core-facility and the responsible steering bodies a powerful instrument when arguing about further development of the corefacility and internal funding with institutional decision makers.

CONCLUSIONS
Founders and managers of bioimage analysis corefacilities face challenging tasks in shaping their work environment: Primarily, the institute communities' needs have to be fulfiled using a well-adjusted selection of services.At the same time, the career perspectives of the core-facility employees are of high importance as the core-facility cannot operate without motivated personnel who strive to improve their skills and develop along with the bioimage analysis field.The core-facility is not just responsible for maintaining existing tools but also for developing workflows for future research projects.From our perspective, core-facilities who develop technologies work best if they are partially funded through thirdparty research funding, which may turn the core-facility partially into a research group.This group structure compromise deserves its own name, which is why we refer to our group as a 'Bioimage Analysis Technology Development Group'.

F I G U R E 1
Classification of bioimage analysis core-facilities according to the introduced services software development, consulting, data analysis, research data management (RDM), infrastructure maintenance, teaching and outreach.The color-coded fields ('knowledge-centric', 'service-centric' and 'maintenance-centric') highlight domains of thematic overlap between offered services.