Molecular Devices' newly released imager, the ImageXpress Micro. (Courtesy of Molecular Devices.)

In the beginning, there were microscopes—and they were good. But almost nothing in science stays 'good enough' for very long, and even though microscope technology has become more powerful and capable of more refined resolution and imaging studies of surpassing elegance, many biologists have now started to shift their focus to experiments that 'think big'—looking at hundreds or even thousands of specimens, a scale that demands a new imaging paradigm.

Perhaps some of this can be blamed on the invention of the microarray, which for many redefined the scale that 'thinking big' could actually encompass. But this was also a natural progression—after all, the starting point for many drug discovery or genetic screens often entails observing the cellular effects of a wide range of compounds or DNA constructs, a process which can be arduous, at best, if performed manually.

“When I was in San Francisco in 1996, I did a lot of compound screening with a student by eye,” recalls Harvard investigator Tim Mitchison. “That experience of having gone through thousands of compounds by eye definitely made me want to automate microscopy, but it took a while before we had the technology to do it.” Paul Matsudaira, the Director of the Whitehead Institute-MIT BioImaging Center, tells a similar tale: “We wanted to start a pilot project on imaging the proteome, starting with imaging the cell adhesion machinery... If we think about the several hundred genes that we really want to tag and watch, this is not an operation that you can do by putting cells on slides and watching them on a microscope. We needed some kind of automated method.”

Both investigators have become champions of what is now known as high-throughput imaging, or high-content screening (HCS)—the use of computer-controlled, automated systems for the imaging of cell-based screens on a very large scale. Initially, the novelty and potentially high cost of automation may have been a deterrent for academic researchers. Today, however, manufacturers have started to recognize the needs of this important market, even as academics have come to perceive the power these systems offer. Matsudaira says, “I think that the tipping point is that people are realizing that they need to have this kind of equipment to do things at this scale, and the equipment is becoming available.” Indeed, in just the last few years, HCS technology has permitted the conduct of studies exploring drug profiling1, RNA interference2 and functional proteomics3 at a magnitude that was previously inconceivable—or at best, terrifying to the unfortunate scientist who was stuck operating the microscope. As these systems become simpler to assemble and operate, and become capable of handling a more diverse range of cell-based assays, it seems likely that their popularity will only continue to grow.

Getting started

In a typical imaging system, a microscope magnifies the sample of interest (that, for high-throughput studies, is presented on a motorized stage), a camera captures the image data, and a digitizer transfers the images to a computer for storage, with each step controlled by software for image acquisition and analysis.

Some companies offer a combination of proprietary components and custom-designed systems, allowing investigators to design an imaging system that meets their specific needs (see Box 1). But for scientists looking to simplify frequently performed cellular assays, a pre-built and fully integrated imaging system may offer the best solution. In such cases, asks Matsudaira, “why put something together when we can get started off the bat?” Cellomics' KineticScan HCS Reader is one example of an integrated system designed for the automated analysis of live cell populations. The KineticScan boasts the ability to measure subcellular parameters in several cells over time, allowing analysis of the heterogeneity of responses rather than simply the averaged response of all the cells in each well. It includes a liquid handling system to take care of washing and reagent additions during experiments while unattended—you can also tell the KineticScan to execute complete experiments including data acquisition, processing and analysis while you're away. Cellomics also offers 'plug-and-play' style software modules that make it easy to carry out your experiment without becoming an expert programmer (see below).

Molecular Devices' ImageXpress 5000A is also designed with ease of use in mind. The ImageXpress' Script Wizards let you automate the acquisition and analysis of images without first learning how to write code. The scripts can also be run from external scheduling software for robotic integration. The ImageXpress has both image-based and laser-based autofocus modes, the latter being faster for screening applications. Additionally, both the x-y and z stages are motorized, with encoders that give feedback control for precise automated positioning. Molecular Devices is also introducing several new products this year that will make high-throughput work even simpler for new users. “We are now offering a complete high-content screening platform, from acquisition control through image analysis, to a full informatics package,” says Michael Sjaastad, director of marketing for imaging at Molecular Devices. “With the new instruments and application modules, we have a turnkey approach for getting into high-content screening.”

Another system designed for ease of use is BD Biosciences' Pathway, a fully automated confocal imaging system for the analysis of fixed and live cells (in suspension or attached). BD Biosciences claims that their confocal system is unique in that it uses white light for illumination, allowing the researcher to use many more fluorescent dyes. “The system is designed with two categories of user in mind,” says BD Biosciences' director of marketing for bioimaging systems, Philip Vanek. New workflows provide the beginning user with a range of applications in an intuitive interface, making the system easy to operate. “For experienced users,” says Vanek, “the system still provides full control of all operational features of the instrument to develop novel cell-based applications. The configuration of the software allows both [types of] users to share an instrument without causing data loss or corrupting settings for each user.”

The eye of the beholder

Obviously, high-throughput imaging would be impossible without effective optical hardware. Many imaging systems use digital charge-coupled device (CCD) cameras to receive and digitize images for transfer to a computer; cooled CCD cameras are popular because they help to minimize noise.

Looking forward, Colin Coates, senior scientist at Andor Technology, expects more companies to start offering high-end electron multiplying charge-coupled device (EMCCD) technology, featured in Andor's iXon series of cameras, for high throughput and HCS applications. “Those that don't,” he says, “will be at a significant technical disadvantage in a very competitive environment.” Pioneered by Andor, EMCCD technology gives cameras single-photon sensitivity by two mechanisms. One principle “involves an electronic insertion called the gain register, which amplifies a signal from even a single photon to a level that is well clear of the CCD read noise floor, hence rendering the read noise essentially negligible,” explains Coates. Another important principle is the vacuum design, which cools the camera to −90 °C. “It is crucial to eliminate dark current in EMCCDs,” adds Coates, “since even a single thermal electron is amplified up by this new mechanism.”

Andor Technology's iXon camera. (Courtesy of Andor Technology.)

In terms of microscopes, Zeiss continues to set a high standard with the Axiovert 200 M, a high-end inverted fluorescence microscope suitable for a wide variety of applications, which includes up to eight motorized components and is designed specifically for automated live-cell imaging. The Axiovert 200 M is also available as a component of the Cell Observer system, which pairs the Axiovert scope with a camera, imaging software and other equipment designed specifically by Zeiss to offer users the widest possible range of imaging options, including the capacity to do time-lapse recordings or to simultaneously store image data from several fluorescent dyes in up to eight different channels—data that can be freely combined at any time.

Nikon's optical systems are also on offer in the high-throughput market, and GE Healthcare's IN Cell screening systems benefit from the high-throughput capabilities of Nikon's TE2000 microscope systems. “To [allow greater] speed, the microscope objective should be of the lowest magnification possible so that the image [can] cover as much area (field size) as possible to capture as many cells as possible at a time,” explains Joseph LoBiondo, product planning manager at Nikon Instruments. “It also needs to do this with as short an exposure as possible so that we do not kill the cells, and so that we can do this in a short period of time. This is accomplished by the use of high numerical aperture objectives that are color-corrected and have sufficient working distance to look through the bottom of a 96-well plate.”

Developing your pictures

Hardware may continue to grow more accurate and refined, but access to powerful and efficient software for image acquisition and processing remains an essential foundation—or potential Achilles' heel—for any HCS project.

Some scientists express frustration that their assay requirements continue to outpace the capabilities of modern software. “Anytime we've had a project with somebody who knows much about computers, they pretty much immediately start to write their own software, because they get dissatisfied with the commercial products—but then they generate code that's difficult for anybody else to use!” explains Mitchison. Some laboratories have stepped into the breach and attempted to develop flexible, open-source solutions. In David Sabatini's lab at MIT, for example, postdoc Anne Carpenter has spearheaded the development of CellProfiler, an open-source image analysis program that is now in beta testing and will soon be available through the group's website (http://groups.csail.mit.edu/vision/cellprofiler/).

In the meantime, software developers continue to work hard at developing more reliable and flexible solutions for efficiently processing the mountains of complex data yielded by a typical high-content imaging study, either in the form of programmable suites for the do-it-yourself type (see Box 2), or software that tries to balance adaptability with ease-of-use for computer novices.

One example of the latter is Beckman Coulter's CytoShop, which features powerful algorithms for defining subcellular compartments with a combination of computational geometric analysis and image segmentation of uniquely stained cellular structures. The software takes into account details of cell morphology that may otherwise result in anomalies, such as vacuoles, pseudopodia and neuronal processes. “Though simple in concept, the extra architectural step to separate and organize the data as populations of cells is very powerful and better approximates the biology being observed,” says Casey Laris, product manager of cell imaging and analysis at Beckman Coulter. This software is also friendly to customization, and is compatible with plug-ins from such popular programs as MatLab, ImagePro, C++ and MetaMorph.

The IN Cell 3000 Analyzer from GE Healthcare. (Courtesy of GE Healthcare.)

BioImagene's CellMine software is another possibility for labs looking to move into HCS. CellMine is user-friendly, yet flexible, featuring a built-in workflow that can be customized to meet users' specific needs. Of course, HCS poses a serious data management problem—not only for storage, but also for locating or correlating data—and with this in mind, BioImagene offers its SIMS software, which receives measured parameters and images from CellMine. SIMS is a complete image management platform that can be used to store, browse and search images across all your data. Combining CellMine and SIMS makes it easier to find and analyze data because the high-content measurements are aggregated with the associated images.

A more tailored approach to HCS software is available in Cellomics' family of BioApplications, designed for measuring specific phenomena in cells, such as neurite outgrowth or nuclear translocation, or for more general functions, such as compartmental analysis. Each BioApplication has the capability to incorporate measurements such as the size, shape, amount of fluorescent label and pattern of fluorescence for each cell; data can also be reported at the well and subpopulation levels. Both the acquisition and analysis functions are designed for users with little computer programming experience. Each BioApplication is biologically validated to ensure functionality, and you can purchase only the BioApplications that you need.

Moving forward

Although high-throughput imaging technology is now evolving by leaps and bounds, Sabatini suggests that present hardware formats may only be able to progress so much farther before bumping into the limits of optics and physics. “Microscopes are a fairly mature technology, and there are all sorts of physical barriers... that are difficult to overcome,” he says. “I think these systems operate quite well for what you can possibly do at this time.” If this is the case, a lot of the burden for improving imaging will now depend on the researchers developing new fluorophores and labeling reagents. In many ways, Sabatini concludes, “it's more of a chemistry problem than an equipment problem.”

This is not necessarily true at the computational end, where considerable innovation is still needed. As mentioned in the previous section, there is an ongoing need for more powerful image analysis solutions. There is also, however, the equally pressing issue of data storage and handling. Modern high-throughput imaging experiments typically churn out several terabytes of data, a potentially overwhelming quantity of information. As experiments grow larger and more ambitious, terabyte-scale data collection may well be just the tip of the iceberg, and some in the field, like Matsudaira, suggest that the development of more efficient digital formats for representing complex image data could help considerably.

Matsudaira also cites another important goal along these lines—the standardization of image formats between different platforms to simplify collaboration and the sharing of data. This is one goal of the Open Microscopy Environment (http://www.openmicroscopy.org/), a joint effort between members of the academic research community and commercial imaging specialists, such as PerkinElmer and Applied Precision. “It's an attempt to develop a standard for light microscopy [like the DICOM standard] that the medical community has developed for X-rays and MRIs,” says Matsudaira. “Think of the genome project if ABI machines put out data differently than an Amersham machine... and the databases that came out of one genome data center were different than those from another genome data center. We'd be in a mess! But that's the state that we're in now.”

Nonetheless, even in these early days, it seems clear that high-throughput imaging platforms offer a powerful opportunity for academics with ambitious screening projects in mind. And as systems grow more affordable and user-friendly—even for the imaging neophyte—this sort of automation seems more practical and less like a fancy novelty. Mitchison concludes: “You can do this in a very sophisticated way, but you can also do it in a very simple way... the opportunities here for academics are really quite large, and people shouldn't feel that this is a big scary thing.”Table 1

Table 1 Suppliers guide: companies offering systems for high-throughput imaging