Computational Issues in Biomedical Nanometrics and Nano-Materials

Biomedical Nanotechnology is an emerging area of great scientific and technological opportunity. It is widely recognized as one of the most potentially beneficial applications of nanotechnology to industry and society to date. Work in this area has a number of computational aspects: information technology based tools and measurement techniques are used to study biosystems with micro- and nano-scale physics and chemistry, and computational methods are helping to generate remarkable new insights into how biological systems function, how metabolic processes interrelate, and how new molecular scale machines can operate. This paper reviews current advances in computational algorithms and tools applied to biomedical nanometrics and nano-materials. We categorize algorithms into three general areas, describe representative methods, and conclude with several promising directions of future investigation.


Introduction
Nanotechnology is a new area of science that involves working with materials and devices that are at the nanometer (one billionth of a meter) scale. It generally is used to describe processes and objects that occur between a fraction of a nanometer and hundreds of nanometers; in biology, this range includes a number of molecules of interest, including DNA, which has a width of approximately 2 nm in its typical "double helix" state [1].
Biomedical Nanotechnology is the unification of biotechnology and nanotechnology, including biomedical nanometrics and nano-materials. Diagnostics, drugs delivery, and prostheses & implants are three areas where nanotechnology is entering biomedical areas. Biomedical Nanotechnology is attracting increasing interest as an emerging interdisciplinary research field straddling nanotechnology and biotechnology. As tools for combining nanotechnology with biomedical science, information techniques and algorithms appear to be gaining importance in biomedical nanometrics and nano-materials, with applications from the storage and reproduction of genetic information to the control of developmental processes to sophisticated DNA-based computation and engineering.
In this paper, we review computational algorithms and applications in biomedical nanometrics and nano-materials. There are three complementary perspectives on biomedical nano computation: (1) using algorithms to direct and control nanobiology processes, such as synthetic biology, DNA computation, and molecular modeling; (2) using engineering and programming techniques to help design and control nano-devices, such as biomedical nano-devices and bio-nanomachines, nano-magnetic devices, and nano-electronic and molecular electronic devices; and (3) using what we here refer to as "information technology" to solve nano-materials problems, such as those involving design of nanoparticles and bionanoparticles, block copolymer materials, and self-assembled molecular materials. In the current biomedical nano areas, there is a great demand for computational techniques for analyzing and designing new nanostructured materials and for theoretical analysis of basic properties in nano-scale structures and conformation in biomolecules.
The reviewed research areas will cover biotechnology, nanotechnology, and information technology-these are three emerging technology areas that are advancing both independently and as a result of new interfaces among one another. Bio-Nano-Info interface is illustrated in Table 1. As biotechnology and nanotechnology develop, their requirements on information technology are fast increasing. Information technology advances are distributed in a multitude of computational research fields, including high density information storage, high performance computing, and image and information processing. The outcomes have been equally diverse: miniaturized sensor suites for surveillance, war fighter personal status monitors, high performance and affordable nanocomposites, and miniaturized robotics for uninhabited platforms.

Nanobiology and Information Technology
Biology has in recent years developed an ever-increasing understanding of the mechanics of living cells. This has led to increased potential for two key types of nano-scale applications, namely DNA-based computation and the engineering and use of synthetic cells, and has also allowed computational modeling of intracellular processes and molecular-level dynamics to advance to new levels of realism.
DNA computing is based on the idea of harnessing the information storage and processing capabilities used by DNA for gene expression and regulation for massively parallel abstract computation. A DNA molecule consists of two intertwined and spiraling backbone strands of sugar and phosphate, each with attached nucleotides that hydrogen-bond to complementary partners on the other strand. In relation to this review, DNA is most interesting as a substrate for computation. In DNA computation, an important future advance over "natural" DNA computing, which is based on the use of either DNA hybridization or enzymatic processing of DNA [2] and has been explored in several different approaches, will be the incorporation of synthetic elements into a DNA molecule, including in particular metals (such as copper [3], transition metals [4], manganese [5], or nickel [6]) or small molecules.
These type of DNA manipulations can also be used as a basis for nano-device development, e.g. by inserting appropriate molecules between base pairs or binding them to the double helix itself [7], or by using DNA as a scaffold [1,8,9]. The emerging discipline of synthetic biology, which combines aspects of biology, engineering, and computation, has as one of its goal the use of cells and cellular processes as programmable components [10]: by engineering customized living cells, it will be possible to create biosensors, factories for nano-material creation or assembly, and programmable drug delivery systems. The idea of algorithmic self-assembly arose from the combination of DNA computing [11], the theory of tilings [12], and DNA nanotechnology [13]. The theoretical basis for self-assembly has its roots in Domino Tiling Problems is presented by Wang [14].
DNA self-assembly is a methodology for the construction of molecular scale structures [15]. The nucleobases commonly used in DNA self-assembly are the purines, adenine (A) and guanine (G), and the pyrimidines, thymine (T) and cytosine (C). There are two complementary perspectives on molecular computation: one is to use the biochemical algorithms to direct and control molecular processes, such as complex fabrication tasks; the other one is to use the astounding parallelism of chemistry to solve mathematical problems, such as combinatorial search problems [16]. Winfree [16] presented a simple form of biochemical algorithm, based on molecular self assembly of heterogeneous crystals, that illustrates some aspects of programming in vitro biochemical systems and the potential applications. The self-assembled grids are capable of forming patterns that are relevant to logic circuitry and can serve as templates for metals and semiconductors [17]. DNA self-assembly can be used to execute massively parallel computations at the molecular scale, with concurrent assemblies that may execute computations independently. Due to the very compact form of DNA molecules, the degree of parallelism (due to distinct tiling assemblies) may be up to 10 [15] to possibly 10 [18].
It is also important for designers to have access to tools that are capable of modeling self-assembled structures within a framework for complex DNA nanostructure fabrication. In the paper [18], Dwyer introduced his bottom-up circuit patterning process based on DNA self-assembly in terms of the design tool requirements and the new opportunities self-assembly creates for circuit designers.
Computational modeling of nano-scale architectures and interactions and the resulting simulation of material properties has helped in the development of materials in many areas, such as liquid crystals [19]. It has also led to a greater understanding of how nanoparticles interact with a cellular membrane [20], the outer layer surrounding a living cell, which is a lipid bilayer generally several nm thick that has complex permeability properties. Dubey et al. reported on simulations of the motion and properties of a new viral protein nano-motor [21] that could conceivably function as part of a molecular-scale machine [22].
The theoretical basis for self-assembly of nanostructures has its roots in the Domino Tiling Problems defined by Wang [14,23]. Aperiodic Wang tilings are used to create non-repeating images and texture maps for computer graphics [24]. The input is a finite set of unit size square tiles, each of whose sides are labeled with symbols over a finite alphabet. Additional restrictions include the initial placement of a subset of these tiles, and the dimensions of the layout where tiles must be placed. Assuming an arbitrarily large supply of each tile, the problem is to place the tiles to completely fill a given layout so that each pair of abutting tiles has identical symbols on their contacting sides [25]. Tiles can represent molecules, macromolecules, or nanoparticles (and different tiles may represent different classes), and they usually interact with each other. In addition, reversibility is another important principle: for self-assembly to generate ordered structures, each association must either be reversible or must allow the components to adjust their positions within an aggregate once it has formed. In previous research, it has been shown that the difficulty of the problem is independent of this condition-the problem remains NP-Complete regardless of whether or not a layout contains information about the placement of tiles [26]. Subsequently, several research groups [27][28][29] and [30] proposed the use of self-assembly processes (in the context of DNA tiling and nanostructures) to solve NP-complete combinatorial search problems such as SAT and graph coloring.

Biomedical Nano-Materials
Historically, computational nano-materials research have been developed in recent decades on algorithms design and software system implementation, such as nano-materials design engine system, new nano-spintronics materials and nano-photonics materials design system [31], modeling virtual reality and molecular dynamics of novel nanocomposites [32].
Although living organisms are built of cells that are typically 10 µm, the proteins are much smaller with a typical size of just 5 nm, which is comparable with the dimensions of smallest manmade nanoparticles [33]. Understanding of biological processes on the nano-scale level is a strong driving force behind development of nanotechnology [34]. In recent survey paper [33], Salata did review on research developments in the field of applied nanomaterials with their applications in biology and medicine, and discussed their commercialization prospects.
Because nanoparticles exist in the same size domain as proteins, nano-materials are potentially suitable for bio-tagging or labeling. However, the fact that nanoparticles are of an appropriate size is rarely sufficient, and in general a bioinorganic interface must be attached to a nanoparticle to interact with a specific biological target [33]. In order to use computational algorithms and software to help the design and virtual reality, particle detection and reconstruction methods are required. Detecting single particles represents the first step of most 3D single particle reconstruction algorithms in the literature [35,36]; for example, Gontard et al. used local adaptive thresholding for particle detection from transmission electron microscope images [37]. In a typical procedure, a particle 3D reconstruction process needs to use thousands of or even millions of particles to produce a reasonable 3D model. Jacob et al. [38,39] proposed an algorithm for the 3D reconstruction of DNA filaments from a pair of stereo cryo-electron micrographs. In their method, a snake-like algorithm is used to fit the 2-D data and specify the 3D model of filaments as a spline curvet.
The early detection of cancer is one of the keys to achieving a positive outcome for cancer therapy, and many cancer researchers believe that nanomaterials are playing a more and more important role in improving the ability of oncologists to find cancer in its earliest and most treatable stages. Weihong Tan et al. [40] used aptamer-labeled magnetic nanoparticles to collect and detect small numbers of leukemia cells in whole blood and separating them from other cells in a magnetic field. Ronald Andres et al. [41] developed folic acid-based targeting agents to use with gold nanoparticles as light-activated thermal scalpels for killing malignant cells, many of which over express a receptor for folic acid on their cell surfaces. Florence Delie et al. [42] build a powerful photosensitizer (the active agent in photodynamic therapy, which is used to treat or relieve the symptoms of esophageal cancer and non-small cell lung cancer) that would kill cancer cells when irradiated with light. Joseph Wang et al. [43,44] are using electrodes modified with carbon nanotubes to create highly sensitive devices capable of detecting specific sequences of DNA, including those associated with the breast cancer gene BRCA1. Michael Strano et al. [45] used carbon nanotubes to create a biomedical sensor which is implanted into a diabetes sufferer and can then be used to measure glucose levels at any time, without the need for a blood sample.

Nano-scale Characterization and Analysis
Nano-scale characterization and analysis involves analyzing materials at the atomic, molecular, and macromolecular levels, often using imaging. Atomic Force Microscope (AFM), Scanning Probe Microscope (SPM), and SEM/TEM (Scanning Electron/Transmission Microscopy) systems provide high-resolution imaging with nanotechnology research and allow scientists to peer into the nano world. Traditional biomedical imaging techniques are anatomically and functionally oriented and reflect structural features and physiological and pathological phenomena. Topographic imaging at the molecular level involves an indirect mapping of the surface of a material based on a probe that can follow the shape of the surface. Molecular imaging systems can detect, capture, and monitor molecular and cellular abnormalities that cause diseases and associated symptoms in vivo. The computational issues in this area range from image acquisition to image processing to image analysis.
Fluorescence microscope imaging is extensively used in the study of biochemical and biological molecules in solution. Many computational techniques have been developed to address limitations in the study of biochemical and biological molecules. In paper [46], Merryman and Kovacevic proposed an algorithm for adaptive efficient acquisition of fluorescence microscopy data sets using a multirate approach in the context of classification of proteins based on their subcellular locations. Their algorithm outperforms standard downsampling because it uses an intelligent acquisition scheme and retains high frequencies and saves samples where low frequencies are present [47]. Since their algorithm retains the high frequencies and saves samples where low frequencies are present, it outperforms the standard one. Lidke and Rieger et al. [48] derived relationships for the minimum uncertainty in an anisotropy measurement based on the number of total collected photons and used quantitative methods for the acquisition and image processing of anisotropy data that return the expected error of the anisotropy per pixel based on photon statistics. They presented theoretical and experimental considerations for detecting fluorescence polarization by measurements of anisotropy in the confocal microscope.
Images of nano-structures are often noisy. Segmentation and denoise methods are required in nano-scale images. Gat [49] proposed a method for segmenting an image using a geometric model of the observed structure. Their algorithm explored a pre-defined space of segmentations using a branch-and-bound algorithm. Their segmentation results are guaranteed to be globally optimal. Luciani et al. [50] combined the multiresolution analysis with a thresholding method to generate a new automatic processing allowing a segmentation of AFM images with the aim of obtaining with precision the dimensions of nano-domains. Luck and Carlson et al. [51] used combined model based algorithm with image enhancement, Gaussian Markov, and Bayesian framework to develop an effective algorithm for automatically segmenting nuclei in confocal reflectance images of cervical tissue. Usually noise in computer vision applications can be modeled quite well as additive, zero-mean white or Gaussian noise, but nano-scale images suffer from low intensities and thus mainly from Poisson-like noise. Scharr, Felsberg, and Forssén [52] applied B-spline channel smoothing to handle non-zero-mean noises and meet the requirements imposed by the noise characteristic. Compared to anisotropic diffusion schemes that were popular methods, their technique over performed on noisy nano-scale images of silicon structures. Yu and Bajaj [53] used an automatic algorithm to detect and isolate the asymmetric units of an icosahedral density map. The fast marching algorithm and rotational transforms for icosahedral and local symmetry axes were integrated to segment the boundaries of asymmetric units. With more and more virus maps available at near atomic resolution, their method has been validated in both real cryo-EM maps and blurred maps of PDB structures. Automated segmentation of tube-like structures is an interesting topic in biomedical images [54][55][56][57][58]. Abdul-Karim and Roysam et al. [59] combined the minimum description length principle and recursive random search algorithm to select optimal parameter settings for vessel and neurite segmentation. Their method is fully automatic and self contained and free of user interactions with demonstration on human retinas and cultured neurons images.
Ryu and Horn et al. [60] introduced a Voronoi diagram based image reconstruction algorithm to employ structured illumination patterns to increase the resolution of an imaging system for optical microscopy. Their algorithm combines high resolution textured illumination with a relatively low resolution sensor system, and a high resolution image of the target is generated computationally by processing a series of low resolution images obtained by illuminating the target with a sequence of finely textured patterns.
Imaging resolution in optical coherence tomography (OCT) is a key element for acquiring clinically useful optical biopsies of tissues. Ralston and Marks et al. [61] Gaussian beam deconvolution based algorithms to mitigate transverse blurring and the apparent loss of transverse resolution in OCT. Meantime, they investigated an iterative expectation maximization algorithm, the Richardson-Lucy algorithm, with a beam-width-dependent iteration scheme. Their algorithms concentrate energies near boundaries that provide a good approximation to cellular boundaries and subcellular features, and tend to be more robust against errors from the defocused blur.

Conclusion
The successful development and implementation of computational methods brings the benefits to biomedical Nanotechnology. Although an increasing number of praiseworthy efforts in applying computational methods to biomedical nanotechnology, the field is still very much in its infancy. The future is difficult to predict, however we believe promising research paradigms should involve novel probes, sensitive data acquisition, multi-modality multi-mode fusion, advanced reconstruction methods, prior knowledge based regularization, sophisticated image analysis, and/or optimized system integration. There is no doubt that nano computing will play an increasingly more important role in biomedical nanometrics and nano-materials research.