Construction of a brain–machine hybrid system to evaluate adaptability of an insect
Highlights
► To examine dynamics of a brain, we developed the brain–machine hybrid system. ► We set a brain as a controller of a robot moving in a real environment. ► The model system we chose was silk moth’s odor searching behavior. ► We selected command signals to make the robot reach the odor source. ► We also acquired compensative neural responses to the movement of the robot.
Introduction
Insects perform adaptive behavior according to changing environmental conditions by information processing in simple nervous system. Here we define adaptability as the ability to execute a behavioral task under changing environmental conditions. As for biology, it is of interest for robotics, because robots also need to execute tasks in changing environmental conditions.
Male silkworm moths, Bombyx mori, orient toward conspecific females displaying a programmed behavioral pattern (straight-line walking, zigzagging turns and looping) upon detection of sex pheromone by their antennae [1], [2] (Fig. 1). This programmed behavioral pattern is repeated each time a pheromone plume is encountered resulting in localization of the goal.
In a previous study, it has been reported that the silkworm moth could compensate for motor asymmetry and show adaptive behavior in the orientation behavior to a pheromone source [3]. In that study, an insect-controlled two-wheeled robot was built to examine the adaptability of the silkworm moth under controllable circumstances. The robot moved based on the locomotion of the silkworm moth on a sphere mounted on the robot. Using the robot, artificial changes of motor gain could produce unintentional movements for the silkworm moth, and under these conditions, the silkworm moth could adapt to the new circumstances. Moreover, it was revealed that the silkworm moth used visual cues under these conditions, so integration of visual and olfactory sensory input had an important role for the adaptive behavior.
In our laboratory, we have examined input–output pathways in silkworm moths’ brains underlying this behavior by applying several biological methods, for example, molecular genetics, biochemical analysis, electrophysiology and behavioral experiments. Based on the fundamental knowledge from these biological studies, models of the neural pathways are being constructed [4], [5], [6], [7]. Moreover, robots that had those models as controllers were developed to test the function in realistic environments [8].
Currently, however, these brain models are incompletely describing adaptability because of the lack of knowledge concerning the function of each neuron at the network level. In addition, we think that adaptability is elicited by the interaction between insects and environments, so a brain that executes adaptive behavior should be analyzed under closed-loop conditions. Recently in the interdisciplinary field of neuroscience and robotics, closed-loop experimental systems that connect a brain with a robot have been developed [9], [10]. These systems succeeded in a study of phototaxis by giving a brain feedback in form of electrical stimuli reflecting the behavior of a robot. Though they were suited for the study of phototaxis because the optical stimuli in the environment were easy to translate into feedback stimuli, they would be difficult to apply to study chemical plume tracking because the dynamics of odor distribution are too complicated. We have suggested a new experimental method, a brain–machine hybrid system, to estimate dynamic characteristics of an insect brain from the input stimuli to output signals and elucidate modification of behavior and output feedback system (Fig. 2). From the knowledge, we expect to implement the algorithms to artificial systems and make models based on the function and the anatomy of the insect brain.
In this study, we constructed a brain–machine hybrid system as an experimental platform to reproduce the behavior of an insect (Fig. 3), and conducted the initial experiments to understand how silkworm moths process information in the brain during adaptive odor searching behavior.
Section snippets
Brain–machine hybrid system
We constructed a brain–machine hybrid system that could reconstruct silkworm moth behavior (Fig. 4). We developed this hybrid system according to the following steps: (1) Selection of command signals. (2) Development of the conversion rule from signals to behavior. (3) Development of small amplifiers.
Experiments in the wind tunnel
To test the behavioral pattern and odor source orientation behavior of the hybrid system, we used an experimental wind tunnel where we could install reproducible environments (Fig. 8(a)). The wind tunnel was made of styrofoam blocks (The Dow Chemical Company, Japan), and provided enough space (W 840 mm×H300 mm×L1800 mm) for the hybrid system (W 140 mm×H60 mm×L130 mm) to move. We connected the wind tunnel to an exhaust duct to produce a constant air flow of 0.7 m/s average speed. Inside the wind
Response of command signals to unintentional movement of the hybrid system
We did the initial experiments to approach adaptability of an insect using the hybrid system. We can arbitrarily move the hybrid system and measure responses of a moth to external stimuli. Therefore we gave unintentional movement to a moth on the hybrid system and examine whether it could compensate for the externally given movement.
In a previous study, it was shown that silkworm moth compensated for motor asymmetry in orientation behavior using visual feedback [3]. In this study, we
Reconstruction of pheromone tracking by the hybrid system
Using the hybrid system, we observed the programmed behavioral pattern of a male moth following a single pheromone stimulus. We show a typical result by plotting angles formed between pheromone source direction from the hybrid system’s start point and the longitudinal axis of the hybrid system (Fig. 9). Average angular velocity was 9.57°/s (before the stimulus) and 35.8°/s (after the stimulus). Elements of the moth’s programmed behavioral patterns (straight-line walking, zigzagging turns, and
Discussion
In this study, we constructed a brain–machine hybrid system that could execute odor searching behavior. On the artificial body the moths can interact with environment, and we can examine how adaptive behavior is produced by manipulating the movement of the robot. This is a new approach to understand neural activities underlying adaptive behavior in realistic environments.
At first, we selected steering signals corresponding to walking direction that were activated during neck swinging. The 2nd
Conclusion
In this study, we constructed a brain–machine hybrid system by selecting and using appropriative steering signals. At first, we showed that the hybrid system could reproduce the moths’ behavioral pattern and orientation behavior. Using the hybrid system, we could intervene in the relationship between brain and environment where adaptability was produced. For the first step to understand adaptive behavior at the level of the nervous system, we showed that the compensative visual feedback existed
Acknowledgment
This study was supported by MEXT (Scientific Research on Priority Areas 454 (mobiligence17075007) and KAKENHI09J01188).
We thank Dr. Stephan Shuichi Haupt (The University of Tokyo), for allowing us to use his locomotion recorder system and giving us much advice and Dr. Noriyasu Ando (The University of Tokyo) for allowing us to use his visual stimulation setup. We also thank Shigeru Toriihara (Tokyo Institute of Technology) for designing the basic recording system on a brain–machine hybrid system.
Ryo Minegishi received his B.S., and M.Sc. Degree in Neurobiology from the Institute of Biological Sciences, University of Tsukuba in 2007 and 2009, respectively. Since 2009 he is a research fellow of the Japan Society for the Promotion of Science. He is currently in the doctoral course of the University of Tokyo. His research interests include neuroethology, bio-robotics and brain–machine interface. He is a member of ZSJ, JSCPB and SICE.
References (19)
Orientation of the male silkmoth to the sex attractant bombykol
- et al.
Crossmodal visual input for odor tracking during fly flight
Current Biology
(2008) Bombyx mori mating dance: an essential in locating the female
Applied Entomology and Zoology
(1979)- et al.
Insect controlled robot-evaluation of adaptation ability
Journal of Robotics and Mechatronics
(2007) - et al.
Morphological and physiological properties of pheromone-triggered flipflopping descending interneurons of the male silkworm moth, Bombyx mori
Journal of Comparative Physiology A.
(1994) - et al.
Physiological and morphological characterization of olfactory descending interneurons of the male silkworm moth, Bombyx mori
Journal of Comparative Physiology A.
(1999) - et al.
Neural control mechanisms of the pheromone-triggered programmed behavior in male silkmoths revealed by double-labeling of descending interneuron and a motor neuron
Journal of Comparative Neurology
(2005) - et al.
Neurons associated with the flip–flop activity in the lateral accessory lobe and ventral protocerebrum of the silkworm moth brain
Journal of Comparative Neurology
(2010) - et al.
Neural basis of odor-source searching behavior in insect brain systems evaluated with a mobile robot
Chemical Senses
(2005)
Cited by (26)
Insect-machine hybrid robot
2020, Current Opinion in Insect ScienceCitation Excerpt :Minegishi et al. mounted a dissected silkmoth on a robot and controlled it with the activities of the cervical motoneurons that receive steering commands from the descending interneurons [33,34]. The robot exhibited searching behaviors and localized an odor source (Figure 3(e, i)) [35•]. Furthermore, the robot compensated for the turning bias imposed by the optomotor response.
Using insects to drive mobile robots — hybrid robots bridge the gap between biological and artificial systems
2017, Arthropod Structure and DevelopmentCitation Excerpt :The control of a mobile robot by neural activities of onboard or remotely placed animals has been reported in studies on vertebrates, such as lampreys, rats, and monkeys (Reger et al., 2000; Zelenin et al., 2000; Talwar et al., 2002; Fukayama et al., 2010; Rajangam et al., 2016). Several studies also use insect neural signals (from sensory neurons, visual interneurons, or motor neurons) or electromyograms for control of a mobile robot (Kuwana et al., 1996; Nagasawa et al., 1999; Kanzaki et al., 2004; Ortiz, 2006; Ejaz et al., 2011; Minegishi et al., 2012; Martinez et al., 2014) or visual virtual reality (Butala et al., 2007). However, this approach requires the decoding of biological signals to extract meaningful motor commands.
A computational model of conditioning inspired by Drosophila olfactory system
2017, Neural NetworksLearning a Generic Olfactory Search Strategy from Silk Moths by Deep Inverse Reinforcement Learning
2022, IEEE Transactions on Medical Robotics and Bionics
Ryo Minegishi received his B.S., and M.Sc. Degree in Neurobiology from the Institute of Biological Sciences, University of Tsukuba in 2007 and 2009, respectively. Since 2009 he is a research fellow of the Japan Society for the Promotion of Science. He is currently in the doctoral course of the University of Tokyo. His research interests include neuroethology, bio-robotics and brain–machine interface. He is a member of ZSJ, JSCPB and SICE.
Atsushi Takashima received his B.S., M.Sc. and Ph.D. Degree in engineering from the department of mechanical and control engineering, Tokyo Institute of Technology in 2003, 2005, and 2011, respectively. He is currently in the doctoral course of the same university. His research interests include multi robot system and brain–machine interface. He is a member of RSJ and SICE.
Daisuke Kurabayashi received his BE, ME, and Ph.D. in the department of precision machinery engineering, in 1993, 1995, and 1998, respectively from the university of Tokyo. He worked for research institute of physical and chemical research (RIKEN) from 1998 to 2001 as a postdoctoral researcher. Since 2001, he is working for Tokyo Institute of Technology as an associate professor. His research interest includes distributed autonomous systems, functional structure of a network, and bio-robotics. He is a member of RSJ, JSME, JSPE, SICE and IEEE.
Ryohei Kanzaki received his B.S., M.S. and D.Sc. Degree in Neurobiology from the Institute of Biological Sciences, University of Tsukuba in 1980, 1983 and 1986, respectively. From 1987 to 1990 he was a postdoctoral research fellow at the Arizona Research Laboratories, Division of Neurobiology, University of Arizona. From 1991 to 2003 he was successively an assistant professor, associate professor, and full professor at the Institute of Biological Sciences, University of Tsukuba. From 2004 to 2006 he was a full professor at Department of Mechano-Informatics, Graduate School of Information Science and Technology, The University of Tokyo. Since 2006 he is a full professor at the Research Center for Advanced Science and Technology (RCAST), The University of Tokyo.