Abstract
In primates’ cerebral cortex, depth rotation sensitive (DRS) neurons have the property of preferential selectivity for depth rotation motion, whereas such a property is rarely adopted to create computational models for depth rotation motion detection. To fill this gap, a novel feedforward visual neural network is developed to execute depth rotation object detection, based on the recent neurophysiologic achievements on the mammalian vision system. The proposed neural network consists of two parts, i.e., presynaptic and postsynaptic neural networks. The former comprises multiple lateral inhibition neural sub-networks for the capture of visual motion information, and the latter extracts the cues of translational and depth motion and later, synthesizes such clues to perceive the process of depth rotation of an object. Experimentally, the neural network is sufficiently examined by different types of depth rotation under multiple conditions and settings. Numerical experiments show that not only it can effectively detect the spatio-temporal energy change of depth rotation of a moving object, but also its output excitation curve is a quasi-sinusoidal one, which is compatible with the hypothesis suggested by Johansson and Jansson in projective geometry. This research is a critical step toward the construction of artificial vision system for depth rotation object recognition.
Similar content being viewed by others
References
Yan C, Xie H, Yang D et al (2018) Supervised hash coding with deep neural network for environment perception of intelligent vehicles. IEEE Trans Intell Transp Syst 19:284–295
Vlasits A, Baden T (2019) Motion vision: a new mechanism in the mammalian retina. Curr Biol 29:R933–R935
Koenderink JJ, van Doorn AJ (1976) Local structure of movement parallax of the plane. J Opt Soc Am 66:717–723
Verri A, Girosi F, Torre V (1990) Differential techniques for optical flow. J Opt Soc Am A 7:912–922
Maunsell JH, Van Essen DC (1983) Functional properties of neurons in middle temporal visual area of the macaque monkey. I. Selectivity for stimulus direction, speed, and orientation. J Neurophysiol 49:1127–1147
Rind FC, Simmons PJ (1999) Seeing what is coming: building collision-sensitive neurones. Trends Neurosci 22:215–220
Saito H, Yukie M, Tanaka K et al (1986) Integration of direction signals of image motion in the superior temporal sulcus of the macaque monkey. J Neurosci 6:145–157
Sakata H, Shibutani H, Kawano K, Harrington TL (1985) Neural mechanisms of space vision in the parietal association cortex of the monkey. Vis Res 25:453–463
Sakata H, Shibutani H, Ito Y, Tsurugai K (1986) Parietal cortical neurons responding to rotary movement of visual stimulus in space. Exp Brain Res 61:658–663
Hu B, Yue S, Zhang Z (2017) A rotational motion perception neural network based on asymmetric spatiotemporal visual information processing. IEEE Trans Neural Netw Learn Syst 28:2803–2821
Sakata H, Shibutani H, Ito Y et al (1994) Functional properties of rotation-sensitive neurons in the posterior parietal association cortex of the monkey. Exp Brain Res 101:183–202
Sakata H, Taira M, Kusunoki M et al (1997) The parietal association cortex in depth perception and visual control of hand action. Trends Neurosci 20:350–357
Wang H, Peng J, Zheng X, Yue S (2020) A robust visual system for small target motion detection against cluttered moving backgrounds. IEEE Trans Neural Networks Learn Syst 31:839–853
Shojaei K (2019) Three-dimensional neural network tracking control of a moving target by underactuated autonomous underwater vehicles. Neural Comput Appl 31:509–521
Li L, Zhang Z, Lu J (2021) Artificial fly visual joint perception neural network inspired by multiple-regional collision detection. Neural Netw 135:13–28
Fu Q, Hu C, Peng J et al (2020) A robust collision perception visual neural network with specific selectivity to darker objects. IEEE Trans Cybern 50:5074–5088
Maheshan MS, Harish BS, Nagadarshan N (2019) A convolution neural network engine for sclera recognition. Int J Interact Multimed Artif Intell 6:78–83
Liu D, Bellotto N, Yue S (2020) Deep spiking neural network for video-based disguise face recognition based on dynamic facial movements. IEEE Trans Neural Netw Learn Syst 31:1843–1855
Jha S, Dey A, Kumar R, Kumar-Solanki V (2019) A novel approach on visual question answering by parameter prediction using faster region based convolutional neural network. Int J Interact Multimed Artif Intell 5:30–37
Hu B, Zhang Z, Li L (2019) LGMD-based visual neural network for detecting crowd escape behavior. In: Proceedings 2018 5th IEEE international conference cloud computing intelligent systems, CCIS 2018, vol 6, pp 772–778
Chen J, Su W, Wang Z (2020) Crowd counting with crowd attention convolutional neural network. Neurocomputing 382:210–220
Braunstein ML (1972) Perception of rotation in depth: a process model. Psychol Rev 79:510–524
Hershberger WA, Stewart MR, Laughlin NK (1976) Conflicting motion perspective simulating simultaneous clockwise and counterclockwise rotation in depth. J Exp Psychol Hum Percept Perform 2:174–178
Braunstein ML (1984) Perception of rotation in depth: the psychophysical evidence. ACM SIGGRAPH Comput Graph 18:25–26
Shulman GL (1991) Attentional modulation of mechanisms that analyze rotation in depth. J Exp Psychol Hum Percept Perform 17:726–737
Braunstein ML (1976) Depth perception through motion. Academic Press, London
Petersik JT (1980) Rotation judgments and depth judgments: separate or dependent processes? Percept Psychophys 27:588–590
Andersen GJ, Braunstein ML (1983) Dynamic occlusion in the perception of rotation in depth. Percept Psychophys 34:356–362
Johansson G, Jansson G (1968) Perceived rotary motion from changes in a straight line. Percept Psychophys 4:165–170
Carpenter DL, Dugan MP (1983) Motion parallax information for direction of rotation in depth: order and direction components. Perception 12:559–569
Miles FA (1998) The neural processing of 3-D visual information: evidence from eye movements. Eur J Neurosci 10:811–822
Schaafsma SJ, Duysens J, Gielen CCAM (1997) Responses in ventral intraparietal area of awake macaque monkey to optic flow patterns corresponding to rotation of planes in depth can be explained by translation and expansion effects. Vis Neurosci 14:633–646
Simmons PJ, Rind FC, Santer RD (2010) Escapes with and without preparation: the neuroethology of visual startle in locusts. J Insect Physiol 56:876–883
Rind FC, Bramwell DI (1996) Neural network based on the input organization of an identified neuron signaling impending collision. J Neurophysiol 75:967–985
Yue S, Rind FC (2006) Collision detection in complex dynamic scenes using an LGMD-based visual neural network with feature enhancement. IEEE Trans Neural Netw 17:705–716
Yue S, Rind FC (2006) Visual motion pattern extraction and fusion for collision detection in complex dynamic scenes. Comput Vis Image Underst 104:48–60
Yue S, Rind FC (2013) Postsynaptic organisations of directional selective visual neural networks for collision detection. Neurocomputing 103:50–62
Gabriel JP, Trivedi CA, Maurer CM et al (2012) Layer-specific targeting of direction-selective neurons in the zebrafish optic tectum. Neuron 76:1147–1160
Bereshpolova Y, Stoelzel CR, Su C et al (2019) Activation of a visual cortical column by a directionally selective thalamocortical neuron. Cell Rep 27:3733–3740
Fried SI, Münch TA, Werblin FS (2002) Mechanisms and circuitry underlying directional selectivity in the retina. Nature 420:411–414
Huang X, Rangel M, Briggman KL, Wei W (2019) Neural mechanisms of contextual modulation in the retinal direction selective circuit. Nat Commun 10:1–15
Fu Q, Yue S (2017) Modeling direction selective visual neural network with ON and OFF pathways for extracting motion cues from cluttered background. In: 2017 International joint conference on neural networks (IJCNN) IEEE 831–838
Fu Q, Yue S (2020) Modelling Drosophila motion vision pathways for decoding the direction of translating objects against cluttered moving backgrounds. Biol Cybern 114:443–460
Wei W (2018) Neural mechanisms of motion processing in the mammalian retina. Annu Rev Vis Sci 4:165–192
Vlasits AL, Euler T, Franke K (2019) Function first: classifying cell types and circuits of the retina. Curr Opin Neurobiol 56:8–15
Morrone MC, Burr DC, Vaina LM (1995) Two stages of visual processing for radial and circular motion. Nature 376:507–509
Fu Q, Wang H, Hu C, Yue S (2019) Towards computational models and applications of insect visual systems for motion perception: a review. Artif Life 25:263–311
Grünert U, Martin PR (2020) Cell types and cell circuits in human and non-human primate retina. Prog Retin Eye Res 78:1–33
Field GD, Rieke F (2002) Nonlinear signal transfer from mouse rods to bipolar cells and implications for visual sensitivity. Neuron 34:773–785
Gollisch T, Meister M (2010) Eye smarter than scientists believed: neural computations in circuits of the retina. Neuron 65:150–164
Yang X, Wu SM (1991) Feedforward lateral inhibition in retinal bipolar cells: Input-output relation of the horizontal cell-depolarizing bipolar cell synapse. Proc Natl Acad Sci 88:3310–3313
Thoreson WB, Mangel SC (2012) Lateral interactions in the outer retina. Prog Retin Eye Res 31:407–441
Rind FC, Wernitznig S, Pölt P et al (2016) Two identified looming detectors in the locust: ubiquitous lateral connections among their inputs contribute to selective responses to looming objects. Sci Rep 6:1–16
Hu B, Zhang Z (2018) Bio-plausible visual neural network for spatio-temporally spiral motion perception. Neurocomputing 310:96–114
Yue S, Rind FC (2013) Redundant neural vision systems-competing for collision recognition roles. IEEE Trans Auton Ment Dev 5:173–186
Albright TD, Desimone R, Gross CG (1984) Columnar organization of directionally selective cells in visual area MT of the macaque. J Neurophysiol 51:16–31
Schneider M, Kemper VG, Emmerling TC et al (2019) Columnar clusters in the human motion complex reflect consciously perceived motion axis. Proc Natl Acad Sci U S A 116:5096–5101
Beardsley SA, Ward RL, Vaina LM (2003) A neural network model of spiral-planar motion tuning in MSTd. Vision Res 43:577–595
Acknowledgements
The authors sincerely thank the anonymous reviewers for their valuable comments. We also thank the editors of this work for their support. The work is supported by National Natural Science Foundation of China (Nos. 62063002, 61563009, 62066006).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Hu, B., Zhang, Z. Bio-inspired visual neural network on spatio-temporal depth rotation perception. Neural Comput & Applic 33, 10351–10370 (2021). https://doi.org/10.1007/s00521-021-05796-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-021-05796-z