Elsevier

Cognition

Volume 132, Issue 1, July 2014, Pages 1-15
Cognition

Motor coordination uses external spatial coordinates independent of developmental vision

https://doi.org/10.1016/j.cognition.2014.03.005Get rights and content

Highlights

  • Bimanual coordination is constrained by external coordinates in sighted and congenitally blind.

  • The perceptual code that guides motor coordination does not critically depend on vision.

  • The use of external coordinates by the blind contrasts with their use of anatomical coordinates in perception.

Abstract

The constraints that guide bimanual movement coordination are informative about the processing principles underlying movement planning in humans. For example, symmetry relative to the body midline benefits finger and hand movements independent of hand posture. This symmetry constraint has been interpreted to indicate that movement coordination is guided by a perceptual code. Although it has been assumed implicitly that the perceptual system at the heart of this constraint is vision, this relationship has not been tested. Here, congenitally blind and sighted participants made symmetrical and non-symmetrical (that is, parallel) bimanual tapping and finger oscillation movements. For both groups, symmetrical movements were executed more correctly than parallel movements, independent of anatomical constraints like finger homology and hand posture. For the blind, the reliance on external spatial factors in movement coordination stands in stark contrast to their use of an anatomical reference frame in perceptual processing. Thus, the externally coded symmetry constraint evident in bimanual coordination can develop in the absence of the visual system, suggesting that the visual system is not critical for the establishment of an external-spatial reference frame in movement coordination.

Introduction

To guide actions in the world, the brain faces a difficult challenge: sensory information about objects must be translated into appropriate muscle contractions that bring the effector towards them. However, the spatial coordinate systems inherent in the different senses – for example, an eye-centered reference frame for visual information falling onto the retina (Batista, Buneo, Snyder, & Andersen, 1999) – does not readily define the kinds of muscle activations and joint constellations necessary for movement towards objects (Herbort, Butz, & Pedersen, 2010). Much research has, therefore, been concerned with the question what kind of coordinate system dominates movement planning and execution.

One field of research in which this debate has been central is the coordination between different effectors. Movements that are mirror symmetrical with respect to the body midline are executed with greater precision, and can be performed at higher speeds, than non-symmetrical movements. For example, when rhythmically flexing and extending the right and left wrists, movement performance is superior when both wrists are flexed and extended in synchrony, compared to when one wrist must be flexed while the other is extended (Cohen, 1971, Kelso, 1984, Kelso et al., 1986). Similar principles govern other types of finger movements like tapping (Mechsner, Kerzel, Knoblich, & Prinz, 2001), finger flexion and extension (Carson and Riek, 1998, Riek et al., 1992), and finger abduction and adduction (Mechsner et al., 2001). Finger flexion and extension are movements that bring the finger down and up, respectively, when the hand is held palm down. Finger abduction and adduction, in contrast, are movements that bring the right index finger to the left and right, respectively, when the right hand is held palm down. In the following, we will refer to the latter as “finger oscillation” for brevity.

There has been an extensive debate about the origin of “mirror symmetry” for bimanual movement coordination. It was originally suggested that symmetry pertained to the use of homologous muscles in the two wrists, hands, or fingers, respectively (Cohen, 1971, Kelso, 1984, Riek et al., 1992). This owed to the fact that, due to the body’s symmetry, movements towards the midline require the use of homologous muscles in the two hands. Accordingly, performance advantages of symmetric movements may be due simply to synergies in the motor system during co-activation of homologous muscles. However, it was later shown that the tendency for mirror symmetry was preserved when the hands were held in opposite postures – one facing up, and the other facing down (Mechsner et al., 2001). In this situation, bimanual finger movements were still performed most successfully when the fingers of the two hands were directed towards and away from the body midline in synchrony, although this now required the concurrent use of non-homologous muscles in the two hands. Moreover, participants performed significantly worse in a condition in which the fingers had to be moved “in parallel”, that is, in the same direction in space (both fingers towards the left side in space, then both towards the right side in space). Note, that this movement requires the use of homologous muscles when the hands are positioned in opposite postures. Thus, the facilitation of coordination by symmetry seemed to depend on perceptual factors, and not on motor-related mechanisms like muscle synergies.

Further evidence for the use of perceptual codes in motor coordination comes from a tapping task (Mechsner et al., 2001). In this task, participants had to tap bimanual finger patterns. When participants tapped a mirror-symmetrical pattern (that is, tap the two middle fingers, then the two index fingers), they performed better than when they tapped a parallel pattern (that is, the left middle with the right index finger, then the left index with the right middle finger). If movement coordination were based on muscle homology, then, in this latter case, participants should tap best whenever two homologous fingers are tapped together. To test this hypothesis, tapping patterns were modified such that participants used the right middle and ring fingers, rather than the right index and middle fingers. Note, that in this case two homologous fingers (the two middle fingers) tap together in a parallel rather than in a symmetrical tapping pattern. However, again, participants preferred the spatially mirror-symmetrical tapping pattern, further supporting the conclusion that movement coordination is governed by “perceptual” rather than by anatomical factors (Mechsner et al., 2001). Others have referred to non-anatomical influences on movement coordination as effects of “mutual movement direction”, which results in “extrinsic”, as opposed to egocentric, muscle-based coordinative constraints (Swinnen et al., 1998). Here, we refer to space-based coordination principles as being based on an external spatial reference frame, and contrast this term with an anatomical, muscle-based coding scheme.

An advantage of certain spatial coordination patterns has been demonstrated for non-homologous limbs, too. For example, participants performed more successfully when they had to move a hand and a foot up and down in synchrony, than when they had to move the two effectors asynchronously, that is, one up and the other down (Baldissera, Cavallari, & Civaschi, 1982). This effect was independent of whether the hand was turned upward or downward. Such coordinative preferences cannot be due to muscle homology, given that they involve different kinds of limbs. Their existence has, therefore, been interpreted as evidence for movement coordination being entirely organized according to perceptual factors (Mechsner, 2004), just like the finger coordination results by Mechsner and colleagues. In contrast, others have suggested that movement coordination is subject to several different types of constraints, among them both anatomical and perceptual factors (Amazeen et al., 2008, Riek and Woolley, 2005, Swinnen et al., 1998, Temprado et al., 2003).

However, it remains unspecified which types of perceptual information might be at the heart of the external spatial biases that have been observed. The identification of an impact of external coordinates on coordination does not in itself reveal the cognitive functions which underlie such organizational principles as mirror symmetry, nor the perceptual systems which establish them. Intuitively, our perception of space is tightly linked with our visual sense: for example, the description of symmetry in terms of movement direction usually coincides with a visual description of the respective movements. Moreover, vision is intricately linked to movement planning and execution. For example, it is known that movements that are aimed directly at a visual target can be executed extremely fast and, possibly, bypass cortical control (Day and Lyon, 2000, Pruszynski et al., 2010). Furthermore, movements are regularly corrected online based on visual input with seemingly little effort during execution (Day & Reynolds, 2005), and movement trajectories are adjusted such that they appear visually approximately straight (Wolpert, Ghahramani, & Jordan, 1995).

External coordinates may, however, be derived from other sensory systems, such as proprioception, the vestibular system, and even audition. In the bimanual tapping paradigm, emphasizing visual symmetry or parallelism by visually marking the respective fingers did not alter the advantage of mirror-symmetrical movements independent of whether homologous or non-homologous fingers were tapped (Mechsner & Knoblich, 2004). Moreover, occluding vision of the hands did not change the tapping pattern (Mechsner et al., 2001). Yet such independence of immediate visual information does not exclude that symmetry is defined visually. This can be convincingly illustrated with a different perceptual process, namely, tactile localization. There is ample evidence that touch is recoded from skin-based (that is, anatomically-based) into external coordinates. Such recoding is demonstrable by localization impairments induced by body postures which lead to incongruence between skin coordinates and external coordinates, as, for example, during hand crossing (Heed and Azañón, 2014, Shore et al., 2002, Yamamoto and Kitazawa, 2001). Recoding appears to be initiated automatically in sighted humans, as crossing effects are observed even in tasks which do not require any external coding for stimulus and response (Azañón, Camacho, & Soto-Faraco, 2010). In congenitally blind adults, tactile localization is unaffected by posture, suggesting that the lack of the visual system impedes the establishment of automatic external recoding of touch (Röder et al., 2008, Röder et al., 2004). Crucially, performance of participants who have become blind later in life is affected by posture just like that in the sighted, even after decades of blindness. Thus, in late blind individuals, the visual system seems to have induced the use of visual space during ontogeny, and, once established, this coordinate system is used even if visual information is no longer available (Röder et al., 2007, Röder et al., 2004).

In analogy to these findings, we ask here whether the perceptual characteristics that constrain bimanual coordination, as for example symmetry towards the body midline, depend on availability of visual input during ontogenetic development. To this end, we tested whether congenitally blind participants coordinate bimanual movements using externally-based spatial principles like the sighted, or whether their performance relies, instead, on a muscle-based reference frame. If symmetry were determined by muscle homology in congenitally blind individuals, this would suggest that the external symmetry in the sighted develops based on visual input during ontogeny. In contrast, if blind individuals showed externally coded symmetry, this would suggest that the perceptual code that determines movement preferences does not crucially depend on developmental vision.

Section snippets

Experiment 1: Finger tapping

Experiment 1 was the finger tapping task used by Mechsner et al. (2001). In this task, different fingers must be tapped in synchrony. The tapping patterns are arranged to be either symmetrical or parallel. The crucial manipulation is that different fingers are used in different conditions. In some conditions, homologous fingers are tapped together under symmetrical instructions. In other conditions, homologous fingers are instead tapped together under parallel instructions. If tapping is most

Methods

Experiment 2 was the finger oscillation task used by Mechsner et al. (2001). In this task, the index fingers must be moved in the left–right direction (that is, “oscillated”) in synchrony. Oscillations are to be either symmetrical or parallel. The crucial manipulation is that the hands are held in different postures – palm up and palm down – in different conditions. When the two hands have the same posture, symmetrical oscillations require the use of identical muscles. In contrast, when the two

General discussion

We tested whether the tendency towards spatially defined mirror symmetry in bimanual coordination depends on developmental vision. Sighted and congenitally blind participants performed a bimanual tapping task with different sets of fingers on each hand, and a finger oscillation task (finger adduction and abduction) during which hand posture was varied. The performance patterns of the blind participants in both experiments suggest that they were strongly biased towards using an external code for

Acknowledgements

We thank Franziska H. Rudzik, Jonathan T. W. Schubert, and Honsum Lam for acquiring data. This work was funded by the German Research Collaborative SFB 936 “Multi-site communication in the brain”, Projects B1 and B2. TH is funded by the Emmy Noether program of the German Research Foundation (HE 6368/1-1).

References (58)

  • J.A.S. Kelso et al.

    Nonequilibrium phase transitions in coordinated biological motion: Critical fluctuations

    Physics Letters A

    (1986)
  • C. Klaes et al.

    Choosing goals, not rules: Deciding among rule-based action plans

    Neuron

    (2011)
  • A. Pouget et al.

    Multisensory spatial representations in eye-centered coordinates for reaching

    Cognition

    (2002)
  • S. Riek et al.

    Hierarchical organisation of neuro-anatomical constraints in interlimb coordination

    Human Movement Science

    (2005)
  • G. Rizzolatti et al.

    Reorienting attention across the horizontal and vertical meridians: Evidence in favor of a premotor theory of attention

    Neuropsychologia

    (1987)
  • B. Röder et al.

    Early vision impairs tactile perception in the blind

    Current Biology

    (2004)
  • A. Schindler et al.

    Parietal cortex codes for egocentric space beyond the field of view

    Current Biology

    (2013)
  • D.I. Shore et al.

    Confusing the mind by crossing the hands

    Cognitive Brain Research

    (2002)
  • S.P. Swinnen et al.

    Exploring interlimb constraints during bimanual graphic performance: Effects of muscle grouping and direction

    Behavioural Brain Research

    (1998)
  • J. Temprado et al.

    Interaction of directional, neuromuscular and egocentric constraints on the stability of preferred bimanual coordination patterns

    Human Movement Science

    (2003)
  • E.L. Amazeen et al.

    Visual–spatial and anatomical constraints interact in a bimanual coordination task with transformed visual feedback

    Experimental Brain Research

    (2008)
  • E. Azañón et al.

    Tactile remapping beyond space

    European Journal of Neuroscience

    (2010)
  • S. Badde et al.

    Multiple spatial representations determine touch localization on the fingers

    Journal of Experimental Psychology: Human Perception and Performance

    (2013)
  • D.J. Barr

    Random effects structure for testing interactions in linear mixed-effects models

    Frontiers in Psychology

    (2013)
  • Bates, D., Maechler, M., & Bolker, B. (2013). lme4: Linear mixed-effects models using S4 classes....
  • A.P. Batista et al.

    Reach plans in eye-centered coordinates

    Science

    (1999)
  • R.G. Carson et al.

    The influence of joint position on the dynamics of perception–action coupling

    Experimental Brain Research

    (1998)
  • L. Cohen

    Synchronous bimanual movements performed by homologous and non-homologous muscles

    Perceptual and Motor Skills

    (1971)
  • T. Collins et al.

    Eye-movement-driven changes in the perception of auditory space

    Attention, Perception, and Psychophysics

    (2010)
  • Cited by (14)

    • The shared numerical representation for action and perception develops independently from vision

      2020, Cortex
      Citation Excerpt :

      While the present data may first seem at odds with those results, they are well in line with more recent studies showing that blind people are able to use external coordinates when the task involves action (in contrast to mere perception) (Crollen et al., 2017; Crollen, Spruyt, Mahau, Bottini, & Collignon, 2019; Heed, Buchholz, Engel, & Röder, 2015; Heed & Röder, 2014). Bimanual coordination in the congenitally blind is for example constrained by external-spatial factors like in the sighted (Heed & Röder, 2014). External coordinates may similarly affect tactile localization in congenitally blind in the context of an action that requires external spatial coding (Heed et al., 2015).

    • Integrating multisensory information across external and motor-based frames of reference

      2018, Cognition
      Citation Excerpt :

      Why would information from an external frame of reference be more strongly weighted than information from a motor-based frame of reference (i.e. we > wm), regardless of motor outflow or effort? Prior studies have shown that information about movements in external space, instead of motor-based information centered on muscles or joints, dominates movement coordination (Brandes et al., 2016; Heed & Röder, 2014; Mechsner et al., 2001). For example, in a bimanual finger oscillation task, movements are typically more coordinated when the movements are symmetric (the index fingers simultaneously move towards or away from the body midline) versus parallel (the index fingers move simultaneously in the same direction in external space).

    • Motor priming by movement observation with contralateral concurrent action execution

      2018, Human Movement Science
      Citation Excerpt :

      One can argue that the compatibility effect observed in the contralateral condition was simply caused by the nature of bilateral motor control. Previous studies have reported that participants show greater precision in rhythmic movements that are mirror symmetrical with respect to the body midline than non-symmetrical (parallel) movements, regardless of anatomical posture (Heed & Röder, 2014; Mechsner, Kerzel, Knoblich, & Prinz, 2001). Although their tasks are largely different from ours, it is possible that difficulties of motor control in non-symmetrical movements delayed right hand responses and yielded the positive compatibility effect that supports our hypothesis.

    • Oscillatory activity reflects differential use of spatial reference frames by sighted and blind individuals in tactile attention

      2015, NeuroImage
      Citation Excerpt :

      The apparent lack of the use of external coordinates during tactile attentional orienting in congenitally blind humans corroborates previous evidence suggesting that the absence of vision from birth significantly changes tactile spatial processing (Röder et al., 2004, 2008). Although congenitally blind individuals can make use of an external reference frame when task instructions suggest or require its use (Eardley and van Velzen, 2011; Heed and Röder, 2014; Röder et al., 2007), they appear to rely on an anatomical reference frame otherwise, as in the current study. The neural structures thought to generate oscillatory alpha-band activity (Lopes da Silva et al., 1973, 1980; Lőrincz et al., 2009), including the visual thalamus as well as the lower layers of the visual cortex, have been found to be atrophied in congenital blind individuals (Ptito et al., 2008; Shimony et al., 2006).

    • Space and time in the sighted and blind

      2015, Cognition
      Citation Excerpt :

      Our data are in accord with data from perceptual-motor tasks in which early blind participants have been found, under some circumstances, to adopt an external FoR like their sighted counterparts. For instance, both sighted and blind people rely onto an external FoR for coordinating bimanual movements such as finger tapping and finger oscillation (Heed & Röder, 2014). Additionally, it has been shown that also sighted people tend to rely on different FoRs to estimate the location of a tactile stimulus depending on the salience of a particular frame of reference (external or anatomical) in a secondary task (Badde, Röder, & Heed, 2014).

    View all citing articles on Scopus
    View full text