Skip to main content

Gestures Offer Insight

Hand and arm movements do much more than accent words; they provide context for understanding

Our body movements always convey something about us to other people. The body "speaks" whether we are sitting or standing, talking or just listening. On a blind date, how the two individuals position themselves tells a great deal about how the evening will unfold: Is she leaning in to him or away? Is his smile genuine or forced?

The same is true of gestures. Almost always involuntary, they tip us off to love, hate, humility and deceit. Yet for years, scientists spent surprisingly little time studying them, because the researchers presumed that hand and arm movements were mere by-products of verbal communication. That view changed during the 1990s, in part because of the influential work of psycholinguist David McNeill at the University of Chicago. For him, gestures are "windows into thought processes." McNeills work, and numerous studies since then, has shown that the body can underscore, undermine or even contradict what a person says. Experts increasingly agree that gestures and speech spring from a common cognitive process to become inextricably interwoven. Understanding the relationship is crucial to understanding how people communicate overall.

The Visual Information Channel
Most of us would find it difficult and uncomfortable to converse for any extended period without using our hands and arms. Gestures play a role whenever we attempt to explain something. At the very least, such motions are co-verbal; they accompany our speech, conveying information that is hard to get across with words. Hand movements can display complex spatial relations, directions, the shape of objects. They enable us to draw maps in the air that tell a puzzled motorist how to reach the turnpike. People who do not gesture rob themselves and their listeners of an important informational channel.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Neurological findings on individuals with communication disorders also demonstrate a fundamental connection between speech and gestures. Brain damage that leads to the loss of mobility in limbs can compromise verbal communication. Patients with aphasia--who do not have the ability to speak or to understand speech--also find it difficult to gesture or understand signs by others. These cases and others suggest that gestures are controlled by the very brain regions responsible for speech.

The interpretations of sounds and movements are closely related for the listener as well. For years, the link could be demonstrated only indirectly by asking test subjects what information they gleaned from others who were speaking and gesticulating. Recent brain research has provided much better insight. For example, neuroscientist Spencer D. Kelly of Colgate University has studied gestures with the help of event-related potentials--characteristic brain waves consisting of a sequence of peaks and valleys--that occur in certain patterns when one person observes another communicating. The patterns reveal neuronal-processing steps in particular brain regions. One of the negative peaks (a valley), referred to as N400, is especially significant. It occurs when we stumble over an inappropriate and unexpected word, for example, when we hear a sentence like "He spread his toast with socks."

Kelly hooked test subjects to an electroencephalograph and charted their event-related potentials while they watched a video. In it, an actor spoke while using gestures to indicate characteristics of an object. A hand movement might fit a word semantically, such as when the word "tall" was illustrated by gesturing at a long-stem glass on a table. A gesture might also be used to convey additional information, such as when "tall" was accompanied by finger movements that indicated the thinness of the elongated stem of the glass. Viewers saw contradictory scenes, too, in which an actor combined the word tall with a gesture that referred to a short object on the table. And sometimes an actor made no gesture at all; in this control situation, the test subjects heard only the spoken word.

Subjects exhibited substantially different brain-wave patterns depending on the situation. The researchers found strong negative peaks--a so-called N400 effect--whenever speech and gesture contradicted one another. They interpreted this phenomenon to mean that gestures and words are in fact processed together: observers factor the meaning of a gesture into their interpretation of a word.

This conclusion was supported by the finding that the event-related potentials exhibited no comparable negativity in the control situation. Even during early processing, the curves differ depending on whether the hand movement fits the word, complements it or contradicts it. "The semantic content" of hand gestures, Kelly says, "contributes to the processing of word meaning in the brain."

Which Came First?
Despite intriguing progress, scientists from various disciplines can still only guess at the origins of the close coupling between gestures and speech. Because primates possess a particularly rich repertoire of gestures--young chimpanzees, for example, typically hold out an open hand when begging from their mothers--it may be that gestures preceded speech in humans. Other researchers advance the notion that "vocal gestures"--simple sounds that could be used as units of meaning, much like hand movements or grimaces--arose first in humans.

Observing young children can provide clues to the common development of oral and visual communication. Up to the age of nine to 12 months, babies reach out with all the fingers of their open hand for whatever object they want--similar to the chimpanzee begging for food. A neuronal maturational shift occurs at about 10 or 11 months in girls, somewhat later in boys: babies begin to point with one finger rather than all the fingers. The effort to get hold of an object is transformed into directed pointing, usually to get the attention of a caregiver. The pointing also usually accompanies a babys initial attempts at verbal symbolization ("da," "wawa"), even though the early attempts frequently fail. A more nuanced gesturing vocabulary begins to develop as fine-motor finger control improves, between nine and 14 months, yet the spoken word continues to lag behind.

Tabling a Topic with One Hand
Synchronized word-gesture combinations begin to be seen in parallel with the childs developing word usage at 16 to 18 months, ultimately leading to children and adults who "embody" with their hands and arms the shape of an object, how people in a group exercise are positioned relative to one another in space, even abstract and metaphorical thoughts. Put your two palms together, lay them aside your right ear, close your eyes, and lean your head to the side--most people will understand that posture as a symbol for "sleep."

Regardless of whether speech or gestures came first in evolution or which of them develops first in babies, humans have come to rely on many varieties of co-verbal gestures. McNeill, in his influential 1992 book Hand and Mind: What Gestures Reveal about Thought, subdivided co-verbal gestures into four basic types: deictic, iconic, metaphorical and beats.

Deictic (pointing) gestures often accompany words such as "here," "there" or "this" and also "I" and "you." In each case, the speaker points to something concrete (this table) or to something figurative ("in this case"). When people say "I," they often point toward themselves with a slightly open hand. And even when a person points toward herself without saying "I," we generally assume that she is talking about herself.

Iconic gestures express images. These movements may relate to something spatial but also to an event, as when someone says, "Susie chased the cat with an umbrella," while poking about with an imaginary umbrella. Such gestures may provide additional information by depicting more precisely just how the poor animal was chased away--with a stabbing or swatting motion--and whether the cat ran off to the right or left.

Metaphorical gestures look similar to iconic ones but generally relate to abstractions. When we say, "The next topic...," we may define an invisible object with a half-opened hand. The abstraction then seems to move into the real world, where it becomes tangible. But if we then say, "...will be tabled for the time being," we may flatten the hand and place it down onto an imaginary table or even metaphorically sweep the object off the table. Iconic and metaphorical gestures may become just as conventional in their meanings as words; a hand wiping imaginary sweat from ones brow means, "That was a close one!"

Finally, we are familiar with beats from political speeches; the punctuated arm or hand movements that are closely linked with the rhythm of the speech give emphasis to what is being said and apparent power to an argument, regardless of its actual content.

These conventionalized gestures can work without our having to say anything. But McNeill is particularly interested in the connection between spontaneous gestures and the spoken word. That both might stem from the same thought was hypothesized in the 1980s by Adam Kendon, a cognitive scientist and founder of gesture research who now divides his time between the University of Pennsylvania and the University of Naples LOrientale in Italy. He observed that the so-called gesture stroke of a co-verbal hand sign--the actual conveyor of meaning, such as mopping ones brow--is enacted shortly before or at the latest when its verbal affiliate is enunciated. Whether a cat is being chased away with an umbrella, or a subject is being tabled, the listener is given visual information just before the verbal information.

According to McNeills theory, the process of speech production and the process of gesture production have a common mental source in which a mixture of preverbal symbols and mental images form the point of origin for the thought that is to be expressed. This growth point, as McNeill calls it, represents a kind of seed out of which words and gestures develop.

Think First, Gesture Later
McNeill also points out that the various language families differ in how they distribute components of meaning between speech and gesture--at least when referring to directional kinds of information. In Romance languages such as Spanish, the gesture stroke is more likely to be coupled with the verb, that is, with "climbs" in "he climbs the ladder" (accompanied by the speakers hand moving upward). In Germanic languages such as German and English, the same gesture stroke is more likely to be used to indicate the locus of action: "he climbs up the ladder" (accompanied by an upward hand thrust).

The languages clearly differ in how information about paths is conveyed, McNeill says. His former doctoral student, Gale Stam, now at National-Louis University in Chicago, uses this finding to determine whether a Spanish speaker who is learning English is beginning to think in English. If his gesture stroke continues to fall on the verb "climb" while speaking English, he is probably still thinking in Spanish and thus is purely translating. If the gesture stroke spontaneously falls on the preposition "up," she assumes that the transition to thinking in English has occurred.

The growing appreciation among scientists for the tight interweave between speech, thought and gesture is giving rise to theories about how the brain creates and coordinates these functions. One influential new model comes from psychologist Willem Levelt of the Max Planck Institute for Psycholinguistics in Nijmegen, the Netherlands. According to Levelt, the brain produces a verbal utterance in three stages. First the brain conceptualizes an intended message as purely preverbal information--as a concept that is not yet formulated linguistically. In the second stage, the brain finds words for this concept and constructs sentences--again, a purely internal process. Only in the third stage do the organs of articulation come into play, producing the desired utterance via the lungs and vocal cords.

One of Levelts students, Jan-Peter de Ruiter, has incorporated gestures into this model. He assumes that the initial conceptualization stage also encompasses a visual precursor for gestures. According to de Ruiter, the brain creates gestural sketches. In the second stage, the sketch is transformed into a gestural plan--a set of movement instructions--that leads to muscle motor programs in the third stage. These programs tell our arms and hands how to move.

This model helps us to understand why gestures may precede the speech they are meant to accompany. The words first have to be assembled into a grammatically sensible expression, whereas the motion is conveyed by standard motor instructions. In a sentence such as "Susie chased the cat with an umbrella," the brain needs time to construct the proper word sequence, which takes longer than issuing the simple motor instruction for "sweep the right arm."

De Ruiter is examining in greater detail the presumed interaction between speech and gesture for pointing motions. He has recorded dialogues between two people telling each other stories and has found that an extended gesture--such as when someone points up toward the sky--tends to delay the verbalization to which it refers ("the plane ascended at a steep angle"). Gestures also adapt to speech; when a storyteller has misspoken and stumbles momentarily, a preprepared gesture appears to be held in abeyance until the speech component is running smoothly again.

These kinds of insights show that understanding how the body communicates is crucial to understanding verbal communication. Spoken words are not the only way humans convey meaning. As professional orators have known for centuries, a well-placed gesture can be the most effective way to make a point hit home. The more we learn about how the body communicates, the better we will become as communicators and observers.

(Further Reading)

  • Neural Correlates of Bimodal Speech and Gesture Comprehension. Spencer D. Kelly, Corinne Kravitz and Michael Hopkins in Brain and Language, Vol. 89, pages 253–260; 2004.

  • Gesture and Thought. David McNeill. University of Chicago Press, 2005.

 

Neural Correlates of Bimodal Speech and Gesture Comprehension. Spencer D. Kelly, Corinne Kravitz and Michael Hopkins in Brain and Language, Vol. 89, pages 253¿260; 2004.

Neural Correlates of Bimodal Speech and Gesture Comprehension. Spencer D. Kelly, Corinne Kravitz and Michael Hopkins in Brain and Language, Vol. 89, pages 253¿260; 2004.

Gesture and Thought. David McNeill. University of Chicago Press, 2005.

Gesture and Thought. David McNeill. University of Chicago Press, 2005.

SA Mind Vol 17 Issue 5This article was originally published with the title “Gestures Offer Insight” in SA Mind Vol. 17 No. 5 (), p. 20
doi:10.1038/scientificamericanmind1006-20