"Slight" of hand: the processing of visually degraded gestures with speech

PLoS One. 2012;7(8):e42620. doi: 10.1371/journal.pone.0042620. Epub 2012 Aug 9.

Abstract

Co-speech hand gestures influence language comprehension. The present experiment explored what part of the visual processing system is optimized for processing these gestures. Participants viewed short video clips of speech and gestures (e.g., a person saying "chop" or "twist" while making a chopping gesture) and had to determine whether the two modalities were congruent or incongruent. Gesture videos were designed to stimulate the parvocellular or magnocellular visual pathways by filtering out low or high spatial frequencies (HSF versus LSF) at two levels of degradation severity (moderate and severe). Participants were less accurate and slower at processing gesture and speech at severe versus moderate levels of degradation. In addition, they were slower for LSF versus HSF stimuli, and this difference was most pronounced in the severely degraded condition. However, exploratory item analyses showed that the HSF advantage was modulated by the range of motion and amount of motion energy in each video. The results suggest that hand gestures exploit a wide range of spatial frequencies, and depending on what frequencies carry the most motion energy, parvocellular or magnocellular visual pathways are maximized to quickly and optimally extract meaning.

MeSH terms

  • Female
  • Gestures*
  • Humans
  • Male
  • Movement
  • Photic Stimulation
  • Range of Motion, Articular
  • Reaction Time
  • Semantics
  • Speech*
  • Videotape Recording
  • Visual Perception*

Grants and funding

These authors have no support or funding to report.