Coordination of gaze and action during high-speed steering and obstacle avoidance

PLoS One. 2024 Mar 8;19(3):e0289855. doi: 10.1371/journal.pone.0289855. eCollection 2024.

Abstract

When humans navigate through complex environments, they coordinate gaze and steering to sample the visual information needed to guide movement. Gaze and steering behavior have been extensively studied in the context of automobile driving along a winding road, leading to accounts of movement along well-defined paths over flat, obstacle-free surfaces. However, humans are also capable of visually guiding self-motion in environments that are cluttered with obstacles and lack an explicit path. An extreme example of such behavior occurs during first-person view drone racing, in which pilots maneuver at high speeds through a dense forest. In this study, we explored the gaze and steering behavior of skilled drone pilots. Subjects guided a simulated quadcopter along a racecourse embedded within a custom-designed forest-like virtual environment. The environment was viewed through a head-mounted display equipped with an eye tracker to record gaze behavior. In two experiments, subjects performed the task in multiple conditions that varied in terms of the presence of obstacles (trees), waypoints (hoops to fly through), and a path to follow. Subjects often looked in the general direction of things that they wanted to steer toward, but gaze fell on nearby objects and surfaces more often than on the actual path or hoops. Nevertheless, subjects were able to perform the task successfully, steering at high speeds while remaining on the path, passing through hoops, and avoiding collisions. In conditions that contained hoops, subjects adapted how they approached the most immediate hoop in anticipation of the position of the subsequent hoop. Taken together, these findings challenge existing models of steering that assume that steering is tightly coupled to where actors look. We consider the study's broader implications as well as limitations, including the focus on a small sample of highly skilled subjects and inherent noise in measurement of gaze direction.

MeSH terms

  • Automobile Driving*
  • Fixation, Ocular
  • Humans
  • Motion
  • Movement*
  • Psychomotor Performance

Grants and funding

This material is based upon work supported by the National Science Foundation (https://www.nsf.gov/) under Grant No. 2218220 to BRF and by the Office of Naval Research (https://www.nre.navy.mil/) under Grant No. N00014-18-1-2283 to BRF. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation or the Office of Naval Research. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.