Active vision shapes and coordinates flight motor responses in flies
Animals use active sensing to respond to sensory inputs and guide future motor decisions. In flight, flies generate a pattern of head and body movements to stabilize gaze. How the brain distributes visual information to control head and body movements and how active head movements influence downstream motor control remains elusive. Using a control theoretic framework, we studied the optomotor gaze stabilization reflex in virtual reality and quantified how head movements track visual motion and shape wing steering efforts in Drosophila. By shaping visual inputs, head movements increased the gain of wing steering responses and coordination between stimulus and wings, suggesting a synergy between head and wing movements. Following stimulus onset, the head responded in as little as 10 ms—a latency similar to the primate vestibulo-ocular reflex—whereas wing steering responses lagged by more than 40 ms. This timing difference suggests a temporal order in the flow of visual information such that the head filters visual information eliciting downstream wing steering responses. Head fixation significantly decreased flight mechanical power by reducing both wingbeat frequency and overall thrust. By simulating an Elementary Motion Detector array, we show that head movements reduce the effective visual input dynamic range thereby mapping onto the sensitivity optimum of the motion vision pathway and are more in-phase than head-fixed simulation. Taken together, our results reveal a synergy between active visual sensing and flight motor responses in Drosophila. Our work provides a framework for understanding how to coordinate moving sensors on a moving body.
|Active vision shapes and coordinates flight motor responses in flies
|CC BY 4.0 (Attribution)
|September 1, 2020
|March 25, 2020
This resource is currently not in any collection.