Hemicordulia tau dragonfly by : JJ Harrison

Recent news is abuzz with that latest of fascinating technologies, the driverless car. Designing a navigation system that can scan the environment, process surrounding traffic, and execute moves in real time to avoid collision is no small task! A new paper published in the Journal of The Royal Society Interface, by a research team from the University of Adelaide, has found valuable insight into how animals navigate the challenges related to complex locomotion by studying one of nature’s most keen-eyed hunters—Australia’s native Hemicordulia tau dragonfly.

Dragonfly vision and locomotion have inspired cutting-edge technology in recent years, informing everything from updated drone design to compound camera lenses. Beyond mechanical inspiration, however, lies the evolving frontier of computational biology—a field that includes the study of sensing, signal processing, and other means by which the living world interacts with its environment. Just how does a dragonfly see, and how does that information translate to fast reflexes and an over 95% prey capture rate?

The study, led by PhD student Zahra Bagheri, set out to answer this question by looking specifically at an array of small target motion detector (STMD) neurons in the third optic neuropil, a visual processing area of the dragonfly’s brain. These neurons select for small targets, and are incredibly robust at determining contrast, velocity, and other aspects of target differentiation. After observing the neurons during visual stimulation, the group successfully created a computer model with three parts—an immersive environment with complex outdoor scenes, a moving target object, and an algorithm that mimics the information processing of a dragonfly in pursuit. Remarkably, the team can filter out visual “noise” from a changing and busy background, and remain on target despite split-second fluctuations in speed and agility.

The researchers found that, instead of focusing on the target, the dragonfly employs an “active vision” system that views the background and compares target movement against that field. The authors also note the insect’s ability to lock its eyes forward while in flight, compensating for motion by rotating its head through twists and turns, providing stabilization and keeping the target object in its center of vision. These findings resulted in a simulation nearly twenty times faster at target pursuit than competing models, and the group is working to refine both the algorithm and the software required to mount such a system on a working vehicle.

Aside from artificial sensing, this research has many implications in brain and motor study, and may someday lead to medical advances in treating conditions such as human blindness.

Image: JJ Harrison/Wikimedia

Share This