STEP2DYNA University of Lincoln researcher Jiannan Zhao recently published a paper titled “Enhancing LGMD’s Looming Selectivity for UAV with Spatial-temporal Distributed Presynaptic Connections” on IEEE Transactions on Neural Networks and Learning Systems. IEEE Transactions on Neural Networks and Learning Systems is one of the top-tier journals that publish technical articles dealing with the theory, design, and applications of neural networks and related learning systems. It has a significant influence on artificial neural networks and learning systems.
Research Summary
Collision detection is one of the most challenging tasks for Unmanned Aerial Vehicles (UAVs). This is especially true for small or micro UAVs, due to their limited computational power. In nature, flying insects with compact and simple visual systems demonstrate their remarkable ability to navigate and avoid collision in complex environments. A good example of this is provided by locusts. They can avoid collisions in a dense swarm through the activity of a motion-based visual neuron called the Lobula Giant Movement Detector (LGMD). The defining feature of the LGMD neuron is its preference for looming. As a flying insect’s visual neuron, LGMD is considered to be an ideal basis for building UAV’s collision detecting system. However, existing LGMD models cannot distinguish looming clearly from other visual cues such as complex background movements caused by UAV agile flights. To address this issue, this research proposed a new model implementing distributed spatial-temporal synaptic interactions, which is inspired by recent findings in locusts’ synaptic morphology. We first introduced the locally distributed excitation to enhance the excitation caused by visual motion with preferred velocities. Then radially extending temporal latency for inhibition is incorporated to compete with the distributed excitation and selectively suppress the non-preferred visual motions. This spatial-temporal competition between excitation and inhibition in our model is therefore tuned to preferred image angular velocity representing looming rather than background movements with these distributed synaptic interactions. A series of experiments systematically analysed the proposed model during UAV agile flights. Our results demonstrated that this new model enhances the looming selectivity in complex flying scenes considerably and has the potential to be implemented on embedded collision detection systems for small or micro UAVs.
Research Highlights
To overcome whole-field-of-view image motion during UAV agile flight, this research proposed novel synaptic computing strategies to filter on image angular velocity. Due to the proposed spatial-temporal distributed synaptic interconnections, the model of the LGMD neuron is able to select looming pattern with linear synaptic computation only. The neural model is depicted in Figure 1, Video trials performance is shown in Figure 2 and UAV onboard experiments are demonstrated in the supplementary videos.
Supplementary Video
Further demonstration and analyses is provided in the video.