The 2019 International Joint Conference on Neural Networks (IJCNN) was held at the InterContinental Budapest Hotel in Budapest, Hungary on the 14-19 July 2019. The full Program with Abstracts can be found here.
This conference was attended by STEP2DYNA researchers from the University of Lincoln, Huatian Wang and Hongxin Wang.
Neural Models of Perception, Cognition and Action
Tuesday, July 16, 5:30PM-7:30PM
Hongxin Wang presented the following:
Hongxin Wang, Jigen Peng, Qinbing Fu, Huatian Wang and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China.
The robust detection of small targets against cluttered background is important for future artificial visual systems in searching and tracking applications. The insects’ visual systems have demonstrated excellent ability to avoid predators, find prey or identify conspecifics – which always appear as small dim speckles in the visual field. Build a computational model of the insects’ visual pathways could provide effective solutions to detect small moving targets. Although a few visual system models have been proposed, they only make use of small-field visual features for motion detection and their detection results often contain a number of false positives. To address this issue, we develop a new visual system model for small target motion detection against cluttered moving backgrounds. Compared to the existing models, the small-field and wide-field visual features are separately extracted by two motion-sensitive neurons to detect small target motion and background motion. These two types of motion information are further integrated to filter out false positives. Extensive experiments showed that the proposed model can outperform the existing models in terms of detection rates.
Plenary Poster Session POS2: Poster Session 2
Thursday, July 18, 10:00AM-11:40AM
Huatian Wang presented the following:
Angular Velocity Estimation of Image Motion Mimicking the Honeybee Tunnel Centring Behaviour [#19326]
Huatian Wang, Qinbing Fu, Hongxin Wang, Jigen Peng, Paul Baxter, Cheng Hu and Shigang Yue, University of Lincoln, United Kingdom; Guangzhou University, China
Insects use visual information to estimate angular velocity of retinal image motion, which determines a variety of flight behaviours including speed regulation, tunnel centring and visual navigation. For angular velocity estimation, honeybees show large spatial-independence against visual stimuli, whereas the previous models have not fulfilled such an ability. To address this issue, we propose a bio-plausible model for estimating the image motion velocity based on behavioural experiments of the honeybee flying through patterned tunnels. The proposed model contains mainly three parts, the texture estimation layer for spatial information extraction, the delay-and-correlate layer for temporal information extraction and the decoding layer for angular velocity estimation. This model produces responses that are largely independent of the spatial frequency in grating experiments. And the model has been implemented in a virtual bee for tunnel centring simulations. The results coincide with both electro-physiological neuron spike and behavioural path recordings, which indicates our proposed method provides a better explanation of the honeybee’s image motion detection mechanism guiding the tunnel centring behaviour.