A likelihood-based data fusion model for the integration of multiple sensor data: A case study with vision and lidar sensors
Author(s)
Jo, Jun
Tsunoda, Yukito
Stantic, Bela
Liew, Alan Wee-Chung
Year published
2017
Metadata
Show full item recordAbstract
Sensors have been developed and applied in a wide range of fields such as robotics and autonomous vehicle navigation (AVN). Due to the inability of a single sensor to fully sense its surroundings, multiple sensors based on individual specialties are commonly used in order to complement the shortcomings and enrich perception. However, it is challenging to integrate the heterogeneous types of sensory information and produce useful results. This research aims to achieve a high degree of accuracy with a minimum false-positive and false-negative rate for the sake of reliability and safety. This paper introduces a likelihood-based ...
View more >Sensors have been developed and applied in a wide range of fields such as robotics and autonomous vehicle navigation (AVN). Due to the inability of a single sensor to fully sense its surroundings, multiple sensors based on individual specialties are commonly used in order to complement the shortcomings and enrich perception. However, it is challenging to integrate the heterogeneous types of sensory information and produce useful results. This research aims to achieve a high degree of accuracy with a minimum false-positive and false-negative rate for the sake of reliability and safety. This paper introduces a likelihood-based data fusion model, which integrates information from various sensors, maps it into the integrated data space and generates the solution considering all the information from the sensors. Two distinct sensors: an optical camera and a LIght Detection And Range (Lidar) sensor were used for the experiment. The experimental results showed the usefulness of the proposed model in comparison with single sensor outcomes.
View less >
View more >Sensors have been developed and applied in a wide range of fields such as robotics and autonomous vehicle navigation (AVN). Due to the inability of a single sensor to fully sense its surroundings, multiple sensors based on individual specialties are commonly used in order to complement the shortcomings and enrich perception. However, it is challenging to integrate the heterogeneous types of sensory information and produce useful results. This research aims to achieve a high degree of accuracy with a minimum false-positive and false-negative rate for the sake of reliability and safety. This paper introduces a likelihood-based data fusion model, which integrates information from various sensors, maps it into the integrated data space and generates the solution considering all the information from the sensors. Two distinct sensors: an optical camera and a LIght Detection And Range (Lidar) sensor were used for the experiment. The experimental results showed the usefulness of the proposed model in comparison with single sensor outcomes.
View less >
Conference Title
ROBOT INTELLIGENCE TECHNOLOGY AND APPLICATIONS 4
Volume
447
Subject
Artificial intelligence not elsewhere classified