A likelihood-based data fusion model for the integration of multiple sensor data: A case study with vision and lidar sensors
MetadataShow full item record
Sensors have been developed and applied in a wide range of fields such as robotics and autonomous vehicle navigation (AVN). Due to the inability of a single sensor to fully sense its surroundings, multiple sensors based on individual specialties are commonly used in order to complement the shortcomings and enrich perception. However, it is challenging to integrate the heterogeneous types of sensory information and produce useful results. This research aims to achieve a high degree of accuracy with a minimum false-positive and false-negative rate for the sake of reliability and safety. This paper introduces a likelihood-based data fusion model, which integrates information from various sensors, maps it into the integrated data space and generates the solution considering all the information from the sensors. Two distinct sensors: an optical camera and a LIght Detection And Range (Lidar) sensor were used for the experiment. The experimental results showed the usefulness of the proposed model in comparison with single sensor outcomes.
Robot Intelligence Technology and Applications 4: Results from the 4th International Conference on Robot Intelligence Technology and Applications
Artificial Intelligence and Image Processing not elsewhere classified