A Multimodal Fusion System for Object Identification in Point Clouds with Density and Coverage Differences
Abstract
Data fusion, which involves integrating information from multiple sources to achieve a specific objective, is an essential area of contemporary scientific research. This article presents a multimodal fusion system for object identification in point clouds in a controlled environment. Several stages were implemented, including downsampling and denoising techniques, to prepare the data before fusion. Two denoising approaches were tested and compared: one based on neighborhood technique and the other using a median filter for each "x", "y", and "z" coordinate of each point. The downsampling techniques included Random, Grid Average, and Nonuniform Grid Sample. To achieve precise alignment of sensor data in a common coordinate system, registration techniques such as Iterative Closest Point (ICP), Coherent Point Drift (CPD), and Normal Distribution Transform (NDT) were employed. Despite facing limitations, variations in density, and differences in coverage among the point clouds generated by the sensors, the system successfully achieved an integrated and coherent representation of objects in the controlled environment. This accomplishment establishes a robust foundation for future research in the field of point cloud data fusion.
Más información
Título según WOS: | A Multimodal Fusion System for Object Identification in Point Clouds with Density and Coverage Differences |
Título de la Revista: | PROCESSES |
Volumen: | 12 |
Número: | 2 |
Editorial: | MDPI Open Access Publishing |
Fecha de publicación: | 2024 |
DOI: |
10.3390/pr12020248 |
Notas: | ISI |