Extended Target Tracking with 3D-INSEG and its Benefits in Dense Scenarios
Keywords: tracking, segmentation, stereo vision, Multiple Extended Object Tracking, Depth Estimation
Abstract
In Multiple Extended Object Tracking (MEOT), it is assumed that a solitary target can produce multiple measurements. The quality of these measurements is paramount for obtaining accurate estimates of tracks over time. To test state-of-the-art MEOT algorithms, both simulated and real laser data, recorded in open spaces, have been used. MEOT algorithms work well in these scenarios, but when applied in more cluttered or restricted spaces, they often fail to produce good results, because close proximity target measurements are considered as measurements with the same origin. To address these cases, this article applies the 3D INstance SEGmentation (3D-INSEG) algorithm to MEOT to process stereo image sequences, extracting 3D information corresponding to each detected target using cameras. The algorithm selects pixels from each detected target and calculates the disparity map from stereo pairs, projecting them into 3D space using this disparity map. Subsequently, these measurements undergo processing by an extended target Poisson multi-Bernoulli mixture (PMBM) filter with a gamma Gaussian inverse-Wishart (GGIW) implementation. The advantages of MEOT with the 3D-INSEG-generated data are demonstrated in this article via a comparison with MEOT based on Velodyne LiDAR data points recorded from the same scenario processed by the same MEOT algorithm.
Más información
| Título según WOS: | Extended Target Tracking with 3D-INSEG and its Benefits in Dense Scenarios |
| Título según SCOPUS: | Extended Target Tracking with 3D-INSEG and its Benefits in Dense Scenarios |
| Título de la Revista: | FUSION 2024 - 27th International Conference on Information Fusion |
| Editorial: | Institute of Electrical and Electronics Engineers Inc. |
| Fecha de publicación: | 2024 |
| Idioma: | English |
| DOI: |
10.23919/FUSION59988.2024.10706460 |
| Notas: | ISI, SCOPUS |