go back

Vision-Enhanced Low-Cost Localization in Crowdsourced Maps

Benedict Flade, Axel Koppert, Gorka Isasmendi, Anweshan Das, David Bétaille, Gijs Dubbelman, Oihana Otaegui, Julian Eggert, "Vision-Enhanced Low-Cost Localization in Crowdsourced Maps", IEEE Intelligent Transportation Systems Magazine, vol. 12, no. 3, pp. 70 - 80, 2020.

Abstract

Lane-level localization of vehicles with low-cost sensors is a challenging task. In situations in which Global Navigation Satellite Systems (GNSS) suffer from weak observation geometry or the influence of reflected signals, the fusion of heterogeneous information presents a suitable approach for improving the localization accuracy. We propose a solution based on a monocular front-facing camera, a low-cost inertial measurement unit (IMU) and a single-frequency GNSS receiver. The sensor data fusion is implemented as a tightly-coupled Kalman filter, correcting the IMU-based trajectory with GNSS observations while employing EGNOS correction data. Furthermore, we consider vision-based complementary data that serves as an additional source of information. In contrast to other approaches, the camera is not used for inferring the motion of the vehicle but for directly correcting the localization results under usage of map information. More specifically, the so-called camera to map alignment is done by comparing virtual 3D views (candidates) created from projected map data, with lane geometry features, extracted from the camera image. One strength of the proposed solution is its compatibility with state-of-the-art map data, which is publicly available from different sources. We validate the approach on real world data recorded in The Netherlands and show that it presents a promising cost-efficient means to support future advanced driver assistance systems.



Download Bibtex file Download PDF

Search