Map-Based Probabilistic Visual Self-Localization
2016
Article
avg
ps
Accurate and efficient self-localization is a critical problem for autonomous systems. This paper describes an affordable solution to vehicle self-localization which uses odometry computed from two video cameras and road maps as the sole inputs. The core of the method is a probabilistic model for which an efficient approximate inference algorithm is derived. The inference algorithm is able to utilize distributed computation in order to meet the real-time requirements of autonomous systems in some instances. Because of the probabilistic nature of the model the method is capable of coping with various sources of uncertainty including noise in the visual odometry and inherent ambiguities in the map (e.g., in a Manhattan world). By exploiting freely available, community developed maps and visual odometry measurements, the proposed method is able to localize a vehicle to 4m on average after 52 seconds of driving on maps which contain more than 2,150km of drivable roads.
Author(s): | Marcus A. Brubaker and Andreas Geiger and Raquel Urtasun |
Journal: | IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI) |
Year: | 2016 |
Department(s): | Autonomous Vision, Perceiving Systems |
Research Project(s): |
Global Localization and Affordance Learning
|
Bibtex Type: | Article (article) |
Links: |
pdf
|
BibTex @article{Brubaker2016PAMI, title = {Map-Based Probabilistic Visual Self-Localization}, author = {Brubaker, Marcus A. and Geiger, Andreas and Urtasun, Raquel}, journal = {IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI)}, year = {2016}, doi = {} } |