dc.contributor.author | Hübner, Thomas | en_US |
dc.contributor.author | Pajarola, Renato | en_US |
dc.contributor.editor | P. Alliez and M. Magnor | en_US |
dc.date.accessioned | 2015-07-09T11:07:29Z | |
dc.date.available | 2015-07-09T11:07:29Z | |
dc.date.issued | 2009 | en_US |
dc.identifier.uri | http://dx.doi.org/10.2312/egs.20091037 | en_US |
dc.description.abstract | A major drawback in many robotics projects is the dependance on a specific environment and the otherwise uncertain behavior of the hardware. Simple navigation tasks like driving in a straight line can lead to a strong lateral drift over time in an unknown environment. In this paper we propose a fast and simple solution for the lateral drift problem for vision guided robots by real-time scene analysis. Without an environment-specific calibration of the robot s drive system, we balance the differential drive speed on the fly. Therefore, a feature detector is used on consecutive images. Detected feature points determine the focus of expansion (FOE) that is used for locating and correcting the robot s lateral drift. Results are presented for an unmodified real-world indoor environment that demonstrate that our method is able to correct most lateral drift, solely based on real-time vision processing. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.title | Real-time Vision-based Lateral Drift Correction | en_US |
dc.description.seriesinformation | Eurographics 2009 - Short Papers | en_US |
dc.description.sectionheaders | Imaging, Perception, Display | en_US |
dc.identifier.doi | 10.2312/egs.20091037 | en_US |
dc.identifier.pages | 13-16 | en_US |