Please use this identifier to cite or link to this item: http://repositorio.inesctec.pt/handle/123456789/5879
Title: A voting method for stereo egomotion estimation
Authors: Hugo Miguel Silva
Bernardino,A
Eduardo Silva
Issue Date: 2017
Abstract: The development of vision-based navigation systems for mobile robotics applications in outdoor scenarios is a very challenging problem due to frequent changes in contrast and illumination, image blur, pixel noise, lack of image texture, low image overlap and other effects that lead to ambiguity in the interpretation of motion from image data. To mitigate the problems arising from multiple possible interpretations of the data in outdoor stereo egomotion, we present a fully probabilistic method denoted as probabilistic stereo egomotion transform. Our method is capable of computing 6-degree of freedom motion parameters solely based on probabilistic correspondences without the need to track or commit key point matches between two consecutive frames. The use of probabilistic correspondence methods allows to maintain several match hypothesis for each point, which is an advantage when ambiguous matches occur (which is the rule in image feature correspondence problems), because no commitment is made before analysing all image information. Experimental validation is performed in simulated and real outdoor scenarios in the presence of image noise and image blur. Comparison with other current state-of-the-art visual motion estimation method is also provided. Our method is capable of significant reduction of estimation errors mainly in harsh conditions of noise and blur. © 2017, © The Author(s) 2017.
URI: http://repositorio.inesctec.pt/handle/123456789/5879
http://dx.doi.org/10.1177/1729881417710795
metadata.dc.type: article
Publication
Appears in Collections:CRAS - Articles in International Journals

Files in This Item:
File Description SizeFormat 
P-00M-W6X.pdf1.62 MBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.