Incremental texture mapping for autonomous driving

dc.contributor.author Miguel Riem Oliveira en
dc.contributor.author Santos,V en
dc.contributor.author Sappa,AD en
dc.contributor.author Dias,P en
dc.contributor.author António Paulo Moreira en
dc.date.accessioned 2017-12-28T11:22:54Z
dc.date.available 2017-12-28T11:22:54Z
dc.date.issued 2016 en
dc.description.abstract Autonomous vehicles have a large number of on-board sensors, not only for providing coverage all around the vehicle, but also to ensure multi-modality in the observation of the scene. Because of this, it is not trivial to come up with a single, unique representation that feeds from the data given by all these sensors. We propose an algorithm which is capable of mapping texture collected from vision based sensors onto a geometric description of the scenario constructed from data provided by 3D sensors. The algorithm uses a constrained Delaunay triangulation to produce a mesh which is updated using a specially devised sequence of operations. These enforce a partial configuration of the mesh that avoids bad quality textures and ensures that there are no gaps in the texture. Results show that this algorithm is capable of producing fine quality textures. en
dc.identifier.uri http://repositorio.inesctec.pt/handle/123456789/5048
dc.identifier.uri http://dx.doi.org/10.1016/j.robot.2016.06.009 en
dc.language eng en
dc.relation 5157 en
dc.relation 6438 en
dc.rights info:eu-repo/semantics/embargoedAccess en
dc.title Incremental texture mapping for autonomous driving en
dc.type article en
dc.type Publication en
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
P-00M-3AE.pdf
Size:
8.24 MB
Format:
Adobe Portable Document Format
Description: