Mobile robotic systems need to perceive their surroundings in order to act independently. In this work a perception framework is developed which interprets the data of a binocular camera in order to transform it into a compact, expressive model of the environment. This model enables a mobile system to move in a targeted way and interact with its surroundings. It is shown how the developed methods also provide a solid basis for technical assistive aids for visually impaired people.
Umfang: VII, 129 S.
Preis: €41.00 | £38.00 | $72.00
These are words or phrases in the text that have been automatically identified by the Named Entity Recognition and Disambiguation service, which provides Wikipedia () and Wikidata () links for these entities.
Schwarze, T. 2018. Compact Environment Modelling from Unconstrained Camera Platforms. Karlsruhe: KIT Scientific Publishing. DOI: https://doi.org/10.5445/KSP/1000083235
Dieses Buch ist lizenziert unter Creative Commons Attribution + ShareAlike 4.0 Dedication
Dieses Buch ist Peer reviewed. Informationen dazu finden Sie hier
Veröffentlicht am 25. September 2018
Englisch
158
Paperback | 978-3-7315-0801-4 |