SLAM and Vision-based Humanoid Navigation - Université Toulouse III - Paul Sabatier - Toulouse INP Accéder directement au contenu
Chapitre D'ouvrage Année : 2018

SLAM and Vision-based Humanoid Navigation

Résumé

In order for humanoid robots to evolve autonomously in a complex environment, they have to perceive it, build an appropriate representation, localize in it, and decide which motion to realize. The relationship between the environment and the robot is rather complex as some parts are obstacles to avoid, other possible support for locomotion, or objects to manipulate. The affordance with the objects and the environment may result in quite complex motions ranging from bimanual manipulation to whole-body motion generation. In this chapter, we will introduce tools to realize vision-based humanoid navigation. The general structure of such a system is depicted in Fig. 1. It classically represents the perception-action loop where, based on the sensor signals, a number of information are extracted. The information is used to localize the robot and build a representation of the environment. This process is the subject of the second paragraph. Finally a motion is planned and sent to the robot control system. The third paragraph describes several approaches to implement visual navigation in the context of humanoid robotics.
Fichier principal
Vignette du fichier
hrr-slam.pdf (2.31 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01674512 , version 1 (03-01-2018)

Identifiants

Citer

Olivier Stasse. SLAM and Vision-based Humanoid Navigation. Humanoid Robotics: A Reference, pp.1739-1761, 2018, 978-94-007-6047-9. ⟨10.1007/978-94-007-6046-2_59⟩. ⟨hal-01674512⟩
140 Consultations
573 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More