Robust Navigation Using Markov Models - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Article Dans Une Revue International Journal of Advanced Robotic Systems Année : 2008

Robust Navigation Using Markov Models

Résumé

To reach a given goal, a mobile robot first computes a motion plan (ie a sequence of actions that will take it to its goal), and then executes it. Markov Decision Processes (MDPs) have been successfully used to solve these two problems. Their main advantage is that they provide a theoretical framework to deal with the uncertainties related to the robot's motor and perceptive actions during both planning and execution stages. This paper describes a navigation approach using an MDP-based planning method and Markov Local- isation. The planning method uses a hierarchic representation of the robot's state space. Besides, the actions used better integrate the kinematic constraints of a wheeled mobile robot. These two features yield a motion planner more efficient and better suited to plan robust motion strategies. Also, this paper focuses on the experimental aspects related to the use of Markov Techniques with a particular emphasis on how two key elements were obtained by learning, namely the transition function (that encodes the uncertainties related to the robot actions) and the sensor model. Experiments carried out with a real robot demonstrate the robustness of the whole navigation approach.
Fichier principal
Vignette du fichier
08-ars-burlet-etal.pdf (1.18 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

inria-00259299 , version 1 (27-02-2008)
inria-00259299 , version 2 (28-02-2008)

Identifiants

  • HAL Id : inria-00259299 , version 2

Citer

Julien Burlet, Thierry Fraichard, Olivier Aycard. Robust Navigation Using Markov Models. International Journal of Advanced Robotic Systems, 2008, 5 (2). ⟨inria-00259299v2⟩
170 Consultations
251 Téléchargements

Partager

Gmail Facebook X LinkedIn More