Multi-Modal User Interactions in Controlled Environments - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Ouvrages Année : 2010

Multi-Modal User Interactions in Controlled Environments

Résumé

- One of first books to cover primarily multimodality and behavioral data, rather than mono-modality tracking and analysis font - Discusses a video-based system that boosts productivity and increases satisfaction by automating repetitive human tasks - Focuses on the presentation of information to the user Multi-Modal User Interactions in Controlled Environments investigates the capture and analysis of user’s multimodal behavior (mainly eye gaze, eye fixation, eye blink and body movements) within a real controlled environment (controlled-supermarket, personal environment) in order to adapt the response of the computer/environment to the user. Such data is captured using non-intrusive sensors (for example, cameras in the stands of a supermarket) installed in the environment. This multi-modal video based behavioral data will be analyzed to infer user intentions while assisting users in their day-to-day tasks by adapting the system’s response to their requirements seamlessly. This book also focuses on the presentation of information to the user. Multi-Modal User Interactions in Controlled Environments is designed for professionals in industry, including professionals in the domains of security and interactive web television. This book is also suitable for graduate-level students in computer science and electrical engineering.
Fichier non déposé

Dates et versions

hal-01856780 , version 1 (13-08-2018)

Identifiants

  • HAL Id : hal-01856780 , version 1

Citer

Chaabane Djeraba, Adel Lablack, Yassine Benabbas. Multi-Modal User Interactions in Controlled Environments. Springer US, 34, 216 p., 2010, Multimedia Systems and Applications, 978-1-4419-0316-7. ⟨hal-01856780⟩
40 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More