Skip to main content
  • Book
  • © 2010

Multi-Modal User Interactions in Controlled Environments

  • One of first books to cover primarily multimodality and behavioral data, rather than mono-modality tracking and analysis (mainly eye gaze, eye fixation, eye blink, body movements)
  • Discusses video based system that boosts productivity and increases satisfaction by automating repetitive human tasks; optimizes gestures for information we need, plus enables us to work together with other people through space and time

Part of the book series: Multimedia Systems and Applications (MMSA, volume 34)

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 169.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (6 chapters)

  1. Front Matter

    Pages 1-1
  2. Introduction

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 1-10
  3. Abnormal Event Detection

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 11-58
  4. Flow Estimation

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 59-98
  5. Estimation of Visual Gaze

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 99-141
  6. Visual Field Projection and Region of Interest Analysis

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 143-170
  7. Conclusion

    • Chaabane Djeraba, Adel Lablack, Yassine Benabbas
    Pages 171-173
  8. Back Matter

    Pages 174-174

About this book

Multi-Modal User Interactions in Controlled Environments investigates the capture and analysis of user’s multimodal behavior (mainly eye gaze, eye fixation, eye blink and body movements) within a real controlled environment (controlled-supermarket, personal environment) in order to adapt the response of the computer/environment to the user. Such data is captured using non-intrusive sensors (for example, cameras in the stands of a supermarket) installed in the environment. This multi-modal video based behavioral data will be analyzed to infer user intentions while assisting users in their day-to-day tasks by adapting the system’s response to their requirements seamlessly. This book also focuses on the presentation of information to the user. Multi-Modal User Interactions in Controlled Environments is designed for professionals in industry, including professionals in the domains of security and interactive web television. This book is also suitable for graduate-level students in computer science and electrical engineering.

Authors and Affiliations

  • Fundamentale de Lille, University Lille 1, Laboratoire d'Informatique, Villeneuve d'Ascq, France

    Chaabane Djeraba

  • Fondamentale de Lille, University Lille 1, Laboratoire d'Informatique, Villeneuve d'Ascq, France

    Adel Lablack, Yassine Benabbas

Bibliographic Information

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 169.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access