Skip to main content
  • Book
  • © 2016

Human Activity Recognition and Prediction

Editors:

  • Covers the most state-of-the-art topics of activity recognition and prediction
  • Discusses both methodology and real-world practice of human activity recognition
  • Contains contributions from top experts in the field, who voice their unique perspectives included throughout

Buy it now

Buying options

eBook USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (8 chapters)

  1. Front Matter

    Pages i-vii
  2. Introduction

    • Yu Kong, Yun Fu
    Pages 1-22
  3. Action Recognition and Human Interaction

    • Yu Kong, Yun Fu
    Pages 23-48
  4. Subspace Learning for Action Recognition

    • Chengcheng Jia, Yun Fu
    Pages 49-69
  5. Multimodal Action Recognition

    • Chengcheng Jia, Wei Pang, Yun Fu
    Pages 71-85
  6. RGB-D Action Recognition

    • Chengcheng Jia, Yu Kong, Zhengming Ding, Yun Fu
    Pages 87-106
  7. Activity Prediction

    • Yu Kong, Yun Fu
    Pages 107-122
  8. Actionlets and Activity Prediction

    • Kang Li, Yun Fu
    Pages 123-151
  9. Time Series Modeling for Activity Prediction

    • Kang Li, Sheng Li, Yun Fu
    Pages 153-174

About this book

This book provides a unique view of human activity recognition, especially fine-grained human activity structure learning, human-interaction recognition, RGB-D data based action recognition, temporal decomposition, and causality learning in unconstrained human activity videos. The techniques discussed give readers tools that provide a significant improvement over existing methodologies of video content understanding by taking advantage of activity recognition. It links multiple popular research fields in computer vision, machine learning, human-centered computing, human-computer interaction, image classification, and pattern recognition. In addition, the book includes several key chapters covering multiple emerging topics in the field. Contributed by top experts and practitioners, the chapters present key topics from different angles and blend both methodology and application, composing a solid overview of the human activity recognition techniques.

Editors and Affiliations

  • Northeastern University, Boston, USA

    Yun Fu

Bibliographic Information

  • Book Title: Human Activity Recognition and Prediction

  • Editors: Yun Fu

  • DOI: https://doi.org/10.1007/978-3-319-27004-3

  • Publisher: Springer Cham

  • eBook Packages: Engineering, Engineering (R0)

  • Copyright Information: Springer International Publishing Switzerland 2016

  • Hardcover ISBN: 978-3-319-27002-9Published: 06 January 2016

  • Softcover ISBN: 978-3-319-80055-4Published: 30 March 2018

  • eBook ISBN: 978-3-319-27004-3Published: 23 December 2015

  • Edition Number: 1

  • Number of Pages: VII, 174

  • Number of Illustrations: 6 b/w illustrations, 64 illustrations in colour

  • Topics: Signal, Image and Speech Processing, Image Processing and Computer Vision, Biometrics

Buy it now

Buying options

eBook USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access