Skip to main content
  • Textbook
  • © 2009

The Elements of Statistical Learning

Data Mining, Inference, and Prediction, Second Edition

  • The many topics include neural networks, support vector machines, classification trees and boosting - the first comprehensive treatment of this topic in any book
  • Includes more than 200 pages of four-color graphics
  • Includes supplementary material: sn.pub/extras

Part of the book series: Springer Series in Statistics (SSS)

Buy it now

Buying options

eBook USD 64.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book USD 84.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (18 chapters)

  1. Front Matter

    Pages i-xxii
  2. Introduction

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 1-8
  3. Overview of Supervised Learning

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 9-41
  4. Linear Methods for Regression

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 43-99
  5. Linear Methods for Classification

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 101-137
  6. Basis Expansions and Regularization

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 139-189
  7. Kernel Smoothing Methods

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 191-218
  8. Model Assessment and Selection

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 219-259
  9. Model Inference and Averaging

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 261-294
  10. Additive Models, Trees, and Related Methods

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 295-336
  11. Boosting and Additive Trees

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 337-387
  12. Neural Networks

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 389-416
  13. Support Vector Machines and Flexible Discriminants

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 417-458
  14. Prototype Methods and Nearest-Neighbors

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 459-483
  15. Unsupervised Learning

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 485-585
  16. Random Forests

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 587-604
  17. Ensemble Learning

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 605-624
  18. Undirected Graphical Models

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 625-648
  19. High-Dimensional Problems: p N

    • Trevor Hastie, Robert Tibshirani, Jerome Friedman
    Pages 649-698
  20. Back Matter

    Pages 699-745

About this book

This book describes the important ideas in a variety of fields such as medicine, biology, finance, and marketing in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of colour graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorisation, and spectral clustering. There is also a chapter on methods for "wide'' data (p bigger than n), including multiple testing and false discovery rates.

Authors and Affiliations

  • Dept. of Statistics, Stanford University, Stanford, USA

    Trevor Hastie, Robert Tibshirani, Jerome Friedman

About the authors

Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.

Bibliographic Information

Buy it now

Buying options

eBook USD 64.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book USD 84.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access