Skip to main content
  • Book
  • © 2004

Statistical Learning Theory and Stochastic Optimization

Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001

Authors:

Editors:

Part of the book series: Lecture Notes in Mathematics (LNM, volume 1851)

Part of the book sub series: École d'Été de Probabilités de Saint-Flour (LNMECOLE)

Buy it now

Buying options

eBook USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (11 chapters)

  1. Front Matter

    Pages I-VIII
  2. Introduction

    • Olivier Catoni
    Pages 1-4
  3. 1. Universal lossless data compression

    • Olivier Catoni
    Pages 5-54
  4. 3. Non cumulated mean risk

    • Olivier Catoni
    Pages 71-95
  5. 4. Gibbs estimators

    • Olivier Catoni
    Pages 97-154
  6. 6. Deviation inequalities

    • Olivier Catoni
    Pages 199-222
  7. 7. Markov chains with exponential transitions

    • Olivier Catoni
    Pages 223-260
  8. References

    • Olivier Catoni
    Pages 261-265
  9. Index

    • Olivier Catoni
    Pages 267-269
  10. Back Matter

    Pages 277-280

About this book

Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.

Reviews

From the reviews:

"This book is based on a course of lectures given by the author on a circle of ideas lying at the interface of information theory, statistical learning theory and statistical interference. … The book is perhaps the first ever compendium of this circle of ideas and will be a valuable resource for researchers in information theory, statistical learning theory and statistical inference." (Vivek S. Borkar, Mathematical Reviews, Issue 2006 d)

Bibliographic Information

Buy it now

Buying options

eBook USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access