Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001
Series: Lecture Notes in Mathematics, Vol. 1851
Catoni, Olivier
Picard, Jean (Ed.)
2004, VIII, 284 p.
Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.
You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.
After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.
ISBN 978-3-540-44507-4
digitally watermarked, no DRM
The eBook version of this title will be available soon
Softcover (also known as softback) version.
You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.
Standard shipping is free of charge for individual customers.
(net)
price for USA
ISBN 978-3-540-22572-0
free shipping for individuals worldwide
usually dispatched within 3 to 5 business days
Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
Content Level » Research
Keywords » Estimator - Measure - Probability theory - algorithms - complexity - information theory - learning - learning theory - optimization
Related subjects » Applications - Artificial Intelligence - Computational Science & Engineering - Mathematics - Probability Theory and Stochastic Processes - Statistical Theory and Methods
Get alerted on new Springer publications in the subject area of Probability Theory and Stochastic Processes.