Logo - springer
Slogan - springer

Computer Science - Theoretical Computer Science | Information Theory and Statistical Learning

Information Theory and Statistical Learning

Emmert-Streib, Frank, Dehmer, Matthias (Eds.)

2009, X, 439 p.

Available Formats:
eBook
Information

Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.

 
$119.00

(net) price for USA

ISBN 978-0-387-84816-7

digitally watermarked, no DRM

Included Format: PDF

download immediately after purchase


learn more about Springer eBooks

add to marked items

Hardcover
Information

Hardcover version

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.

 
$149.00

(net) price for USA

ISBN 978-0-387-84815-0

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days


add to marked items

Softcover
Information

Softcover (also known as softback) version.

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.

 
$149.00

(net) price for USA

ISBN 978-1-4419-4650-8

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days


add to marked items

  • Combines information theory and statistical learning components in one volume
  • Many chapters are contributed by authors who pioneered the presented methods themselves
  • Interdisciplinary approach makes this book accessible to researchers and professionals in many areas of study

Information Theory and Statistical Learning presents theoretical and practical results about information theoretic methods used in the context of statistical learning.

The book will present a comprehensive overview of the large range of different methods that have been developed in a multitude of contexts. Each chapter is written by an expert in the field. The book is intended for an interdisciplinary readership working in machine learning, applied statistics, artificial intelligence, biostatistics, computational biology, bioinformatics, web mining or related disciplines.

Advance Praise for Information Theory and Statistical Learning:

"A new epoch has arrived for information sciences to integrate various disciplines such as information theory, machine learning, statistical inference, data mining, model selection etc. I am enthusiastic about recommending the present book to researchers and students, because it summarizes most of these new emerging subjects and methods, which are otherwise scattered in many places."

-- Shun-ichi Amari, RIKEN Brain Science Institute,  Professor-Emeritus at the University of Tokyo

Content Level » Research

Keywords » algorithms - combinatorial optimization - data compression - information - information theory - kernel method - machine learning - optimization - stability

Related subjects » Artificial Intelligence - Robotics - Signals & Communication - Theoretical Computer Science

Table of contents 

Algorithmic Probability: Theory and Applications.- Model Selection and Testing by the MDL Principle.- Normalized Information Distance.- The Application of Data Compression-Based Distances to Biological Sequences.- MIC: Mutual Information Based Hierarchical Clustering.- A Hybrid Genetic Algorithm for Feature Selection Based on Mutual Information.- Information Approach to Blind Source Separation and Deconvolution.- Causality in Time Series: Its Detection and Quantification by Means of Information Theory.- Information Theoretic Learning and Kernel Methods.- Information-Theoretic Causal Power.- Information Flows in Complex Networks.- Models of Information Processing in the Sensorimotor Loop.- Information Divergence Geometry and the Application to Statistical Machine Learning.- Model Selection and Information Criterion.- Extreme Physical Information as a Principle of Universal Stability.- Entropy and Cloning Methods for Combinatorial Optimization, Sampling and Counting Using the Gibbs Sampler.

Popular Content within this publication 

 

Articles

Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Coding and Information Theory.