175 years of Springer publishing +++ Through June 30: 50% off Physics & Astronomy Books

Information Science and Statistics

Information Theoretic Learning

Renyi's Entropy and Kernel Perspectives

Authors: Principe, Jose C.

Buy this book

eBook $139.00
price for USA (gross)
  • ISBN 978-1-4419-1570-2
  • Digitally watermarked, DRM-free
  • Included format: EPUB, PDF
  • ebooks can be used on all reading devices
  • Immediate eBook download after purchase
Hardcover $179.00
price for USA
  • ISBN 978-1-4419-1569-6
  • Free shipping for individuals worldwide
  • Usually dispatched within 3 to 5 business days.
Softcover $179.00
price for USA
  • ISBN 978-1-4614-2585-4
  • Free shipping for individuals worldwide
  • Usually dispatched within 3 to 5 business days.
About this book

This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy.

ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyi’s quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications.

Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research.

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering, and BellSouth Professor at the University of Florida, and the Founder and Director of the Computational NeuroEngineering Laboratory. He is an IEEE and AIMBE Fellow, Past President of the International Neural Network Society, Past Editor-in-Chief of the IEEE Trans. on Biomedical Engineering and the Founder Editor-in-Chief of the IEEE Reviews on Biomedical Engineering. He has written an interactive electronic book on Neural Networks, a book on Brain Machine Interface Engineering and more recently a book on Kernel Adaptive Filtering, and was awarded the 2011 IEEE Neural Network Pioneer Award.

About the authors

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering, and BellSouth Professor at the University of Florida, and the Founder and Director of the Computational NeuroEngineering Laboratory. He is an IEEE and AIMBE Fellow, Past President of the International Neural Network Society, Past Editor-in-Chief of the IEEE Trans. on Biomedical Engineering and the Founder Editor-in-Chief of the IEEE Reviews on Biomedical Engineering. He has written an interactive electronic book on Neural Networks, a book on Brain Machine Interface Engineering and more recently a book on Kernel Adaptive Filtering, and was awarded the 2011 IEEE Neural Network Pioneer Award.

Reviews

From the book reviews:

“The book is remarkable in various ways in the information it presents on the concept and use of entropy functions and their applications in signal processing and solution of statistical problems such as M-estimation, classification, and clustering. Students of engineering and statistics will greatly benefit by reading it.” (C. R. Rao, Technometrics, Vol. 55 (1), February, 2013)


Table of contents (11 chapters)

  • Information Theory, Machine Learning, and Reproducing Kernel Hilbert Spaces

    Principe, José C.

    Pages 1-45

  • Renyi’s Entropy, Divergence and Their Nonparametric Estimators

    Xu, Dongxin (et al.)

    Pages 47-102

  • Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria

    Erdogmus, Deniz (et al.)

    Pages 103-140

  • Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems

    Erdogmus, Deniz (et al.)

    Pages 141-179

  • Nonlinear Adaptive Filtering with MEE, MCC, and Applications

    Erdogmus, Deniz (et al.)

    Pages 181-218

Buy this book

eBook $139.00
price for USA (gross)
  • ISBN 978-1-4419-1570-2
  • Digitally watermarked, DRM-free
  • Included format: EPUB, PDF
  • ebooks can be used on all reading devices
  • Immediate eBook download after purchase
Hardcover $179.00
price for USA
  • ISBN 978-1-4419-1569-6
  • Free shipping for individuals worldwide
  • Usually dispatched within 3 to 5 business days.
Softcover $179.00
price for USA
  • ISBN 978-1-4614-2585-4
  • Free shipping for individuals worldwide
  • Usually dispatched within 3 to 5 business days.
Loading...

Recommended for you

Loading...

Bibliographic Information

Bibliographic Information
Book Title
Information Theoretic Learning
Book Subtitle
Renyi's Entropy and Kernel Perspectives
Authors
Series Title
Information Science and Statistics
Copyright
2010
Publisher
Springer-Verlag New York
Copyright Holder
Springer-Verlag New York
eBook ISBN
978-1-4419-1570-2
DOI
10.1007/978-1-4419-1570-2
Hardcover ISBN
978-1-4419-1569-6
Softcover ISBN
978-1-4614-2585-4
Series ISSN
1613-9011
Edition Number
1
Number of Pages
XIV, 448
Topics