Logo - springer
Slogan - springer

Computer Science - Theoretical Computer Science | Neural Networks: Tricks of the Trade

Neural Networks: Tricks of the Trade

Montavon, Grégoire, Orr, Geneviève, Müller, Klaus-Robert (Eds.)

2nd ed. 2012, XII, 769 p. 223 illus.

Available Formats:

Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.


(net) price for USA

ISBN 978-3-642-35289-8

digitally watermarked, no DRM

Included Format: PDF and EPUB

download immediately after purchase

learn more about Springer eBooks

add to marked items


Softcover (also known as softback) version.

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.


(net) price for USA

ISBN 978-3-642-35288-1

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days

add to marked items

  • The second edition of the book "reloads" the first edition with more tricks
  • Provides a timely snapshot of tricks, theory and algorithms that are of use

The twenty last years have been marked by an increase in available data and computing power. In parallel to this trend, the focus of neural network research and the practice of training neural networks has undergone a number of important changes, for example, use of deep learning machines.

The second edition of the book augments the first edition with more tricks, which have resulted from 14 years of theory and experimentation by some of the world's most prominent neural network researchers. These tricks can make a substantial difference (in terms of speed, ease of implementation, and accuracy) when it comes to putting algorithms to work on real problems.

Content Level » Research

Keywords » back-propagation - graphics processing unit - multilayer perceptron - neural reinforcement learning - optimization

Related subjects » Artificial Intelligence - Complexity - Database Management & Information Retrieval - Image Processing - Theoretical Computer Science

Table of contents 

Introduction.- Preface on Speeding Learning.- 1. Efficient BackProp.- Preface on Regularization Techniques to Improve Generalization.- 2. Early Stopping — But When?.- 3. A Simple Trick for Estimating the Weight Decay Parameter.- 4. Controlling the Hyperparameter Search in MacKay’s Bayesian Neural Network Framework.- 5. Adaptive Regularization in Neural Network Modeling.- 6. Large Ensemble Averaging.- Preface on Improving Network Models and Algorithmic Tricks.- 7. Square Unit Augmented, Radially Extended, Multilayer Perceptrons.- 8. A Dozen Tricks with Multitask Learning.- 9. Solving the Ill-Conditioning in Neural Network Learning.- 10. Centering Neural Network Gradient Factors.- 11. Avoiding Roundoff Error in Backpropagating Derivatives.- 12. Transformation Invariance in Pattern Recognition –Tangent Distance and Tangent Propagation.- 13. Combining Neural Networks and Context-Driven Search for On-line, Printed Handwriting Recognition in the Newtons.- 14. Neural Network Classification and Prior Class Probabilities.- 15. Applying Divide and Conquer to Large Scale Pattern Recognition Tasks.- Preface on Tricks for Time Series.- 16. Forecasting the Economy with Neural Nets: A Survey of Challenges and Solutions.- 17. How to Train Neural Networks.- Preface on Big Learning in Deep Neural Networks.- 18. Stochastic Gradient Descent Tricks.- 19. Practical Recommendations for Gradient-Based Training of Deep Architectures.- 20. Training Deep and Recurrent Networks with Hessian-Free Optimization.- 21. Implementing Neural Networks Efficiently.- Preface on Better Representations: Invariant, Disentangled and Reusable.- 22. Learning Feature Representations with K-Means.- 23. Deep Big Multilayer Perceptrons for Digit Recognition.- 24. A Practical Guide to Training Restricted Boltzmann Machines.- 25. Deep Boltzmann Machines and the Centering Trick.- 26. Deep Learning via Semi-supervised Embedding.- Preface on Identifying Dynamical Systems for Forecasting and Control.- 27. A Practical Guide to Applying Echo State Networks.- 28. Forecasting with Recurrent Neural Networks: 12 Tricks.- 29. Solving Partially Observable Reinforcement Learning Problems with Recurrent Neural Networks.- 30. 10 Steps and Some Tricks to Set up Neural Reinforcement Controllers.

Popular Content within this publication 



Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Computation by Abstract Devices.