Skip to main content
  • Book
  • © 2012

Neural Networks: Tricks of the Trade

  • The second edition of the book "reloads" the first edition with more tricks
  • Provides a timely snapshot of tricks, theory and algorithms that are of use

Part of the book series: Lecture Notes in Computer Science (LNCS, volume 7700)

Part of the book sub series: Theoretical Computer Science and General Issues (LNTCS)

Buy it now

Buying options

eBook USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (39 chapters)

  1. Front Matter

  2. Introduction

    1. Introduction

      • Klaus-Robert Müller
      Pages 1-5
  3. Speeding Learning

    1. Speeding Learning

      • Klaus-Robert Müller
      Pages 7-8
    2. Efficient BackProp

      • Yann A. LeCun, Léon Bottou, Genevieve B. Orr, Klaus-Robert Müller
      Pages 9-48
  4. Regularization Techniques to Improve Generalization

    1. Regularization Techniques to Improve Generalization

      • Klaus-Robert Müller
      Pages 49-51
    2. Early Stopping — But When?

      • Lutz Prechelt
      Pages 53-67
    3. A Simple Trick for Estimating the Weight Decay Parameter

      • Thorsteinn S. Rögnvaldsson
      Pages 69-89
    4. Adaptive Regularization in Neural Network Modeling

      • Jan Larsen, Claus Svarer, Lars Nonboe Andersen, Lars Kai Hansen
      Pages 111-130
    5. Large Ensemble Averaging

      • David Horn, Ury Naftaly, Nathan Intrator
      Pages 131-137
  5. Improving Network Models and Algorithmic Tricks

    1. Improving Network Models and Algorithmic Tricks

      • Klaus-Robert Müller
      Pages 139-141
    2. A Dozen Tricks with Multitask Learning

      • Rich Caruana
      Pages 163-189
    3. Solving the Ill-Conditioning in Neural Network Learning

      • Patrick van der Smagt, Gerd Hirzinger
      Pages 191-203
    4. Centering Neural Network Gradient Factors

      • Nicol N. Schraudolph
      Pages 205-223
  6. Representing and Incorporating Prior Knowledge in Neural Network Training

    1. Transformation Invariance in Pattern Recognition – Tangent Distance and Tangent Propagation

      • Patrice Y. Simard, Yann A. LeCun, John S. Denker, Bernard Victorri
      Pages 235-269
    2. Neural Network Classification and Prior Class Probabilities

      • Steve Lawrence, Ian Burns, Andrew Back, Ah Chung Tsoi, C. Lee Giles
      Pages 295-309

About this book

The twenty last years have been marked by an increase in available data and computing power. In parallel to this trend, the focus of neural network research and the practice of training neural networks has undergone a number of important changes, for example, use of deep learning machines.

The second edition of the book augments the first edition with more tricks, which have resulted from 14 years of theory and experimentation by some of the world's most prominent neural network researchers. These tricks can make a substantial difference (in terms of speed, ease of implementation, and accuracy) when it comes to putting algorithms to work on real problems.

Editors and Affiliations

  • Dept. of Computer Science, Technische Universität Berlin, Berlin, Germany

    Grégoire Montavon, Klaus-Robert Müller

  • Dept. of computer Science, Willamette University, Salem, USA

    Geneviève B. Orr

Bibliographic Information

Buy it now

Buying options

eBook USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access