Skip to main content

Second-Order Methods for Neural Networks

Fast and Reliable Training Methods for Multi-Layer Perceptrons

  • Book
  • © 1997

Overview

Part of the book series: Perspectives in Neural Computing (PERSPECT.NEURAL)

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (6 chapters)

Keywords

About this book

About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac­ tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com­ bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and stochastic approximation theory.

Authors and Affiliations

  • Department of Biochemistry and Molecular Biology, University College London, London, UK

    Adrian J. Shepherd

Bibliographic Information

  • Book Title: Second-Order Methods for Neural Networks

  • Book Subtitle: Fast and Reliable Training Methods for Multi-Layer Perceptrons

  • Authors: Adrian J. Shepherd

  • Series Title: Perspectives in Neural Computing

  • DOI: https://doi.org/10.1007/978-1-4471-0953-2

  • Publisher: Springer London

  • eBook Packages: Springer Book Archive

  • Copyright Information: Springer-Verlag London 1997

  • Softcover ISBN: 978-3-540-76100-6Published: 28 April 1997

  • eBook ISBN: 978-1-4471-0953-2Published: 06 December 2012

  • Series ISSN: 1431-6854

  • Edition Number: 1

  • Number of Pages: XIV, 145

  • Number of Illustrations: 30 b/w illustrations

  • Topics: Artificial Intelligence, Special Purpose and Application-Based Systems

Publish with us