Skip to main content
  • Book
  • © 1995

Feed-Forward Neural Networks

Vector Decomposition Analysis, Modelling and Analog Implementation

Authors:

Part of the book series: The Springer International Series in Engineering and Computer Science (SECS, volume 314)

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (14 chapters)

  1. Front Matter

    Pages i-xiii
  2. Introduction

    • Anne-Johan Annema
    Pages 1-26
  3. The Vector Decomposition Method

    • Anne-Johan Annema
    Pages 27-37
  4. Dynamics of Single Layer Nets

    • Anne-Johan Annema
    Pages 39-56
  5. Cost Functions for Two-Layer Neural Networks

    • Anne-Johan Annema
    Pages 167-176
  6. Some issues for f’ (x)

    • Anne-Johan Annema
    Pages 177-185
  7. Feed-forward hardware

    • Anne-Johan Annema
    Pages 187-214
  8. Analog weight adaptation hardware

    • Anne-Johan Annema
    Pages 215-228
  9. Conclusions

    • Anne-Johan Annema
    Pages 229-234
  10. Back Matter

    Pages 235-238

About this book

Feed-Forward Neural Networks: Vector Decomposition Analysis, Modelling and Analog Implementation presents a novel method for the mathematical analysis of neural networks that learn according to the back-propagation algorithm. The book also discusses some other recent alternative algorithms for hardware implemented perception-like neural networks. The method permits a simple analysis of the learning behaviour of neural networks, allowing specifications for their building blocks to be readily obtained.
Starting with the derivation of a specification and ending with its hardware implementation, analog hard-wired, feed-forward neural networks with on-chip back-propagation learning are designed in their entirety. On-chip learning is necessary in circumstances where fixed weight configurations cannot be used. It is also useful for the elimination of most mis-matches and parameter tolerances that occur in hard-wired neural network chips.
Fully analog neural networks have several advantages over other implementations: low chip area, low power consumption, and high speed operation.
Feed-Forward Neural Networks is an excellent source of reference and may be used as a text for advanced courses.

Authors and Affiliations

  • MESA Research Institute, University of Twente, Netherlands

    Anne-Johan Annema

Bibliographic Information

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access