Logo - springer
Slogan - springer

Birkhäuser - Birkhäuser Computer Science | Neural Networks and Analog Computation - Beyond the Turing Limit

Neural Networks and Analog Computation

Beyond the Turing Limit

Siegelmann, Hava

1999, XIV, 181 p.

A product of Birkhäuser Basel
Available Formats:
eBook
Information

Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.

 
$89.99

(net) price for USA

ISBN 978-1-4612-0707-8

digitally watermarked, no DRM

Included Format: PDF

download immediately after purchase


learn more about Springer eBooks

add to marked items

Hardcover
Information

Hardcover version

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.

 
$169.00

(net) price for USA

ISBN 978-0-8176-3949-5

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days


add to marked items

Softcover
Information

Softcover (also known as softback) version.

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.

 
$119.00

(net) price for USA

ISBN 978-1-4612-6875-8

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days


add to marked items

The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. What emerges is a Church-Turing-like thesis, applied to the field of analog computation, which features the neural network model in place of the digital Turing machine. This new concept can serve as a point of departure for the development of alternative, supra-Turing, computational theories. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicated the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics.

The topics covered in this work will appeal to a wide readership from a variety of disciplines. Special care has been taken to explain the theory clearly and concisely. The first chapter review s the fundamental terms of modern computational theory from the point of view of neural networks and serves as a reference for the remainder of the book. Each of the subsequent chapters opens with introductory material and proceeds to explain the chapter’s connection to the development of the theory. Thereafter, the concept is defined in mathematical terms.

Although the notion of a neural network essentially arises from biology, many engineering applications have been found through highly idealized and simplified models of neuron behavior. Particular areas of application have been as diverse as explosives detection in airport security, signature verification, financial and medical times series prediction, vision, speech processing, robotics, nonlinear control, and signal processing. The focus in all of these models is entirely on the behavior of networks as computer.

The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students.

Content Level » Research

Keywords » Natur - Theorie - complexity - computer science - development - model - robot - robotics - science - simulation

Related subjects » Birkhäuser Computer Science - Birkhäuser Engineering - Birkhäuser Mathematics

Table of contents 

1 Computational Complexity.- 1.1 Neural Networks.- 1.2 Automata: A General Introduction.- 1.2.1 Input Sets in Computability Theory.- 1.3 Finite Automata.- 1.3.1 Neural Networks and Finite Automata.- 1.4 The Turing Machine.- 1.4.1 Neural Networks and Turing Machines.- 1.5 Probabilistic Turing Machines.- 1.5.1 Neural Networks and Probabilistic Machines.- 1.6 Nondeterministic Turing Machines.- 1.6.1 Nondeterministic Neural Networks.- 1.7 Oracle Turing Machines.- 1.7.1 Neural Networks and Oracle Machines.- 1.8 Advice Turing Machines.- 1.8.1 Circuit Families.- 1.8.2 Neural Networks and Advice Machines.- 1.9 Notes.- 2 The Model.- 2.1 Variants of the Network.- 2.1.1 A “System Diagram” Interpretation.- 2.2 The Network’s Computation.- 2.3 Integer Weights.- 3 Networks with Rational Weights.- 3.1 The Turing Equivalence Theorem.- 3.2 Highlights of the Proof.- 3.2.1 Cantor-like Encoding of Stacks.- 3.2.2 Stack Operations.- 3.2.3 General Construction of the Network.- 3.3 The Simulation.- 3.3.1 P-Stack Machines.- 3.4 Network with Four Layers.- 3.4.1 A Layout Of The Construction.- 3.5 Real-Time Simulation.- 3.5.1 Computing in Two Layers.- 3.5.2 Removing the Sigmoid From the Main Layer.- 3.5.3 One Layer Network Simulates TM.- 3.6 Inputs and Outputs.- 3.7 Universal Network.- 3.8 Nondeterministic Computation.- 4 Networks with Real Weights.- 4.1 Simulating Circuit Families.- 4.1.1 The Circuit Encoding.- 4.1.2 A Circuit Retrieval.- 4.1.3 Circuit Simulation By a Network.- 4.1.4 The Combined Network.- 4.2 Networks Simulation by Circuits.- 4.2.1 Linear Precision Suffices.- 4.2.2 The Network Simulation by a Circuit.- 4.3 Networks versus Threshold Circuits.- 4.4 Corollaries.- 5 Kolmogorov Weights: Between P and P/poly.- 5.1 Kolmogorov Complexity and Reals.- 5.2 Tally Oracles and Neural Networks.- 5.3 Kolmogorov Weights and Advice Classes.- 5.4 The Hierarchy Theorem.- 6 Space and Precision.- 6.1 Equivalence of Space and Precision.- 6.2 Fixed Precision Variable Sized Nets.- 7 Universality of Sigmoidal Networks.- 7.1 Alarm Clock Machines.- 7.1.1 Adder Machines.- 7.1.2 Alarm Clock and Adder Machines.- 7.2 Restless Counters.- 7.3 Sigmoidal Networks are Universal.- 7.3.1 Correctness of the Simulation.- 7.4 Conclusions.- 8 Different-limits Networks.- 8.1 At Least Finite Automata.- 8.2 Proof of the Interpolation Lemma.- 9 Stochastic Dynamics.- 9.1 Stochastic Networks.- 9.1.1 The Model.- 9.2 The Main Results.- 9.2.1 Integer Networks.- 9.2.2 Rational Networks.- 9.2.3 Real Networks.- 9.3 Integer Stochastic Networks.- 9.4 Rational Stochastic Networks.- 9.4.1 Rational Set of Choices.- 9.4.2 Real Set of Choices.- 9.5 Real Stochastic Networks.- 9.6 Unreliable Networks.- 9.7 Nondeterministic Stochastic Networks.- 10 Generalized Processor Networks.- 10.1 Generalized Networks: Definition.- 10.2 Bounded Precision.- 10.3 Equivalence with Neural Networks.- 10.4 Robustness.- 11 Analog Computation.- 11.1 Discrete Time Models.- 11.2 Continuous Time Models.- 11.3 Hybrid Models.- 11.4 Dissipative Models.- 12 Computation Beyond the Turing Limit.- 12.1 The Analog Shift Map.- 12.2 Analog Shift and Computation.- 12.3 Physical Relevance.- 12.4 Conclusions.

Popular Content within this publication 

 

Articles

Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Artificial Intelligence (incl. Robotics).

Additional information