Skip to main content
  • Book
  • © 1991

Connectionist Approaches to Language Learning

Editors:

Part of the book series: The Springer International Series in Engineering and Computer Science (SECS, volume 154)

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (6 chapters)

  1. Front Matter

    Pages i-iv
  2. Introduction

    • David S. Touretzky
    Pages 1-3
  3. Learning Automata from Ordered Examples

    • Sara Porat, Jerome A. Feldman
    Pages 5-34
  4. Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks

    • David Servan-Schreiber, Axel Cleeremans, James L. McClelland
    Pages 57-89
  5. The Induction of Dynamical Recognizers

    • Jordan B. Pollack
    Pages 123-148
  6. Back Matter

    Pages 149-149

About this book

arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study­ ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self­ similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.

Editors and Affiliations

  • Carnegie Mellon University, USA

    David Touretzky

Bibliographic Information

Buy it now

Buying options

eBook USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access