Skip to main content

Novelty, Information and Surprise

  • Book
  • © 2022

Overview

  • Provides definitions of useful new concepts: description, novelty, surprise, template
  • Discusses new viewpoints on information theory in relation to the natural sciences
  • Demonstrates a method of analyzing neuronal spike trains (burst surprise)

Part of the book series: Information Science and Statistics (ISS)

  • 2939 Accesses

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (18 chapters)

  1. Surprise and Information of Descriptions

  2. Coding and Information Transmission

  3. Information Rate and Channel Capacity

  4. Repertoires and Covers

  5. Information, Novelty and Surprise in Science

Keywords

About this book

This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is defined for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space Ω) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of Ω, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these concepts, mostlyin statistics and in neuroscience.




Authors and Affiliations

  • Neural Information Processing, University of Ulm, Ulm, Germany

    Günther Palm

About the author

Günther Palm studied mathematics in Hamburg and Tübingen. After completing his studies in mathematics (Master in 1974, Ph.D. in 1975) he worked on nonlinear systems, associative memory and brain theory at the MPI for Biological Cybernetics in Tübingen. In 1983/84 he was a Fellow at the Wissenschaftskolleg in Berlin. From 1988 to 1991 he was professor for theoretical brain research at the University of Düsseldorf. Since then he has served as a professor for computer science and Director of the Institute of Neural Information Processing at the University of Ulm, where his focus is on information theory, pattern recognition, neural networks, and brain modelling.

Bibliographic Information

Publish with us