Logo - springer
Slogan - springer

Mathematics - Probability Theory and Stochastic Processes | Image Analysis, Random Fields and Dynamic Monte Carlo Methods - A Mathematical Introduction

Image Analysis, Random Fields and Dynamic Monte Carlo Methods

A Mathematical Introduction

Winkler, Gerhard

Softcover reprint of the original 1st ed. 1995, XIV, 324 pp. 59 figs.

Available Formats:
eBook
Information

Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.

 
$99.00

(net) price for USA

ISBN 978-3-642-97522-6

digitally watermarked, no DRM

Included Format: PDF

download immediately after purchase


learn more about Springer eBooks

add to marked items

Softcover
Information

Softcover (also known as softback) version.

You can pay for Springer Books with Visa, Mastercard, American Express or Paypal.

Standard shipping is free of charge for individual customers.

 
$129.00

(net) price for USA

ISBN 978-3-642-97524-0

free shipping for individuals worldwide

usually dispatched within 3 to 5 business days


add to marked items

  • About this book

The book is mainly concerned with the mathematical foundations of Bayesian image analysis and its algorithms. This amounts to the study of Markov random fields and dynamic Monte Carlo algorithms like sampling, simulated annealing and stochastic gradient algorithms. The approach is introductory and elemenatry: given basic concepts from linear algebra and real analysis it is self-contained. No previous knowledge from image analysis is required. Knowledge of elementary probability theory and statistics is certainly beneficial but not absolutely necessary. The necessary background from imaging is sketched and illustrated by a number of concrete applications like restoration, texture segmentation and motion analysis.

Content Level » Research

Keywords » Markov Random Field - Monte Carlo - Monte Carlo method - Monte Carlos Methods - Probability theory - algorithms - image analysis - imaging - statistics

Related subjects » Image Processing - Physical & Information Science - Probability Theory and Stochastic Processes - Radiology - Software Engineering - Theoretical Computer Science

Table of contents 

I. Bayesian Image Analysis: Introduction.- 1. The Bayesian Paradigm.- 1.1 The Space of Images.- 1.2 The Space of Observations.- 1.3 Prior and Posterior Distribution.- 1.4 Bayesian Decision Rules.- 2. Cleaning Dirty Pictures.- 2.1 Distortion of Images.- 2.1.1 Physical Digital Imaging Systems.- 2.1.2 Posterior Distributions.- 2.2 Smoothing.- 2.3 Piecewise Smoothing.- 2.4 Boundary Extraction.- 3. Random Fields.- 3.1 Markov Random Fields.- 3.2 Gibbs Fields and Potentials.- 3.3 More on Potentials.- II. The Gibbs Sampler and Simulated Annealing.- 4. Markov Chains: Limit Theorems.- 4.1 Preliminaries.- 4.2 The Contraction Coefficient.- 4.3 Homogeneous Markov Chains.- 4.4 Inhomogeneous Markov Chains.- 5. Sampling and Annealing.- 5.1 Sampling.- 5.2 Simulated Annealing.- 5.3 Discussion.- 6. Cooling Schedules.- 6.1 The ICM Algorithm.- 6.2 Exact MAPE Versus Fast Cooling.- 6.3 Finite Time Annealing.- 7. Sampling and Annealing Revisited.- 7.1 A Law of Large Numbers for Inhomogeneous Markov Chains.- 7.1.1 The Law of Large Numbers.- 7.1.2 A Counterexample.- 7.2 A General Theorem.- 7.3 Sampling and Annealing under Constraints.- 7.3.1 Simulated Annealing.- 7.3.2 Simulated Annealing under Constraints.- 7.3.3 Sampling with and without Constraints.- III. More on Sampling and Annealing.- 8. Metropolis Algorithms.- 8.1 The Metropolis Sampler.- 8.2 Convergence Theorems.- 8.3 Best Constants.- 8.4 About Visiting Schemes.- 8.4.1 Systematic Sweep Strategies.- 8.4.2 The Influence of Proposal Matrices.- 8.5 The Metropolis Algorithm in Combinatorial Optimization.- 8.6 Generalizations and Modifications.- 8.6.1 Metropolis-Hastings Algorithms.- 8.6.2 Threshold Random Search.- 9. Alternative Approaches.- 9.1 Second Largest Eigenvalues.- 9.1.1 Convergence Reproved.- 9.1.2 Sampling and Second Largest Eigenvalues.- 9.1.3 Continuous Time and Space.- 10. Parallel Algorithms.- 10.1 Partially Parallel Algorithms.- 10.1.1 Synchroneous Updating on Independent Sets.- 10.1.2 The Swendson-Wang Algorithm.- 10.2 Synchroneous Algorithms.- 10.2.1 Introduction.- 10.2.2 Invariant Distributions and Convergence.- 10.2.3 Support of the Limit Distribution.- 10.3 Synchroneous Algorithms and Reversibility.- 10.3.1 Preliminaries.- 10.3.2 Invariance and Reversibility.- 10.3.3 Final Remarks.- IV. Texture Analysis.- 11. Partitioning.- 11.1 Introduction.- 11.2 How to Tell Textures Apart.- 11.3 Features.- 11.4 Bayesian Texture Segmentation.- 11.4.1 The Features.- 11.4.2 The Kolmogorov-Smirnov Distance.- 11.4.3 A Partition Model.- 11.4.4 Optimization.- 11.4.5 A Boundary Model.- 11.5 Julesz’s Conjecture.- 11.5.1 Introduction.- 11.5.2 Point Processes.- 12. Texture Models and Classification.- 12.1 Introduction.- 12.2 Texture Models.- 12.2.1 The ?-Model.- 12.2.2 The Autobinomial Model.- 12.2.3 Automodels.- 12.3 Texture Synthesis.- 12.4 Texture Classification.- 12.4.1 General Remarks.- 12.4.2 Contextual Classification.- 12.4.3 MPM Methods.- V. Parameter Estimation.- 13. Maximum Likelihood Estimators.- 13.1 Introduction.- 13.2 The Likelihood Function.- 13.3 Objective Functions.- 13.4 Asymptotic Consistency.- 14. Spacial ML Estimation.- 14.1 Introduction.- 14.2 Increasing Observation Windows.- 14.3 The Pseudolikelihood Method.- 14.4 The Maximum Likelihood Method.- 14.5 Computation of ML Estimators.- 14.6 Partially Observed Data.- VI. Supplement.- 15. A Glance at Neural Networks.- 15.1 Introduction.- 15.2 Boltzmann Machines.- 15.3 A Learning Rule.- 16. Mixed Applications.- 16.1 Motion.- 16.2 Tomographic Image Reconstruction.- 16.3 Biological Shape.- VII. Appendix.- A. Simulation of Random Variables.- A.1 Pseudo-random Numbers.- A.2 Discrete Random Variables.- A.3 Local Gibbs Samplers.- A.4 Further Distributions.- A.4.1 Binomial Variables.- A.4.2 Poisson Variables.- A.4.3 Gaussian Variables.- A.4.4 The Rejection Method.- A.4.5 The Polar Method.- B. The Perron-Frobenius Theorem.- C. Concave Functions.- D. A Global Convergence Theorem for Descent Algorithms.- References.

Popular Content within this publication 

 

Articles

Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Probability Theory and Stochastic Processes.