Logo - springer
Slogan - springer

Physics - Optics & Lasers | Probability, Statistical Optics, and Data Testing - A Problem Solving Approach

Probability, Statistical Optics, and Data Testing

A Problem Solving Approach

Frieden, B.R.

1983

eBook
Information

Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.

(net) price for USA

ISBN 978-3-642-96732-0

digitally watermarked, no DRM

Included Format: PDF

download immediately after purchase


learn more about Springer eBooks

add to marked items

$69.95
  • About this textbook

A basic skill in probability is practically demanded nowadays in many bran­ ches of optics, especially in image science. On the other hand, there is no text presently available that develops probability, and its companion fields stochastic processes and statistics, from the optical perspective. [Short of a book, a chapter was recently written for this purpose; see B. R. Frieden (ed. ): The Computer in Optical Research, Topics in Applied Physics, Vol. 41 (Springer, Berlin, Heidelberg, New York 1980) Chap. 3] Most standard texts either use illustrative examples and problems from electrical engineering or from the life sciences. The present book is meant to remedy this situation, by teaching probability with the specific needs of the optical researcher in mind. Virtually all the illustrative examples and applications of the theory are from image science and other fields of optics. One might say that photons have replaced electrons in nearly all considera­ tions here. We hope, in this manner, to make the learning of probability a pleasant and absorbing experience for optical workers. Some of the remaining applications are from information theory, a con­ cept which complements image science in particular. As will be seen, there are numerous tie-ins between the two concepts. Students will be adequately prepared for the material in this book if they have had a course in calculus, and know the basics of matrix manipulation.

Content Level » Research

Keywords » Optics - Optik - Probability - Statistik - Stochastischer Prozess - Wahrscheinlichkeitsrechnung

Related subjects » Complexity - Optics & Lasers - Theoretical, Mathematical & Computational Physics

Table of contents 

1. Introduction.- 1.1 What Is Chance, and Why Study It?.- 1.1.1 Chance vs Determinism.- 1.1.2 Probability Problems in Optics.- 1.1.3 Statistical Problems in Optics.- 2. The Axiomatic Approach.- 2.1 Notion of an Experiment; Events.- 2.1.1 Event Space; The Space Event.- 2.1.2 Disjoint Events.- 2.1.3 The Certain Event.- Exercise 2.1.- 2.2 Definition of Probability.- 2.3 Relation to Frequency of Occurence.- 2.4 Some Elementary Consequences.- 2.4.1 Additivity Property.- 2.4.2 Normalization Property.- 2.5 Marginal Probability.- 2.6 The “Traditional” Definition of Probability.- 2.7 Illustrative Problem: A Dice Game.- 2.8 Illustrative Problem: Let’s (try to) Take a Trip.- 2.9 Law of Large Numbers.- 2.10 Optical Objects and Images as Probability Laws.- 2.11 Conditional Probability.- Exercise 2.2.- 2.12 The Quantity of Information.- 2.13 Statistical Independence.- 2.13.1 Illustrative Problem: Let’s (try to) Take a Trip (Continued).- 2.14 Informationless Messages.- 2.15 A Definition of Noise.- 2.16 “Additivity” Property of Information.- 2.17 Partition Law.- 2.18 Illustrative Problem: Transmittance Through a Film.- 2.19 How to Correct a Success Rate for Guesses.- Exercise 2.3.- 2.20 Baves’ Rule.- 2.21 Some Optical Applications.- 2.22 Information Theory Application.- 2.23 Application to Markov Events.- 2.24 Complex Number Events.- Exercise 2.4.- 3. Continuous Random Variables.- 3.1 Definition of Random Variable.- 3.2 Probability Density Function, Basic Properties.- 3.3 Information Theory Application: Continuous Limit.- 3.4 Optical Application: Continuous Form of Imaging Law.- 3.5 Expected Values Moments.- 3.6 Optical Application: Moments of the Slit Diffraction Pattern.- 3.7 Information Theory Application.- 3.8 Case of Statistical Independence.- 3.9 Mean of a Sum.- 3.10 Optical Application.- 3.11 Deterministic Limit; Representations of the Dirac ?-Function.- 3.12 Correspondence Between Discrete and Continuous Cases.- 3.13 Cumulative Probability.- 3.14 The Means of an Algebraic Expression: A Simplified Approach.- 3.15 A Potpourri of Probability Laws.- 3.15.1 Poisson.- 3.15.2 Binomial.- 3.15.3 Uniform.- 3.15.4 Exponential.- 3.15.5 Normal (One-Dimensional).- 3.15.6 Normal (Two-Dimensional).- 3.15.7 Normal (Multi-Dimensional).- 3.15.8 Skewed Gaussian Case; Gram-Charlier Expansion.- 3.15.9 Optical Application.- 3.15.10 Geometric Law.- 3.15.11 Cauchy Law.- 3.15.12 Sinc2 Law.- Exercise 3.1.- 4. Fourier Methods in Probability.- 4.1 Characteristic Function Defined.- 4.2 Use in Generating Moments.- 4.3 An Alternative to Describing RV x.- 4.4 On Optical Applications.- 4.5 Shift Theorem.- 4.6 Poisson Case.- 4.7 Binomial Case.- 4.8 Uniform Case.- 4.9 Exponential Case.- 4.10 Normal Case (One Dimension).- 4.11 Multidimensional Cases.- 4.12 Normal Case (Two Dimensions).- 4.13 Convolution Theorem, Transfer Theorem.- 4.14 Probability Law for the Sum of Two Independent RV’s.- 4.15 Optical Applications.- 4.15.1 Imaging Equation as the Sum of Two Random Displacements.- 4.15.2 Unsharp Masking.- 4.16 Sum of n Independent RVs; the “Random Walk” Phenomenon.- Exercise 4.1.- 4.17 Resulting Mean and Variance: Normal, Poisson, and General Cases.- 4.18 Sum of nDependent RV’s.- 4.19 Case of Two Gaussian Bivariate RV’s.- 4.20 Sampling Theorems for Probability.- 4.21 Case of Limited Range of x Derivation.- 4.22 Discussion.- 4.23 Optical Application.- 4.24 Case of Limited Ranse of ?.- 4.25 Central Limit Theorem.- 4.26 Derivation.- Exercise 4.2.- 4.27 How Large Does n Have to be?.- 4.28 Optical Applications.- 4.28.1 Cascaded Optical Systems.- 4.28.2 Laser Resonator.- 4.28.3 Atmospheric Turbulence.- 4.29 Generating Normally Distributed Numbers from Uniformly Random Numbers.- 4.30 The Error function.- Exercise 4.3.- 5. Functions of Random Variables.- 5.1 Case of a Single Random Variable.- 5.2 Uniaue Root.- 5.3 Application from Geometrical Optics.- 5.4 Multiple Roots.- 5.5 Illustrative Example.- 5.6 Case of n Random Variables, r Roots.- 5.7 Optical Applications.- 5.8 Statistical Modeling.- 5.9 Application of Transformation Theory to Laser Speckle.- 5.9.1 Physical Layout.- 5.9.2 Plan.- 5.9.3 Statistical Model.- 5.9.4 Marginal Probabilities for Light Amplitudes U, Uim.- 5.9.5 Correlation Between Ure and Uim.- 5.9.6 Joint Probability Law for Ure, Uim.- 5.9.7 Probability Laws for Intensity and Phase; Transformation of the RV’s.- 5.9.8 Marginal Law for Intensity and Phase.- 5.9.9 Signal-to-Noise (S/N) Ratio in the Speckle Image.- 5.10 Speckle Reduction by Use of a Scanning Aperture.- 5.10.1 Statistical Model.- 5.10.2 Probability Density for Output Intensity pI (v).- 5.10.3 Moments and S/N Ratio.- 5.10.4 Standard Form for the Chi-Square Distribution.- 5.11 Calculation of Spot Intensity Profiles Using Transformation Theory.- 5.11.1 Illustrative Example.- 5.11.2 Implementation by Ray-Trace.- 5.12 Application of Transformation Theory to a Satellite-Ground Communication Problem.- Exercise 5.1.- 6. Bernoulli Trials and its Limiting Cases.- 6.1 Analysis.- 6.2 Illustrative Problems.- 6.2.1 Illustrative Problem: Let’s (try to) Take a Trip: The Last Word.- 6.2.2 Illustrative Problem: Mental Telepathy as a Communication Link?.- 6.3 Characteristic Function and Moments.- 6.4 Optical Application: Checkerboard Model of Granularity.- 6.5 The Poisson Limit.- 6.5.1 Analysis.- 6.5.2 Example of Degree of Approximation.- 6.6 Optical Application: The Shot Effect.- 6.7 Optical Application: Combined Sources.- 6.8 Poisson Joint Count for Two Detectors — Intensity Interferometry.- 6.9 The Normal Limit (DeMoivre-Laplace Law).- 6.9.1 Derivation.- 6.9.2 Conditions of Use.- 6.9.3 Use of the Error function.- Exercise 6.1.- 7. The Monte Carlo Calculation.- 7.1 Producing Random Numbers that Obey a Prescribed Probability Law.- 7.1.1 Illustrative Case.- 7.1.2 Normal Case.- 7.2 Analysis of the Photographic Emulsion by Monte Carlo Calculation.- 7.3 Application of the Monte Carlo Calculation to Remote Sensing.- 7.4 Monte Carlo Formation of Optical Images.- 7.4.1 An Example.- 7.5 Monte Carlo Simulation of Speckle Patterns.- Exercise 7.1.- 8 Stochastic Processes.- 8.1 Definition of Stochastic Process.- 8.2 Definition of Power Spectrum.- 8.2.1 Some Examples of Power Spectra.- 8.3 Definition of Autocorrelation Function; Kinds of Stationarity.- 8.4 Fourier Transform Theorem.- 8.5 Case of a “White” Power Snectrum.- 8.6 Application: Average Transfer Function Through Atmospheric Turbulence.- 8.6.1 Statistical Model for Phase Fluctuations.- 8.6.2 A Transfer Function for Turbulence.- 8.7 Transfer Theorems for Power Spectra.- 8.7.1 Determining the MTF Using Random Objects.- 8.7.2 Speckle Interferometry of Labeyrie.- 8.7.3 Resolution Limits of Speckle Interferometry.- Exercise 8.1.- 8.8 Transfer Theorem for Autocorrelation: The Knox-Thompson Method.- 8.9 Additive Noise.- 8.10 Random Noise.- 8.11 Ergodic Property.- Exercise 8.2.- 8.12 Optimum Restoring Filter.- 8.12.1 Definition of Restoring Filter.- 8.12.2 Model.- 8.12.3 Solution.- Exercise 8.3.- 8.13 Information Content in the Optical Image.- 8.13.1 Statistical Model.- 8.13.2 Analysis.- 8.13.3 Noise Entropy.- 8.13.4 Data Entropy.- 8.13.5 The Answer.- 8.13.6 Interpretation.- 8.14 Data Information and Its Ability to be Restored.- 8.15 Superposition Processes; the Shot Noise Process.- 8.15.1 Probability Law for i.- 8.15.2 Some Important Averages.- 8.15.3 Mean Value i(x0)>.- 8.15.4 Shot Noise Case.- 8.15.5 Second Moment (i1 (x0)>.- 8.15.6 Variance 03C32 (x0).- 8.15.7 Shot Noise Case.- 8.15.8 Signal-to-Noise (S/N) Ratio.- Exercise 8.4.- 8.15.9 Autocorrelation function.- 8.15.10 Shot Noise Case.- 8.15.11 Application: An Overlapping Circular Grain Model for the Emulsion.- 8.15.12 Application: Light Fluctuations due to Randomly Tilted Waves, the “Swimming pool” Effect.- Exercise 8.5.- 9. Introduction to Statistical Methods: Estimating the Mean, ian, Variance, S/N, and Simple Probability.- 9.1 Estimating a Mean from a Finite Sample.- 9.2 Statistical Model.- 9.3 Analysis.- 9.4 Discussion.- 9.5 Error in a Discrete, Linear Processor: Why Linear Methods Often Fail.- 9.6 Estimating a Probability: Derivation of the Law of Large Numbers.- 9.7 Variance of Error.- 9.8 Illustrative Uses of Error Expression.- 9.8.1 Estimating Probabilities from Empirical Rates.- 9.8.2 Aperture Size for Required Accuracy in Transmittance Readings.- 9.9 Probability Law for the Estimated Probability; Confidence Limits.- 9.10 Calculation of the Sample Variance.- 9.10.1 Unbiased Estimate of the Variance.- 9.10.2 Fxnected Frror in the Samnle Variance.- 9.10.3 Illustrative Problems.- 9.11 Estimating the Signal-to-Noise Ratio; Student’s Probability Law.- 9.11.1 Probability Law for SNR.- 9.11.2 Moments of SNR.- 9.11.3 Limit c? 0; a Student Probability Law.- 9.12 Properties of a Median Window.- 9.13 Statistics of the Median.- 9.13.1 Probability Law for the Median.- 9.13.2 Laser Speckle Case: Exponential Probability Law.- Exercise 9.1.- 10. Estimating a Probability Law.- 10.1 Estimating Probability Densities Using Orthogonal Expansions.- 10.2 Karhunen-Loeve Expansion.- 10.3 The Multinomial Probability Law.- 10.3.1 Derivation.- 10.3.2 Illustrative Example.- 10.4 Estimating a Probability Law Using Maximum Likelihood.- 10.4.1 Principle of Maximum Likelihood.- 10.4.2 Maximum Entropy Estimate.- 10.4.3 The Search for “Maximum Prior Ignorance”.- 10.4.4 Other Types of Estimates (Summary).- 10.4.5 Return to Maximum Entropy Estimation, Discrete Case.- 10.4.6 Transition to a Continuous Random Variable.- 10.4.7 Solution.- 10.4.8 Maximized H.- 10.4.9 Illustrative Example: Significance of the Normal Law.- 10.4.10 The Smoothness Property; Least Biased Aspect.- 10.4.11 A Well Known Distribution Derived.- 10.4.12 When Does the Maximum Entropy Estimate Equal the True Law7.- 10.4.13 Maximum Likelihood Estimation of Optical Objects.- 10.4.14 Case of Nearly Featureless Objects.- Exercise 10.1.- 11. The Chi-Square Test of Significance.- 11.1 Forming the x2 Statistic.- 11.2 Probabilitv Law for x2 Statistic.- 11.3 When is a Coin Fixed?.- 11.4 Equivalence of Chi-Square to Other Statistics; Sufficient Statistics.- 11.5 When is a Vote Decisive?.- 11.6 Generalization to N Voters.- 11.7 Use as an Image Detector.- Exercise 11.1.- 12. The Student f-Test on the Mean.- 12.1 Cases Where Data Accuracy is Unknown.- 12.2 Philosophy of the Approach: Statistical Inference.- 12.3 Forming the Statistic.- 12.4 Student’t-Distribution; Derivation.- 12.5 Some Properties of Student’t-Distribution.- 12.6 Application to the Problem; Student’t-Test.- 12.7 Illustrative Example.- 12.8 Other Applications.- Exercise.- 13. The F-Test on Variance.- 13.1 Snedecor’s F-Distribution; Derivation.- 13.2 Some Properties of Snedecor’s F-Distribution.- 13.3 TheF-Test.- 13.4 Illustrative Example.- 13.5 Application to Image Detection.- Exercise 13.1.- 14. Least-Squares Curve Fitting — Regression Analysis.- 14.1 Summation Model for the Physical Effect.- 14.2 Linear Regression Model for the Noise.- 14.3 Equivalence of ML and Least-Squares Solutions.- 14.4 Solution.- 14.5 Return to Film Problem.- 14.6 “Significant” Factors; the R-statistic.- 14.7 Examnle: Was T2 an Insignificant Factor?.- 14.8 Accuracy of the Estimated Coefficients.- 14.8.1 Absorptance of an Optical Fiber.- 14.8.2 Variance of Error in the General Case.- 14.8.3 Error in the Estimated Absorptance of an Optical Fiber.- Exercise 14.1.- 15. Principal Components Analysis.- 15.1 A Photographic Problem.- 15.2 Equivalent Eigenvalue Problem.- 15.3 The Eigenvalues as Sample Variances.- 15.4 The Data in Terms of Principal Components.- 15.5 Reduction in Data Dimensionality.- 15.6 Return to the H-D Problem.- 15.7 Application to Multispectral Imagery.- 15.8 Error Analysis.- Exercise 15.1.- 16. The Controversy Between Bayesians and Classicists.- 16.1 Bayesian Approach to Confidence Limits for an Estimated Probability.- 16.1.1 Probability Law for the Unknown Probability.- 16.1.2 Assumption of a Uniform Prior.- 16.1.3 Irrelevance of Choice of Prior Statisticp0 (x) if N is Large.- 16.1.4 Limiting Form for N Large.- 16.1.5 Illustrative Problem.- 16.2 Laplace’s Rule of Succession.- 16.2.1 Derivation.- Exercise 16.1.- 16.2.2 Role of the Prior.- Exercise 16.2.- Appendix A. Error Function and its Derivative [4.12].- Appendix E. A Crib Sheet of Statistical Parameters and their Errors.- Appendix F Synopsis of Statistical Tests.- References.

Popular Content within this publication 

 

Articles

Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Classical Electrodynamics, Wave Phenomena.