Logo - springer
Slogan - springer

Physics - Optics & Lasers | Probability, Statistical Optics, and Data Testing - A Problem Solving Approach

Probability, Statistical Optics, and Data Testing

A Problem Solving Approach

Frieden, B.Roy

2nd ed. 1991, XX, 443p. 110 illus..


Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.

You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.

After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.

(net) price for USA

ISBN 978-3-642-97289-8

digitally watermarked, no DRM

Included Format: PDF

download immediately after purchase

learn more about Springer eBooks

add to marked items

This new edition incorporates corrections of all known typographical errors in the first edition, as well as some more substantive changes. Chief among the latter is the addition of Chap. 17, on methods of estimation. As with the rest of the text, most applications and examples cited in the new chapter are from the optical perspective. The intention behind this new chapter is to empower the optical researcher with a yet broader range of research tools. Certainly a basic knowledge of estimation methods should be among these. In particular, the sections on likelihood theory and Fisher information prepare readers for the problems of optical parameter estimation and probability law estimation. Physicists and optical scientists might find this material particularly useful, since the subject of Fisher information is generally not covered in standard physical science curricula. Since the words "statistical optics" are prominent in the title of this book, their meaning needs to be clarified. There is a general tendency to overly emphasize the statistics of photons as the sine qua non of statistical optics. In view is taken, which equally emphasizes the random medium this text a wider that surrounds the photon, be it a photographic emulsion, the turbulent atmo­ sphere, a vibrating lens holder, etc. Also included are random interpretations of ostensibly deterministic phenomena, such as the Hurter-Driffield (H and D) curve of photography. Such a "random interpretation" sometimes breaks new ground, as in Chap.

Content Level » Research

Keywords » Likelihood - Mathematical methods in physics - Optical information - Quantum optics - Statistical optics - Statistik - Wahrscheinlichkeitsrechnung - image formation and analysis - optics - probability theory

Related subjects » Complexity - Optics & Lasers - Theoretical, Mathematical & Computational Physics

Table of contents 

1. Introduction.- 1.1 What Is Chance, and Why Study It?.- 1.1.1 Chance vs Determinism.- 1.1.2 Probability Problems in Optics.- 1.1.3 Statistical Problems in Optics.- 2. The Axiomatic Approach.- 2.1 Notion of an Experiment; Events.- 2.1.1 Event Space; The Space Event.- 2.1.2 Disjoint Events.- 2.1.3 The Certain Event.- Exercise 2.1.- 2.2 Definition of Probability.- 2.3 Relation to Frequency of Occurrence.- 2.4 Some Elementary Consequences.- 2.4.1 Additivity Property.- 2.4.2 Normalization Property.- 2.5 Marginal Probability.- 2.6 The “Traditional” Definition of Probability.- 2.7 Illustrative Problem: A Dice Game.- 2.8 Illustrative Problem: Let’s (Try to) Take a Trip.- 2.9 Law of Large Numbers.- 2.10 Optical Objects and Images as Probability Laws.- 2.11 Conditional Probability.- Exercise 2.2.- 2.12 The Quantity of Information.- 2.13 Statistical Independence.- 2.13.1 Illustrative Problem: Let’s (Try to) Take a Trip (Continued).- 2.14 Informationless Messages.- 2.15 A Definition of Noise.- 2.16 “Additivity” Property of Information.- 2.17 Partition Law.- 2.18 Illustrative Problem: Transmittance Through a Film…..- 2.19 How to Correct a Success Rate for Guesses.- Exercise 2.3.- 2.20 Bayes’ Rule.- 2.21 Some Optical Applications.- 2.22 Information Theory Application.- 2.23 Application to Markov Events.- 2.24 Complex Number Events.- Exercise 2.4.- 3. Continuous Random Variables.- 3.1 Definition of a Random Variable.- 3.2 Probability Density Function, Basic Properties.- 3.3 Information Theory Application: Continuous Limit…..- 3.4 Optical Application: Continuous Form of Imaging Law...- 3.5 Expected Values, Moments.- 3.6 Optical Application: Moments of the Slit Diffraction Pattern.- 3.7 Information Theory Application.- 3.8 Case of Statistical Independence.- 3.9 Mean of a Sum.- 3.10 Optical Application.- 3.11 Deterministic Limit; Representations of the Dirac ?-Function.- 3.12 Correspondence Between Discrete and Continuous Cases..- 3.13 Cumulative Probability.- 3.14 The Means of an Algebraic Expression: A Simplified Approach.- 3.15 A Potpourri of Probability Laws.- 3.15.1 Poisson.- 3.15.2 Binomial.- 3.15.3 Uniform.- 3.15.4 Exponential.- 3.15.5 Normal (One-Dimensional).- 3.15.6 Normal (Two-Dimensional).- 3.15.7 Normal (Multi-Dimensional).- 3.15.8 Skewed Gaussian Case; Gram-Charlier Expansion..- 3.15.9 Optical Application.- 3.15.10 Geometric Law.- 3.15.11 Cauchy Law.- 3.15.12 Sinc2 Law.- Exercise 3.1.- 4. Fourier Methods in Probability.- 4.1 Characteristic Function Defined.- 4.2 Use in Generating Moments.- 4.3 An Alternative to Describing RV x.- 4.4 On Optical Applications.- 4.5 Shift Theorem.- 4.6 Poisson Case.- 4.7 Binomial Case.- 4.8 Uniform Case.- 4.9 Exponential Case.- 4.10 Normal Case (One Dimension).- 4.11 Multidimensional Cases.- 4.12 Normal Case (Two Dimensions).- 4.13 Convolution Theorem, Transfer Theorem.- 4.14 Probability Law for the Sum of Two Independent RV’s.- 4.15 Optical Applications.- 4.15.1 Imaging Equation as the Sum of Two Random Displacements.- 4.15.2 Unsharp Masking.- 4.16 Sum of n Independent RV’s; the “Random Walk” Phenomenon.- Exercise 4.1.- 4.17 Resulting Mean and Variance: Normal, Poisson, and General Cases.- 4.18 Sum of n Dependent RV’s.- 4.19 Case of Two Gaussian Bivariate RV’s.- 4.20 Sampling Theorems for Probability.- 4.21 Case of Limited Range of x, Derivation.- 4.22 Discussion.- 4.23 Optical Application.- 4.24 Case of Limited Range of ?.- 4.25 Central Limit Theorem.- 4.26 Derivation.- Exercise 4.2.- 4.27 How Large Does n Have to be?.- 4.28 Optical Applications.- 4.28.1 Cascaded Optical Systems.- 4.28.2 Laser Resonator.- 4.28.3 Atmospheric Turbulence.- 4.29 Generating Normally Distributed Numbers from Uniformly Random Numbers.- 4.30 The Error Function.- Exercise 4.3.- 5. Functions of Random Variables.- 5.1 Case of a Single Random Variable.- 5.2 Unique Root.- 5.3 Application from Geometrical Optics.- 5.4 Multiple Roots.- 5.5 Illustrative Example.- 5.6 Case of n Random Variables, r Roots.- 5.7 Optical Applications.- 5.8 Statistical Modeling.- 5.9 Application of Transformation Theory to Laser Speckle.- 5.9.1 Physical Layout.- 5.9.2 Plan.- 5.9.3 Statistical Model.- 5.9.4 Marginal Probabilities for Light Amplitudes Ure, Uim.- 5.9.5 Correlation Between Ure and Uim.- 5.9.6 Joint Probability Law for Ure, Uim.- 5.9.7 Probability Laws for Intensity and Phase; Transformation of the RV’s.- 5.9.8 Marginal Law for Intensity and Phase.- 5.9.9 Signal-to-Noise (S/N) Ratio in the Speckle Image.- 5.10 Speckle Reduction by Use of a Scanning Aperture.- 5.10.1 Statistical Model.- 5.10.2 Probability Density for Output Intensity pI (v).- 5.10.3 Moments and S/N Ratio.- 5.10.4 Standard Form for the Chi-Square Distribution.- 5.11 Calculation of Spot Intensity Profiles Using Transformation Theory.- 5.11.1 Illustrative Example.- 5.11.2 Implementation by Ray-Trace.- 5.12 Application of Transformation Theory to a Satellite-Ground Communication Problem.- Exercise 5.1.- 6. Bernoulli Trials and Limiting Cases.- 6.1 Analysis.- 6.2 Illustrative Problems.- 6.2.1 Illustrative Problem: Let’s (Try to) Take a Trip: The Last Word.- 6.2.2 Illustrative Problem: Mental Telepathy as a Communication Link?.- 6.3 Characteristic Function and Moments.- 6.4 Optical Application: Checkerboard Model of Granularity.- 6.5 The Poisson Limit.- 6.5.1 Analysis.- 6.5.2 Example of Degree of Approximation.- 6.6 Optical Application: The Shot Effect.- 6.7 Optical Application: Combined Sources.- 6.8 Poisson Joint Count for Two Detectors — Intensity Interferometry.- 6.9 The Normal Limit (DeMoivre-Laplace Law).- 6.9.1 Derivation.- 6.9.2 Conditions of Use.- 6.9.3 Use of the Error Function.- Exercise 6.1.- 7. The Monte Carlo Calculation.- 7.1 Producing Random Numbers That Obey a Prescribed Probability Law.- 7.1.1 Illustrative Case.- 7.1.2 Normal Case.- 7.2 Analysis of the Photographic Emulsion by Monte Carlo Calculation.- 7.3 Application of the Monte Carlo Calculation to Remote Sensing.- 7.4 Monte Carlo Formation of Optical Images.- 7.4.1 An Example.- 7.5 Monte Carlo Simulation of Speckle Patterns.- Exercise 7.1.- 8. Stochastic Processes.- 8.1 Definition of a Stochastic Process.- 8.2 Definition of Power Spectrum.- 8.2.1 Some Examples of Power Spectra.- 8.3 Definition of Autocorrelation Function; Kinds of Stationarity.- 8.4 Fourier Transform Theorem.- 8.5 Case of a “White” Power Spectrum.- 8.6 Application: Average Transfer Function Through Atmospheric Turbulence.- 8.6.1 Statistical Model for Phase Fluctuations.- 8.6.2 A Transfer Function for Turbulence.- 8.7 Transfer Theorems for Power Spectra.- 8.7.1 Determining the MTF Using Random Objects.- 8.7.2 Speckle Interferometry of Labeyrie.- 8.7.3 Resolution Limits of Speckle Interferometry.- Exercise 8.1.- 8.8 Transfer Theorem for Autocorrelation: The Knox-Thompson Method.- 8.9 Additive Noise.- 8.10 Random Noise.- 8.11 Ergodic Property.- Exercise 8.2.- 8.12 Optimum Restoring Filter.- 8.12.1 Definition of Restoring Filter.- 8.12.2 Model.- 8.12.3 Solution.- Exercise 8.3.- 8.13 Information Content in the Optical Image.- 8.13.1 Statistical Model.- 8.13.2 Analysis.- 8.13.3 Noise Entropy.- 8.13.4 Data Entropy.- 8.13.5 The Answer.- 8.13.6 Interpretation.- 8.14 Data Information and Its Ability to be Restored.- 8.15 Superposition Processes; the Shot Noise Process.- 8.15.1 Probability Law for i.- 8.15.2 Some Important Averages.- 8.15.3 Mean Value .- 8.15.4 Shot Noise Case.- 8.15.5 Second Moment .- 8.15.6 Variance ?2 (x0).- 8.15.7 Shot Noise Case.- 8.15.8 Signal-to-Noise (S/N) Ratio.- Exercise 8.4.- 8.15.9 Autocorrelation Function.- 8.15.10 Shot Noise Case.- 8.15.11 Application: An Overlapping Circular Grain Model for the Emulsion.- 8.15.12 Application: Light Fluctuations due to Randomly Tilted Waves, the “Swimming Pool” Effect.- Exercise 8.5.- 9. Introduction to Statistical Methods: Estimating the Mean, Median, Variance, S/N, and Simple Probability.- 9.1 Estimating a Mean from a Finite Sample.- 9.2 Statistical Model.- 9.3 Analysis.- 9.4 Discussion.- 9.5 Error in a Discrete, Linear Processor: Why Linear Methods Often Fail.- 9.6 Estimating a Probability: Derivation of the Law of Large Numbers.- 9.7 Variance of Error.- 9.8 Illustrative Uses of the Error Expression.- 9.8.1 Estimating Probabilities from Empirical Rates.- 9.8.2 Aperture Size for Required Accuracy in Transmittance Readings.- 9.9 Probability Law for the Estimated Probability: Confidence Limits.- 9.10 Calculation of the Sample Variance.- 9.10.1 Unbiased Estimate of the Variance.- 9.10.2 Expected Error in the Sample Variance.- 9.10.3 Illustrative Problems.- 9.11 Estimating the Signal-to-Noise Ratio; Student’s Probability Law.- 9.11.1 Probability Law for SNR.- 9.11.2 Moments of SNR.- 9.11.3 Limit c ? 0; a Student Probability Law.- 9.12 Properties of a Median Window.- 9.13 Statistics of the Median.- 9.13.1 Probability Law for the Median.- 9.13.2 Laser Speckle Case: Exponential Probability Law.- Exercise 9.1.- 10. Estimating a Probability Law.- 10.1 Estimating Probability Densities Using Orthogonal Expansions.- 10.2 Karhunen-Loeve Expansion.- 10.3 The Multinomial Probability Law.- 10.3.1 Derivation.- 10.3.2 Illustrative Example.- 10.4 Estimating a Probability Law Using Maximum Likelihood.- 10.4.1 Principle of Maximum Likelihood.- 10.4.2 Maximum Entropy Estimate.- 10.4.3 The Search for “Maximum Prior Ignorance”.- 10.4.4 Other Types of Estimates (Summary).- 10.4.5 Return to Maximum Entropy Estimation, Discrete Case.- 10.4.6 Transition to a Continuous Random Variable.- 10.4.7 Solution.- 10.4.8 Maximized H.- 10.4.9 Illustrative Example: Significance of the Normal Law.- 10.4.10 The Smoothness Property; Least Biased Aspect.- 10.4.11 A Well Known Distribution Derived.- 10.4.12 When Does the Maximum Entropy Estimate Equal the True Law?.- 10.4.13 Maximum Likelihood Estimation of Optical Objects.- 10.4.14 Case of Nearly Featureless Objects.- Exercise 10.1.- 11. The Chi-Square Test of Significance.- 11.1 Forming the ?2 Statistic.- 11.2 Probability Law for ?2 Statistic.- 11.3 When is a Coin Fixed?.- 11.4 Equivalence of Chi-Square to Other Statistics; Sufficient Statistics.- 11.5 When is a Vote Decisive?.- 11.6 Generalization to N Voters.- 11.7 Use as an Image Detector.- Exercise 11.1.- 12. The Student t-Test on the Mean.- 12.1 Cases Where Data Accuracy is Unknown.- 12.2 Philosophy of the Approach: Statistical Inference.- 12.3 Forming the Statistic.- 12.4 Student’s t-Distribution; Derivation.- 12.5 Some Properties of Student’s t-Distribution.- 12.6 Application to the Problem; Student’s t-Test.- 12.7 Illustrative Example.- 12.8 Other Applications.- Exercise 12.1.- 13. The F-Test on Variance.- 13.1 Snedeco’s F-Distribution; Derivation.- 13.2 Some Properties of Snedecor’s F-Distribution.- 13.3 The F-Test.- 13.4 Illustrative Example.- 13.5 Application to Image Detection.- Exercise 13.1.- 14. Least-Squares Curve Fitting — Regression Analysis.- 14.1 Summation Model for the Physical Effect.- 14.2 Linear Regression Model for the Noise.- 14.3 Equivalence of ML and Least-Squares Solutions.- 14.4 Solution.- 14.5 Return to Film Problem.- 14.6 “Significant” Factors; the R-Statistic.- 14.7 Example: Was T2 an Insignificant Factor?.- 14.8 Accuracy of the Estimated Coefficients.- 14.8.1 Absorptance of an Optical Fiber.- 14.8.2 Variance of Error in the General Case.- 14.8.3 Error in the Estimated Absorptance of an Optical Fiber.- Exercise 14.1.- 15. Principal Components Analysis.- 15.1 A Photographic Problem.- 15.2 Equivalent Eigenvalue Problem.- 15.3 The Eigenvalues as Sample Variances.- 15.4 The Data in Terms of Principal Components.- 15.5 Reduction in Data Dimensionality.- 15.6 Return to the H—D Problem.- 15.7 Application to Multispectral Imagery.- 15.8 Error Analysis.- Exercise 15.1.- 16. The Controversy Between Bayesians and Classicists.- 16.1 Bayesian Approach to Confidence Limits for an Estimated Probability.- 16.1.1 Probability Law for the Unknown Probability.- 16.1.2 Assumption of a Uniform Prior.- 16.1.3 Irrelevance of Choice of Prior Statistic p0(x) if N is Large.- 16.1.4 Limiting Form for N Large.- 16.1.5 Illustrative Problem.- 16.2 Laplace’s Rule of Succession.- 16.2.1 Derivation.- Exercise 16.1.- 16.2.2 Role of the Prior.- Exercise 16.2.- 16.3 Prior Probability Law for the Physical Constants.- 17. Introduction to Estimation Methods.- 17.1 Deterministic Parameters: Likelihood Theory.- 17.1.1 Unbiased Estimators.- 17.1.2 Maximum Likelihood Estimators.- Exercise 17.1.- 17.1.3 Cramer-Rao Lower Bound on Error.- 17.1.4 Achieving the Lower Bound.- 17.1.5 Testing for Efficient Estimators.- 17.2 Random Parameters: Bayesian Estimation Theory.- 17.2.1 Cost Functions.- 17.2.2 Risk.- 17.2.3 MAP Estimates.- Exercise 17.2.- 17.3 Estimating Probability Laws: the Use of Fisher Information.- 17.3.1 Equilibrium States.- 17.3.2 Application to Quantum Mechanics: The Schrödinger Wave Equation.- 17.3.3 The Klein-Gordon Equation.- 17.3.4 Application to Diffraction Optics: The Helmholtz Wave Equation.- 17.3.5 Application to Thermodynamics: The Maxwell-Boltzmann Law.- 17.3.6 Discussion.- Exercise 17.3.- Appendix A. Error Function and Its Derivative.- Appendix E. A Crib Sheet of Statistical Parameters and Their Errors.- Appendix F. Synopsis of Statistical Tests.- References.

Popular Content within this publication 



Read this Book on Springerlink

Services for this book

New Book Alert

Get alerted on new Springer publications in the subject area of Classical Electrodynamics, Wave Phenomena.