Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010141976 | TA1637 F38 2007 | Open Access Book | Book | Searching... |
Searching... | 30000010150706 | TA1637 F38 2007 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
Images contain information about the spatial properties of the scene they depict. When coupled with suitable assumptions, images can be used to infer thr- dimensional information. For instance, if the scene contains objects made with homogeneous material, such as marble, variations in image intensity can be - sociated with variations in shape, and hence the "shading" in the image can be exploited to infer the "shape" of the scene (shape from shading). Similarly, if the scene contains (statistically) regular structures, variations in image intensity can be used to infer shape (shape from textures). Shading, texture, cast shadows, - cluding boundaries are all "cues" that can be exploited to infer spatial properties of the scene from a single image, when the underlying assumptions are sat- ?ed. In addition, one can obtain spatial cues from multiple images of the same scene taken with changing conditions. For instance, changes in the image due to a moving light source are used in "photometric stereo," changes in the image due to changes in the position of the cameras are used in "stereo," "structure from motion," and "motion blur. " Finally, changes in the image due to changes in the geometry of the camera are used in "shape from defocus. " In this book, we will concentrate on the latter two approaches, motion blur and defocus, which are referred to collectively as "accommodation cues.
Table of Contents
Preface | p. vii |
1 Introduction | p. 1 |
1.1 The sense of vision | p. 1 |
1.1.1 Stereo | p. 4 |
1.1.2 Structure from motion | p. 5 |
1.1.3 Photometric stereo and other techniques based on controlled light | p. 5 |
1.1.4 Shape from shading | p. 6 |
1.1.5 Shape from texture | p. 6 |
1.1.6 Shape from silhouettes | p. 6 |
1.1.7 Shape from defocus | p. 6 |
1.1.8 Motion blur | p. 7 |
1.1.9 On the relative importance and integration of visual cues | p. 7 |
1.1.10 Visual inference in applications | p. 8 |
1.2 Preview of coming attractions | p. 9 |
1.2.1 Estimating 3-D geometry and photometry with a finite aperture | p. 9 |
1.2.2 Testing the power and limits of models for accommodation cues | p. 10 |
1.2.3 Formulating the problem as optimal inference | p. 11 |
1.2.4 Choice of optimization criteria, and the design of optimal algorithms | p. 12 |
1.2.5 Variational approach to modeling and inference from accommodation cues | p. 12 |
2 Basic models of image formation | p. 14 |
2.1 The simplest imaging model | p. 14 |
2.1.1 The thin lens | p. 14 |
2.1.2 Equifocal imaging model | p. 16 |
2.1.3 Sensor noise and modeling errors | p. 18 |
2.1.4 Imaging models and linear operators | p. 19 |
2.2 Imaging occlusion-free objects | p. 20 |
2.2.1 Image formation nuisances and artifacts | p. 22 |
2.3 Dealing with occlusions | p. 23 |
2.4 Modeling defocus as a diffusion process | p. 26 |
2.4.1 Equifocal imaging as isotropic diffusion | p. 28 |
2.4.2 Nonequifocal imaging model | p. 29 |
2.5 Modeling motion blur | p. 30 |
2.5.1 Motion blur as temporal averaging | p. 30 |
2.5.2 Modeling defocus and motion blur simultaneously | p. 34 |
2.6 Summary | p. 35 |
3 Some analysis: When can 3-D shape be reconstructed from blurred images? 37 | |
3.1 The problem of shape from defocus | p. 38 |
3.2 Observability of shape | p. 39 |
3.3 The role of radiance | p. 41 |
3.3.1 Harmonic components | p. 42 |
3.3.2 Band-limited radiances and degree of resolution | p. 42 |
3.4 Joint observability of shape and radiance | p. 46 |
3.5 Regularization | p. 46 |
3.6 On the choice of objective function in shape from defocus | p. 47 |
3.7 Summary | p. 49 |
4 Least-squares shape from defocus | p. 50 |
4.1 Least-squares minimization | p. 50 |
4.2 A solution based on orthogonal projectors | p. 53 |
4.2.1 Regularization via truncation of singular values | p. 53 |
4.2.2 Learning the orthogonal projectors from images | p. 55 |
4.3 Depth-map estimation algorithm | p. 58 |
4.4 Examples | p. 60 |
4.4.1 Explicit kernel model | p. 60 |
4.4.2 Learning the kernel model | p. 61 |
4.5 Summary | p. 65 |
5 Enforcing positivity: Shape from defocus and image restoration by minimizing I-divergence | p. 69 |
5.1 Information-divergence | p. 70 |
5.2 Alternating minimization | p. 71 |
5.3 Implementation | p. 76 |
5.4 Examples | p. 76 |
5.4.1 Examples with synthetic images | p. 76 |
5.4.2 Examples with real images | p. 78 |
5.5 Summary | p. 79 |
6 Defocus via diffusion: Modeling and reconstruction | p. 87 |
6.1 Blurring via diffusion | p. 88 |
6.2 Relative blur and diffusion | p. 89 |
6.3 Extension to space-varying relative diffusion | p. 90 |
6.4 Enforcing forward diffusion | p. 91 |
6.5 Depth-map estimation algorithm | p. 92 |
6.5.1 Minimization of the cost functional | p. 94 |
6.6 On the extension to multiple images | p. 95 |
6.7 Examples | p. 96 |
6.7.1 Examples with synthetic images | p. 97 |
6.7.2 Examples with real images | p. 99 |
6.8 Summary | p. 99 |
7 Dealing with motion: Unifying defocus and motion blur | p. 106 |
7.1 Modeling motion blur and defocus in one go | p. 107 |
7.2 Well-posedness of the diffusion model | p. 109 |
7.3 Estimating Radiance, Depth, and Motion | p. 110 |
7.3.1 Cost Functional Minimization | p. 111 |
7.4 Examples | p. 113 |
7.4.1 Synthetic Data | p. 114 |
7.4.2 Real Images | p. 117 |
7.5 Summary | p. 118 |
8 Dealing with multiple moving objects | p. 120 |
8.1 Handling multiple moving objects | p. 121 |
8.2 A closer look at camera exposure | p. 124 |
8.3 Relative motion blur | p. 125 |
8.3.1 Minimization algorithm | p. 126 |
8.4 Dealing with changes in motion | p. 127 |
8.4.1 Matching motion blur along different directions | p. 129 |
8.4.2 A look back at the original problem | p. 131 |
8.4.3 Minimization algorithm | p. 132 |
8.5 Image restoration | p. 135 |
8.5.1 Minimization algorithm | p. 137 |
8.6 Examples | p. 138 |
8.6.1 Synthetic data | p. 138 |
8.6.2 Real data | p. 141 |
8.7 Summary | p. 146 |
9 Dealing with occlusions | p. 147 |
9.1 Inferring shape and radiance of occluded surfaces | p. 148 |
9.2 Detecting occlusions | p. 150 |
9.3 Implementation of the algorithm | p. 151 |
9.4 Examples | p. 152 |
9.4.1 Examples on a synthetic scene | p. 152 |
9.4.2 Examples on real images | p. 154 |
9.5 Summary | p. 157 |
10 Final remarks | p. 159 |
A Concepts of radiometry | p. 161 |
A.1 Radiance, irradiance, and the pinhole model | p. 161 |
A.1.1 Foreshortening and solid angle | p. 161 |
A.1.2 Radiance and irradiance | p. 162 |
A.1.3 Bidirectional reflectance distribution function | p. 163 |
A.1.4 Lambertian surfaces | p. 163 |
A.1.5 Image intensity for a Lambertian surface and a pinhole lens model | p. 164 |
A.2 Derivation of the imaging model for a thin lens | p. 164 |
B Basic primer on functional optimization | p. 168 |
B.1 Basics of the calculus of variations | p. 169 |
B.1.1 Functional derivative | p. 170 |
B.1.2 Euler-Lagrange equations | p. 171 |
B.2 Detailed computation of the gradients | p. 172 |
B.2.1 Computation of the gradients in Chapter 6 | p. 172 |
B.2.2 Computation of the gradients in Chapter 7 | p. 174 |
B.2.3 Computation of the gradients in Chapter 8 | p. 176 |
B.2.4 Computation of the gradients in Chapter 9 | p. 185 |
C Proofs | p. 190 |
C.1 Proof of Proposition 3.2 | p. 190 |
C.2 Proof of Proposition 3.5 | p. 191 |
C.3 Proof of Proposition 4.1 | p. 192 |
C.4 Proof of Proposition 5.1 | p. 194 |
C.5 Proof of Proposition 7.1 | p. 195 |
D Calibration of defocused images | p. 197 |
D.1 Zooming and registration artifacts | p. 197 |
D.2 Telecentric optics | p. 200 |
E Matlab implementation of some algorithms | p. 202 |
E.1 Least-squares solution (Chapter 4) | p. 202 |
E.2 I-divergence solution (Chapter 5) | p. 212 |
E.3 Shape from defocus via diffusion (Chapter 6) | p. 221 |
E.4 Initialization: A fast approximate method | p. 229 |
F Regularization | p. 232 |
F.1 Inverse problems | p. 232 |
F.2 Ill-posed problems | p. 234 |
F.3 Regularization | p. 235 |
F.3.1 Tikhonov regularization | p. 237 |
F.3.2 Truncated SVD | p. 238 |
References | p. 239 |
Index | p. 247 |