Computational Imaging for Scene Understanding Transient, Spectral, and Polarimetric Analysis

Most cameras are inherently designed to mimic what is seen by the human eye: they have three channels of RGB and can achieve up to around 30 frames per second (FPS). However, some cameras are designed to capture other modalities: some may have the ability to capture spectra from near UV to near IR r...

Descripción completa

Detalles Bibliográficos
Autor principal: Funatomi, Takuya (-)
Otros Autores: Okabe, Takahiro
Formato: Libro electrónico
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated 2024.
Edición:1st ed
Colección:Image. Sensors and image processing
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009828025406719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright Page
  • Contents
  • Introduction
  • Part 1. Transient Imaging and Processing
  • Chapter 1. Transient Imaging
  • 1.1. Introduction
  • 1.2. Mathematical formulation
  • 1.2.1. Analysis of transient light transport propagation
  • 1.2.2. Sparsity of the impulse response function T (x, t)
  • 1.3. Capturing light in flight
  • 1.3.1. Single-photon avalanche diodes (SPAD)
  • 1.4. Applications
  • 1.4.1. Range imaging
  • 1.4.2. Material estimationand classification
  • 1.4.3. Light transport decomposition
  • 1.5. Non-line-of-sight imaging
  • 1.5.1. Backprojection
  • 1.5.2. Confocal NLOS and the light-cone transform
  • 1.5.3. Surface-based methods
  • 1.5.4. Virtualwaves and phasorfields
  • 1.5.5. Discussion
  • 1.6. Conclusion
  • 1.7. References
  • Chapter 2. Transient Convolutional Imaging
  • 2.1. Introduction
  • 2.2. Time-of-flight imaging
  • 2.2.1. Correlationimage sensors
  • 2.2.2. Convolutional ToF depth imaging
  • 2.2.3. Multi-path interference
  • 2.3. Transient convolutional imaging
  • 2.3.1. Global convolutional transport
  • 2.3.2. Transient imaging using correlation image sensors
  • 2.3.3. Spatio-temporal modulation
  • 2.4. Transient imagingin scatteringmedia
  • 2.5. Present andfuturedirections
  • 2.6. References
  • Chapter 3. Time-of-Flight and Transient Rendering
  • 3.1. Introduction
  • 3.2. Mathematical modeling
  • 3.2.1. Mathematical modeling for time-of-flight cameras
  • 3.3. How to render time-of-flight cameras?
  • 3.3.1. Challenges and solutions in time-of-flight rendering
  • 3.4. Open-sourceimplementations
  • 3.5. Applicationsof transient rendering
  • 3.6. Future directions
  • 3.7. References
  • Part 2. Spectral Imaging and Processing
  • Chapter 4. Hyperspectral Imaging
  • 4.1. Introduction
  • 4.2. 2D (raster scanning) architectures
  • 4.2.1. Czerny-Turner grating spectrometers.
  • 4.2.2. Transmission grating/prism spectrometers
  • 4.2.3. Coded aperture spectrometers
  • 4.2.4. Echelle spectrometers
  • 4.3. 1D scanning architectures
  • 4.3.1. Dispersive spectrometers
  • 4.3.2. Interferometric methods
  • 4.3.3. Interferometric filter methods
  • 4.3.4. Polarization-based filter methods
  • 4.3.5. Active illumination methods
  • 4.4. Snapshot architectures
  • 4.4.1. Bowen-Walravenimage slicer
  • 4.4.2. Image slicing and imagemapping
  • 4.4.3. Integral field spectrometry with coherent fiber bundles (IFS-F)
  • 4.4.4. Integral field spectroscopy with lenslet arrays (IFS-L)
  • 4.4.5. Filter array camera (FAC)
  • 4.4.6. Computed tomography imaging spectrometry (CTIS)
  • 4.4.7. Coded aperture snapshot spectral imager (CASSI)
  • 4.5. Comparisonof snapshot techniques
  • 4.5.1. The disadvantagesof snapshot
  • 4.6. Conclusion
  • 4.7. References
  • Chapter 5. Spectral Modeling and Separation of Reflective-Fluorescent Scenes
  • 5.1. Introduction
  • 5.2. RelatedWork
  • 5.3. Separation of reflection and fluorescence
  • 5.3.1. Reflection and fluorescence models
  • 5.3.2. Separation using high-frequency illumination
  • 5.3.3. Discussion on the illumination frequency
  • 5.3.4. Error analysis
  • 5.4. Estimating the absorption spectra
  • 5.5. Experiment results and analysis
  • 5.5.1. Experimental setup
  • 5.5.2. Quantitative evaluation of recovered spectra
  • 5.5.3. Visual separation and relighting results
  • 5.5.4. Separation by using high-frequency filters
  • 5.5.5. Ambient illumination
  • 5.6. Limitations and conclusion
  • 5.7. References
  • Chapter 6. Shape from Water
  • 6.1. Introduction
  • 6.2. Related works
  • 6.3. Light absorption in water
  • 6.4. Bispectral light absorption for depth recovery
  • 6.4.1. Bispectral depth imaging
  • 6.4.2. Depth accuracy and surface reflectance
  • 6.5. Practical shape from water.
  • 6.5.1. Non-collinear/perpendicular light-camera configuration
  • 6.5.2. Perspective camera with a point source
  • 6.5.3. Non-ideal narrow-band filters
  • 6.6. Co-axial bispectral imaging system and experiment results
  • 6.6.1. System configuration and calibration
  • 6.6.2. Depth and shape accuracy
  • 6.6.3. Complex static and dynamic objects
  • 6.7. Trispectral light absorption for depth recovery
  • 6.7.1. Trispectral depth imaging
  • 6.7.2. Evaluation on the reflectance spectra database
  • 6.8. Discussions
  • 6.9. Conclusion
  • 6.10. References
  • Chapter 7. Far Infrared Light Transport Decomposition and Its Application for Thermal Photometric Stereo
  • 7.1. Introduction
  • 7.1.1. Contributions
  • 7.2. Related work
  • 7.2.1. Light transport decomposition
  • 7.2.2. Computational thermal imaging
  • 7.2.3. Photometric stereo
  • 7.3. Far infrared light transport
  • 7.4. Decomposition and application
  • 7.4.1. Far infrared light transport decomposition
  • 7.4.2. Separating the ambient component
  • 7.4.3. Separating reflection and radiation
  • 7.4.4. Separating diffuse and global radiations
  • 7.4.5. Other options
  • 7.4.6. Thermal photometric stereo
  • 7.5. Experiments
  • 7.5.1. Decomposition result
  • 7.5.2. Surface normal estimation
  • 7.6. Conclusion
  • 7.7. References
  • Chapter 8. Synthetic Wavelength Imaging: Utilizing Spectral Correlations for High-Precision Time-of-Flight Sensing
  • 8.1. Introduction
  • 8.2. Synthetic wavelength imaging
  • 8.3. Synthetic wavelength interferometry
  • 8.4. Synthetic wavelength holography
  • 8.4.1. Imaging around corners with synthetic wavelength holography
  • 8.4.2. Imaging through scattering media with synthetic wavelength holography
  • 8.4.3. Discussion and comparison with the state of the art
  • 8.5. Fundamental performance limits of synthetic wavelength imaging
  • 8.6. Conclusion and future directions.
  • 8.7. Acknowledgment
  • 8.8. References
  • Part 3. Polarimetric Imaging and Processing
  • Chapter 9. Polarization-Based Shape Estimation
  • 9.1. Fundamental theory of polarization
  • 9.2. Reflection component separation
  • 9.3. Phase angle of polarization
  • 9.4. Surface normal estimation from the phase angle
  • 9.5. Degree of polarization
  • 9.6. Surface normal estimation from the degree of polarization
  • 9.7. Stokes vector
  • 9.8. Surface normal estimation from the Stokes vector
  • 9.9. References
  • Chapter 10. Shape from Polarization and Shading
  • 10.1. Introduction
  • 10.2. Related works
  • 10.2.1. Shading and polarization fusion
  • 10.2.2. Shape estimation under uncalibrated light sources
  • 10.3. Problem setting and assumptions
  • 10.4. Shading stereoscopic constraint
  • 10.5. Polarization stereoscopic constraint
  • 10.6. Normal estimation with two constraints
  • 10.6.1. Algorithm 1: Recovering individual surface points
  • 10.6.2. Algorithm 2: Recovering shape and light directions
  • 10.7. Experiments
  • 10.7.1. Simulation experiments with weights for two constraints
  • 10.7.2. Real-world experiments
  • 10.8. Conclusion and future works
  • 10.9. References
  • Chapter 11. Polarization Imaging in the Wild Beyond the Unpolarized World Assumption
  • 11.1. Introduction
  • 11.2. Mueller calculus
  • 11.3. Polarizing filters
  • 11.3.1. Linear polarizers
  • 11.3.2. Reflectors
  • 11.4. Polarization imaging
  • 11.5. Image formation model
  • 11.5.1. Partially linearly polarized incident illumination
  • 11.5.2. Unpolarized incident illumination
  • 11.5.3. Discussion
  • 11.6. Polarization imaging reflectometry in the wild
  • 11.7. Digital Single-Lens Reflex (DSLR) setup
  • 11.7.1. Data acquisition
  • 11.7.2. Calibration
  • 11.7.3. Polarization processing pipeline
  • 11.8. Reflectance recovery
  • 11.8.1. Surface normal estimation.
  • 11.8.2. Diffuse albedo estimation
  • 11.8.3. Specular component estimation
  • 11.9. Results and analysis
  • 11.9.1. Results
  • 11.9.2. Discussion and error analysis
  • 11.10. References
  • Chapter 12. Multispectral Polarization Filter Array
  • 12.1. Introduction
  • 12.2. Multispectral polarization filter array with a photonic crystal
  • 12.3. Generalization of imaging and demosaicking with multispectral
  • 12.4. Demonstration
  • 12.5. Conclusion
  • 12.6. References
  • List of Authors
  • Index
  • EULA.