Nonlinear filters theory and applications

"This book fills the gap between the literature on nonlinear filters and nonlinear observers by presenting a new state estimation strategy, the smooth variable structure filter (SVSF). The book is a valuable resource to researchers outside of the control society, where literature on nonlinear o...

Descripción completa

Detalles Bibliográficos
Otros Autores: Setoodeh, Peyman, 1974- author (author), Habibi, Saeid, author, Haykin, Simon S., 1931- author
Formato: Libro electrónico
Idioma:Inglés
Publicado: Hoboken, New Jersey : John Wiley & Sons, Inc [2022]
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009657537606719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright
  • Contents
  • List of Figures
  • List of Table
  • Preface
  • Acknowledgments
  • Acronyms
  • Chapter 1 Introduction
  • 1.1 State of a Dynamic System
  • 1.2 State Estimation
  • 1.3 Construals of Computing
  • 1.4 Statistical Modeling
  • 1.5 Vision for the Book
  • Chapter 2 Observability
  • 2.1 Introduction
  • 2.2 State‐Space Model
  • 2.3 The Concept of Observability
  • 2.4 Observability of Linear Time‐Invariant Systems
  • 2.4.1 Continuous‐Time LTI Systems
  • 2.4.2 Discrete‐Time LTI Systems
  • 2.4.3 Discretization of LTI Systems
  • 2.5 Observability of Linear Time‐Varying Systems
  • 2.5.1 Continuous‐Time LTV Systems
  • 2.5.2 Discrete‐Time LTV Systems
  • 2.5.3 Discretization of LTV Systems
  • 2.6 Observability of Nonlinear Systems
  • 2.6.1 Continuous‐Time Nonlinear Systems
  • 2.6.2 Discrete‐Time Nonlinear Systems
  • 2.6.3 Discretization of Nonlinear Systems
  • 2.7 Observability of Stochastic Systems
  • 2.8 Degree of Observability
  • 2.9 Invertibility
  • 2.10 Concluding Remarks
  • Chapter 3 Observers
  • 3.1 Introduction
  • 3.2 Luenberger Observer
  • 3.3 Extended Luenberger‐Type Observer
  • 3.4 Sliding‐Mode Observer
  • 3.5 Unknown‐Input Observer
  • 3.6 Concluding Remarks
  • Chapter 4 Bayesian Paradigm and Optimal Nonlinear Filtering
  • 4.1 Introduction
  • 4.2 Bayes' Rule
  • 4.3 Optimal Nonlinear Filtering
  • 4.4 Fisher Information
  • 4.5 Posterior Cramér-Rao Lower Bound
  • 4.6 Concluding Remarks
  • Chapter 5 Kalman Filter
  • 5.1 Introduction
  • 5.2 Kalman Filter
  • 5.3 Kalman Smoother
  • 5.4 Information Filter
  • 5.5 Extended Kalman Filter
  • 5.6 Extended Information Filter
  • 5.7 Divided‐Difference Filter
  • 5.8 Unscented Kalman Filter
  • 5.9 Cubature Kalman Filter
  • 5.10 Generalized PID Filter
  • 5.11 Gaussian‐Sum Filter
  • 5.12 Applications
  • 5.12.1 Information Fusion
  • 5.12.2 Augmented Reality.
  • 5.12.3 Urban Traffic Network
  • 5.12.4 Cybersecurity of Power Systems
  • 5.12.5 Incidence of Influenza
  • 5.12.6 COVID‐19 Pandemic
  • 5.13 Concluding Remarks
  • Chapter 6 Particle Filter
  • 6.1 Introduction
  • 6.2 Monte Carlo Method
  • 6.3 Importance Sampling
  • 6.4 Sequential Importance Sampling
  • 6.5 Resampling
  • 6.6 Sample Impoverishment
  • 6.7 Choosing the Proposal Distribution
  • 6.8 Generic Particle Filter
  • 6.9 Applications
  • 6.9.1 Simultaneous Localization and Mapping
  • 6.10 Concluding Remarks
  • Chapter 7 Smooth Variable‐Structure Filter
  • 7.1 Introduction
  • 7.2 The Switching Gain
  • 7.3 Stability Analysis
  • 7.4 Smoothing Subspace
  • 7.5 Filter Corrective Term for Linear Systems
  • 7.6 Filter Corrective Term for Nonlinear Systems
  • 7.7 Bias Compensation
  • 7.8 The Secondary Performance Indicator
  • 7.9 Second‐Order Smooth Variable Structure Filter
  • 7.10 Optimal Smoothing Boundary Design
  • 7.11 Combination of SVSF with Other Filters
  • 7.12 Applications
  • 7.12.1 Multiple Target Tracking
  • 7.12.2 Battery State‐of‐Charge Estimation
  • 7.12.3 Robotics
  • 7.13 Concluding Remarks
  • Chapter 8 Deep Learning
  • 8.1 Introduction
  • 8.2 Gradient Descent
  • 8.3 Stochastic Gradient Descent
  • 8.4 Natural Gradient Descent
  • 8.5 Neural Networks
  • 8.6 Backpropagation
  • 8.7 Backpropagation Through Time
  • 8.8 Regularization
  • 8.9 Initialization
  • 8.10 Convolutional Neural Network
  • 8.11 Long Short‐Term Memory
  • 8.12 Hebbian Learning
  • 8.13 Gibbs Sampling
  • 8.14 Boltzmann Machine
  • 8.15 Autoencoder
  • 8.16 Generative Adversarial Network
  • 8.17 Transformer
  • 8.18 Concluding Remarks
  • Chapter 9 Deep Learning‐Based Filters
  • 9.1 Introduction
  • 9.2 Variational Inference
  • 9.3 Amortized Variational Inference
  • 9.4 Deep Kalman Filter
  • 9.5 Backpropagation Kalman Filter
  • 9.6 Differentiable Particle Filter.
  • 9.7 Deep Rao-Blackwellized Particle Filter
  • 9.8 Deep Variational Bayes Filter
  • 9.9 Kalman Variational Autoencoder
  • 9.10 Deep Variational Information Bottleneck
  • 9.11 Wasserstein Distributionally Robust Kalman Filter
  • 9.12 Hierarchical Invertible Neural Transport
  • 9.13 Applications
  • 9.13.1 Prediction of Drug Effect
  • 9.13.2 Autonomous Driving
  • 9.14 Concluding Remarks
  • Chapter 10 Expectation Maximization
  • 10.1 Introduction
  • 10.2 Expectation Maximization Algorithm
  • 10.3 Particle Expectation Maximization
  • 10.4 Expectation Maximization for Gaussian Mixture Models
  • 10.5 Neural Expectation Maximization
  • 10.6 Relational Neural Expectation Maximization
  • 10.7 Variational Filtering Expectation Maximization
  • 10.8 Amortized Variational Filtering Expectation Maximization
  • 10.9 Applications
  • 10.9.1 Stochastic Volatility
  • 10.9.2 Physical Reasoning
  • 10.9.3 Speech, Music, and Video Modeling
  • 10.10 Concluding Remarks
  • Chapter 11 Reinforcement Learning‐Based Filter
  • 11.1 Introduction
  • 11.2 Reinforcement Learning
  • 11.3 Variational Inference as Reinforcement Learning
  • 11.4 Application
  • 11.4.1 Battery State‐of‐Charge Estimation
  • 11.5 Concluding Remarks
  • Chapter 12 Nonparametric Bayesian Models
  • 12.1 Introduction
  • 12.2 Parametric vs Nonparametric Models
  • 12.3 Measure‐Theoretic Probability
  • 12.4 Exchangeability
  • 12.5 Kolmogorov Extension Theorem
  • 12.6 Extension of Bayesian Models
  • 12.7 Conjugacy
  • 12.8 Construction of Nonparametric Bayesian Models
  • 12.9 Posterior Computability
  • 12.10 Algorithmic Sufficiency
  • 12.11 Applications
  • 12.11.1 Multiple Object Tracking
  • 12.11.2 Data‐Driven Probabilistic Optimal Power Flow
  • 12.11.3 Analyzing Single‐Molecule Tracks
  • 12.12 Concluding Remarks
  • References
  • Index
  • EULA.