Discrete stochastic processes and optimal filtering
Optimal filtering applied to stationary and non-stationary signals provides the most efficient means of dealing with problems arising from the extraction of noise signals. Moreover, it is a fundamental feature in a range of applications, such as in navigation in aerospace and aeronautics, filter pro...
Otros Autores: | , |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Newport Beach, California :
ISTE
2007.
|
Edición: | 1st edition |
Colección: | ISTE
|
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009626917506719 |
Tabla de Contenidos:
- Discrete Stochastic Processes and Optimal Filtering; Table of Contents; Preface; Introduction; Chapter 1. Random Vectors; 1.1. Definitions and general properties; 1.2. Spaces L1(dP) and L2(dP); 1.2.1. Definitions; 1.2.2. Properties; 1.3. Mathematical expectation and applications; 1.3.1. Definitions; 1.3.2. Characteristic functions of a random vector; 1.4. Second order random variables and vectors; 1.5. Linear independence of vectors of L2(dP); 1.6. Conditional expectation (concerning random vectors with density function); 1.7. Exercises for Chapter 1; Chapter 2. Gaussian Vectors
- 2.1. Some reminders regarding random Gaussian vectors2.2. Definition and characterization of Gaussian vectors; 2.3. Results relative to independence; 2.4. Affine transformation of a Gaussian vector; 2.5. The existence of Gaussian vectors; 2.6. Exercises for Chapter 2; Chapter 3. Introduction to Discrete Time Processes; 3.1. Definition; 3.2. WSS processes and spectral measure; 3.2.1. Spectral density; 3.3. Spectral representation of a WSS process; 3.3.1. Problem; 3.3.2. Results; 3.3.2.1. Process with orthogonal increments and associated measurements; 3.3.2.2. Wiener stochastic integral
- 3.3.2.3. Spectral representation3.4. Introduction to digital filtering; 3.5. Important example: autoregressive process; 3.6. Exercises for Chapter 3; Chapter 4. Estimation; 4.1. Position of the problem; 4.2. Linear estimation; 4.3. Best estimate - conditional expectation; 4.4. Example: prediction of an autoregressive process AR (1); 4.5. Multivariate processes; 4.6. Exercises for Chapter 4; Chapter 5. The Wiener Filter; 5.1. Introduction; 5.1.1. Problem position; 5.2. Resolution and calculation of the FIR filter; 5.3. Evaluation of the least error
- 5.4. Resolution and calculation of the IIR filter5.5. Evaluation of least mean square error; 5.6. Exercises for Chapter 5; Chapter 6. Adaptive Filtering: Algorithm of the Gradient and the LMS; 6.1. Introduction; 6.2. Position of problem; 6.3. Data representation; 6.4. Minimization of the cost function; 6.4.1. Calculation of the cost function; 6.5. Gradient algorithm; 6.6. Geometric interpretation; 6.7. Stability and convergence; 6.8. Estimation of gradient and LMS algorithm; 6.8.1. Convergence of the algorithm of the LMS; 6.9. Example of the application of the LMS algorithm
- 6.10. Exercises for Chapter 6Chapter 7. The Kalman Filter; 7.1. Position of problem; 7.2. Approach to estimation; 7.2.1. Scalar case; 7.2.2. Multivariate case; 7.3. Kalman filtering; 7.3.1. State equation; 7.3.2. Observation equation; 7.3.3. Innovation process; 7.3.4. Covariance matrix of the innovation process; 7.3.5. Estimation; 7.3.6. Riccati's equation; 7.3.7. Algorithm and summary; 7.4. Exercises for Chapter 7; Table of Symbols and Notations; Bibliography; Index