System parameter identification information criteria and algorithms

Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identi...

Full description

Bibliographic Details
Other Authors: Chen, Badong, author (author), Chen, Badong (-)
Format: eBook
Language:Inglés
Published: Amsterdam ; Boston : Elsevier 2013.
London : 2013.
Edition:1st ed
Series:Elsevier insights
Elsevier insights.
Subjects:
See on Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009627920906719
Table of Contents:
  • Front Cover; System Parameter Identification; Copyright Page; Contents; About the Authors; Preface; Symbols and Abbreviations; 1 Introduction; 1.1 Elements of System Identification; 1.2 Traditional Identification Criteria; 1.3 Information Theoretic Criteria; 1.3.1 MEE Criteria; 1.3.2 Minimum Information Divergence Criteria; 1.3.3 Mutual Information-Based Criteria; 1.4 Organization of This Book; Appendix A: Unifying Framework of ITL; 2 Information Measures; 2.1 Entropy; 2.2 Mutual Information; 2.3 Information Divergence; 2.4 Fisher Information; 2.5 Information Rate
  • Appendix B: α-Stable DistributionAppendix C: Proof of (2.17); Appendix D: Proof of Cramer-Rao Inequality; 3 Information Theoretic Parameter Estimation; 3.1 Traditional Methods for Parameter Estimation; 3.1.1 Classical Estimation; 3.1.1.1 ML Estimation; 3.1.1.2 Method of Moments; 3.1.2 Bayes Estimation; 3.2 Information Theoretic Approaches to Classical Estimation; 3.2.1 Entropy Matching Method; 3.2.2 Maximum Entropy Method; 3.2.2.1 Parameter Estimation of Exponential Type Distribution; 3.2.2.2 Maximum Spacing Estimation; 3.2.2.3 Maximum Equality Estimation; 3.2.3 Minimum Divergence Estimation
  • 3.3 Information Theoretic Approaches to Bayes Estimation3.3.1 Minimum Error Entropy Estimation; 3.3.1.1 Some Properties of MEE Criterion; 3.3.1.2 Relationship to Conventional Bayes Risks; 3.3.2 MC Estimation; 3.4 Information Criteria for Model Selection; Appendix E: EM Algorithm; Appendix F: Minimum MSE Estimation; Appendix G: Derivation of AIC Criterion; 4 System Identification Under Minimum Error Entropy Criteria; 4.1 Brief Sketch of System Parameter Identification; 4.1.1 Model Structure; 4.1.2 Criterion Function; 4.1.3 Identification Algorithm; 4.1.3.1 Batch Identification
  • 4.1.3.2 Online Identification4.1.3.3 Recursive Least Squares Algorithm; 4.1.3.4 Least Mean Square Algorithm; 4.1.3.5 Kernel Adaptive Filtering Algorithms; 4.2 MEE Identification Criterion; 4.2.1 Common Approaches to Entropy Estimation; 4.2.1.1 Integral Estimate; 4.2.1.2 Resubstitution Estimate; 4.2.1.3 Splitting Data Estimate; 4.2.1.4 Cross-validation Estimate; 4.2.2 Empirical Error Entropies Based on KDE; 4.3 Identification Algorithms Under MEE Criterion; 4.3.1 Nonparametric Information Gradient Algorithms; 4.3.1.1 BIG Algorithm; 4.3.1.2 Sliding Information Gradient Algorithm
  • 4.3.1.3 FRIG Algorithm4.3.1.4 SIG Algorithm; 4.3.2 Parametric IG Algorithms; 4.3.3 Fixed-Point Minimum Error Entropy Algorithm; 4.3.4 Kernel Minimum Error Entropy Algorithm; 4.3.5 Simulation Examples; 4.4 Convergence Analysis; 4.4.1 Convergence Analysis Based on Approximate Linearization; 4.4.2 Energy Conservation Relation; 4.4.3 Mean Square Convergence Analysis Based on Energy Conservation Relation; 4.4.3.1 Sufficient Condition for Mean Square Convergence; 4.4.3.2 Mean Square Convergence Curve; 4.4.3.3 Mean Square Steady-State Performance; 4.5 Optimization of φ-Entropy Criterion
  • 4.6 Survival Information Potential Criterion