Time series modeling, computation, and inference

Focusing on Bayesian approaches and computations using analytic and simulation-based methods for inference, Time Series: Modeling, Computation, and Inference, Second Edition integrates mainstream approaches for time series modeling with significant recent developments in methodology and applications...

Descripción completa

Detalles Bibliográficos
Otros Autores: Prado, Raquel, author (author), West, Mike, 1959- author, Ferreira, Marco Antonio Rosa, 1969- author
Formato: Libro electrónico
Idioma:Inglés
Publicado: Boca Raton, Florida ; Abingdon, Oxon : CRC Press [2021]
Edición:2nd ed
Colección:Chapman and Hall/CRC Texts in Statistical Science
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009757928306719
Tabla de Contenidos:
  • Cover
  • Half Title
  • Series Page
  • Title Page
  • Copyright Page
  • Contents
  • Preface
  • Authors
  • 1. Notation, definitions, and basic inference
  • 1.1. Problem Areas and Objectives
  • 1.2. Stochastic Processes and Stationarity
  • 1.3. Autocorrelation and Cross-correlation
  • 1.4. Smoothing and Differencing
  • 1.5. A Primer on Likelihood and Bayesian Inference
  • 1.5.1. ML, MAP, and LS Estimation
  • 1.5.2. Traditional Least Squares
  • 1.5.3. Full Bayesian Analysis
  • 1.5.3.1. Reference Bayesian Analysis
  • 1.5.3.2. Conjugate Bayesian Analysis
  • 1.5.4. Nonconjugate Bayesian Analysis
  • 1.5.5. Posterior Sampling
  • 1.5.5.1. The Metropolis-Hastings Algorithm
  • 1.5.5.2. Gibbs Sampling
  • 1.5.5.3. Convergence
  • 1.6. Appendix
  • 1.6.1. The Uniform Distribution
  • 1.6.2. The Univariate Normal Distribution
  • 1.6.3. The Multivariate Normal Distribution
  • 1.6.4. The Gamma and Inverse-gamma Distributions
  • 1.6.5. The Exponential Distribution
  • 1.6.6. The Chi-square Distribution
  • 1.6.7. The Inverse Chi-square Distributions
  • 1.6.8. The Univariate Student-t Distribution
  • 1.6.9. The Multivariate Student-t Distribution
  • 1.7. Problems
  • 2. Traditional time domain models
  • 2.1. Structure of Autoregressions
  • 2.1.1. Stationarity in AR Processes
  • 2.1.2. State-Space Representation of an AR(p)
  • 2.1.3. Characterization of AR(2) Processes
  • 2.1.4. Autocorrelation Structure of an AR(p)
  • 2.1.5. The Partial Autocorrelation Function
  • 2.2. Forecasting
  • 2.3. Estimation in AR Models
  • 2.3.1. Yule-Walker and Maximum Likelihood
  • 2.3.2. Basic Bayesian Inference for AR Models
  • 2.3.3. Simulation of Posterior Distributions
  • 2.3.4. Order Assessment
  • 2.3.5. Initial values and Missing Data
  • 2.3.6. Imputing Initial Values via Simulation
  • 2.4. Further Issues in Bayesian Inference for AR Models.
  • 2.4.1. Sensitivity to the Choice of Prior Distributions
  • 2.4.1.1. Analysis Based on Normal Priors
  • 2.4.1.2. Discrete Normal Mixture Prior and Subset Models
  • 2.4.2. Alternative Prior Distributions
  • 2.4.2.1. Scale-mixtures and Smoothness Priors
  • 2.4.2.2. Priors Based on AR Latent Structure
  • 2.5. Autoregressive Moving Average Models (ARMA)
  • 2.5.1. Structure of ARMA Models
  • 2.5.2. Autocorrelation and Partial Autocorrelation Functions
  • 2.5.3. Inversion of AR Components
  • 2.5.4. Forecasting and Estimation of ARMA Processes
  • 2.5.4.1. Forecasting ARMA Models
  • 2.5.4.2. MLE and Least Squares Estimation
  • 2.5.4.3. State-space Representation
  • 2.5.4.4. Bayesian Estimation of ARMA Processes
  • 2.6. Other Models
  • 2.7. Appendix
  • 2.7.1. The Reversible Jump MCMC Algorithm
  • 2.7.2. The Binomial Distribution
  • 2.7.3. The Beta Distribution
  • 2.7.4. The Dirichlet Distribution
  • 2.7.5. The Beta-binomial Distribution
  • 2.8. Problems
  • 3. The frequency domain
  • 3.1. Harmonic Regression
  • 3.1.1. The One-component Model
  • 3.1.1.1. Reference Analysis
  • 3.1.2. The Periodogram
  • 3.1.3. Some Data Analyses
  • 3.1.4. Several Uncertain Frequency Components
  • 3.1.5. Harmonic Component Models of Known Period
  • 3.1.6. The Periodogram (revisited)
  • 3.2. Some Spectral Theory
  • 3.2.1. Spectral Representation of a Time Series Process
  • 3.2.2. Representation of Autocorrelation Functions
  • 3.2.3. Other Facts and Examples
  • 3.2.4. Traditional Nonparametric Spectral Analysis
  • 3.3. Discussion and Extensions
  • 3.3.1. Long Memory Time Series Models
  • 3.4. Appendix
  • 3.4.1. The F Distribution
  • 3.4.2. Distributions of Quadratic Forms
  • 3.4.3. Orthogonality of Harmonics
  • 3.4.4. Complex Valued Random Variables
  • 3.4.5. Orthogonal Increments Processes
  • 3.4.5.1. Real-valued Orthogonal Increments Processes.
  • 3.4.5.2. Complex-valued Orthogonal Increments Processes
  • 3.5. Problems
  • 4. Dynamic linear models
  • 4.1. General Linear Model Structures
  • 4.2. Forecast Functions and Model Forms
  • 4.2.1. Superposition of Models
  • 4.2.2. Time Series Models
  • 4.3. Inference in DLMs: Basic Normal Theory
  • 4.3.1. Sequential Updating: Filtering
  • 4.3.2. Learning a Constant Observation Variance
  • 4.3.3. Missing and Unequally Spaced Data
  • 4.3.4. Forecasting
  • 4.3.5. Retrospective Updating: Smoothing
  • 4.3.6. Discounting for DLM State Evolution Variances
  • 4.3.7. Stochastic Variances and Discount Learning
  • 4.3.7.1. References and additional comments
  • 4.3.8. Intervention, Monitoring, and Model Performance
  • 4.3.8.1. Intervention
  • 4.3.8.2. Model monitoring and performance
  • 4.4. Extensions: Non-Gaussian and Nonlinear Models
  • 4.5. Posterior Simulation: MCMC Algorithms
  • 4.5.1. Examples
  • 4.6. Problems
  • 5. State-space TVAR models
  • 5.1. Time-Varying Autoregressions and Decompositions
  • 5.1.1. Basic DLM Decomposition
  • 5.1.2. Latent Structure in TVAR Models
  • 5.1.2.1. Decompositions for standard autoregressions
  • 5.1.2.2. Decompositions in the TVAR case
  • 5.1.3. Interpreting Latent TVAR Structure
  • 5.2. TVAR Model Speci cation and Posterior Inference
  • 5.3. Extensions
  • 5.4. Problems
  • 6. SMC methods for state-space models
  • 6.1. General State-Space Models
  • 6.2. Posterior Simulation: Sequential Monte Carlo
  • 6.2.1. Sequential Importance Sampling and Resampling
  • 6.2.2. The Auxiliary Particle Filter
  • 6.2.3. SMC for Combined State and Parameter Estimation
  • 6.2.3.1. Algorithm of Liu and West
  • 6.2.3.2. Storvik's algorithm
  • 6.2.3.3. Practical ltering
  • 6.2.3.4. Particle learning methods
  • 6.2.4. Smoothing
  • 6.2.5. Examples
  • 6.3. Problems
  • 7. Mixture models in time series
  • 7.1. Markov Switching Models.
  • 7.1.1. Parameter Estimation
  • 7.1.2. Other Models
  • 7.2. Multiprocess Models
  • 7.2.1. Definitions and Examples
  • 7.2.2. Posterior Inference
  • 7.2.2.1. Posterior inference in class I models
  • 7.2.2.2. Posterior inference in class II models
  • 7.3. Mixtures of General State-Space Models
  • 7.4. Case Study: Detecting Fatigue from EEGs
  • 7.4.1. Structured Priors in Multi-AR Models
  • 7.4.2. Posterior Inference
  • 7.5. Univariate Stochastic Volatility models
  • 7.5.1. Zero-Mean AR(1) SV Model
  • 7.5.2. Normal Mixture Approximation
  • 7.5.3. Centered Parameterization
  • 7.5.4. MCMC Analysis
  • 7.5.5. Further Comments
  • 7.6. Problems
  • 8. Topics and examples in multiple time series
  • 8.1. Multichannel Modeling of EEG Data
  • 8.1.1. Multiple Univariate TVAR Models
  • 8.1.2. A Simple Factor Model
  • 8.2. Some Spectral Theory
  • 8.2.1. The Cross-Spectrum and Cross-Periodogram
  • 8.3. Dynamic Lag/Lead Models
  • 8.4. Other Approaches
  • 8.5. Problems
  • 9. Vector AR and ARMA models
  • 9.1. Vector Autoregressive Models
  • 9.1.1. State-Space Representation of a VAR Process
  • 9.1.2. The Moving Average Representation of a VAR Process
  • 9.1.3. VAR Time Series Decompositions
  • 9.2. Vector ARMA Models
  • 9.2.1. Autocovariances and Cross-covariances
  • 9.2.2. Partial Autoregression Matrix Function
  • 9.2.3. VAR(1) and DLM Representations
  • 9.3. Estimation in VARMA
  • 9.3.1. Identifiability
  • 9.3.2. Least Squares Estimation
  • 9.3.3. Maximum Likelihood Estimation
  • 9.3.3.1. Conditional likelihood
  • 9.3.3.2. Exact likelihood
  • 9.4. Bayesian VAR, TV-VAR, and DDNMs
  • 9.5. Mixtures of VAR Processes
  • 9.6. PARCOR Representations and Spectral Analysis
  • 9.6.1. Spectral Matrix of a VAR and VARMA processes
  • 9.7. Problems
  • 10. General classes of multivariate dynamic models
  • 10.1. Theory of Multivariate and Matrix Normal DLMs.
  • 10.1.1. Multivariate Normal DLMs
  • 10.1.2. Matrix Normal DLMs and Exchangeable Time Series
  • 10.2. Multivariate DLMs and Exchangeable Time Series
  • 10.2.1. Sequential Updating
  • 10.2.2. Forecasting and Retrospective Smoothing
  • 10.3. Learning Cross-Series Covariances
  • 10.3.1. Sequential Updating
  • 10.3.2. Forecasting and Retrospective Smoothing
  • 10.4. Time-Varying Covariance Matrices
  • 10.4.1. Introductory Discussion
  • 10.4.2. Wishart Matrix Discounting Models
  • 10.4.3. Matrix Beta Evolution Model
  • 10.4.4. DLM Extension and Sequential Updating
  • 10.4.5. Retrospective Analysis
  • 10.4.6. Financial Time Series Volatility Example
  • 10.4.6.1. Data and model
  • 10.4.6.2. Trajectories of multivariate stochastic volatility
  • 10.4.6.3. Time-varying principal components analysis
  • 10.4.6.4. Latent components in multivariate volatility
  • 10.4.7. Short-term Forecasting for Portfolio Decisions
  • 10.4.7.1. Additional comments and extensions
  • 10.4.8. Beta-Bartlett Wishart Models for Stochastic Volatility
  • 10.4.8.1. Discount model variants
  • 10.4.8.2. Additional comments and current research areas
  • 10.5. Multivariate Dynamic Graphical Models
  • 10.5.1. Gaussian Graphical Models
  • 10.5.2. Dynamic Graphical Models
  • 10.6. Selected recent developments
  • 10.6.1. Simultaneous Graphical Dynamic Models
  • 10.6.2. Models for Multivariate Time Series of Counts
  • 10.6.3. Models for Flows on Dynamic Networks
  • 10.6.4. Dynamic Multiscale Models
  • 10.7. Appendix
  • 10.7.1. The Matrix Normal Distribution
  • 10.7.2. The Wishart Distribution
  • 10.7.3. The Inverse Wishart Distribution
  • 10.7.3.1. Point estimates of variance matrices
  • 10.7.4. The Normal, Inverse Wishart Distribution
  • 10.7.5. The Matrix Normal, Inverse Wishart Distribution
  • 10.7.6. Hyper-Inverse Wishart Distributions
  • 10.7.6.1. Decomposable graphical models.
  • 10.7.6.2. The hyper-inverse Wishart distribution.