Time series forecasting in Python

Build predictive models from time-based patterns in your data. Master statistical models including new deep learning approaches for time series forecasting. In Time Series Forecasting in Python you will learn how to: Recognize a time series forecasting problem and build a performant predictive m...

Descripción completa

Detalles Bibliográficos
Otros Autores: Peixeiro, Marco, author (author)
Formato: Libro electrónico
Idioma:Inglés
Publicado: Shelter Island, New York : Manning Publications Co [2022]
Edición:[First edition]
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009701328306719
Tabla de Contenidos:
  • Intro
  • inside front cover
  • Time Series Forecasting in Python
  • Copyright
  • dedication
  • contents
  • front matter
  • preface
  • acknowledgments
  • about this book
  • Who should read this book?
  • How this book is organized: A roadmap
  • About the code
  • liveBook discussion forum
  • Author online
  • about the author
  • about the cover illustration
  • Part 1. Time waits for no one
  • 1 Understanding time series forecasting
  • 1.1 Introducing time series
  • 1.1.1 Components of a time series
  • 1.2 Bird's-eye view of time series forecasting
  • 1.2.1 Setting a goal
  • 1.2.2 Determining what must be forecast to achieve your goal
  • 1.2.3 Setting the horizon of the forecast
  • 1.2.4 Gathering the data
  • 1.2.5 Developing a forecasting model
  • 1.2.6 Deploying to production
  • 1.2.7 Monitoring
  • 1.2.8 Collecting new data
  • 1.3 How time series forecasting is different from other regression tasks
  • 1.3.1 Time series have an order
  • 1.3.2 Time series sometimes do not have features
  • 1.4 Next steps
  • Summary
  • 2 A naive prediction of the future
  • 2.1 Defining a baseline model
  • 2.2 Forecasting the historical mean
  • 2.2.1 Setup for baseline implementations
  • 2.2.2 Implementing the historical mean baseline
  • 2.3 Forecasting last year's mean
  • 2.4 Predicting using the last known value
  • 2.5 Implementing the naive seasonal forecast
  • 2.6 Next steps
  • Summary
  • 3 Going on a random walk
  • 3.1 The random walk process
  • 3.1.1 Simulating a random walk process
  • 3.2 Identifying a random walk
  • 3.2.1 Stationarity
  • 3.2.2 Testing for stationarity
  • 3.2.3 The autocorrelation function
  • 3.2.4 Putting it all together
  • 3.2.5 Is GOOGL a random walk?
  • 3.3 Forecasting a random walk
  • 3.3.1 Forecasting on a long horizon
  • 3.3.2 Forecasting the next timestep
  • 3.4 Next steps
  • 3.5 Exercises
  • 3.5.1 Simulate and forecast a random walk.
  • 3.5.2 Forecast the daily closing price of GOOGL
  • 3.5.3 Forecast the daily closing price of a stock of your choice
  • Summary
  • Part 2. Forecasting with statistical models
  • 4 Modeling a moving average process
  • 4.1 Defining a moving average process
  • 4.1.1 Identifying the order of a moving average process
  • 4.2 Forecasting a moving average process
  • 4.3 Next steps
  • 4.4 Exercises
  • 4.4.1 Simulate an MA(2) process and make forecasts
  • 4.4.2 Simulate an MA(q) process and make forecasts
  • Summary
  • 5 Modeling an autoregressive process
  • 5.1 Predicting the average weekly foot traffic in a retail store
  • 5.2 Defining the autoregressive process
  • 5.3 Finding the order of a stationary autoregressive process
  • 5.3.1 The partial autocorrelation function (PACF)
  • 5.4 Forecasting an autoregressive process
  • 5.5 Next steps
  • 5.6 Exercises
  • 5.6.1 Simulate an AR(2) process and make forecasts
  • 5.6.2 Simulate an AR(p) process and make forecasts
  • Summary
  • 6 Modeling complex time series
  • 6.1 Forecasting bandwidth usage for data centers
  • 6.2 Examining the autoregressive moving average process
  • 6.3 Identifying a stationary ARMA process
  • 6.4 Devising a general modeling procedure
  • 6.4.1 Understanding the Akaike information criterion (AIC)
  • 6.4.2 Selecting a model using the AIC
  • 6.4.3 Understanding residual analysis
  • 6.4.4 Performing residual analysis
  • 6.5 Applying the general modeling procedure
  • 6.6 Forecasting bandwidth usage
  • 6.7 Next steps
  • 6.8 Exercises
  • 6.8.1 Make predictions on the simulated ARMA(1,1) process
  • 6.8.2 Simulate an ARMA(2,2) process and make forecasts
  • Summary
  • 7 Forecasting non-stationary time series
  • 7.1 Defining the autoregressive integrated moving average model
  • 7.2 Modifying the general modeling procedure to account for non-stationary series.
  • 7.3 Forecasting a non-stationary times series
  • 7.4 Next steps
  • 7.5 Exercises
  • 7.5.1 Apply the ARIMA(p,d,q) model on the datasets from chapters 4, 5, and 6
  • Summary
  • 8 Accounting for seasonality
  • 8.1 Examining the SARIMA(p,d,q)(P,D,Q)m model
  • 8.2 Identifying seasonal patterns in a time series
  • 8.3 Forecasting the number of monthly air passengers
  • 8.3.1 Forecasting with an ARIMA(p,d,q) model
  • 8.3.2 Forecasting with a SARIMA(p,d,q)(P,D,Q)m model
  • 8.3.3 Comparing the performance of each forecasting method
  • 8.4 Next steps
  • 8.5 Exercises
  • 8.5.1 Apply the SARIMA(p,d,q)(P,D,Q)m model on the Johnson &amp
  • Johnson dataset
  • Summary
  • 9 Adding external variables to our model
  • 9.1 Examining the SARIMAX model
  • 9.1.1 Exploring the exogenous variables of the US macroeconomics dataset
  • 9.1.2 Caveat for using SARIMAX
  • 9.2 Forecasting the real GDP using the SARIMAX model
  • 9.3 Next steps
  • 9.4 Exercises
  • 9.4.1 Use all exogenous variables in a SARIMAX model to predict the real GDP
  • Summary
  • 10 Forecasting multiple time series
  • 10.1 Examining the VAR model
  • 10.2 Designing a modeling procedure for the VAR(p) model
  • 10.2.1 Exploring the Granger causality test
  • 10.3 Forecasting real disposable income and real consumption
  • 10.4 Next steps
  • 10.5 Exercises
  • 10.5.1 Use a VARMA model to predict realdpi and realcons
  • 10.5.2 Use a VARMAX model to predict realdpi and realcons
  • Summary
  • 11 Capstone: Forecasting the number of antidiabetic drug prescriptions in Australia
  • 11.1 Importing the required libraries and loading the data
  • 11.2 Visualizing the series and its components
  • 11.3 Modeling the data
  • 11.3.1 Performing model selection
  • 11.3.2 Conducting residual analysis
  • 11.4 Forecasting and evaluating the model's performance
  • Next steps
  • Part 3. Large-scale forecasting with deep learning.
  • 12 Introducing deep learning for time series forecasting
  • 12.1 When to use deep learning for time series forecasting
  • 12.2 Exploring the different types of deep learning models
  • 12.3 Getting ready to apply deep learning for forecasting
  • 12.3.1 Performing data exploration
  • 12.3.2 Feature engineering and data splitting
  • 12.4 Next steps
  • 12.5 Exercise
  • Summary
  • 13 Data windowing and creating baselines for deep learning
  • 13.1 Creating windows of data
  • 13.1.1 Exploring how deep learning models are trained for time series forecasting
  • 13.1.2 Implementing the DataWindow class
  • 13.2 Applying baseline models
  • 13.2.1 Single-step baseline model
  • 13.2.2 Multi-step baseline models
  • 13.2.3 Multi-output baseline model
  • 13.3 Next steps
  • 13.4 Exercises
  • Summary
  • 14 Baby steps with deep learning
  • 14.1 Implementing a linear model
  • 14.1.1 Implementing a single-step linear model
  • 14.1.2 Implementing a multi-step linear model
  • 14.1.3 Implementing a multi-output linear model
  • 14.2 Implementing a deep neural network
  • 14.2.1 Implementing a deep neural network as a single-step model
  • 14.2.2 Implementing a deep neural network as a multi-step model
  • 14.2.3 Implementing a deep neural network as a multi-output model
  • 14.3 Next steps
  • 14.4 Exercises
  • Summary
  • 15 Remembering the past with LSTM
  • 15.1 Exploring the recurrent neural network (RNN)
  • 15.2 Examining the LSTM architecture
  • 15.2.1 The forget gate
  • 15.2.2 The input gate
  • 15.2.3 The output gate
  • 15.3 Implementing the LSTM architecture
  • 15.3.1 Implementing an LSTM as a single-step model
  • 15.3.2 Implementing an LSTM as a multi-step model
  • 15.3.3 Implementing an LSTM as a multi-output model
  • 15.4 Next steps
  • 15.5 Exercises
  • Summary
  • 16 Filtering a time series with CNN
  • 16.1 Examining the convolutional neural network (CNN).
  • 16.2 Implementing a CNN
  • 16.2.1 Implementing a CNN as a single-step model
  • 16.2.2 Implementing a CNN as a multi-step model
  • 16.2.3 Implementing a CNN as a multi-output model
  • 16.3 Next steps
  • 16.4 Exercises
  • Summary
  • 17 Using predictions to make more predictions
  • 17.1 Examining the ARLSTM architecture
  • 17.2 Building an autoregressive LSTM model
  • 17.3 Next steps
  • 17.4 Exercises
  • Summary
  • 18 Capstone: Forecasting the electric power consumption of a household
  • 18.1 Understanding the capstone project
  • 18.1.1 Objective of this capstone project
  • 18.2 Data wrangling and preprocessing
  • 18.2.1 Dealing with missing data
  • 18.2.2 Data conversion
  • 18.2.3 Data resampling
  • 18.3 Feature engineering
  • 18.3.1 Removing unnecessary columns
  • 18.3.2 Identifying the seasonal period
  • 18.3.3 Splitting and scaling the data
  • 18.4 Preparing for modeling with deep learning
  • 18.4.1 Initial setup
  • 18.4.2 Defining the DataWindow class
  • 18.4.3 Utility function to train our models
  • 18.5 Modeling with deep learning
  • 18.5.1 Baseline models
  • 18.5.2 Linear model
  • 18.5.3 Deep neural network
  • 18.5.4 Long short-term memory (LSTM) model
  • 18.5.5 Convolutional neural network (CNN)
  • 18.5.6 Combining a CNN with an LSTM
  • 18.5.7 The autoregressive LSTM model
  • 18.5.8 Selecting the best model
  • 18.6 Next steps
  • Part 4. Automating forecasting at scale
  • 19 Automating time series forecasting with Prophet
  • 19.1 Overview of the automated forecasting libraries
  • 19.2 Exploring Prophet
  • 19.3 Basic forecasting with Prophet
  • 19.4 Exploring Prophet's advanced functionality
  • 19.4.1 Visualization capabilities
  • 19.4.2 Cross-validation and performance metrics
  • 19.4.3 Hyperparameter tuning
  • 19.5 Implementing a robust forecasting process with Prophet.
  • 19.5.1 Forecasting project: Predicting the popularity of "chocolate" searches on Google.