Bayesian analysis with Python a practical guide to probabilistic modeling
The third edition of Bayesian Analysis with Python serves as an introduction to the main concepts of applied Bayesian modeling using PyMC, a state-of-the-art probabilistic programming library, and other libraries that support and facilitate modeling like ArviZ, for exploratory analysis of Bayesian m...
Otros Autores: | , , |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Birmingham, England :
Packt Publishing
January 2024
|
Edición: | Third edition |
Colección: | Expert insight
|
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009805128606719 |
Tabla de Contenidos:
- Cover
- Copyright
- Foreword
- Contributors
- Table of Contents
- Preface
- Who this book is for
- What this book covers
- What's new in this edition?
- Installation instructions
- Conventions used
- Chapter 1: Thinking Probabilistically
- Statistics, models, and this book's approach
- Working with data
- Bayesian modeling
- A probability primer for Bayesian practitioners
- Sample space and events
- Random variables
- Discrete random variables and their distributions
- Continuous random variables and their distributions
- Cumulative distribution function
- Conditional probability
- Expected values
- Bayes' theorem
- Interpreting probabilities
- Probabilities, uncertainty, and logic
- Single-parameter inference
- The coin-flipping problem
- Choosing the likelihood
- Choosing the prior
- Getting the posterior
- The influence of the prior
- How to choose priors
- Communicating a Bayesian analysis
- Model notation and visualization
- Summarizing the posterior
- Summary
- Exercises
- Chapter 2: Programming Probabilistically
- Probabilistic programming
- Flipping coins the PyMC way
- Summarizing the posterior
- Posterior-based decisions
- Savage-Dickey density ratio
- Region Of Practical Equivalence
- Loss functions
- Gaussians all the way down
- Gaussian inferences
- Posterior predictive checks
- Robust inferences
- Degrees of normality
- A robust version of the Normal model
- InferenceData
- Groups comparison
- The tips dataset
- Cohen's d
- Probability of superiority
- Posterior analysis of mean differences
- Summary
- Exercises
- Chapter 3: Hierarchical Models
- Sharing information, sharing priors
- Hierarchical shifts
- Water quality
- Shrinkage
- Hierarchies all the way up
- Summary
- Exercises
- Chapter 4: Modeling with Lines
- Simple linear regression
- Linear bikes.
- Interpreting the posterior mean
- Interpreting the posterior predictions
- Generalizing the linear model
- Counting bikes
- Robust regression
- Logistic regression
- The logistic model
- Classification with logistic regression
- Interpreting the coefficients of logistic regression
- Variable variance
- Hierarchical linear regression
- Centered vs. noncentered hierarchical models
- Multiple linear regression
- Summary
- Exercises
- Chapter 5: Comparing Models
- Posterior predictive checks
- The balance between simplicity and accuracy
- Many parameters (may) lead to overfitting
- Too few parameters lead to underfitting
- Measures of predictive accuracy
- Information criteria
- Akaike Information Criterion
- Widely applicable information criteria
- Other information criteria
- Cross-validation
- Approximating cross-validation
- Calculating predictive accuracy with ArviZ
- Model averaging
- Bayes factors
- Some observations
- Calculation of Bayes factors
- Analytically
- Sequential Monte Carlo
- Savage-Dickey ratio
- Bayes factors and inference
- Regularizing priors
- Summary
- Exercises
- Chapter 6: Modeling with Bambi
- One syntax to rule them all
- The bikes model, Bambi's version
- Polynomial regression
- Splines
- Distributional models
- Categorical predictors
- Categorical penguins
- Relation to hierarchical models
- Interactions
- Interpreting models with Bambi
- Variable selection
- Projection predictive inference
- Projection predictive with Kulprit
- Summary
- Exercises
- Chapter 7: Mixture Models
- Understanding mixture models
- Finite mixture models
- The Categorical distribution
- The Dirichlet distribution
- Chemical mixture
- The non-identifiability of mixture models
- How to choose K
- Zero-Inflated and hurdle models
- Zero-Inflated Poisson regression
- Hurdle models.
- Mixture models and clustering
- Non-finite mixture model
- Dirichlet process
- Continuous mixtures
- Some common distributions are mixtures
- Summary
- Exercises
- Chapter 8: Gaussian Processes
- Linear models and non-linear data
- Modeling functions
- Multivariate Gaussians and functions
- Covariance functions and kernels
- Gaussian processes
- Gaussian process regression
- Gaussian process regression with PyMC
- Setting priors for the length scale
- Gaussian process classification
- GPs for space flu
- Cox processes
- Coal mining disasters
- Red wood
- Regression with spatial autocorrelation
- Hilbert space GPs
- HSGP with Bambi
- Summary
- Exercises
- Chapter 9: Bayesian Additive Regression Trees
- Decision trees
- BART models
- Bartian penguins
- Partial dependence plots
- Individual conditional plots
- Variable selection with BART
- Distributional BART models
- Constant and linear response
- Choosing the number of trees
- Summary
- Exercises
- Chapter 10: Inference Engines
- Inference engines
- The grid method
- Quadratic method
- Markovian methods
- Monte Carlo
- Markov chain
- Metropolis-Hastings
- Hamiltonian Monte Carlo
- Sequential Monte Carlo
- Diagnosing the samples
- Convergence
- Trace plot
- Rank plot
- , (R hat)
- Effective Sample Size (ESS)
- Monte Carlo standard error
- Divergences
- Keep calm and keep trying
- Summary
- Exercises
- Chapter 11: Where to Go Next
- Other Books You May Enjoy
- Index.