Bayesian Statistics and Marketing

Detalles Bibliográficos
Autor principal: Rossi, Peter E. (-)
Otros Autores: Allenby, Greg M., Misra, Sanjog
Formato: Libro electrónico
Idioma:Inglés
Publicado: Newark : John Wiley & Sons, Incorporated 2024.
Edición:2nd ed
Colección:WILEY SERIES in PROB and STATISTICS/see 1345/6,6214/5 Series
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009840471006719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright
  • Contents
  • Chapter 1 Introduction
  • 1.1 A Basic Paradigm for Marketing Problems
  • 1.2 A Simple Example
  • 1.3 Benefits and Costs of the Bayesian Approach
  • 1.4 An Overview of Methodological Material and Case Studies
  • 1.5 Approximate Bayes Methods and This Book
  • 1.6 Computing and This Book
  • Acknowledgments
  • Chapter 2 Bayesian Essentials
  • 2.1 Essential Concepts from Distribution Theory
  • 2.2 The Goal of Inference and Bayes Theorem
  • 2.2.1 Bayes Theorem
  • 2.3 Conditioning and the Likelihood Principle
  • 2.4 Prediction and Bayes
  • 2.5 Summarizing the Posterior
  • 2.6 Decision Theory, Risk, and the Sampling Properties of Bayes Estimators
  • 2.7 Identification and Bayesian Inference
  • 2.8 Conjugacy, Sufficiency, and Exponential Families
  • 2.9 Regression and Multivariate Analysis Examples
  • 2.9.1 Multiple Regression
  • 2.9.2 Assessing Priors for Regression Models
  • 2.9.3 Bayesian Inference for Covariance Matrices
  • 2.9.4 Priors and the Wishart Distribution
  • 2.9.5 Multivariate Regression
  • 2.9.6 The Limitations of Conjugate Priors
  • 2.10 Integration and Asymptotic Methods
  • 2.11 Importance Sampling
  • 2.11.1 GHK Method for Evaluation of Certain Integrals of MVN
  • 2.12 Simulation Primer for Bayesian Problems
  • 2.12.1 Uniform, Normal, and Gamma Generation
  • 2.12.2 Truncated Distributions
  • 2.12.3 Multivariate Normal and Student t Distributions
  • 2.12.4 The Wishart and Inverted Wishart Distributions
  • 2.12.5 Multinomial Distributions
  • 2.12.6 Dirichlet Distribution
  • 2.13 Simulation from Posterior of Multivariate Regression Model
  • Chapter 3 MCMC Methods
  • 3.1 MCMC Methods
  • 3.2 A Simple Example: Bivariate Normal Gibbs Sampler
  • 3.3 Some Markov Chain Theory
  • 3.4 Gibbs Sampler
  • 3.5 Gibbs Sampler for the SUR Regression Model
  • 3.6 Conditional Distributions and Directed Graphs.
  • 3.7 Hierarchical Linear Models
  • 3.8 Data Augmentation and a Probit Example
  • 3.9 Mixtures of Normals
  • 3.9.1 Identification in Normal Mixtures
  • 3.9.2 Performance of the Unconstrained Gibbs Sampler
  • 3.10 Metropolis Algorithms
  • 3.10.1 Independence Metropolis Chains
  • 3.10.2 Random Walk Metropolis Chains
  • 3.10.3 Scaling of the Random Walk Metropolis
  • 3.11 Metropolis Algorithms Illustrated with the Multinomial Logit Model
  • 3.12 Hybrid MCMC Methods
  • 3.13 Diagnostics
  • Chapter 4 Unit‐Level Models and Discrete Demand
  • 4.1 Latent Variable Models
  • 4.2 Multinomial Probit Model
  • 4.2.1 Understanding the Autocorrelation Properties of the MNP Gibbs Sampler
  • 4.2.2 The Likelihood for the MNP Model
  • 4.3 Multivariate Probit Model
  • 4.4 Demand Theory and Models Involving Discrete Choice
  • 4.4.1 A Nonhomothetic Choice Model
  • 4.4.2 Demand for Discrete Quantities
  • 4.4.3 Demand for Variety
  • Chapter 5 Hierarchical Models for Heterogeneous Units
  • 5.1 Heterogeneity and Priors
  • 5.2 Hierarchical Models
  • 5.3 Inference for Hierarchical Models
  • 5.4 A Hierarchical Multinomial Logit Example
  • 5.5 Using Mixtures of Normals
  • 5.5.1 A Hybrid Sampler
  • 5.5.2 Identification of the Number of Mixture Components
  • 5.5.3 Application to Hierarchical Models
  • 5.6 Further Elaborations of the Normal Model of Heterogeneity
  • 5.7 Diagnostic Checks of the First Stage Prior
  • 5.8 Findings and Influence on Marketing Practice
  • Chapter 6 Model Choice and Decision Theory
  • 6.1 Model Selection
  • 6.2 Bayes Factors in the Conjugate Setting
  • 6.3 Asymptotic Methods for Computing Bayes Factors
  • 6.4 Computing Bayes Factors Using Importance Sampling
  • 6.5 Bayes Factors Using MCMC Draws from the Posterior
  • 6.6 Bridge Sampling Methods
  • 6.7 Posterior Model Probabilities with Unidentified Parameters
  • 6.8 Chib's Method.
  • 6.9 An Example of Bayes Factor Computation: Diagonal MNP models
  • 6.10 Marketing Decisions and Bayesian Decision Theory
  • 6.10.1 Plug‐In vs Full Bayes Approaches
  • 6.10.2 Use of Alternative Information Sets
  • 6.10.3 Valuation of Disaggregate Information
  • 6.11 An Example of Bayesian Decision Theory: Valuing Household Purchase Information
  • Chapter 7 Simultaneity
  • 7.1 A Bayesian Approach to Instrumental Variables
  • 7.2 Structural Models and Endogeneity/Simultaneity
  • 7.2.1 Demand Model
  • 7.2.2 Supply Model - Profit Maximizing Prices
  • 7.2.3 Bayesian Estimation
  • 7.3 Non‐Random Marketing Mix Variables
  • 7.3.1 A General Framework
  • 7.3.2 An Application to Detailing Allocation
  • 7.3.3 Conditional Modeling Approach
  • 7.3.4 Beyond the Conditional Model
  • Chapter 8 A Bayesian Perspective on Machine Learning
  • 8.1 Introduction
  • 8.2 Regularization
  • 8.2.1 The LASSO and Bayes
  • 8.2.2 Discussion: Informative Regularizers
  • 8.2.3 Bayesian Inference
  • 8.3 Bagging
  • 8.3.1 Bagging for Regression
  • 8.3.2 Bagging, Bayesian Model Averaging and Ensembles
  • 8.4 Boosting
  • 8.4.1 Boosting as Bayes
  • 8.5 Deep Learning
  • 8.5.1 A Primer on Deep Learning
  • 8.5.2 Bayes and Deep Learning
  • 8.6 Applications
  • 8.6.1 Bayes/ML for Flexible Heterogeneity
  • 8.6.2 The Need for ML
  • 8.6.3 Discussion
  • Chapter 9 Bayesian Analysis for Text Data
  • 9.1 Introduction
  • 9.2 Consumer Demand
  • 9.2.1 The Latent Dirichlet Allocation (LDA) Model
  • 9.2.2 Full Gibbs Sampler
  • 9.2.3 Processing Text Data for Analysis
  • 9.2.4 Collapsed Gibbs Sampler
  • 9.2.5 The Sentence Constrained LDA Model
  • 9.2.6 Conjunctions and Punctuation
  • 9.3 Integrated Models
  • 9.3.1 Text and Conjoint Data
  • 9.3.2 R Code for Text and Conjoint Data
  • 9.3.3 Text and Product Ratings
  • 9.3.4 Text and Scaled Response Data
  • 9.4 Discussion.
  • Chapter 10 Case Study 1: Analysis of Choice‐Based Conjoint Data Using A Hierarchical Logit Model
  • 10.1 Choice‐Based Conjoint
  • 10.2 A Random Coefficient Logit
  • 10.3 Sign Constraints and Priors
  • 10.4 The Camera Data
  • 10.4.1 Panel Data in bayesm
  • 10.5 Running the Model
  • 10.6 Describing the Draws of Respondent Partworths
  • 10.7 Predictive Posteriors
  • 10.7.1 Respondent‐Level Parthworth Inferences
  • 10.7.2 Posterior Predictive Distributions
  • 10.8 COMPARISON OF STAN AND SAWTOOTH SOFTWARE TO BAYESM ROUTINES
  • 10.8.1 Comparison to STAN
  • 10.8.2 Comparison with Sawtooth Software
  • Chapter 11 Case Study 2: WTP and Equilibrium Analysis with Conjoint Demand
  • 11.1 The Demand for Product Features
  • 11.1.1 The Standard Choice Model for Differentiated Product Demand
  • 11.1.2 Estimating Demand
  • 11.2 Conjoint Surveys and Demand Estimation
  • 11.2.1 Conjoint Design
  • 11.3 WTP Properly Defined
  • 11.3.1 Pseudo‐WTP
  • 11.3.2 Pseudo WTP for Heterogenous Consumers
  • 11.3.3 True WTP
  • 11.3.4 Problems with All WTP Measures
  • 11.4 Nash Equilibrium Prices - Computation and Assumptions
  • 11.4.1 Assumptions
  • 11.4.2 A Standard Logit Model for Demand
  • 11.4.3 Computing Equilibrium Prices
  • 11.5 Camera Example
  • 11.5.1 WTP Computations
  • 11.5.2 Equilibrium Price Calculations
  • 11.5.3 Lessons for Conjoint Design from WTP and Equilibrium Price Computations
  • Chapter 12 Case Study 3: Scale Usage Heterogeneity
  • 12.1 Background
  • 12.2 Model
  • 12.3 Priors and MCMC Algorithm
  • 12.4 Data
  • 12.4.1 Scale Usage Heterogeneity
  • 12.4.2 Correlation Analysis
  • 12.5 Discussion
  • 12.6 R Implementation
  • Chapter 13 Case Study 4: Volumetric Conjoint
  • 13.1 Introduction
  • 13.2 Model Development
  • 13.3 Estimation
  • 13.4 Empirical Analysis
  • 13.4.1 Ice Cream
  • 13.4.2 Frozen Pizza
  • 13.5 Discussion
  • 13.6 Using the Code
  • 13.7 Concluding Remarks.
  • Chapter 14 Case Study 5: Approximate Bayes and Personalized Pricing
  • 14.1 Heterogeneity and Heterogeneous Treatment Effects
  • 14.2 The Framework
  • 14.2.1 Introducing the ML Element
  • 14.3 Context and Data
  • 14.4 Does the Bayesian Bootstrap Work?
  • 14.5 A Bayesian Bootstrap Procedure for the HTE Logit
  • 14.5.1 The Estimator
  • 14.5.2 Results
  • 14.6 Personalized Pricing
  • A An Introduction to R and bayesm
  • A.1 SETTING UP THE R ENVIRONMENT AND BAYESM
  • A.1.1 Obtaining R
  • A.1.2 Getting Started in RStudio
  • A.1.3 Obtaining Help in RStudio
  • A.1.4 Installing bayesm
  • A.2 The R Language
  • A.2.1 Using Built‐In Functions: Running a Regression
  • A.2.2 Inspecting Objects and the R Workspace
  • A.2.3 Vectors, Matrices, and Lists
  • A.2.4 Accessing Elements and Subsetting Vectors, Arrays, and Lists
  • A.2.5 Loops
  • A.2.6 Implicit Loops
  • A.2.7 Matrix Operations
  • A.2.8 Other Useful Built‐In R Functions
  • A.2.9 User‐defined Functions
  • A.2.10 Debugging Functions
  • A.2.11 Elementary Graphics
  • A.2.12 System Information
  • A.2.13 More Lessons Learned from Timing
  • A.3 USING BAYESM
  • A.4 OBTAINING HELP WITH BAYESM
  • A.5 Tips on Using MCMC Methods
  • A.6 Extending and Adapting Our Code
  • References
  • Index
  • EULA.