Econometric analysis

For first-year graduate courses in Econometrics for Social Scientists. Bridging the gap between social science studies and econometric analysis Designed to bridge the gap between social science studies and field-econometrics, Econometric Analysis, 8th Edition, Global Edition, presents this ever-grow...

Descripción completa

Detalles Bibliográficos
Otros Autores: Greene, William H., 1951- author (author)
Formato: Libro electrónico
Idioma:Inglés
Publicado: Harlow, England : Pearson [2020]
Edición:Eighth, Global edition
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009632535506719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright Page
  • Brief Contents
  • Contents
  • Examples and Applications
  • Preface
  • Part I: The Linear Regression Model
  • CHAPTER 1 Econometrics
  • 1.1 Introduction
  • 1.2 The Paradigm of Econometrics
  • 1.3 The Practice of Econometrics
  • 1.4 Microeconometrics and Macroeconometrics
  • 1.5 Econometric Modeling
  • 1.6 Plan of the Book
  • 1.7 Preliminaries
  • 1.7.1 Numerical Examples
  • 1.7.2 Software and Replication
  • 1.7.3 Notational Conventions
  • CHAPTER 2 The Linear Regression Model
  • 2.1 Introduction
  • 2.2 The Linear Regression Model
  • 2.3 Assumptions of the Linear Regression Model
  • 2.3.1 Linearity of the Regression Model
  • 2.3.2 Full Rank
  • 2.3.3 Regression
  • 2.3.4 Homoscedastic and Nonautocorrelated Disturbances
  • 2.3.5 Data Generating Process for the Regressors
  • 2.3.6 Normality
  • 2.3.7 Independence and Exogeneity
  • 2.4 Summary and Conclusions
  • CHAPTER 3 Least Squares Regression
  • 3.1 Introduction
  • 3.2 Least Squares Regression
  • 3.2.1 The Least Squares Coefficient Vector
  • 3.2.2 Application: An Investment Equation
  • 3.2.3 Algebraic Aspects of the Least Squares Solution
  • 3.2.4 Projection
  • 3.3 Partitioned Regression and Partial Regression
  • 3.4 Partial Regression and Partial Correlation Coefficients
  • 3.5 Goodness of Fit and the Analysis of Variance
  • 3.5.1 The Adjusted R-Squared and a Measure of Fit
  • 3.5.2 R-Squared and the Constant Term in the Model
  • 3.5.3 Comparing Models
  • 3.6 Linearly Transformed Regression
  • 3.7 Summary and Conclusions
  • CHAPTER 4 Estimating the Regression Model by Least Squares
  • 4.1 Introduction
  • 4.2 Motivating Least Squares
  • 4.2.1 Population Orthogonality Conditions
  • 4.2.2 Minimum Mean Squared Error Predictor
  • 4.2.3 Minimum Variance Linear Unbiased Estimation
  • 4.3 Statistical Properties of the Least Squares Estimator.
  • 4.3.1 Unbiased Estimation
  • 4.3.2 Omitted Variable Bias
  • 4.3.3 Inclusion of Irrelevant Variables
  • 4.3.4 Variance of the Least Squares Estimator
  • 4.3.5 The Gauss-Markov Theorem
  • 4.3.6 The Normality Assumption
  • 4.4 Asymptotic Properties of the Least Squares Estimator
  • 4.4.1 Consistency of the Least Squares Estimator of ß
  • 4.4.2 The Estimator of Asy. Var[b]
  • 4.4.3 Asymptotic Normality of the Least Squares Estimator
  • 4.4.4 Asymptotic Efficiency
  • 4.4.5 Linear Projections
  • 4.5 Robust Estimation and Inference
  • 4.5.1 Consistency of the Least Squares Estimator
  • 4.5.2 A Heteroscedasticity Robust Covariance Matrix for Least Squares
  • 4.5.3 Robustness to Clustering
  • 4.5.4 Bootstrapped Standard Errors with Clustered Data
  • 4.6 Asymptotic Distribution of a Function of b: The Delta Method
  • 4.7 Interval Estimation
  • 4.7.1 Forming a Confidence Interval for a Coefficient
  • 4.7.2 Confidence Interval for a Linear Combination of Coefficients: the Oaxaca Decomposition
  • 4.8 Prediction and Forecasting
  • 4.8.1 Prediction Intervals
  • 4.8.2 Predicting y when the Regression Model Describes Log y
  • 4.8.3 Prediction Interval for y when the Regression Model Describes Log y
  • 4.8.4 Forecasting
  • 4.9 Data Problems
  • 4.9.1 Multicollinearity
  • 4.9.2 Principal Components
  • 4.9.3 Missing Values and Data Imputation
  • 4.9.4 Measurement Error
  • 4.9.5 Outliers and Influential Observations
  • 4.10 Summary and Conclusions
  • CHAPTER 5 Hypothesis Tests and Model Selection
  • 5.1 Introduction
  • 5.2 Hypothesis Testing Methodology
  • 5.2.1 Restrictions and Hypotheses
  • 5.2.2 Nested Models
  • 5.2.3 Testing Procedures
  • 5.2.4 Size, Power, and Consistency of a Test
  • 5.2.5 A Methodological Dilemma: Bayesian Versus Classical Testing
  • 5.3 Three Approaches to Testing Hypotheses
  • 5.3.1 Wald Tests Based on the Distance Measure.
  • 5.3.1.a Testing a Hypothesis About a Coefficient
  • 5.3.1.b The F Statistic
  • 5.3.2 Tests Based on the Fit of the Regression
  • 5.3.2.a The Restricted Least Squares Estimator
  • 5.3.2.b The Loss of Fit from Restricted Least Squares
  • 5.3.2.c Testing the Significance of the Regression
  • 5.3.2.d Solving Out the Restrictions and a Caution about R2
  • 5.3.3 Lagrange Multiplier Tests
  • 5.4 Large-Sample Tests and Robust Inference
  • 5.5 Testing Nonlinear Restrictions
  • 5.6 Choosing Between Nonnested Models
  • 5.6.1 Testing Nonnested Hypotheses
  • 5.6.2 An Encompassing Model
  • 5.6.3 Comprehensive Approach-The J Test
  • 5.7 A Specification Test
  • 5.8 Model Building-A General to Simple Strategy
  • 5.8.1 Model Selection Criteria
  • 5.8.2 Model Selection
  • 5.8.3 Classical Model Selection
  • 5.8.4 Bayesian Model Averaging
  • 5.9 Summary and Conclusions
  • CHAPTER 6 Functional Form, Difference in Differences, and Structural Change
  • 6.1 Introduction
  • 6.2 Using Binary Variables
  • 6.2.1 Binary Variables in Regression
  • 6.2.2 Several Categories
  • 6.2.3 Modeling Individual Heterogeneity
  • 6.2.4 Sets of Categories
  • 6.2.5 Threshold Effects and Categorical Variables
  • 6.2.6 Transition Tables
  • 6.3 Difference in Differences Regression
  • 6.3.1 Treatment Effects
  • 6.3.2 Examining the Effects of Discrete Policy Changes
  • 6.4 Using Regression Kinks and Discontinuities to Analyze Social Policy
  • 6.4.1 Regression Kinked Design
  • 6.4.2 Regression Discontinuity Design
  • 6.5 Nonlinearity in the Variables
  • 6.5.1 Functional Forms
  • 6.5.2 Interaction Effects
  • 6.5.3 Identifying Nonlinearity
  • 6.5.4 Intrinsically Linear Models
  • 6.6 Structural Break and Parameter Variation
  • 6.6.1 Different Parameter Vectors
  • 6.6.2 Robust Tests of Structural Break with Unequal Variances
  • 6.6.3 Pooling Regressions
  • 6.7 Summary And Conclusions.
  • CHAPTER 7 Nonlinear, Semiparametric, and Nonparametric Regression Models
  • 7.1 Introduction
  • 7.2 Nonlinear Regression Models
  • 7.2.1 Assumptions of the Nonlinear Regression Model
  • 7.2.2 The Nonlinear Least Squares Estimator
  • 7.2.3 Large-Sample Properties of the Nonlinear Least Squares Estimator
  • 7.2.4 Robust Covariance Matrix Estimation
  • 7.2.5 Hypothesis Testing and Parametric Restrictions
  • 7.2.6 Applications
  • 7.2.7 Loglinear Models
  • 7.2.8 Computing the Nonlinear Least Squares Estimator
  • 7.3 Median and Quantile Regression
  • 7.3.1 Least Absolute Deviations Estimation
  • 7.3.2 Quantile Regression Models
  • 7.4 Partially Linear Regression
  • 7.5 Nonparametric Regression
  • 7.6 Summary and Conclusions
  • CHAPTER 8 Endogeneity and Instrumental Variable Estimation
  • 8.1 Introduction
  • 8.2 Assumptions of the Extended Model
  • 8.3 Instrumental Variables Estimation
  • 8.3.1 Least Squares
  • 8.3.2 The Instrumental Variables Estimator
  • 8.3.3 Estimating the Asymptotic Covariance Matrix
  • 8.3.4 Motivating the Instrumental Variables Estimator
  • 8.4 Two-Stage Least Squares, Control Functions, and Limited Information Maximum Likelihood
  • 8.4.1 Two-Stage Least Squares
  • 8.4.2 A Control Function Approach
  • 8.4.3 Limited Information Maximum Likelihood
  • 8.5 Endogenous Dummy Variables: Estimating Treatment Effects
  • 8.5.1 Regression Analysis of Treatment Effects
  • 8.5.2 Instrumental Variables
  • 8.5.3 A Control Function Estimator
  • 8.5.4 Propensity Score Matching
  • 8.6 Hypothesis Tests
  • 8.6.1 Testing Restrictions
  • 8.6.2 Specification Tests
  • 8.6.3 Testing for Endogeneity: The Hausman and Wu Specification Tests
  • 8.6.4 A Test for Overidentification
  • 8.7 Weak Instruments and LIML
  • 8.8 Measurement Error
  • 8.8.1 Least Squares Attenuation
  • 8.8.2 Instrumental Variables Estimation
  • 8.8.3 Proxy Variables.
  • 8.9 Nonlinear Instrumental Variables Estimation
  • 8.10 Natural Experiments and the Search for Causal Effects
  • 8.11 Summary and Conclusions
  • Part II: Generalized Regression Model and Equation Systems
  • CHAPTER 9 The Generalized Regression Model and Heteroscedasticity
  • 9.1 Introduction
  • 9.2 Robust Least Squares Estimation and Inference
  • 9.3 Properties of Least Squares and Instrumental Variables
  • 9.3.1 Finite-Sample Properties of Least Squares
  • 9.3.2 Asymptotic Properties of Least Squares
  • 9.3.3 Heteroscedasticity and Var[b|X]
  • 9.3.4 Instrumental Variable Estimation
  • 9.4 Efficient Estimation by Generalized Least Squares
  • 9.4.1 Generalized Least Squares (GLS)
  • 9.4.2 Feasible Generalized Least Squares (FGLS)
  • 9.5 Heteroscedasticity and Weighted Least Squares
  • 9.5.1 Weighted Least Squares
  • 9.5.2 Weighted Least Squares with Known Ω
  • 9.5.3 Estimation When Ω Contains Unknown Parameters
  • 9.6 Testing for Heteroscedasticity
  • 9.6.1 White's General Test
  • 9.6.2 The Lagrange Multiplier Test
  • 9.7 Two Applications
  • 9.7.1 Multiplicative Heteroscedasticity
  • 9.7.2 Groupwise Heteroscedasticity
  • 9.8 Summary and Conclusions
  • CHAPTER 10 Systems of Regression Equations
  • 10.1 Introduction
  • 10.2 The Seemingly Unrelated Regressions Model
  • 10.2.1 Ordinary Least Squares And Robust Inference
  • 10.2.2 Generalized Least Squares
  • 10.2.3 Feasible Generalized Least Squares
  • 10.2.4 Testing Hypotheses
  • 10.2.5 The Pooled Model
  • 10.3 Systems of Demand Equations: Singular Systems
  • 10.3.1 Cobb-Douglas Cost Function
  • 10.3.2 Flexible Functional Forms: The Translog Cost Function
  • 10.4 Simultaneous Equations Models
  • 10.4.1 Systems of Equations
  • 10.4.2 A General Notation for Linear Simultaneous Equations Models
  • 10.4.3 The Identification Problem
  • 10.4.4 Single Equation Estimation and Inference.
  • 10.4.5 System Methods of Estimation.