Examples and problems in mathematical statistics
Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplic...
Autor principal: | |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Hoboken, New Jersey :
Wiley
2014.
|
Edición: | 1st ed |
Colección: | Wiley series in probability and statistics
|
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009684436406719 |
Tabla de Contenidos:
- Intro
- Examples and Problems in Mathematical Statistics
- Contents
- Preface
- List of Random Variables
- List of Abbreviations
- 1 Basic Probability Theory
- PART I: THEORY
- 1.1 OPERATIONS ON SETS
- 1.2 ALGEBRA AND σ-FIELDS
- 1.3 PROBABILITY SPACES
- 1.4 CONDITIONAL PROBABILITIES AND INDEPENDENCE
- 1.5 RANDOM VARIABLES AND THEIR DISTRIBUTIONS
- 1.6 THE LEBESGUE AND STIELTJES INTEGRALS
- 1.6.1 General Definition of Expected Value: The Lebesgue Integral
- 1.6.2 The Stieltjes-Riemann Integral
- 1.6.3 Mixtures of Discrete and Absolutely Continuous Distributions
- 1.6.4 Quantiles of Distributions
- 1.6.5 Transformations
- 1.7 JOINT DISTRIBUTIONS, CONDITIONAL DISTRIBUTIONS AND INDEPENDENCE
- 1.7.1 Joint Distributions
- 1.7.2 Conditional Expectations: General Definition
- 1.7.3 Independence
- 1.8 MOMENTS AND RELATED FUNCTIONALS
- 1.9 MODES OF CONVERGENCE
- 1.10 WEAK CONVERGENCE
- 1.11 LAWS OF LARGE NUMBERS
- 1.11.1 The Weak Law of Large Numbers (WLLN)
- 1.11.2 The Strong Law of Large Numbers (SLLN)
- 1.12 CENTRAL LIMIT THEOREM
- 1.13 MISCELLANEOUS RESULTS
- 1.13.1 Law of the Iterated Logarithm
- 1.13.2 Uniform Integrability
- 1.13.3 Inequalities
- 1.13.4 The Delta Method
- 1.13.5 The Symbols op and Op
- 1.13.6 The Empirical Distribution and Sample Quantiles
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS TO SELECTED PROBLEMS
- 2 Statistical Distributions
- PART I: THEORY
- 2.1 INTRODUCTORY REMARKS
- 2.2 FAMILIES OF DISCRETE DISTRIBUTIONS
- 2.2.1 Binomial Distributions
- 2.2.2 Hypergeometric Distributions
- 2.2.3 Poisson Distributions
- 2.2.4 Geometric, Pascal, and Negative Binomial Distributions
- 2.3 SOME FAMILIES OF CONTINUOUS DISTRIBUTIONS
- 2.3.1 Rectangular Distributions
- 2.3.2 Beta Distributions
- 2.3.3 Gamma Distributions
- 2.3.4 Weibull and Extreme Value Distributions.
- 2.3.5 Normal Distributions
- 2.3.6 Normal Approximations
- 2.4 TRANSFORMATIONS
- 2.4.1 One-to-One Transformations of Several Variables
- 2.4.2 Distribution of Sums
- 2.4.3 Distribution of Ratios
- 2.5 VARIANCES AND COVARIANCES OF SAMPLE MOMENTS
- 2.6 DISCRETE MULTIVARIATE DISTRIBUTIONS
- 2.6.1 The Multinomial Distribution
- 2.6.2 Multivariate Negative Binomial
- 2.6.3 Multivariate Hypergeometric Distributions
- 2.7 MULTINORMAL DISTRIBUTIONS
- 2.7.1 Basic Theory
- 2.7.2 Distribution of Subvectors and Distributions of Linear Forms
- 2.7.3 Independence of Linear Forms
- 2.8 DISTRIBUTIONS OF SYMMETRIC QUADRATIC FORMS OF NORMAL VARIABLES
- 2.9 INDEPENDENCE OF LINEAR AND QUADRATIC FORMS OF NORMAL VARIABLES
- 2.10 THE ORDER STATISTICS
- 2.11 t-DISTRIBUTIONS
- 2.12 F-DISTRIBUTIONS
- 2.13 THE DISTRIBUTION OF THE SAMPLE CORRELATION
- 2.14 EXPONENTIAL TYPE FAMILIES
- 2.15 APPROXIMATING THE DISTRIBUTION OF THE SAMPLE MEAN: EDGEWORTH AND SADDLEPOINT APPROXIMATIONS
- 2.15.1 Edgeworth Expansion
- 2.15.2 Saddlepoint Approximation
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS TO SELECTED PROBLEMS
- 3 Sufficient Statistics and the Information in Samples
- PART I: THEORY
- 3.1 INTRODUCTION
- 3.2 DEFINITION AND CHARACTERIZATION OF SUFFICIENT STATISTICS
- 3.2.1 Introductory Discussion
- 3.2.2 Theoretical Formulation
- 3.3 LIKELIHOOD FUNCTIONS AND MINIMAL SUFFICIENT STATISTICS
- 3.4 SUFFICIENT STATISTICS AND EXPONENTIAL TYPE FAMILIES
- 3.5 SUFFICIENCY AND COMPLETENESS
- 3.6 SUFFICIENCY AND ANCILLARITY
- 3.7 INFORMATION FUNCTIONS AND SUFFICIENCY
- 3.7.1 The Fisher Information
- 3.7.2 The Kullback-Leibler Information
- 3.8 THE FISHER INFORMATION MATRIX
- 3.9 SENSITIVITY TO CHANGES IN PARAMETERS
- 3.9.1 The Hellinger Distance
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS TO SELECTED PROBLEMS.
- 4 Testing Statistical Hypotheses
- PART I: THEORY
- 4.1 THE GENERAL FRAMEWORK
- 4.2 THE NEYMAN-PEARSON FUNDAMENTAL LEMMA
- 4.3 TESTING ONE-SIDED COMPOSITE HYPOTHESES IN MLR MODELS
- 4.4 TESTING TWO-SIDED HYPOTHESES IN ONE-PARAMETER EXPONENTIAL FAMILIES
- 4.5 TESTING COMPOSITE HYPOTHESES WITH NUISANCE PARAMETERS-UNBIASED TESTS
- 4.6 LIKELIHOOD RATIO TESTS
- 4.6.1 Testing in Normal Regression Theory
- 4.6.2 Comparison of Normal Means: The Analysis of Variance
- 4.7 THE ANALYSIS OF CONTINGENCY TABLES
- 4.7.1 The Structure of Multi-Way Contingency Tables and the Statistical Model
- 4.7.2 Testing the Significance of Association
- 4.7.3 The Analysis of Tables
- 4.7.4 Likelihood Ratio Tests for Categorical Data
- 4.8 SEQUENTIAL TESTING OF HYPOTHESES
- 4.8.1 The Wald Sequential Probability Ratio Test
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS TO SELECTED PROBLEMS
- 5 Statistical Estimation
- PART I: THEORY
- 5.1 GENERAL DISCUSSION
- 5.2 UNBIASED ESTIMATORS
- 5.2.1 General Definition and Example
- 5.2.2 Minimum Variance Unbiased Estimators
- 5.2.3 The Cramér-Rao Lower Bound for the One-Parameter Case
- 5.2.4 Extension of the Cramér-Rao Inequality to Multiparameter Cases
- 5.2.5 General Inequalities of the Cramér-Rao Type
- 5.3 THE EFFICIENCY OF UNBIASED ESTIMATORS IN REGULAR CASES
- 5.4 BEST LINEAR UNBIASED AND LEAST-SQUARES ESTIMATORS
- 5.4.1 BLUEs of the Mean
- 5.4.2 Least-Squares and BLUEs in Linear Models
- 5.4.3 Best Linear Combinations of Order Statistics
- 5.5 STABILIZING THE LSE: RIDGE REGRESSIONS
- 5.6 MAXIMUM LIKELIHOOD ESTIMATORS
- 5.6.1 Definition and Examples
- 5.6.2 MLEs in Exponential Type Families
- 5.6.3 The Invariance Principle
- 5.6.4 MLE of the Parameters of Tolerance Distributions
- 5.7 EQUIVARIANT ESTIMATORS
- 5.7.1 The Structure of Equivariant Estimators.
- 5.7.2 Minimum MSE Equivariant Estimators
- 5.7.3 Minimum Risk Equivariant Estimators
- 5.7.4 The Pitman Estimators
- 5.8 ESTIMATING EQUATIONS
- 5.8.1 Moment-Equations Estimators
- 5.8.2 General Theory of Estimating Functions
- 5.9 PRETEST ESTIMATORS
- 5.10 ROBUST ESTIMATION OF THE LOCATION AND SCALE PARAMETERS OF SYMMETRIC DISTRIBUTIONS
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS OF SELECTED PROBLEMS
- 6 Confidence and Tolerance Intervals
- PART I: THEORY
- 6.1 GENERAL INTRODUCTION
- 6.2 THE CONSTRUCTION OF CONFIDENCE INTERVALS
- 6.3 OPTIMAL CONFIDENCE INTERVALS
- 6.4 TOLERANCE INTERVALS
- 6.5 DISTRIBUTION FREE CONFIDENCE AND TOLERANCE INTERVALS
- 6.6 SIMULTANEOUS CONFIDENCE INTERVALS
- 6.7 TWO-STAGE AND SEQUENTIAL SAMPLING FOR FIXED WIDTH CONFIDENCE INTERVALS
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTION TO SELECTED PROBLEMS
- 7 Large Sample Theory for Estimation and Testing
- PART I: THEORY
- 7.1 CONSISTENCY OF ESTIMATORS AND TESTS
- 7.2 CONSISTENCY OF THE MLE
- 7.3 ASYMPTOTIC NORMALITY AND EFFICIENCY OF CONSISTENT ESTIMATORS
- 7.4 SECOND-ORDER EFFICIENCY OF BAN ESTIMATORS
- 7.5 LARGE SAMPLE CONFIDENCE INTERVALS
- 7.6 EDGEWORTH AND SADDLEPOINT APPROXIMATIONS TO THE DISTRIBUTION OF THE MLE: ONE-PARAMETER CANONICAL EXPONENTIAL FAMILIES
- 7.7 LARGE SAMPLE TESTS
- 7.8 PITMAN'S ASYMPTOTIC EFFICIENCY OF TESTS
- 7.9 ASYMPTOTIC PROPERTIES OF SAMPLE QUANTILES
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTION OF SELECTED PROBLEMS
- 8 Bayesian Analysis in Testing and Estimation
- PART I: THEORY
- 8.1 THE BAYESIAN FRAMEWORK
- 8.1.1 Prior, Posterior, and Predictive Distributions
- 8.1.2 Noninformative and Improper Prior Distributions
- 8.1.3 Risk Functions and Bayes Procedures
- 8.2 BAYESIAN TESTING OF HYPOTHESIS
- 8.2.1 Testing Simple Hypothesis.
- 8.2.2 Testing Composite Hypotheses
- 8.2.3 Bayes Sequential Testing of Hypotheses
- 8.3 BAYESIAN CREDIBILITY AND PREDICTION INTERVALS
- 8.3.1 Credibility Intervals
- 8.3.2 Prediction Intervals
- 8.4 BAYESIAN ESTIMATION
- 8.4.1 General Discussion and Examples
- 8.4.2 Hierarchical Models
- 8.4.3 The Normal Dynamic Linear Model
- 8.5 APPROXIMATION METHODS
- 8.5.1 Analytical Approximations
- 8.5.2 Numerical Approximations
- 8.6 EMPIRICAL BAYES ESTIMATORS
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS OF SELECTED PROBLEMS
- 9 Advanced Topics in Estimation Theory
- PART I: THEORY
- 9.1 MINIMAX ESTIMATORS
- 9.2 MINIMUM RISK EQUIVARIANT, BAYES EQUIVARIANT, AND STRUCTURAL ESTIMATORS
- 9.2.1 Formal Bayes Estimators for Invariant Priors
- 9.2.2 Equivariant Estimators Based on Structural Distributions
- 9.3 THE ADMISSIBILITY OF ESTIMATORS
- 9.3.1 Some Basic Results
- 9.3.2 The Inadmissibility of Some Commonly Used Estimators
- 9.3.3 Minimax and Admissible Estimators of the Location Parameter
- 9.3.4 The Relationship of Empirical Bayes and Stein-Type Estimators of the Location Parameter in the Normal Case
- PART II: EXAMPLES
- PART III: PROBLEMS
- PART IV: SOLUTIONS OF SELECTED PROBLEMS
- References
- Author Index
- Subject Index
- WILEY SERIES IN PROBABILITY AND STATISTICS.