A course in statistics with R

Detalles Bibliográficos
Otros Autores: Tattar, Prabhanjan, 1979- author (author), Ramaiah, Suresh, 1979- author, Manjunath, B. G., 1981- author
Formato: Libro electrónico
Idioma:Inglés
Publicado: Chichester, West Sussex : John Wiley & Sons, Incorporated 2016.
Edición:1st ed
Colección:THEi Wiley ebooks.
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009849102406719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright
  • Dedication
  • Contents
  • List of Figures
  • List of Tables
  • Preface
  • Acknowledgments
  • Part I The Preliminaries
  • Chapter 1 Why R?
  • 1.1 Why R?
  • 1.2 R Installation
  • 1.3 There is Nothing such as PRACTICALS
  • 1.4 Datasets in R and Internet
  • 1.4.1 List of Web-sites containing DATASETS
  • 1.4.2 Antique Datasets
  • 1.5 http://cran.r-project.org
  • 1.5.1 http://r-project.org
  • 1.5.2 http://www.cran.r-project.org/web/views/
  • 1.5.3 Is subscribing to R-Mailing List useful?
  • 1.6 R and its Interface with other Software
  • 1.7 help and/or?
  • 1.8 R Books
  • 1.9 A Road Map
  • Chapter 2 The R Basics
  • 2.1 Introduction
  • 2.2 Simple Arithmetics and a Little Beyond
  • 2.2.1 Absolute Values, Remainders, etc.
  • 2.2.2 round, floor, etc.
  • 2.2.3 Summary Functions
  • 2.2.4 Trigonometric Functions
  • 2.2.5 Complex Numbers
  • 2.2.6 Special Mathematical Functions
  • 2.3 Some Basic R Functions
  • 2.3.1 Summary Statistics
  • 2.3.2 is, as, is.na, etc.
  • 2.3.3 factors, levels, etc.
  • 2.3.4 Control Programming
  • 2.3.5 Other Useful Functions
  • 2.3.6 Calculus*
  • 2.4 Vectors and Matrices in R
  • 2.4.1 Vectors
  • 2.4.2 Matrices
  • 2.5 Data Entering and Reading from Files
  • 2.5.1 Data Entering
  • 2.5.2 Reading Data from External Files
  • 2.6 Working with Packages
  • 2.7 R Session Management
  • 2.8 Further Reading
  • 2.9 Complements, Problems, and Programs
  • Chapter 3 Data Preparation and Other Tricks
  • 3.1 Introduction
  • 3.2 Manipulation with Complex Format Files
  • 3.3 Reading Datasets of Foreign Formats
  • 3.4 Displaying R Objects
  • 3.5 Manipulation Using R Functions
  • 3.6 Working with Time and Date
  • 3.7 Text Manipulations
  • 3.8 Scripts and Text Editors for R
  • 3.8.1 Text Editors for Linuxians
  • 3.9 Further Reading
  • 3.10 Complements, Problems, and Programs
  • Chapter 4 Exploratory Data Analysis.
  • 4.1 Introduction: The Tukey's School of Statistics
  • 4.2 Essential Summaries of EDA
  • 4.3 Graphical Techniques in EDA
  • 4.3.1 Boxplot
  • 4.3.2 Histogram
  • 4.3.3 Histogram Extensions and the Rootogram
  • 4.3.4 Pareto Chart
  • 4.3.5 Stem-and-Leaf Plot
  • 4.3.6 Run Chart
  • 4.3.7 Scatter Plot
  • 4.4 Quantitative Techniques in EDA
  • 4.4.1 Trimean
  • 4.4.2 Letter Values
  • 4.5 Exploratory Regression Models
  • 4.5.1 Resistant Line
  • 4.5.2 Median Polish
  • 4.6 Further Reading
  • 4.7 Complements, Problems, and Programs
  • Part II Probability and Inference
  • Chapter 5 Probability Theory
  • 5.1 Introduction
  • 5.2 Sample Space, Set Algebra, and Elementary Probability
  • 5.3 Counting Methods
  • 5.3.1 Sampling: The Diverse Ways
  • 5.3.2 The Binomial Coefficients and the Pascals Triangle
  • 5.3.3 Some Problems Based on Combinatorics
  • 5.4 Probability: A Definition
  • 5.4.1 The Prerequisites
  • 5.4.2 The Kolmogorov Definition
  • 5.5 Conditional Probability and Independence
  • 5.6 Bayes Formula
  • 5.7 Random Variables, Expectations, and Moments
  • 5.7.1 The Definition
  • 5.7.2 Expectation of Random Variables
  • 5.8 Distribution Function, Characteristic Function, and Moment Generation Function
  • 5.9 Inequalities
  • 5.9.1 The Markov Inequality
  • 5.9.2 The Jensen's Inequality
  • 5.9.3 The Chebyshev Inequality
  • 5.10 Convergence of Random Variables
  • 5.10.1 Convergence in Distributions
  • 5.10.2 Convergence in Probability
  • 5.10.3 Convergence in rth Mean
  • 5.10.4 Almost Sure Convergence
  • 5.11 The Law of Large Numbers
  • 5.11.1 The Weak Law of Large Numbers
  • 5.12 The Central Limit Theorem
  • 5.12.1 The de Moivre-Laplace Central Limit Theorem
  • 5.12.2 CLT for iid Case
  • 5.12.3 The Lindeberg-Feller CLT
  • 5.12.4 The Liapounov CLT
  • 5.13 Further Reading
  • 5.13.1 Intuitive, Elementary, and First Course Source.
  • 5.13.2 The Classics and Second Course Source
  • 5.13.3 The Problem Books
  • 5.13.4 Other Useful Sources
  • 5.13.5 R for Probability
  • 5.14 Complements, Problems, and Programs
  • Chapter 6 Probability and Sampling Distributions
  • 6.1 Introduction
  • 6.2 Discrete Univariate Distributions
  • 6.2.1 The Discrete Uniform Distribution
  • 6.2.2 The Binomial Distribution
  • 6.2.3 The Geometric Distribution
  • 6.2.4 The Negative Binomial Distribution
  • 6.2.5 Poisson Distribution
  • 6.2.6 The Hypergeometric Distribution
  • 6.3 Continuous Univariate Distributions
  • 6.3.1 The Uniform Distribution
  • 6.3.2 The Beta Distribution
  • 6.3.3 The Exponential Distribution
  • 6.3.4 The Gamma Distribution
  • 6.3.5 The Normal Distribution
  • 6.3.6 The Cauchy Distribution
  • 6.3.7 The t-Distribution
  • 6.3.8 The Chi-square Distribution
  • 6.3.9 The F-Distribution
  • 6.4 Multivariate Probability Distributions
  • 6.4.1 The Multinomial Distribution
  • 6.4.2 Dirichlet Distribution
  • 6.4.3 The Multivariate Normal Distribution
  • 6.4.4 The Multivariate t Distribution
  • 6.5 Populations and Samples
  • 6.6 Sampling from the Normal Distributions
  • 6.7 Some Finer Aspects of Sampling Distributions
  • 6.7.1 Sampling Distribution of Median
  • 6.7.2 Sampling Distribution of Mean of Standard Distributions
  • 6.8 Multivariate Sampling Distributions
  • 6.8.1 Noncentral Univariate Chi-square, t, and F Distributions
  • 6.8.2 Wishart Distribution
  • 6.8.3 Hotellings T2 Distribution
  • 6.9 Bayesian Sampling Distributions
  • 6.10 Further Reading
  • 6.11 Complements, Problems, and Programs
  • Chapter 7 Parametric Inference
  • 7.1 Introduction
  • 7.2 Families of Distribution
  • 7.2.1 The Exponential Family
  • 7.2.2 Pitman Family
  • 7.3 Loss Functions
  • 7.4 Data Reduction
  • 7.4.1 Sufficiency
  • 7.4.2 Minimal Sufficiency
  • 7.5 Likelihood and Information
  • 7.5.1 The Likelihood Principle.
  • 7.5.2 The Fisher Information
  • 7.6 Point Estimation
  • 7.6.1 Maximum Likelihood Estimation
  • 7.6.2 Method of Moments Estimator
  • 7.7 Comparison of Estimators
  • 7.7.1 Unbiased Estimators
  • 7.7.2 Improving Unbiased Estimators
  • 7.8 Confidence Intervals
  • 7.9 Testing Statistical Hypotheses-The Preliminaries
  • 7.10 The Neyman-Pearson Lemma
  • 7.11 Uniformly Most Powerful Tests
  • 7.12 Uniformly Most Powerful Unbiased Tests
  • 7.12.1 Tests for the Means: One- and Two-Sample t-Test
  • 7.13 Likelihood Ratio Tests
  • 7.13.1 Normal Distribution: One-Sample Problems
  • 7.13.2 Normal Distribution: Two-Sample Problem for the Mean
  • 7.14 Behrens-Fisher Problem
  • 7.15 Multiple Comparison Tests
  • 7.15.1 Bonferroni's Method
  • 7.15.2 Holm's Method
  • 7.16 The EM Algorithm*
  • 7.16.1 Introduction
  • 7.16.2 The Algorithm
  • 7.16.3 Introductory Applications
  • 7.17 Further Reading
  • 7.17.1 Early Classics
  • 7.17.2 Texts from the Last 30 Years
  • 7.18 Complements, Problems, and Programs
  • Chapter 8 Nonparametric Inference
  • 8.1 Introduction
  • 8.2 Empirical Distribution Function and Its Applications
  • 8.2.1 Statistical Functionals
  • 8.3 The Jackknife and Bootstrap Methods
  • 8.3.1 The Jackknife
  • 8.3.2 The Bootstrap
  • 8.3.3 Bootstrapping Simple Linear Model*
  • 8.4 Non-parametric Smoothing
  • 8.4.1 Histogram Smoothing
  • 8.4.2 Kernel Smoothing
  • 8.4.3 Nonparametric Regression Models*
  • 8.5 Non-parametric Tests
  • 8.5.1 The Wilcoxon Signed-Ranks Test
  • 8.5.2 The Mann-Whitney test
  • 8.5.3 The Siegel-Tukey Test
  • 8.5.4 The Wald-Wolfowitz Run Test
  • 8.5.5 The Kolmogorov-Smirnov Test
  • 8.5.6 Kruskal-Wallis Test*
  • 8.6 Further Reading
  • 8.7 Complements, Problems, and Programs
  • Chapter 9 Bayesian Inference
  • 9.1 Introduction
  • 9.2 Bayesian Probabilities
  • 9.3 The Bayesian Paradigm for Statistical Inference.
  • 9.3.1 Bayesian Sufficiency and the Principle
  • 9.3.2 Bayesian Analysis and Likelihood Principle
  • 9.3.3 Informative and Conjugate Prior
  • 9.3.4 Non-informative Prior
  • 9.4 Bayesian Estimation
  • 9.4.1 Inference for Binomial Distribution
  • 9.4.2 Inference for the Poisson Distribution
  • 9.4.3 Inference for Uniform Distribution
  • 9.4.4 Inference for Exponential Distribution
  • 9.4.5 Inference for Normal Distributions
  • 9.5 The Credible Intervals
  • 9.6 Bayes Factors for Testing Problems
  • 9.7 Further Reading
  • 9.8 Complements, Problems, and Programs
  • Part III Stochastic Processes and Monte Carlo
  • Chapter 10 Stochastic Processes
  • 10.1 Introduction
  • 10.2 Kolmogorov's Consistency Theorem
  • 10.3 Markov Chains
  • 10.3.1 The m-Step TPM
  • 10.3.2 Classification of States
  • 10.3.3 Canonical Decomposition of an Absorbing Markov Chain
  • 10.3.4 Stationary Distribution and Mean First Passage Time of an Ergodic Markov Chain
  • 10.3.5 Time Reversible Markov Chain
  • 10.4 Application of Markov Chains in Computational Statistics
  • 10.4.1 The Metropolis-Hastings Algorithm
  • 10.4.2 Gibbs Sampler
  • 10.4.3 Illustrative Examples
  • 10.5 Further Reading
  • 10.6 Complements, Problems, and Programs
  • Chapter 11 Monte Carlo Computations
  • 11.1 Introduction
  • 11.2 Generating the (Pseudo-) Random Numbers
  • 11.2.1 Useful Random Generators
  • 11.2.2 Probability Through Simulation
  • 11.3 Simulation from Probability Distributions and Some Limit Theorems
  • 11.3.1 Simulation from Discrete Distributions
  • 11.3.2 Simulation from Continuous Distributions
  • 11.3.3 Understanding Limit Theorems through Simulation
  • 11.3.4 Understanding The Central Limit Theorem
  • 11.4 Monte Carlo Integration
  • 11.5 The Accept-Reject Technique
  • 11.6 Application to Bayesian Inference
  • 11.7 Further Reading
  • 11.8 Complements, Problems, and Programs.
  • Part IV Linear Models.