Probability, random variables, and random processes theory and signal processing applications

Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity wit...

Descripción completa

Detalles Bibliográficos
Autor principal: Shynk, John Joseph (-)
Formato: Libro electrónico
Idioma:Inglés
Publicado: Hoboken, NJ : Wiley 2012, c2013.
Edición:1st ed
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009849075806719
Tabla de Contenidos:
  • Intro
  • PROBABILITY, RANDOM VARIABLES, AND RANDOM PROCESSES
  • CONTENTS
  • PREFACE
  • NOTATION
  • 1 Overview and Background
  • 1.1 Introduction
  • 1.1.1 Signals, Signal Processing, and Communications
  • 1.1.2 Probability, Random Variables, and Random Vectors
  • 1.1.3 Random Sequences and Random Processes
  • 1.1.4 Delta Functions
  • 1.2 Deterministic Signals and Systems
  • 1.2.1 Continuous Time
  • 1.2.2 Discrete Time
  • 1.2.3 Discrete-Time Filters
  • 1.2.4 State-Space Realizations
  • 1.3 Statistical Signal Processing with MATLAB®
  • 1.3.1 Random Number Generation
  • 1.3.2 Filtering
  • Problems
  • Further Reading
  • PART I Probability, Random Variables, and Expectation
  • 2 Probability Theory
  • 2.1 Introduction
  • 2.2 Sets and Sample Spaces
  • 2.3 Set Operations
  • 2.4 Events and Fields
  • 2.5 Summary of a Random Experiment
  • 2.6 Measure Theory
  • 2.7 Axioms of Probability
  • 2.8 Basic Probability Results
  • 2.9 Conditional Probability
  • 2.10 Independence
  • 2.11 Bayes' Formula
  • 2.12 Total Probability
  • 2.13 Discrete Sample Spaces
  • 2.14 Continuous Sample Spaces
  • 2.15 Nonmeasurable Subsets of R
  • Problems
  • Further Reading
  • 3 Random Variables
  • 3.1 Introduction
  • 3.2 Functions and Mappings
  • 3.3 Distribution Function
  • 3.4 Probability Mass Function
  • 3.5 Probability Density Function
  • 3.6 Mixed Distributions
  • 3.7 Parametric Models for Random Variables
  • 3.8 Continuous Random Variables
  • 3.8.1 Gaussian Random Variable (Normal)
  • 3.8.2 Log-Normal Random Variable
  • 3.8.3 Inverse Gaussian Random Variable (Wald)
  • 3.8.4 Exponential Random Variable (One-Sided)
  • 3.8.5 Laplace Random Variable (Double-Sided Exponential)
  • 3.8.6 Cauchy Random Variable
  • 3.8.7 Continuous Uniform Random Variable
  • 3.8.8 Triangular Random Variable
  • 3.8.9 Rayleigh Random Variable
  • 3.8.10 Rice Random Variable.
  • 3.8.11 Gamma Random Variable (Erlang for r ∈ N)
  • 3.8.12 Beta Random Variable (Arcsine for α = β = 1/2, Power Function for β = 1)
  • 3.8.13 Pareto Random Variable
  • 3.8.14 Weibull Random Variable
  • 3.8.15 Logistic Random Variable (Sigmoid for {μ = 0, α = 1})
  • 3.8.16 Chi Random Variable (Maxwell-Boltzmann, Half-Normal)
  • 3.8.17 Chi-Square Random Variable
  • 3.8.18 F-Distribution
  • 3.8.19 Student's t Distribution
  • 3.8.20 Extreme Value Distribution (Type I: Gumbel)
  • 3.9 Discrete Random Variables
  • 3.9.1 Bernoulli Random Variable
  • 3.9.2 Binomial Random Variable
  • 3.9.3 Geometric Random Variable (with Support Z+ or N)
  • 3.9.4 Negative Binomial Random Variable (Pascal)
  • 3.9.5 Poisson Random Variable
  • 3.9.6 Hypergeometric Random Variable
  • 3.9.7 Discrete Uniform Random Variable
  • 3.9.8 Logarithmic Random Variable (Log-Series)
  • 3.9.9 Zeta Random Variable (Zipf)
  • Problems
  • Further Reading
  • 4 Multiple Random Variables
  • 4.1 Introduction
  • 4.2 Random Variable Approximations
  • 4.2.1 Binomial Approximation of Hypergeometric
  • 4.2.2 Poisson Approximation of Binomial
  • 4.2.3 Gaussian Approximations
  • 4.2.4 Gaussian Approximation of Binomial
  • 4.2.5 Gaussian Approximation of Poisson
  • 4.2.6 Gaussian Approximation of Hypergeometric
  • 4.3 Joint and Marginal Distributions
  • 4.4 Independent Random Variables
  • 4.5 Conditional Distribution
  • 4.6 Random Vectors
  • 4.6.1 Bivariate Uniform Distribution
  • 4.6.2 Multivariate Gaussian Distribution
  • 4.6.3 Multivariate Student's t Distribution
  • 4.6.4 Multinomial Distribution
  • 4.6.5 Multivariate Hypergeometric Distribution
  • 4.6.6 Bivariate Exponential Distributions
  • 4.7 Generating Dependent Random Variables
  • 4.8 Random Variable Transformations
  • 4.8.1 Transformations of Discrete Random Variables
  • 4.8.2 Transformations of Continuous Random Variables.
  • 4.9 Important Functions of Two Random Variables
  • 4.9.1 Sum: Z = X + Y
  • 4.9.2 Difference: Z = X - Y
  • 4.9.3 Product: Z = XY
  • 4.9.4 Quotient (Ratio): Z = X/Y
  • 4.10 Transformations of Random Variable Families
  • 4.10.1 Gaussian Transformations
  • 4.10.2 Exponential Transformations
  • 4.10.3 Chi-Square Transformations
  • 4.11 Transformations of Random Vectors
  • 4.12 Sample Mean and Sample Variance S2
  • 4.13 Minimum, Maximum, and Order Statistics
  • 4.14 Mixtures
  • Problems
  • Further Reading
  • 5 Expectation and Moments
  • 5.1 Introduction
  • 5.2 Expectation and Integration
  • 5.3 Indicator Random Variable
  • 5.4 Simple Random Variable
  • 5.5 Expectation for Discrete Sample Spaces
  • 5.6 Expectation for Continuous Sample Spaces
  • 5.7 Summary of Expectation
  • 5.8 Functional View of the Mean
  • 5.9 Properties of Expectation
  • 5.10 Expectation of a Function
  • 5.11 Characteristic Function
  • 5.12 Conditional Expectation
  • 5.13 Properties of Conditional Expectation
  • 5.14 Location Parameters: Mean, Median, and Mode
  • 5.15 Variance, Covariance, and Correlation
  • 5.16 Functional View of the Variance
  • 5.17 Expectation and the Indicator Function
  • 5.18 Correlation Coefficients
  • 5.19 Orthogonality
  • 5.20 Correlation and Covariance Matrices
  • 5.21 Higher Order Moments and Cumulants
  • 5.22 Functional View of Skewness
  • 5.23 Functional View of Kurtosis
  • 5.24 Generating Functions
  • 5.25 Fourth-Order Gaussian Moment
  • 5.26 Expectations of Nonlinear Transformations
  • Problems
  • Further Reading
  • PART II Random Processes, Systems, and Parameter Estimation
  • 6 Random Processes
  • 6.1 Introduction
  • 6.2 Characterizations of a Random Process
  • 6.3 Consistency and Extension
  • 6.4 Types of Random Processes
  • 6.5 Stationarity
  • 6.6 Independent and Identically Distributed
  • 6.7 Independent Increments
  • 6.8 Martingales.
  • 6.9 Markov Sequence
  • 6.10 Markov Process
  • 6.11 Random Sequences
  • 6.11.1 Bernoulli Sequence
  • 6.11.2 Bernoulli Scheme
  • 6.11.3 Independent Sequences
  • 6.11.4 Bernoulli Random Walk
  • 6.11.5 Binomial Counting Sequence
  • 6.12 Random Processes
  • 6.12.1 Poisson Counting Process
  • 6.12.2 Random Telegraph Signal
  • 6.12.3 Wiener Process
  • 6.12.4 Gaussian Process
  • 6.12.5 Pulse Amplitude Modulation
  • 6.12.6 Random Sine Signals
  • Problems
  • Further Reading
  • 7 Stochastic Convergence, Calculus, and Decompositions
  • 7.1 Introduction
  • 7.2 Stochastic Convergence
  • 7.3 Laws of Large Numbers
  • 7.4 Central Limit Theorem
  • 7.5 Stochastic Continuity
  • 7.6 Derivatives and Integrals
  • 7.7 Differential Equations
  • 7.8 Difference Equations
  • 7.9 Innovations and Mean-Square Predictability
  • 7.10 Doob-Meyer Decomposition
  • 7.11 Karhunen-Lo`eve Expansion
  • Problems
  • Further Reading
  • 8 Systems, Noise, and Spectrum Estimation
  • 8.1 Introduction
  • 8.2 Correlation Revisited
  • 8.3 Ergodicity
  • 8.4 Eigenfunctions of RXX(τ)
  • 8.5 Power Spectral Density
  • 8.6 Power Spectral Distribution
  • 8.7 Cross-Power Spectral Density
  • 8.8 Systems with Random Inputs
  • 8.8.1 Nonlinear Systems
  • 8.8.2 Linear Systems
  • 8.9 Passband Signals
  • 8.10 White Noise
  • 8.11 Bandwidth
  • 8.12 Spectrum Estimation
  • 8.12.1 Periodogram
  • 8.12.2 Smoothed Periodogram
  • 8.12.3 Modified Periodogram
  • 8.13 Parametric Models
  • 8.13.1 Autoregressive Model
  • 8.13.2 Moving-Average Model
  • 8.13.3 Autoregressive Moving-Average Model
  • 8.14 System Identification
  • Problems
  • Further Reading
  • 9 Sufficient Statistics and Parameter Estimation
  • 9.1 Introduction
  • 9.2 Statistics
  • 9.3 Sufficient Statistics
  • 9.4 Minimal Sufficient Statistic
  • 9.5 Exponential Families
  • 9.6 Location-Scale Families
  • 9.7 Complete Statistic
  • 9.8 Rao-Blackwell Theorem.
  • 9.9 Lehmann-SchefféTheorem
  • 9.10 Bayes Estimation
  • 9.11 Mean-Square-Error Estimation
  • 9.12 Mean-Absolute-Error Estimation
  • 9.13 Orthogonality Condition
  • 9.14 Properties of Estimators
  • 9.14.1 Unbiased
  • 9.14.2 Consistent
  • 9.14.3 Efficient
  • 9.15 Maximum A Posteriori Estimation
  • 9.16 Maximum Likelihood Estimation
  • 9.17 Likelihood Ratio Test
  • 9.18 Expectation-Maximization Algorithm
  • 9.19 Method of Moments
  • 9.20 Least-Squares Estimation
  • 9.21 Properties of LS Estimators
  • 9.21.1 Minimum ξWLS
  • 9.21.2 Uniqueness
  • 9.21.3 Orthogonality
  • 9.21.4 Unbiased
  • 9.21.5 Covariance Matrix
  • 9.21.6 Efficient: Achieves CRLB
  • 9.21.7 BLU Estimator
  • 9.22 Best Linear Unbiased Estimation
  • 9.23 Properties of BLU Estimators
  • Problems
  • Further Reading
  • A Note on Part III of the Book
  • APPENDICES Introduction to Appendices
  • A Summaries of Univariate Parametric Distributions
  • A.1 Notation
  • A.2 Further Reading
  • A.3 Continuous Random Variables
  • A.3.1 Beta (Arcsine for α = β = 1/2, Power Function for β = 1)
  • A.3.2 Cauchy
  • A.3.3 Chi
  • A.3.4 Chi-Square
  • A.3.5 Exponential (Shifted by c)
  • A.3.6 Extreme Value (Type I: Gumbel)
  • A.3.7 F-Distribution
  • A.3.8 Gamma (Erlang for r ∈ N with Γ (r ) = (r - 1)!)
  • A.3.9 Gaussian (Normal)
  • A.3.10 Half-Normal (Folded Normal)
  • A.3.11 Inverse Gaussian (Wald)
  • A.3.12 Laplace (Double-Sided Exponential)
  • A.3.13 Logistic (Sigmoid for {μ = 0, α = 1})
  • A.3.14 Log-Normal
  • A.3.15 Maxwell-Boltzmann
  • A.3.16 Pareto
  • A.3.17 Rayleigh
  • A.3.18 Rice
  • A.3.19 Student's t Distribution
  • A.3.20 Triangular
  • A.3.21 Uniform (Continuous)
  • A.3.22 Weibull
  • A.4 Discrete Random Variables
  • A.4.1 Bernoulli (with Support {0, 1})
  • A.4.2 Bernoulli (Symmetric with Support {-1, 1})
  • A.4.3 Binomial
  • A.4.4 Geometric (with Support Z+)
  • A.4.5 Geometric (Shifted with Support N).
  • A.4.6 Hypergeometric.