Introduction to imprecise probabilities

"In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent de...

Descripción completa

Detalles Bibliográficos
Otros Autores: Augustin, Thomas, editor (editor)
Formato: Libro electrónico
Idioma:Inglés
Publicado: West Sussex, England : John Wiley & Sons 2014.
Edición:1st edition
Colección:Wiley series in probability and statistics.
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009631568006719
Tabla de Contenidos:
  • Cover
  • Title Page
  • Copyright
  • Contents
  • Introduction
  • A brief outline of this book
  • Guide to the reader
  • Contributors
  • Acknowledgements
  • Chapter 1 Desirability
  • 1.1 Introduction
  • 1.2 Reasoning about and with sets of desirable gambles
  • 1.2.1 Rationality criteria
  • 1.2.2 Assessments avoiding partial or sure loss
  • 1.2.3 Coherent sets of desirable gambles
  • 1.2.4 Natural extension
  • 1.2.5 Desirability relative to subspaces with arbitrary vector orderings
  • 1.3 Deriving and combining sets of desirable gambles
  • 1.3.1 Gamble space transformations
  • 1.3.2 Derived coherent sets of desirable gambles
  • 1.3.3 Conditional sets of desirable gambles
  • 1.3.4 Marginal sets of desirable gambles
  • 1.3.5 Combining sets of desirable gambles
  • 1.4 Partial preference orders
  • 1.4.1 Strict preference
  • 1.4.2 Nonstrict preference
  • 1.4.3 Nonstrict preferences implied by strict ones
  • 1.4.4 Strict preferences implied by nonstrict ones
  • 1.5 Maximally committal sets of strictly desirable gambles
  • 1.6 Relationships with other, nonequivalent models
  • 1.6.1 Linear previsions
  • 1.6.2 Credal sets
  • 1.6.3 To lower and upper previsions
  • 1.6.4 Simplified variants of desirability
  • 1.6.5 From lower previsions
  • 1.6.6 Conditional lower previsions
  • 1.7 Further reading
  • Acknowledgements
  • Chapter 2 Lower previsions
  • 2.1 Introduction
  • 2.2 Coherent lower previsions
  • 2.2.1 Avoiding sure loss and coherence
  • 2.2.2 Linear previsions
  • 2.2.3 Sets of desirable gambles
  • 2.2.4 Natural extension
  • 2.3 Conditional lower previsions
  • 2.3.1 Coherence of a finite number of conditional lower previsions
  • 2.3.2 Natural extension of conditional lower previsions
  • 2.3.3 Coherence of an unconditional and a conditional lower prevision
  • 2.3.4 Updating with the regular extension
  • 2.4 Further reading
  • 2.4.1 The work of Williams.
  • 2.4.2 The work of Kuznetsov
  • 2.4.3 The work of Weichselberger
  • Acknowledgements
  • Chapter 3 Structural judgements
  • 3.1 Introduction
  • 3.2 Irrelevance and independence
  • 3.2.1 Epistemic irrelevance
  • 3.2.2 Epistemic independence
  • 3.2.3 Envelopes of independent precise models
  • 3.2.4 Strong independence
  • 3.2.5 The formalist approach to independence
  • 3.3 Invariance
  • 3.3.1 Weak invariance
  • 3.3.2 Strong invariance
  • 3.4 Exchangeability
  • 3.4.1 Representation theorem for finite sequences
  • 3.4.2 Exchangeable natural extension
  • 3.4.3 Exchangeable sequences
  • 3.5 Further reading
  • 3.5.1 Independence
  • 3.5.2 Invariance
  • 3.5.3 Exchangeability
  • Acknowledgements
  • Chapter 4 Special cases
  • 4.1 Introduction
  • 4.2 Capacities and n-monotonicity
  • 4.3 2-monotone capacities
  • 4.4 Probability intervals on singletons
  • 4.5 ∞-monotone capacities
  • 4.5.1 Constructing ∞-monotone capacities
  • 4.5.2 Simple support functions
  • 4.5.3 Further elements
  • 4.6 Possibility distributions, p-boxes, clouds and related models
  • 4.6.1 Possibility distributions
  • 4.6.2 Fuzzy intervals
  • 4.6.3 Clouds
  • 4.6.4 p-boxes
  • 4.7 Neighbourhood models
  • 4.7.1 Pari-mutuel
  • 4.7.2 Odds-ratio
  • 4.7.3 Linear-vacuous
  • 4.7.4 Relations between neighbourhood models
  • 4.8 Summary
  • Chapter 5 Other uncertainty theories based on capacities
  • 5.1 Imprecise probability = modal logic + probability
  • 5.1.1 Boolean possibility theory and modal logic
  • 5.1.2 A unifying framework for capacity based uncertainty theories
  • 5.2 From imprecise probabilities to belief functions and possibility theory
  • 5.2.1 Random disjunctive sets
  • 5.2.1.1 Mass functions
  • 5.2.1.2 Belief and plausibility functions
  • 5.2.1.3 Belief functions and lower probabilities
  • 5.2.2 Numerical possibility theory
  • 5.2.2.1 Possibility distributions.
  • 5.2.2.2 Possibility and necessity measures
  • 5.2.3 Overall picture
  • 5.3 Discrepancies between uncertainty theories
  • 5.3.1 Objectivist vs. Subjectivist standpoints
  • 5.3.2 Discrepancies in conditioning
  • 5.3.3 Discrepancies in notions of independence
  • 5.3.4 Discrepancies in fusion operations
  • 5.4 Further reading
  • Chapter 6 Game-theoretic probability
  • 6.1 Introduction
  • 6.2 A law of large numbers
  • 6.3 A general forecasting protocol
  • 6.4 The axiom of continuity
  • 6.5 Doob's argument
  • 6.6 Limit theorems of probability
  • 6.7 Lévy's zero-one law
  • 6.8 The axiom of continuity revisited
  • 6.9 Further reading
  • Acknowledgements
  • Chapter 7 Statistical inference
  • 7.1 Background and introduction
  • 7.1.1 What is statistical inference?
  • 7.1.2 (Parametric) statistical models and i.i.d. samples
  • 7.1.3 Basic tasks and procedures of statistical inference
  • 7.1.4 Some methodological distinctions
  • 7.1.5 Examples: Multinomial and normal distribution
  • 7.2 Imprecision in statistics, some general sources and motives
  • 7.2.1 Model and data imprecision
  • sensitivity analysis and ontological views on imprecision
  • 7.2.2 The robustness shock, sensitivity analysis
  • 7.2.3 Imprecision as a modelling tool to express the quality of partial knowledge
  • 7.2.4 The law of decreasing credibility
  • 7.2.5 Imprecise sampling models: Typical models and motives
  • 7.3 Some basic concepts of statistical models relying on imprecise probabilities
  • 7.3.1 Most common classes of models and notation
  • 7.3.2 Imprecise parametric statistical models and corresponding i.i.d. samples
  • 7.4 Generalized Bayesian inference
  • 7.4.1 Some selected results from traditional Bayesian statistics
  • 7.4.1.1 Conjugate families of distributions
  • 7.4.2 Sets of precise prior distributions, robust Bayesian inference and the generalized Bayes rule.
  • 7.4.2.1 Robust Bayes and imprecise probabilities
  • 7.4.2.2 Some computational aspects
  • 7.4.3 A closer exemplary look at a popular class of models: The IDM and other models based on sets of conjugate priors in exponential families
  • 7.4.3.1 The general model
  • 7.4.3.2 The IDM and other prior near-ignorance models
  • 7.4.3.3 Substantial prior information and sensitivity to prior-data conflict
  • 7.4.4 Some further comments and a brief look at other models for generalized Bayesian inference
  • 7.5 Frequentist statistics with imprecise probabilities
  • 7.5.1 The nonrobustness of classical frequentist methods
  • 7.5.2 (Frequentist) hypothesis testing under imprecise probability: Huber-Strassen theory and extensions
  • 7.5.2.1 A generalized Neyman-Pearson setting: Basic concepts
  • 7.5.2.2 Globally least favourable pairs and Huber-Strassen theory
  • 7.5.3 Towards a frequentist estimation theory under imprecise probabilities-some basic criteria and first results
  • 7.5.3.1 Basic traditional criteria
  • 7.5.3.2 Generalized unbiasedness
  • 7.5.3.3 Minimum distance estimation
  • 7.5.3.4 Asymptotics
  • 7.5.4 A brief outlook on frequentist methods
  • 7.6 Nonparametric predictive inference
  • 7.6.1 Overview
  • 7.6.2 Applications and challenges
  • 7.7 A brief sketch of some further approaches and aspects
  • 7.8 Data imprecision, partial identification
  • 7.8.1 Data imprecision
  • 7.8.1.1 The basic setting
  • 7.8.1.2 Basic coarsening model
  • 7.8.1.3 Coarse data in traditional statistics
  • 7.8.2 Cautious data completion
  • 7.8.3 Partial identification and observationally equivalent models
  • 7.8.3.1 (Partial) Identifiability, identification regions
  • 7.8.3.2 Partial identification and misclassification
  • 7.8.4 A brief outlook on some further aspects
  • 7.8.4.1 Partial identification and other concepts of imprecise probabilities.
  • 7.8.4.2 A brief outlook on further developments in the literature
  • 7.9 Some general further reading
  • 7.10 Some general challenges
  • Acknowledgements
  • Chapter 8 Decision making
  • 8.1 Non-sequential decision problems
  • 8.1.1 Choosing from a set of gambles
  • 8.1.2 Choice functions for coherent lower previsions
  • 8.2 Sequential decision problems
  • 8.2.1 Static sequential solutions: Normal form
  • 8.2.2 Dynamic sequential solutions: Extensive form
  • 8.3 Examples and applications
  • 8.3.1 Ellsberg's paradox
  • 8.3.2 Robust Bayesian statistics
  • Chapter 9 Probabilistic graphical models
  • 9.1 Introduction
  • 9.2 Credal sets
  • 9.2.1 Definition and relation with lower previsions
  • 9.2.2 Marginalization and conditioning
  • 9.2.3 Composition
  • 9.3 Independence
  • 9.4 Credal networks
  • 9.4.1 Nonseparately specified credal networks
  • 9.5 Computing with credal networks
  • 9.5.1 Credal networks updating
  • 9.5.2 Modelling and updating with missing data
  • 9.5.3 Algorithms for credal networks updating
  • 9.5.4 Inference on credal networks as a multilinear programming task
  • 9.6 Further reading
  • Acknowledgements
  • Chapter 10 Classification
  • 10.1 Introduction
  • 10.2 Naive Bayes
  • 10.2.1 Derivation of naive Bayes
  • 10.3 Naive credal classifier (NCC)
  • 10.3.1 Checking Credal-dominance
  • 10.3.2 Particular behaviours of NCC
  • 10.3.3 NCC2: Conservative treatment of missing data
  • 10.4 Extensions and developments of the naive credal classifier
  • 10.4.1 Lazy naive credal classifier
  • 10.4.2 Credal model averaging
  • 10.4.3 Profile-likelihood classifiers
  • 10.4.4 Tree-augmented networks (TAN)
  • 10.5 Tree-based credal classifiers
  • 10.5.1 Uncertainty measures on credal sets: The maximum entropy function
  • 10.5.2 Obtaining conditional probability intervals with the imprecise Dirichlet model
  • 10.5.3 Classification procedure.
  • 10.6 Metrics, experiments and software.