Papers
in journals
E. Miranda,
I. Couso, P. Gil. Relationships
between possibility measures and nested random sets. International Journal of
Uncertainty, Fuzziness and Knowledge-Based Systems,
10(1), pp. 1--15, 2002.
Abstract:
Different authors have observed some relationships between consonant random
sets and possibility measures, specially for finite
universes. In this paper, we go deeply into this matter and propose several
possible definitions for the concept of consonant random set. Three of these
conditions are equivalent for finite universes. In that case, the random set
considered is associated to a possibility measure if and only if any of them is
satisfied. However, in a general context, none of the six definitions here
proposed is sufficient for a random set to induce a possibility measure.
Moreover, only one of them seems to be necessary.
E. Miranda,
G. de Cooman. Epistemic
independence in numerical possibility theory. International Journal of Approximate
Reasoning, 32(1), pp. 23--42, 2003.
Abstract:
Numerical possibility measures can be interpreted as systems of upper betting
rates for events. As such, they have a special part in the unifying behavioural
theory of imprecise probabilities, proposed by Walley.
On this interpretation, they should arguably satisfy certain rationality, or
consistency, requirements, such as avoiding sure loss and coherence. Using a
version of Walley's notion of epistemic independence
suitable for possibility measures, we study in detail what these rationality
requirements tell us about the construction of independent product possibility
measures from given marginals, and we obtain
necessary and sufficient conditions for a product to satisfy these criteria. In
particular, we show that the well-known minimum and product rules for forming
independent joint distributions from marginal ones,
are only coherent when at least one of these distributions assume just the
values zero and one.
E. Miranda, I. Couso, P. Gil. Extreme points
of credal sets generated by 2-alternating capacities. International Journal of Approximate
Reasoning, 33 (1), pp. 95--115, 2003.
Abstract:
The characterization of the extreme points constitutes a crucial issue in the
investigation of convex sets of probabilities, not only from a purely
theoretical point of view, but also as a tool in the management of imprecise
information. In this respect, different authors have found an interesting
relation between the extreme points of the class of probability measures
dominated by a second order alternating Choquet
capacity and the permutations of the elements in the referential. However, they
have all restricted their work to the case of a finite referential space. In an
infinite setting, some technical complications arise and they have to be
carefully treated. In this paper, we extend the mentioned result to the more
general case of separable metric spaces. Furthermore, we derive some
interesting topological properties about the convex sets of probabilities here
investigated. Finally, a closer look to the case of possibility measures is
given: for them, we prove that the number of extreme points can be reduced even
in the finite case.
E. Miranda, I. Couso, Pedro Gil. A random set characterisation of
possibility measures. Information Sciences, 168 (1--4), pp.
51-75, 2004.
Abstract:
Several authors have pointed out the relationship between consonant random sets
and possibility measures. However, this relationship has only been proven for
the finite case, where the inverse Möbius of the
upper probability induced by the random set simplifies the computations to a
great extent. In this paper, we study the connection between both concepts for
arbitrary referential spaces. We complete existing results about the lack of an
implication in general with necessary and sufficient conditions for the most interesting cases.
E. Miranda, G. de Cooman, I.
Couso. Lower previsions induced by multi-valued mappings. Journal of
Statistical Planning and Inference, 133 (1), pp. 173-197,
2005.
Abstract:
We discuss how lower previsions induced by multi-valued mappings fit into the
framework of the behavioural theory of imprecise probabilities, and show how
the notions of coherence and natural extension from that theory can be used to
prove and generalise existing results in an elegant and straightforward manner.
This provides a clear example for their explanatory and unifying power.
E. Miranda,
I. Couso, P. Gil, Random
sets as imprecise random variables. Journal of
Mathematical Analysis and Applications, 307 (1), pp. 32-47, 2005.
Abstract:
Given a random set coming from the imprecise observation of a random variable,
we study how to model the information about the probability distribution of
this random variable. Specifically, we investigate whether the information
given by the upper and lower probabilities induced by the random set is
equivalent to the one given by the class of the probabilities induced by the
measurable selections; together with sufficient conditions for this, we also
give examples showing that they are not equivalent in all cases.
E. Miranda, I. Couso, P. Gil. Random
intervals as a model for imprecise information. Fuzzy Sets and Systems, 154(3), pp. 386-412, 2005.
Abstract: Random
intervals constitute one of the classes of random sets with a greater number of
applications. In this paper, we regard them as the imprecise observation of a
random variable, and study how to model the information about the probability
distribution of this one. Two possible models are the probability distributions
of the measurable selections and those bounded by the upper probability. We
prove that, under some hypotheses, the closures of these two sets in the
topology of the weak convergence coincide, improving results from the
literature. Moreover, we provide examples showing that the two models are not
equivalent in general, and give sufficient conditions for the equality between
them. Finally, we comment on the relationship between random intervals and
fuzzy numbers.
G. de Cooman, M. Troffaes, E. Miranda. n-monotone lower previsions. Journal of
Intelligent and Fuzzy Systems, 16(4) (Special
Issue dedicated to the 60th birthday of Etienne E. Kerre),
pp. 253-263, 2005.
Abstract:
We study n-monotone lower previsions, which constitute a generalisation of
n-monotone lower probabilities. We investigate their relation with the concepts
of coherence and natural extension in the behavioural theory of imprecise
probabilities, and improve along the way upon a number of results from the
literature.
E. Miranda, G. de Cooman, E. Quaeghebeur. The Hausdorff moment
problem under finite additivity. Journal of Theoretical Probability, 20(3), pp.663-693, 2007.
We
investigate to what extent finitely additive probability measures on the unit
interval are determined by their moment sequence. We do this by studying the
lower envelope of all finitely additive probability measures with a given
moment sequence. Our investigation leads to several elegant expressions for
this lower envelope, and it allows us to conclude that the information provided
by the moments is equivalent to the one given by the associated lower and upper
distribution functions.
E. Miranda, G. de Cooman. Marginal
extension in the theory of coherent lower previsions. International Journal of Approximate
Reasoning, 46(1), pp. 188--225, 2007.
We generalise Walley's Marginal
Extension Theorem to the case of any finite number of conditional lower
previsions. Unlike the procedure of natural extension, our marginal extension
always provides the smallest (most conservative) coherent extensions. We show
that they can also be calculated as lower envelopes of marginal extensions of
conditional linear (precise) previsions. Finally, we use our version of the
theorem to study the so-called forward irrelevant product and forward
irrelevant natural extension of a number of marginal lower previsions.
G. de Cooman, M.
Troffaes, E. Miranda. A unifying
approach to integration for bounded positive charges. Journal of Mathematical Analysis and Applications, 340(2), 982-999, 2008.
This paper deals with $n$-monotone functionals,
which constitute a generalisation of $n$-monotone set functions. Using the
notion of exactness of a functional, we introduce a new notion of lower and
upper integral which subsumes as particular cases most of the approaches to
integration in the literature. As a consequence, we can characterise which
types of integrals can be used to calculate the natural extension (the lower
envelope of all linear extensions) of a positive bounded charge.
E. Miranda, G. de Cooman, E. Quaeghebeur. Finitely additive extensions of
distribution functions and moment sequences: the coherent lower prevision
approach. International Journal of
Approximate Reasoning, 48(1), 132-155, 2008.
We study the information that a distribution function provides about the
finitely additive probability measure inducing it. We show that in general
there is an infinite number of finitely additive
probabilities associated with the same distribution function. Secondly, we
investigate the relationship between a distribution function and its given
sequence of moments. We provide formulae for the sets of distribution
functions, and finitely additive probabilities, associated with some moment
sequence, and determine under which conditions the moments determine the
distribution function uniquely. We show that all these problems can be
addressed efficiently using the theory of coherent lower previsions.
E. Miranda.
A survey of the theory of coherent lower
previsions. International Journal of Approximate Reasoning,
48(2), 628-658, 2008.
This paper
presents a summary of Peter Walley's theory of
coherent lower previsions. We introduce three representations of coherent
assessments: coherent lower and upper previsions, closed and convex sets of
linear previsions, and sets of desirable gambles. We show also how the notion
of coherence can be used to update our beliefs with new information, and a
number of possibilities to model the notion of independence with coherent lower
previsions. Next, we comment on the connection with other approaches in the
literature: de Finetti's and Williams' earlier work, Kuznetsov's and Weischelberger's
work on interval-valued probabilities, Dempster-Shafer
theory of evidence and Shafer and Vovk's
game-theoretic approach. Finally, we present a brief survey of some
applications and summarize the main strengths and challenges of the theory.
Errata: there was a typo at the end of Example 6: the lower prevision of (first red, second green) is 2/9 and not 4/9. This does not affect the conclusions of the example (that independence in the selection does not imply strong independence). The version of the file on this website has been corrected. Note that the same example appears as Example 3.4 in the chapter 'Structural judgements' of the book 'Introduction to imprecise probabilities'.
G. de Cooman, E. Miranda. Weak and strong
laws of large numbers for coherent lower previsions. Journal of Statistical
Planning and Inference, 138(8), 2409-2432, 2008.
We prove
weak and strong laws of large numbers for coherent lower previsions, where the
lower prevision of a random variable is given a behavioural
interpretation as a subject's supremum acceptable
price for buying it. Our laws are a consequence of the rationality criterion of
coherence, and they can be proven under assumptions that are surprisingly weak
when compared to the standard formulation of the laws in more classical
approaches to probability theory.
G. de Cooman, M. Troffaes, E. Miranda. n-Monotone exact functionals.
Journal
of Mathematical Analysis and Applications, 347(1), 143--156, 2008.
We study n-monotone
functionals, which constitute a generalisation
of n-monotone set functions. We investigate their relation to the
concepts of exactness and natural extension, which generalise
the notions of coherence and natural extension in the behavioural
theory of imprecise probabilities. We improve upon a number of results in the
literature, and prove among other things a representation result for exact n-monotone
functionals in terms of Choquet
integrals.
E. Miranda,
M. Zaffalon. Coherence graphs. Artificial
Intelligence, 173(1), 104--144, 2009.
We study
the consistency of a number of probability distributions, which are allowed to
be imprecise. To make the treatment as general as possible, we represent those
probabilistic assessments as a collection of conditional lower previsions.
The problem then becomes proving Walley's (strong)
coherence of the assessments. In order to maintain generality in the
analysis, we assume to be given nearly no information about the numbers that
make up the lower previsions in the collection. Under this condition, we
investigate the extent to which the above global task can be decomposed into
simpler and more local ones. This is done by introducing a graphical
representation of the conditional lower previsions that we call the coherence
graph: we show that the coherence graph allows one to isolate some subsets
of the collection whose coherence is sufficient for the coherence of all the
assessments; and we provide a polynomial-time algorithm that finds the subsets
efficiently. We show some of the implications of our results by focusing on
three models and problems: Bayesian and credal
networks, of which we prove coherence; the compatibility problem,
for which we provide an optimal graphical decomposition; probabilistic satisfiability, of which we show that some intractable
instances can instead be solved efficiently by exploiting coherence graphs.
G. de Cooman, E. Miranda. Forward
irrelevance. Journal of
Statistical Planning and Inference, 139(2), 256--276, 2009.
We
investigate how to combine marginal assessments about the values that random
variables assume separately into a model for the values that they assume
jointly, when (i) these marginal assessments are modelled by means of coherent lower previsions, and (ii) we
have the additional assumption that the random variables are forward epistemically irrelevant to each other. We consider and
provide arguments for two possible combinations, namely the forward irrelevant
natural extension and the forward irrelevant product, and we study the
relationships between them. Our treatment also uncovers an interesting
connection between the behavioural theory of coherent
lower previsions, and Shafer and Vovk's
game-theoretic approach to probability theory.
G. de Cooman, E. Miranda, E. Quaeghebeur. Representation insensitivity in
immediate predicition. International Journal of Approximate
Reasoning, 50(2) special issue on the Imprecise Dirichlet
Model, 204--216, 2009.
We consider
immediate predictive inference, where a subject, using a number of observations
of a finite number of exchangeable random variables, is asked to coherently
model his beliefs about the next observation, in terms of a predictive lower
prevision. We study when such predictive lower previsions are representation
insensitive, meaning that they are essentially independent of the choice of the
(finite) set of possible values for the random variables. We establish that
such representation insensitive predictive models have very interesting
properties, and show that among such models, the ones produced by the Imprecise
Dirichlet (Multinomial) Model are quite special in a
number of ways. In the Conclusion, we discuss the open question as to how
unique the Imprecise Dirichlet (Multinomial) Model
predictive lower previsions are in being representation insensitive.
E. Miranda.
Updating coherent previsions on
finite spaces. Fuzzy Sets and Systems, 160(9),
1286-1307, 2009.
We compare
the different notions of conditional coherence within the behavioural
theory of imprecise probabilities when all the spaces are finite. We show that
the differences between the notions are due to conditioning on sets of (lower,
and in some cases upper) probability zero. Next, we characterise
the range of coherent
extensions in the finite case, proving that the greatest coherent extensions
can always be calculated using the notion of regular extension, and we discuss
the extensions of our results to infinite spaces.
M. Zaffalon, E. Miranda. Conservative inference rule for uncertain reasoning under
incompleteness. Journal of
Artificial Intelligence Research, 34, 757-821, 2009.
In this
paper we formulate the problem of inference under incomplete information in
very general terms. This includes modelling the
process responsible for the incompleteness, which we call the incompleteness
process. We allow the process' behaviour to be partly
unknown. Then we use Walley's theory of coherent
lower previsions, a generalisation of the Bayesian
theory to imprecision, to derive the rule to update beliefs under
incompleteness that logically follows from our assumptions, and that we call
conservative inference rule. This rule has some remarkable properties: it is an
abstract rule to update beliefs that can be applied in any situation or domain;
it gives us the opportunity to be neither too optimistic nor too pessimistic
about the incompleteness process, which is a necessary condition to draw
reliable while strong enough conclusions; and it is a coherent rule, in the
sense that it cannot lead to inconsistencies. We give examples to show how the
new rule can be applied in expert systems, in parametric statistical inference,
and in pattern classi cation,
and discuss more generally the view of incompleteness processes defended here
as well as some of its consequences.
G. de Cooman, E. Miranda, E. Quaeghebeur. Exchangeable lower previsions. Bernoulli, 15(3), 721-735,
2009.
We extend
de Finetti's notion of exchangeability to finite and
countable sequences of variables, when a subject's beliefs about them are modelled using coherent lower previsions rather than
(linear) previsions. We derive representation theorems in both the finite and
the countable case, in terms of sampling without and with replacement,
respectively.
E. Miranda, I. Couso, P. Gil. Approximation
of upper and lower probabilities by measurable selections. Information
Sciences, 180(8), 1407-1417, 2010.
A random
set can be regarded as the result of the imprecise observation of a random
variable. Following this interpretation, we study to which extent the upper and
lower probabilities induced by the random set keep all the information about
the values of the probability distribution of the random variable. We link this
problem to the existence of selectors of a multi-valued mapping and with the
inner approximations of the upper probability, and prove that under fairly
general conditions (although not in all cases), the
upper and lower probabilities are an adequate tool for modelling
the available information. In doing this, we generalise
a number of results from the literature. Finally, we study the particular case
of consonant random sets and we also derive a relationship between Aumann and Choquet integrals.
E. Miranda,
M. Zaffalon. Conditional
models: coherence and inference through sequences of joint mass functions. Journal of Statistical Planning and Inference, 140(7), 1805-1833,
2010.
We call a conditional
model any set of statements made of conditional probabilities or
expectations. We take conditional models as primitive compared to unconditional
probability, in the sense that conditional statements do not need to be derived
from an unconditional probability. We focus on two problems: (coherence)
giving conditions to guarantee that a conditional model is self-consistent; (inference)
delivering methods to derive new probabilistic statements from a
self-consistent conditional model. We address these problems in the case where
the probabilistic statements can be specified imprecisely through sets of
probabilities, while restricting the attention to finite spaces of
possibilities. Using Walley's theory of coherent
lower previsions, we fully characterise the
question of coherence, and specialise it for the case
of precisely specified probabilities, which is the most common case addressed
in the literature. This shows that coherent conditional models are equivalent
to sequences of (possibly sets of) unconditional mass functions. In turn, this
implies that the inferences from a conditional model are the limits of the
conditional inferences obtained by applying Bayes'
rule, when possible, to the elements of the sequence. In doing so, we unveil
the tight connection between conditional models and zero-probability events.
Such a connection appears to have been overlooked by most previous works on the
subject, thus preventing so far to give a full account of coherence and
inference for conditional models.
Enrique
Miranda, Marco Zaffalon. Notes on desirability and conditional lower previsions.
Annals of Mathematics and Artificial Intelligence, 60(3-4),
251-309, 2010.
We detail
the relationship between sets of desirable gambles and conditional lower
previsions. The former is one the most general models of uncertainty. The
latter corresponds to Walley's celebrated theory of
imprecise probability. We consider two avenues: when a collection of
conditional lower previsions is derived from a set of desirable gambles, and
its converse. In either case, we relate the properties of the derived model
with those of the originating one. Our results constitute basic tools to move
from one formalism to the other, and thus to take advantage of work done in the
two fronts. Sets of desirable gambles are at the same time very powerful and
intuitive models of uncertainty. Given the central role of uncertainty in
artificial intelligence, this work marks a key passage towards the wider
accessibility of those modelling capabilities in
artificial intelligence.
Alessio Benavoli,
Marco Zaffalon, Enrique Miranda, A new robust approach to
filtering based on coherent lower previsions. IEEE
Transactions on Automatic Control, 56(7), 1567-1581, 2011.
The
classical filtering problem is re-examined to take into account imprecision in
the knowledge about the probabilistic relationships involved. To achieve that,
we consider closed convex sets of probabilities, also called coherent lower
previsions. In addition to the general formulation, we study in detail a
particular case of interest: linear-vacuous mixtures. We also show, in a
practical case, that our extension outperforms the Kalman
filter when modelling errors are present in the
system.
Gert de Cooman,
Enrique Miranda, Marco Zaffalon. Independent natural extension. Artificial
Intelligence, 175(12-13), 1911-1950, 2011.
There is no
unique extension of the standard notion of probabilistic independence to the
case where probabilities are indeterminate or imprecisely specified. Epistemic
independence is an extension that formalises the
intuitive idea of mutual irrelevance between different sources of information.
It has a wide scope and great appeal, especially for a field like Artificial
Intelligence, where such an idea or interpretation of independence has already
been employed quite often in precise-probabilistic contexts. Despite of this,
epistemic independence has received little attention so far. This paper
develops its foundations for variables assuming values in finite spaces. We
define (epistemically) independent products of marginals (or possibly conditionals) and show that there
always is a unique least-committal such independent product, which we call the independent
natural extension.We
supply an explicit formula for it, and study some of its properties: associativity, marginalisation,
external additivity, which are basic tools to work
with the independent natural extension. Additionally, we consider a number of
ways in which the standard factorisation formula for
independence can be generalised to an imprecise-probabilistic
context. We show, under some mild conditions, that when the focus is on
least-committal models, using the independent natural
extension is equivalent to imposing a so-called strong factorisation property. This is an important outcome for
applications as it gives a simple tool to make sure that inferences are
consistent with epistemic independence judgments. We discuss the potential of
our results for applications in Artificial Intelligence by recalling recent
work by some of us, where the independent natural extension was applied to
graphical models. It has allowed, for the first time, the development of an
exact linear-time algorithm for the imprecise-probability updating of credal trees.
Enrique Miranda, Marco Zaffalon, Gert de Cooman. Conglomerable natural
extension. International Journal of Approximate Reasoning, 53(8), 1200--1227,
2012.
At the
foundations of probability theory lies a question that has been open since de Finetti framed it in 1930: whether or not an uncertainty model
should be required to be conglomerable. Conglomerability is related to accepting infinitely many
conditional bets. Walley is one of the authors who
have argued in favor of conglomerability, while de Finetti rejected the idea. In this paper we study the
extension of the conglomerability condition to two
types of uncertainty models that are more general than the ones envisaged by de
Finetti: sets of desirable gambles and coherent lower
previsions. We focus in particular on the weakest (i.e., the least-committal)
of those extensions, which we call the conglomerable
natural extension. The weakest extension that does not take conglomerability into account is simply called the natural
extension. We show that taking the natural extension of assessments after imposing
conglomerability---the procedure adopted in Walley's theory---does not yield, in general, the conglomerable natural extension (but it does so in the case
of the marginal extension). Iterating this process of imposing conglomerability and taking the natural extension produces
a sequence of models that approach the conglomerable
natural extension, although it is not known, at this point, whether this
sequence converges to it. We give sufficient conditions for this to happen in
some special cases, and study the differences between working with coherent
sets of desirable gambles and coherent lower previsions. Our results indicate
that it is necessary to rethink the foundations of Walley's
theory of coherent lower previsions for infinite partitions of conditioning
events.
Gert de Cooman,
Enrique Miranda. Irrelevant
and independent natural extension for sets of desirable gambles. Journal of Artificial Intelligence Research,
45, 601-640, 2012.
The results
in this paper add useful tools to the theory of sets of desirable gambles, a
growing toolbox for reasoning with partial probability assessments. We
investigate how to combine a number of marginal coherent sets of desirable
gambles into a joint set using the properties of epistemic irrelevance and
independence. We provide formulas for the smallest such joint, called their
independent natural extension, and study its main properties. The independent
natural extension of maximal coherent sets of desirable gambles allows us to
define the strong product of sets of desirable gambles. Finally, we explore an
easy way to generalise these results to also apply
for the conditional versions of epistemic irrelevance and independence. Having
such a set of tools that are easily implemented in computer programs is clearly
beneficial to fields, like AI, with a clear interest in coherent reasoning
under uncertainty using general and robust uncertainty models that require no
full specification.
Matthias Troffaes,
Enrique Miranda, Sebastien Destercke.
On the
connection between probability boxes and possibility measures. Information Sciences,
224, 88-108, 2013.
We explore the relationship between p-boxes on totally preordered spaces and possibility measures. We start by demonstrating that only those p-boxes who have 0-1-valued lower or upper cumulative distribution function can be possibility measures, and we derive expressions for their natural extension in this case. Next, we establish necessary and sufficient conditions for a p-box to be a possibility measure. Finally, we show that almost every possibility measure can be modelled by a p-box. Whence, any techniques for p-boxes can be readily applied to possibility measures. We demonstrate this by deriving joint possibility measures from marginals, under varying assumptions of independence, using a technique known for p-boxes.
Marco Zaffalon, Enrique Miranda. Probability and time. Artificial Intelligence, 198, 1--51, 2013.
Probabilistic reasoning is often attributed a temporal meaning, in which conditioning is regarded as a normative rule to compute future beliefs out of current beliefs and observations. However, the well-established ‘updating interpretation’of conditioning is not concerned with beliefs that evolve in time, and in particular with future beliefs. On the other hand, a temporal justification of conditioning was proposed already by De Moivre and Bayes, by requiring that current and future beliefs be consistent. We reconsider the latter proposal while dealing with a generalised version of the problem, using a behavioural theory of imprecise probability in the form of coherent lower previsions as well as of coherent sets of desirable gambles, and letting the possibility space be finite or infinite. We obtain that using conditioning is normative, in the imprecise case, only if one establishes future behavioural commitments at the same time of current beliefs. In this case it is also normative that present beliefs be conglomerable, which is a result that touches on a long-term controversy at the foundations of probability. In the remaining case, where one commits to some future behaviour after establishing present beliefs, we characterise the several possibilities to define consistent future assessments; this shows in particular that temporal consistency does not preclude changes of mind. And yet, our analysis does not support that rationality requires consistency in general, even though pursuing consistency makes sense and is useful, at least as a way to guide and evaluate the assessment process. These considerations narrow down in the special case of precise probability, because this formalism cannot distinguish the two different situations illustrated above: it turns out that the only consistent rule is conditioning and moreover that it is not rational to be willing to stick to precise probability while using a rule different from conditioning to compute future beliefs; rationality requires in addition the disintegrability of the present-time probability.
We contrast Williams' and Walley's theories
of coherent lower previsions in the light of conglomerability. These are two of
the most credited approaches to a behavioural theory of imprecise probability.
Conglomerability is the notion that distinguishes them the most: Williams'
theory does not consider it, while Walley aims
at embedding it in his theory. This question is important, as conglomerability
is a major point of disagreement at the foundations of probability, since it was
first defined by de Finetti in 1930. We show that Walley's notion of joint
coherence (which is the single axiom of his theory) for conditional lower
previsions does not take all the implications of conglomerability into account.
Considered also some previous results in the literature, we deduce that Williams'
theory should be the one to use when conglomerability is not required; for the
opposite case, we define the new theory of conglomerably coherent lower
previsions, which is arguably the one to use, and of which Walley's theory can
be understood as an approximation. We show that this approximation is exact in
two important cases: when all conditioning events have positive lower
probability, and when conditioning partitions are nested.
Ignacio Montes, Enrique Miranda, Susana Montes. Stochastic
dominance with imprecise information. Computational Statistics and
Data Analysis, 71(C), 867--885, 2014.
Stochastic
dominance, which is based on the comparison of distribution functions, is one
of the most popular preference measures. However, its use is limited to the
case where the goal is to compare pairs of distribution functions, whereas in
many cases it is interesting to compare sets of distribution functions: this
may be the case for instance when the available information does not allow to
fully elicitate the probability distributions of the
random variables. To deal with these situations, a number of generalisations of the notion of stochastic dominance are
proposed; their connection with an equivalent p-box representation of the sets
of distribution functions is studied; a number of particular cases, such as
sets of distributions associated to possibility measures, are investigated; and
an application to the comparison of the Lorenz curves of countries within the
same region is presented.
Gert de Cooman,
Enrique Miranda.
Lower
previsions induced by filter maps. Journal of Mathematical Analysis and Applications,
410(1), 101--116, 2014.
We
investigate under which conditions a transformation of an imprecise probability
model of a certain type (coherent lower previsions, n-monotone capacities, minitive measures) produces a
model of the same type. We give a number of necessary and sufficient
conditions, and study in detail a particular class of such transformations,
called filter maps. These maps include as particular models multi-valued mappings as well as other models of interest within
imprecise probability theory, and can be linked to filters of sets and 0--1-valued
lower probabilities.
Ignacio Montes, Enrique Miranda, Susana Montes. Decision making with imprecise utilities and beliefs by means of statistical preference and stochastic dominance. European Journal of Operational Research, 234(1), 209--220, 2014.
A problem of decision making under
uncertainty in which the choice must be made between two sets of alternatives
instead of two single ones is considered. A number of choice rules are proposed
and their main properties are investigated, focusing particularly on the
generalizations of stochastic dominance and statistical preference. The
particular cases where imprecision is present in the utilities or in the beliefs
associated to two alternatives are considered.
Enrique Miranda, Marco Zaffalon. On the problem of computing the conglomerable natural extension. International Journal of Approximate Reasoning, 56(A), 1--27, 2015.
Embedding conglomerability as a rationality requirement in probability was among the aims of Walley's behavioural theory of coherent lower previsions. However, recent work has shown that this attempt has only been partly successful. If we focus in particular on the extension of given assessments to a rational and conglomerable model (in the least-committal way), we have that the procedure used in Walley's theory, the natural extension, provides only an approximation to the model that is actually sought for: the so-called conglomerable natural extension. In this paper we consider probabilistic assessments in the form of a coherent lower prevision P, which is another name for a lower expectation functional, and make an in-depth mathematical study of the problem of computing the conglomerable natural extension for this case: that is, where it is defined as the smallest coherent lower prevision F ≥ P that is conglomerable, in case it exists. Past work has shown that F can be approximated by an increasing sequence (F n)n of coherent lower previsions. We solve an open problem by showing that this sequence can consist of infinitely many distinct elements. Moreover, we give sufficient conditions, of quite broad applicability, to make sure that the point-wise limit of the sequence is F in case P is the lower envelope of finitely many linear previsions. In addition, we study the question of the existence of F and its relationship with the notion of marginal extension.
Enrique Miranda, Ignacio Montes. Coherent updating of non-additive measures. International Journal of Approximate Reasoning, 56 (B), 159--177, 2015.
The conditions under which a 2-monotone lower prevision can be uniquely updated to a conditional lower prevision are determined. Then a number of particular cases are investigated: completely monotone lower previsions, for which equivalent conditions in terms of the focal elements of the associated belief function are established; random sets, for which some conditions in terms of the measurable selections can be given; and minitive lower previsions, which are shown to correspond to the particular case of vacuous lower previsions.
Ignacio Montes, Enrique Miranda, Renato Pelessoni, Paolo Vicig. Sklar's theorem in an imprecise setting. Fuzzy Sets and Systems, 278C, 48--66, 2015.
Sklar's theorem is an important tool that connects bidimensional distribution functions with their marginals by means of a copula. When there is imprecision about the marginals, we can model the available information by means of p-boxes, that are pairs of ordered distribution functions. Similarly, we can consider a set of copulas instead of a single one. We study the extension of Sklar's theorem under these conditions, and link the obtained results to stochastic ordering with imprecision.
Enrique Miranda, Matthias Troffaes, Sébastien Destercke. A geometric and game-theoretic study of the conjunction of possibility measures. Information Sciences, 298, 273--289, 2015.
In this paper, we study the conjunction of possibility measures when they are interpreted as coherent upper probabilities, that is, as upper bounds for some set of probability measures. We identify conditions under which the minimum of two possibility measures remains a possibility measure. We provide graphical ways to check these conditions, by means of a zero-sum game formulation of the problem. This also gives us a nice way to adjust the initial possibility measures so their minimum is guaranteed to be a possibility measure. Finally, we identify conditions under which the minimum of two possibility measures is a coherent upper probability, or in other words, conditions under which the minimum of two possibility measures is an exact upper bound for the intersection of the credal sets of those two possibility measures.
Enrique Miranda, Sébastien Destercke. Extreme points of the credal sets generated by comparative probabilities. Journal of Mathematical Psychology, 64-65, 44--57, 2015.
When using convex probability sets (or, equivalently, lower previsions) as uncertainty models, identifying extreme points can help simplifying various computations or the use of some algorithms. In general, sets induced by specific models such as possibility distributions, linear vacuous mixture or 2-monotone measures may have extreme points easier to compute than generic convex sets. In this paper, we study extreme points of another specific model: comparative probability orderings between the elements of a finite space. We use these extreme points to study the properties of the lower probability induced by this set, and connect comparative probabilities with other uncertainty models.
Enrique Miranda, Marco Zaffalon. Independent products in infinite spaces. Journal of Mathematical Analysis and Applications, 425(1), 460-488, 2015.
Probabilistic independence, intended as the mutual irrelevance of given
variables, can be solidly founded on a notion of self-consistency of an
uncertainty model, in particular when probabilities go imprecise. There is
nothing in this approach that prevents it from being adopted in very general
setups, and yet it has mostly been detailed for variables taking finitely
many values. In this mathematical study, we complement previous research by
exploring the extent to which such an approach can be generalised. We focus
in particular on the independent products of two variables. We characterise
the main notions, including some of factorisation and productivity, in the
general case where both spaces can be infinite and show that, however,
there are situations---even in the case of precise probability---where no
independent product exists. This is not the case as soon as at least one
space is finite. We study in depth this case at the frontiers of
well-behaviour detailing the relations among the most important notions; we
show for instance that being an independent product is equivalent to a
certain productivity condition. Then we step back to the general case: we
give conditions for the existence of independent products and study ways to
get around its inherent limitations.
Enrique Miranda, Marco Zaffalon. Conformity and independence with coherent lower previsions. International Journal of Approximate Reasoning, 78C, 125-137, 2016.
We define the conformity of marginal and conditional models with a joint model within Walley's theory of coherent lower previsions. Loosely speaking, conformity means that the joint can reproduce the marginal and conditional models we started from. By studying conformity with and without additional assumptions of epistemic irrelevance and independence, we establish connections with a number of prominent models in Walley's theory: the marginal extension, the irrelevant natural extension, the independent natural extension and the strong product.
Ignacio Montes, Enrique Miranda, Susana Montes. Imprecise stochastic orders and fuzzy rankings. Fuzzy Optimization and Decision Making, 16(4), 297-327, 2017.
We extend the notion of stochastic order to the pairwise comparison of fuzzy random variables. We consider expected utility, stochastic dominance and statistical preference, which are related to the comparisons of the expectations, distribution functions and medians of the underlying variables, and discuss how to generalize these notions to the fuzzy case, when an epistemic interpretation is given to the fuzzy random variables. In passing, we investigate to which extent the earlier extensions of stochastic dominance and expected utility to the comparison of sets of random variables can be useful as fuzzy rankings.
Enrique Miranda, Marco Zaffalon. Full conglomerability. Journal of Statistical Theory and Practice, 11(4), 634-669, 2017.
We do a thorough mathematical study of the notion of full conglomerability, that is, conglomerability with respect to all the partitions of an infinite possibility space, in the sense considered by Peter Walley in his 1991 book. We consider both the cases of precise and imprecise probability (sets of probabilities). We establish relations between conglomerability and countable additivity, continuity, super-additivity and marginal extension. Moreover, we discuss the special case where a model is conglomerable with respect to a subset of all the partitions, and try to sort out the different notions of conglomerability present in the literature. We conclude that countable additivity, which is routinely used to impose full conglomerability in the precise case, appears to be the most well-behaved way to do so in the imprecise case as well by taking envelopes of countably additive probabilities. Moreover, we characterise these envelopes by means of a number of necessary and sufficient conditions.
Ignacio Montes, Enrique Miranda. Bivariate p-boxes and maxitive functions. International Journal of General Systems, 46(4), 354-385, 2017.
We give necessary and sufficient conditions for a maxitive function to be the upper probability of a bivariate p-box, in terms of its associated possibility distribution and its focal sets. This allows us to derive conditions in terms of the lower and upper distribution functions of the bivariate p-box. In particular, we prove that only bivariate p-boxes with a non-informative lower or upper distribution function may induce a maxitive function. In addition, we also investigate the extension of Sklar’s theorem to this context.
Patrizia Berti, Enrique Miranda, Pietro Rigo. Basic ideas underlying conglomerability and disintegrability. International Journal of Approximate Reasoning, 88C, 387-400, 2017.
The basic mathematical theory underlying the notions of conglomerability and disintegrability is reviewed. Both the precise and the imprecise cases are concerned.
Marco Zaffalon, Enrique Miranda. Axiomatisation of incomplete preferences through sets of desirable gambles. Journal of Artificial Intelligence Research, 60, 1057-1126, 2017.
We establish the equivalence of two very general theories: the first is the decision-theoretic formalisation of incomplete preferences based on the mixture independence axiom; the second is the theory of coherent sets of desirable gambles (bounded variables) developed in the context of imprecise probability and extended here to vector-valued gambles. Such an equivalence allows us to analyse the theory of incomplete preferences from the point of view of desirability. Among other things, this leads us to uncover an unexpected and clarifying relation: that the notion of state independence—the traditional assumption that we can have separate models for beliefs (probabilities) and values (utilities)—coincides with that of strong independence in imprecise probability; this connection leads us also to propose much weaker, and arguably more realistic, notions of state independence. Then we simplify the treatment of complete beliefs and values by putting them on a more equal footing. We study the role of the Archimedean condition—which allows us to actually talk of expected utility—, identify some weaknesses and propose alternatives that solve these. More generally speaking, we show that desirability is a valuable alternative foundation to preferences for decision theory that streamlines and unifies a number of concepts while preserving great generality. In addition, the mentioned equivalence shows for the first time how to extend the theory of desirability to imprecise non-linear utility, thus enabling us to formulate one of the most powerful self-consistent theories of reasoning and decision-making available today.
Arthur Van Camp, Gert de Cooman, Enrique Miranda. Lexicographic choice functions. International Journal of Approximate Reasoning, 92, 97-119, 2018.
We investigate a generalisation of the coherent choice functions considered by Seidenfeld et al. (2010), by sticking to the convexity axiom but imposing no Archimedeanity condition. We define our choice functions on vector spaces of options, which allows us to incorporate as special cases both Seidenfeld et al.’s (2010) choice functions on horse lotteries and also pairwise choice—which is equivalent to sets of desirable gambles (Quaeghebeur, 2014)—, and to investigate their connections. We show that choice functions based on sets of desirable options (gambles) satisfy Seidenfeld’s convexity axiom only for very particular types of sets of desirable options, which are exactly those that are representable by lexicographic probability systems that have no non-trivial Savage-null events. We call them lexicographic choice functions. Finally, we prove that these choice functions can be used to determine the most conservative convex choice function associated with a given binary relation.
Arthur van Camp, Gert de Cooman, Enrique Miranda, Erik Quaeghebeur. Coherent choice functions, desirability and indifference. Fuzzy Sets and Systems, 341C, 1-36, 2018.
We investigate how to model indifference with choice functions. We take the coherence axioms for choice functions proposed by Seidenfeld, Schervisch and Kadane as a source of inspiration, but modify them to strengthen the connection with desirability. We discuss the properties of choice functions that are coherent under our modified set of axioms and the connection with desirability. Once this is in place, we present an axiomatisation of indifference in terms of desirability. On this we build our definition of indifference in terms of choice functions, which we discuss in some detail.
Ignacio Montes, Enrique Miranda. Extreme points of the core of possibility measures and p-boxes. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 26(6), 107-1051, 2018.
Under an epistemic interpretation, an upper probability can be regarded as equivalent to the set of probability measures it dominates, sometimes referred to as its core. In this paper, we study the properties of the number of extreme points of the core of a possibility measure, and investigate in detail those associated with (uni- and bi-)variate p-boxes, that model the imprecise information about a cumulative distribution function.
Enrique Miranda, Ignacio Montes. Shapley and Banzhaf values as probability transformations. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 26(6), 917-947, 2018.
We investigate the role of some game solutions, such the Shapley and the Banzhaf values, as probability transformations. The first one coincides with the pignistic transformation proposed in the Transferable Belief Model; the second one is not efficient in general, leading us to consider its normalized version. We study a number of particular models of lower probabilities: minitive measures, coherent lower probabilities, as well as the lower probabilities induced by comparative or distortion models. For them, we provide some alternative expressions of the Shapley and Banzhaf values and study under which conditions they belong to the core of the lower probability.
Ignacio Montes, Enrique Miranda, Paolo Vicig. 2-monotone outer approximations of coherent lower probabilities. International Journal of Approximate Reasoning, 101, 181-205, 2018.
We investigate the problem of approximating a coherent lower probability on a finite space by a 2-monotone capacity that is at the same time as close as possible while not including additional information. We show that this can be tackled by means of a linear programming problem, and investigate the features of the set of undominated solutions. While our approach is based on a distance proposed by Baroni and Vicig, we also discuss a number of alternatives: quadratic programming, extensions of the total variation distance, and the Weber set from game theory. Finally, we show that our work applies to the more general problem of approximating coherent lower previsions.
Ignacio Montes, Enrique Miranda, Sebastien Destercke. Pari-mutuel probabilities as an uncertainty model. Information Sciences, 481, 550-573, 2019.
The pari-mutuel model is a betting scheme that has its origins in horse racing, and that has been applied in a number of contexts, mostly economics. In this paper, we consider the set of probability measures compatible with a pari-mutuel model, characterize its extreme points, and investigate the properties of the associated lower and upper probabilities. We show that the pari-mutuel model can be embedded within the theory of probability intervals, and prove necessary and sufficient conditions for it to be a belief function or a minitive measure. In addition, we also investigate the combination of different pari-mutuel models and their definition on product spaces.
Ignacio Montes, Enrique Miranda, Paolo Vicig. Outer approximating coherent lower probabilities with belief functions. International Journal of Approximate Reasoning, 110, 1-30, 2019.
From an epistemic point of view, coherent lower probabilities allow us to model the imprecise information about a partially unknown probability. However, there are some issues that hinder their use in practice. Since belief functions are easier to deal with, we propose to approximate the coherent lower probability by a belief function that is at the same time as close as possible to the initial coherent lower probability while not including additional information. We show that this problem can be tackled by means of linear programming, and investigate the features of the set of optimal solutions. Moreover, we emphasize the differences with the outer approximations by 2-monotone lower probabilities. We also study the problem for two particular cases of belief functions that are computationally easier to handle: necessity measures and probability boxes.
Enrique Miranda, Marco Zaffalon. Compatibility, desirability, and the running intersection property. Artificial Intelligence, 283C, 2020.
Compatibility is the problem of checking whether
some given probabilistic assessments have a common joint probabilistic model.
When the assessments are unconditional, the problem is well established in the
literature and finds a solution through the running intersection property
(RIP). This is not the case of conditional assessments. In this paper, we study
the compatibility problem in a very general setting: any possibility space,
unrestricted domains, imprecise (and possibly degenerate) probabilities. We
extend the unconditional case to our setting, thus generalising most of previous
results in the literature. The conditional case turns out to be fundamentally
different from the unconditional one. For such a case, we prove that the problem
can still be solved in general by (RIP) but in a more involved way: by
constructing a junction tree and propagating information over it. Still, (RIP)
does not allow us to optimally take advantage of sparsity: in fact, conditional
compatibility can be simplified further by joining junction trees with
coherence graphs.
Ignacio Montes, Enrique Miranda, Sebastien Destercke. Unifying neighburhood and distortion models: Part I- new results on old models. International Journal of General Systems, 49/6, 602-635, 2020.
Neighbourhoods of precise probabilities are instrumental to perform robustness analysis, as they rely on very few parameters. Many such models, sometimes referred to as distortion models, have been proposed in the literature, such as the pari mutuel model, the linear vacuous mixtures or the constant odds ratio model. This paper is the first part of a two paper series where we study the sets of probabilities induced by such models, regarding them as neighbourhoods defined over specific metrics or premetrics. We also compare them in terms of a number of properties: precision, number of extreme points, n-monotonicity, behaviour under conditioning, etc. This first part tackles this study on some of the most popular distortion models in the literature, while the second part studies less known neighbourhood models and summarises our findings.
Ignacio Montes, Enrique Miranda, Sebastien Destercke. Unifying neighburhood and distortion models: Part II- new models and synthesis. International Journal of General Systems, 49/6, 636-674, 2020.
Neighbourhoods of precise probabilities are instrumental to perform robustness analysis, as they rely on very few parameters. In the first part of this study, we introduced a general, unified view encompassing such neighbourhoods, and revisited some well-known models (pari mutuel, linear vacuous, constant odds-ratio). In this second part, we study models that have received little to no attention, but are induced by classical distances between probabilities, such as the total variation, the Kolmogorov and the L 1 distances. We finish by comparing those models in terms of a number of properties: precision, number of extreme points, n-monotonicity, ...thus providing possible guidelines for selecting a neighbourhood rather than another.
Arthur Van Camp, Enrique Miranda. Modelling epistemic irrelevance with choice functions. International Journal of Approximate Reasoning, 125C, 49-72, 2020.
We consider coherent choice functions under the recent axiomatisation proposed by De Bock and de Cooman that guarantees a representation in terms of binary preferences, and we discuss how to define conditioning in this framework. In a multivariate context, we propose a notion of marginalisation, and its inverse operation called weak (cylindrical) extension. We combine this with our definition of conditioning to define a notion of irrelevance, and we obtain the irrelevant natural extension in this framework: the least informative choice function that satisfies a given irrelevance assessment.
Enrique Miranda, Ignacio Montes, Paolo Vicig. On the selection of an optimal outer approximation of a coherent lower probability. Fuzzy Sets and Systems, 424C, 1-36, 2021.
Coherent lower probabilities are one of the most general
tools within Imprecise Probability Theory, and can be used to model the
available information about an unknown or partially known precise probability.
In spite of their generality, coherent lower probabilities are sometimes
difficult to deal with. For this reason, in previous papers we studied the
problem of outer approximating a given coherent lower probability by a more
tractable model, such as a 2- or completely monotone lower probability.
Unfortunately, such an outer approximation is not unique in general, even if we
restrict our attention to those that are undominated by other models from the
same family. In this paper, we investigate whether a number of approaches may
help in selecting a unique undominated outer approximation. These are based on
minimising
a distance with respect to the initial model, maximising the
specificity, or preserving the same preferences as the original model. We apply
them to 2-and completely monotone approximating lower probabilities, and also to
the particular cases of possibility measures and p-boxes.
Arianna Casanova, Enrique Miranda, Marco Zaffalon. Joint desirability foundations of social choice and decision making. Annals of Mathematics and Artificial Intelligence, 89(10-11), 965-1011, 2021.
We develop joint foundations for the fields of social choice and opinion pooling using coherent sets of desirable gambles, a general uncertainty model that allows to encompass both complete and incomplete preferences. This leads on the one hand to a new perspective of traditional results of social choice (in particular Arrow’s theorem as well as sufficient conditions for the existence of an oligarchy and democracy) and on the other hand to using the same framework to analyse opinion pooling. In particular, we argue that weak Pareto (unanimity) should be given the status of a rationality requirement and use this to discuss the aggregation of experts’ opinions based on probability and (state-independent) utility, showing some inherent limitation of this framework, with implications for statistics. The connection between our results and earlier work in the literature is also discussed.
Alexander Erreygers, Enrique Miranda. A graphical study of comparative probabilities. Journal of Mathematical Pschology, 104, 102582, 2021.
We consider a set of comparative probability judgements over a nite possibility space and study the structure of the set of probability measures that are compatible with them. We relate the existence of some compatible probability measure to Walley's behavioural theory of imprecise probabilities, and introduce a graphical representation that allows us to bound, and in some cases determine, the extreme points of the set of compatible measures. In doing this, we generalise some earlier work by Miranda and Destercke on elementary comparisons.
Marco Zaffalon, Enrique Miranda. Desirability foundations of robust rational decision making. Synthese, 198(Supp.27), 6529-6570, 2021.
Recent work has formally linked the traditional axiomatisation of incomplete preferences à la Anscombe-Aumann with the theory of desirability developed in the context of imprecise probability, by showing in particular that they are the very same theory. The equivalence has been established under the constraint that the set of possible prizes is finite. In this paper, we relax such a constraint, thus de facto creating one of the most general theories of rationality and decision making available today. We provide the theory with a sound interpretation and with basic notions, and results, for the separation of beliefs and values, and for the case of complete preferences. Moreover, we discuss the role of conglomerability for the presented theory, arguing that it should be a rationality requirement under very broad conditions.
Sébastien Destercke, Ignacio Montes, Enrique Miranda. Processing distortion models: a comparative study. International Journal of Approximate Reasoning, 145C, 91-120, 2022.
When dealing with sets of probabilities, distortion or neighbourhood models are convenient practical tools, as they rely on very little parameters. In this paper, we study their behaviour when such models are combined and processed through some reasoning tools. More specifically, we study their behaviour when merging different distortion models quantifying uncertainty on the same quantity, and when manipulating distortion models defined over multiple variables.