Accepted Papers (ECSQARU)

ECSQARU 2017 Accepted Papers

Proceedings available on Springer LNAI (Volume 10369).

Click on the cells to see the abstracts of the papers, the links to the slides, and the Springer links.

Lina Abassi and Imen Boukhris. Iterative aggregation of crowdsourced tasks within the belief function theory.
With the growing of crowdsourcing services, gathering training data for supervised machine learning has become cheaper and faster than engaging experts. However, the quality of the crowd-generated labels remains an open issue. This is basically due to the wide ranging expertise levels of the participants in the labeling process. In this paper, we present an iterative approach of label aggregation based on the belief function theory that simultanously estimates labels, the reliability of participants and difficulty of each task. Our empirical evaluation demonstrate the efficiency of our method as it gives better quality labels.

:: SpringerLink ::

Raoua Abdelkhalek, Imen Boukhris and Zied Elouedi. A clustering approach for collaborative filtering under the belief function framework.
Collaborative Filtering (CF) is one of the most successful approaches in Recommender Systems (RS). It exploits the ratings of similar users or similar items in order to predict the users’ preferences. To do so, clustering CF approaches have been proposed to group items or users into different clusters. However, most of the existing approaches do not consider the impact of uncertainty involved during the clusters assignments. To tackle this issue, we propose in this paper a clustering approach for CF under the belief function theory. In our approach, we involve the Evidential C-Means to group the most similar items into different clusters and the predictions are then performed. Our approach tends to take into account the different memberships of the items clusters while maintaining a good scalability and recommendation performance. A comparative evaluation on a real world data set shows that the pro- posed method outperforms the previous evidential collaborative filtering.

:: SpringerLink :: slides ::

Stefano Aguzzoli, Matteo Bianchi, Brunella Gerla and Diego Valota. Probability measures in GödelΔ logic.
In this paper we define and axiomatise finitely additive probability measures for events described by formulas in G ̈odelΔ (GΔ) propositional logic. In particular we show that our axioms fully charac- terise finitely additive probability measures over the free finitely generated algebras in the variety constituting the algebraic semantics of GΔ as integrals of elements of those algebras (represented canonically as algebras of [0, 1]-valued functions), with respect to Borel probability measures.

:: SpringerLink :: slides ::

Leila Amgoud and Jonathan Ben-Naim. Evaluation of arguments in weighted bipolar graphs.
The paper tackled the issue of arguments evaluation in weighted bipolar argumentation graphs (i.e., graphs whose arguments have basic strengths, and may be both supported and attacked). We introduce axioms that an evaluation method (or semantics) could satisfy. Such axioms are very useful for judging and comparing semantics. We then analyze existing semantics on the basis of our axioms, and finally propose a new semantics for the class of acyclic graphs.

:: SpringerLink :: slides ::

Christoph Beierle, Christian Eichhorn and Gabriele Kern-Isberner. A transformation system for unique minimal normal forms of conditional knowledge bases.
Conditional knowledge bases consisting of sets of conditionals are used in inductive nonmonotonic reasoning and can represent the defeasible background knowledge of a reasoning agent. For the comparison of the knowledge of different agents, as well as of different approaches to nonmonotonic reasoning, it is beneficial if these knowledge bases are as compact and straightforward as possible. To enable the replacement of a knowledge base R by a simpler, but equivalent knowledge base R′, we propose to use the notions of elementwise equivalence or model equivalence for conditional knowledge bases. For elementwise equivalence, we present a terminating and confluent transformation system on condi- tional knowledge bases yielding a unique normal form for every R. We show that an extended version of this transformation system takes model equivalence into account. For both transformation system, we prove that the obtained normal forms are minimal with respect to subset inclusion and the corresponding notion of equivalence.

:: SpringerLink :: slides ::

Christoph Beierle and Steven Kutsch. Comparison of inference relations defined over different sets of ranking functions.
Skeptical inference in the context of a conditional knowledge base R can be defined with respect to a set of models of R. For the semantics of ranking functions that assign a degree of surprise to each possible world, we develop a method for comparing the inference relations induced by different sets of ranking functions. Using this method, we address the problem of ensuring the correctness of approximating c-inference for R by constraint satisfaction problems (CSPs) over finite domains. While in general, determining a sufficient upper bound for these CSPs is an open problem, for a sequence of simple knowledge bases investigated only experimentally before, we prove that using the number of conditionals in R as an upper bound correctly captures skeptical c-inference.

:: SpringerLink :: slides ::

Nahla Ben Amor, Zeineb Elkhalfi, Helene Fargier and Régis Sabbadin. Efficient policies for stationary possibilistic Markov decision processes.
Possibilistic Markov Decision Processes offer a compact and tractable way to represent and solve problems of sequential decision under qualitative uncertainty. Even though appealing for its ability to handle qualitative problems, this model suffers from the drowning effect that is inherent to possibilistic decision theory. The present paper proposes to escape the drowning effect by extending to stationary possibilistic MDPs the lexicographic preference relations defined in Fargier and Sabadin (2005) for non-sequential decision problems and provides a value iteration algorithm to compute policies that are optimal for these new criteria.

:: SpringerLink :: slides ::

Nahla Ben Amor, Fatma Essghaier and Helene Fargier. Algorithms for multi-criteria optimization in possibilistic decision trees.
This paper raises the question of solving multi-criteria sequential decision problems under uncertainty. It proposes to extend to possibilistic decision trees the decision rules presented in Ben Amor et al. (2014) for non sequential problems. It present a series of algorithms for this new frame- work: Dynamic Programming can be used and provide an optimal strategy for rules that satisfy the property of monotonicity. There is no guarantee of optimality for those that do not—hence the definition of dedicated algorithms. This paper concludes by an empirical comparison of the algorithms.

:: SpringerLink :: slides ::

Janneke Bolt and Silja Renooij. Structure-based categorisation of Bayesian network parameters.
Bayesian networks typically require thousands of probability para-meters for their specification, many of which are bound to be inaccurate. Knowledge of the direction of change in an output probability of a network occasioned by changes in one or more of its parameters, i.e. the qualitative effect of parameter changes, has been shown to be useful both for parameter tuning and in pre-processing for inference in credal networks. In this paper we identify classes of parameter for which the qualitative effect on a given output of interest can be identified based upon graphical considerations.

:: SpringerLink :: slides ::

Nadira Boudjani, Abdelkader Gouaich and Souhila Kaci. Debate-based learning game for constructing mathematical proofs.
Debate is a valuable and effective method of learning. It is an interactive process in which learners cooperate by exchanging arguments and counter-arguments to solve a common question. We propose a debate-based learning game for mathematics classroom to teach how to structure and build mathematical proofs. Dung’s argumentation frame- work and its extensions are used as a means to extract acceptable arguments that form the proof. Moreover this allows instructors to provide continuous feedbacks to learners without information overload.

:: SpringerLink :: slides ::

Andrey Bronevich and Igor Rozenberg. Incoherence correction and decision making based on generalized credal sets.
While making decisions we meet different types of uncertainty. Recently the concept of generalized credal set has been proposed for modeling conflict, imprecision and contradiction in information. This concept allows us to generalize the theory of imprecise probabilities giving us possibilities to process information presented by contradictory (incoherent) lower previsions. In this paper we propose a new way of introducing generalized credal sets: we show that any contradictory lower prevision can be represented as a convex sum of non-contradictory and fully contradictory lower previsions. In this way we can introduce generalized credal sets and apply them to decision problems. Decision making is based on decision rules in the theory of imprecise probabilities and the contradiction-imprecision transformation that looks like incoherence correction.

:: SpringerLink :: slides ::

Andrea Campagner and Davide Ciucci. Measuring uncertainty in orthopairs.
In many situations information comes in bipolar form. Orthopairs are a simple tool to represent and study this kind of information, where objects are classified in three different classes: positive, negative and boundary. The scope of this work is to introduce some uncertainty measures on orthopairs. Two main cases are investigated: a single orthopair and a collection of orthopairs. Some ideas are taken from neighbouring disciplines, such as fuzzy sets, intuitionistic fuzzy sets, rough sets and possibility theory.

:: SpringerLink :: slides ::

Davide Ciucci and Didier Dubois. A two-tiered propositional framework for handling multisource inconsistent information.
This paper proposes a conceptually simple but expressive framework for handling propositional information stemming from several sources, namely a two-tiered propositional logic augmented with classical modal axioms (BC-logic), a fragment of the non-normal modal logic EMN, whose semantics is expressed in terms of two-valued monotonic set-functions called Boolean capacities. We present a theorem-preserving translation of Belnap logic in this setting. As special cases, we can recover previous translations of three-valued logics such as Kleene and Priest logics. Our translation bridges the gap between Belnap logic, epistemic logic, and theories of uncertainty like possibility theory or belief functions, and paves the way to a unified approach to various inconsistency handling methods.

:: SpringerLink :: slides ::

Giulianella Coletti, Davide Petturiti and Barbara Vantaggi. Fuzzy weighted attribute combinations based similarity measures.
Some similarity measures for fuzzy subsets are introduced: they are based on fuzzy set-theoretic operations and on a weight capacity expressing the degree of contribution of each group of attributes. For such measures, the properties of dominance and T-transitivity are investigated.

:: SpringerLink :: slides ::

Fabio Cozman and Denis Mauá. The complexity of inferences and explanations in probabilistic logic programming.
A popular family of probabilistic logic programming languages combines logic programs with independent probabilistic facts. We study the complexity of marginal inference, most probable explanations, and maximum a posteriori calculations for propositional/relational probabilistic logic programs that are acyclic/definite/stratified/normal/ disjunctive. We show that complexity classes Σk and PPΣk (for various values of k) and NPPP are all reached by such computations.

:: SpringerLink :: slides ::

Fabio Cozman and Denis Mauá. The descriptive complexity of Bayesian network specifications.
We adapt the theory of descriptive complexity to Bayesian networks, by investigating how expressive can be specifications based on predicates and quantifiers. We show that Bayesian network specifications that employ first-order quantification capture the complexity class PP; that is, any phenomenon that can be simulated with a polynomial time probabilistic Turing machine can be also modeled by such a network. We also show that, by allowing quantification over predicates, the resulting Bayesian network specifications capture the complexity class PPNP, a result that does not seem to have equivalent in the literature.

:: SpringerLink :: slides ::

Nadia Creignou, Raida Ktari and Odile Papini. Complexity of model checking for cardinality-based belief revision operators.
This paper deals with the complexity of model checking for belief base revision. We extend the study initiated by Liberatore & Schaerf and introduce two new belief base revision operators stemming from consistent subbases maximal with respect to cardinality. We establish the complexity of the model checking problem for various operators within the framework of propositional logic as well as in the Horn fragment.

:: SpringerLink :: slides ::

Sébastien Destercke. A generic framework to include belief functions in preference handling and multi-criteria decision.
Modelling the preferences of a decision maker about multi-criteria alternatives usually starts by collecting preference information, then used to fit a model issued from a set of hypothesis (weighted average, CP-net). This can lead to inconsistencies, due to inaccurate information provided by the decision maker or to a poor choice of hypothesis set. We propose to quantify and resolve such inconsistencies, by allowing the decision maker to express her/his certainty about the provided preferential information in the form of belief functions.

:: SpringerLink :: slides ::

Tommaso Flaminio, Lluis Godo and Hykel Hosni. On the Boolean structure of conditional events and its logical counterpart.
This paper sheds a novel light on the longstanding problem of investigating the logic of conditional events. Building on the frame- work of Boolean algebras of conditionals previously introduced by the authors, we make two main new contributions. First, we fully characterise the atomic structure of these algebras of conditionals. Second, we introduce the logic of Boolean conditionals (LBC) and prove its com- pleteness with respect to the natural semantics induced by the structural properties of the atoms in a conditional algebra as described in the first part. In addition we outline the close connection of LBC with preferential consequence relations, arguably one of the most appreciated systems of non-monotonic reasoning.

:: SpringerLink :: slides ::

Giulia Fragnito, Joaquim Gabarro and Maria Serna. An Angel-daemon approach to assess the uncertainty in the power of a collectivity to act.
We propose the use of the angel-daemon (a/d) framework to assess the Coleman’s power of a collectivity to act under uncertainty in weighted voting games. In this framework uncertainty profiles describe the potential changes in the weights of a weighted game and fixes the spread of the weights’ change. For each uncertainty profile a strategic a/d game can be considered. This game has two selfish players, the angel a and the daemon d, a selects its action as to maximize the effect on the measure under consideration while d acts oppositely. Players a and d give a balance between the best and the worst. The a/d games associated to the Coleman’s power are zero-sum games and therefore the expected utilities of all the Nash equilibria are the same. In this way we can asses the Coleman’s power under uncertainty. Besides introducing the framework for this particular setting we analyse basic properties and make some computational complexity considerations. We provide several examples based in the evolution of the voting rules of the EU Council of Ministers.

:: SpringerLink :: slides ::

John Grant, Cristian Molinaro and Francesco Parisi. Count queries in probabilistic spatio-temporal knowledge bases with capacity constraints.
The problem of managing spatio-temporal data arises in many applications, such as location-based services, environment monitoring, geographic information system, and many others. In real life, this kind of data is often uncertain. The SPOT framework has been proposed for the representation and processing of probabilistic spatio-temporal data where probability is represented as an interval because the exact value is unknown. In this paper, we enhance the SPOT framework with capacity constraints, which allow users to better model many real-world scenarios. The resulting formalization is called PST knowledge base. We study the computational complexity of consistency checking, a central problem in this setting. Specifically, we show that the problem is NP-complete and also identify tractable cases. We then consider a relevant kind of queries to reason on PST knowledge bases, namely count queries, which ask for how many objects are in a region at a certain time point. We investigate the computational complexity of answering count queries, and show cases for which consistency checking can be exploited for query answering.

:: SpringerLink :: slides ::

Maroua Haddad, Philippe Leray and Nahla Ben Amor. Possibilistic MDL: a new possibilistic likelihood based score function for imprecise data.
Recent years have seen a surge of interest in methods for representing and reasoning with imprecise data. In this paper, we propose a new possibilistic likelihood function handling this particular form of data based on the interpretation of a possibility distribution as a contour function of a random set. The proposed function can serve as the foundation for inferring several possibilistic models. In this paper, we apply it to define a new scoring function to learn possibilistic network structure. Experimental study showing the efficiency of the proposed score is also presented.

:: SpringerLink :: slides ::

Nathalie Helal, Frédéric Pichon, Daniel Porumbel, David Mercier and Eric Lefèvre. A Recourse approach for the capacitated vehicle routing problem with evidential demands.
The capacitated vehicle routing problem with stochastic demands can be modelled using either the chance-constrained approach or the recourse approach. In previous works, we extended the former approach to address the case where uncertainty on customer demands is represented by belief functions, that is where customers have so-called evidential demands. In this paper, we propose an extension of the recourse approach for this latter case. We also provide a technique that makes computations tractable for realistic situations. The feasibility of our approach is then shown by solving instances of this difficult problem using a metaheuristic algorithm.

:: SpringerLink :: slides ::

Anthony Hunter and Nico Potyka. Updating Probabilistic epistemic states in persuasion dialogues.
In persuasion dialogues, the ability of the persuader to model the persuadee allows the persuader to make better choices of move. The epistemic approach to probabilistic argumentation is a promising way of modelling the persuadee’s belief in arguments, and proposals have been made for update methods that specify how these beliefs can be updated at each step of the dialogue. However, there is a need to better under- stand these proposals, and moreover, to gain insights into the space of possible update functions. So in this paper, we present a general frame- work for update functions in which we consider existing and novel update functions.

:: SpringerLink :: slides ::

Christoph Jansen, Thomas Augustin and Georg Schollmeyer. Decision theory meets linear optimization beyond computation.
The paper is concerned with decision making under com- plex uncertainty. We consider the Hodges and Lehmann-criterion relying on uncertain classical probabilities and Walley’s maximality relying on imprecise probabilities. We present linear programming based approaches for computing optimal acts as well as for determining least favorable prior distributions in finite decision settings. Further, we apply results from duality theory of linear programming in order to provide theoretical insights into certain characteristics of these optimal solutions. Particularly, we characterize conditions under which randomization pays out when defining optimality in terms of the Gamma-Maximin criterion and investigate how these conditions relate to least favorable priors.

:: SpringerLink :: slides ::

Tuomo Lehtonen, Johannes P. Wallner and Matti Järvisalo. From structured to abstract argumentation: assumption-based acceptance via AF reasoning.
We study the applicability of abstract argumentation (AF) reasoners in efficiently answering acceptability queries over assumption- based argumentation (ABA) frameworks, one of the prevalent forms of structured argumentation. We provide a refined algorithm for translating ABA frameworks to AFs allowing the use of AF reasoning to answer ABA acceptability queries, covering credulous and skeptical acceptance problems over ABAs in a seamless way under several argumentation semantics. We empirically show that the approach is complementary with a state-of-the-art ABA reasoning system.

:: SpringerLink :: slides ::

Sabrine Mallek, Imen Boukhris, Zied Elouedi and Eric Lefevre. Evidential k-NN for link prediction.
Social networks play a major role in today’s society, they have shaped the unfolding of social relationships. To analyze networks dynamics, link prediction i.e., predicting potential new links between actors, is concerned with inspecting networks topology evolution over time. A key issue to be addressed is the imperfection of real world social network data which are usually missing, noisy, or partially observed. This uncertainty is perfectly handled under the general framework of the belief function theory. Here, link prediction is addressed from a super- vised learning perspective by extending the evidential k-nearest neighbors approach. Each nearest neighbor represents a source of information concerning new links existence. Overall evidence is pooled via the belief function theory fusion scheme. Experiments are conducted on real social network data where performance is evaluated along with a comparative study. Experiment results confirm the effectiveness of the proposed framework, especially when handling skewness in data.

:: SpringerLink :: slides ::

Anders Madsen, Nicolaj Søndberg-Jeppesen, Frank Jensen, Mohamed Sayed, Ulrich Moser, Luis Neto, Joao Reis and Niels Lohse. Parameter learning algorithms for continuous model improvement using operational data.
In this paper, we consider the application of object-oriented Bayesian networks to failure diagnostics in manufacturing systems and continuous model improvement based on operational data. The analysis is based on an object-oriented Bayesian network developed for failure diagnostics of a one-dimensional pick-and-place industrial robot devel- oped by IEF-Werner GmbH. We consider four learning algorithms (batch Expectation-Maximization (EM), incremental EM, Online EM and fractional updating) for parameter updating in the object-oriented Bayesian network using a real operational dataset. Also, we evaluate the perfor- mance of the considered algorithms on a dataset generated from the model to determine which algorithm is best suited for recovering the underlying generating distribution. The object-oriented Bayesian network has been integrated into both the control software of the robot as well as into a software architecture that supports diagnostic and prognostic capabilities of devices in manufacturing systems. We evaluate the time performance of the architecture to determine the feasibility of on-line learning from operational data using each of the four algorithms.

:: SpringerLink :: slides ::

Francesca Mangili, Claudio Bonesana and Alessandro Antonucci. Reliable knowledge-based adaptive testing by credal networks.
An adaptive test is a computer-based testing technique which adjusts the sequence of questions on the basis of the estimated ability level of the test taker. We suggest the use of credal networks, a general- ization of Bayesian networks based on sets of probability mass functions, to implement adaptive tests exploiting the knowledge of the test devel- oper instead of training on databases of answers. Compared to Bayesian networks, these models might offer higher expressiveness and hence a more reliable modeling of the qualitative expert knowledge. The counter- part is a less straightforward identification of the information-theoretic measure controlling the question-selection and the test-stopping criteria. We elaborate on these issues and propose a sound and computationally feasible procedure. Validation against a Bayesian-network approach on a benchmark about German language proficiency assessments suggests that credal networks can be reliable in assessing the student level and effective in reducing the number of questions required to do it.

:: SpringerLink :: slides ::

Martin Plajner and Jirka Vomlel. Monotonicity in Bayesian networks for computerized adaptive testing.
Artificial intelligence is present in many modern computer science applications. The question of effectively learning parameters of such models even with small data samples is still very active. It turns out that restricting conditional probabilities of a probabilistic model by monotonicity conditions might be useful in certain situations. Moreover, in some cases, the modeled reality requires these conditions to hold. In this article we focus on monotonicity conditions in Bayesian Network models. We present an algorithm for learning model parameters, which satisfy monotonicity conditions, based on gradient descent optimization. We test the proposed method on two data sets. One set is synthetic and the other is formed by real data collected for computerized adaptive testing. We compare obtained results with the isotonic regression EM method by Masegosa et al. which also learns BN model parameters satisfying monotonicity. A comparison is performed also with the standard unrestricted EM algorithm for BN learning. Obtained experimental results in our experiments clearly justify monotonicity restrictions. As a consequence of monotonicity requirements, resulting models better fit data.

:: SpringerLink :: slides ::

Jean-Philippe Poli, Laurence Boudet, Espinosa Bruno and Laurence Cornez. Online fuzzy temporal operators for complex system monitoring.
Online fuzzy expert systems can be used to process data and event streams, providing a powerful way to handle their uncertainty and their inaccuracy. Moreover, human experts can decide how to process the streams with rules close to natural language. However, to extract high level information from these streams, they need at least to describe the temporal relations between the data or the events. In this paper, we propose temporal operators which relies on the mathematical definition of some base operators in order to characterize trends and drifts in complex systems. Formalizing temporal relations allows experts to simply describe the behaviors of a system which lead to a break down or an ineffective exploitation. We finally show an experiment of those operators on wind turbines monitoring.

:: SpringerLink :: slides ::

Gian Luca Pozzato. Reasoning in description logics with typicalities and probabilities of exceptions.
We introduce a nonmonotonic procedure for preferential Description Logics in order to reason about typicality by taking probabilities of exceptions into account. We consider an extension, called ALC + TPR , of the logic of typicality ALC + TR by inclusions of the form T(C) ⊑p D, whose intuitive meaning is that “typical Cs are Ds with a probability p”. We consider a notion of extension of an ABox contain- ing only some typicality assertions, then we equip each extension with a probability. We then restrict entailment of a query F to those exten- sions whose probabilities belong to a given and fixed range. We propose a decision procedure for reasoning in ALC + TPR and we exploit it to show that entailment is ExpTime-complete as for the underlying ALC.

:: SpringerLink :: slides ::

Henri Prade and Gilles Richard. Analogical inequalities.
Analogical proportions, i.e., statements of the form a is to b as c is to d, state that the way a and b possibly differ is the same as c and d differ. Thus, it expresses an equality (between differences). However expressing inequalities may be also of interest for stating, for instance, that the difference between a and b is smaller than the one between c and d. The logical modeling of analogical proportions, both in the Boolean case and in the multiple-valued case, has been developed in the last past years. This short paper provides a preliminary investigation of the logical modeling of so-called “analogical inequalities”, which are introduced here, in relation with analogical proportions.

:: SpringerLink :: slides ::

Henri Prade and Gilles Richard. Boolean analogical proportions – axiomatics and algorithmic complexity issues.
Analogical proportions, i.e., statements of the form a is to b as c is to d, are supposed to obey 3 axioms expressing reflexivity, symmetry, and stability under central permutation. These axioms are not enough to determine a single Boolean model, if a minimality condition is not added. After an algebraic discussion of this minimal model and of related expressions, another justification of this model is given in terms of Kolmogorov complexity. It is shown that the 6 Boolean patterns that make an analogical proportion true have a minimal complexity with respect to an expres- sion reflecting the intended meaning of the proportion.

:: SpringerLink :: slides ::

Henry Prakken. On relating abstract and structured probabilistic argumentation: a case study.
This paper investigates the relations between Timmer et al.’s proposal for explaining Bayesian networks with structured argumentation and abstract models of probabilistic argumentation. First some challenges are identified for incorporating probabilistic notions of argument strength in structured models of argumentation. Then it is investigated to what extent Timmer et al’s approach meets these challenges and satis- fies semantics and rationality conditions for probabilistic argumentation frameworks proposed in the literature. The results are used to draw conclusions about the strengths and limitations of both approaches.

:: SpringerLink :

Mustapha Ridaoui, Michel Grabisch and Christophe Labreuche. Axiomatization of an importance index for k-ary games.
We consider MultiCriteria Decision Analysis models which are defined over discrete attributes, taking a finite number of values. We do not assume that the model is monotonically increasing with respect to the attributes values. Our aim is to define an importance index for such general models, encompassing Generalized-Additive Independence models as particular cases. They can be seen as being equivalent to k- ary games (multichoice games). We show that classical solutions like the Shapley value are not suitable for such models, essentially because of the efficiency axiom which does not make sense in this context. We propose an importance index which is a kind of average variation of the model along the attributes. We give an axiomatic characterization of it.

:: SpringerLink :: slides ::

Tjitze Rienstra. RankPL: a qualitative probabilistic programming language.
In this paper we introduce RankPL, a modeling language that can be thought of as a qualitative variant of a probabilistic programming language with a semantics based on Spohn’s ranking theory. Broadly speaking, RankPL can be used to represent and reason about processes that exhibit uncertainty expressible by distinguishing “normal” from “surprising” events. RankPL allows (iterated) revision of rankings over alternative program states and supports various types of reasoning, including abduction and causal inference. We present the language, its denotational semantics, and a number of practical examples. We also discuss an implementation of RankPL that is available for download.

:: SpringerLink :: slides ::

Ahmed Samet, Thomas Guyet, Benjamin Negrevergne, Tien Tuan Dao, Tuan Nha Hoang and Marie Christine Ho Ba Tho. Expert opinion extraction from a biomedical database.
In this paper, we tackle the problem of extracting frequent opinions from uncertain databases. We introduce the foundation of an opinion mining approach with the definition of pattern and support measure. The support measure is derived from the commitment definition. A new algorithm called OpMiner that extracts the set of frequent opinions modelled as a mass functions is detailed. Finally, we apply our approach on a real-world biomedical database that stores opinions of experts to evaluate the reliability level of biomedical data. Performance analysis showed a better quality patterns for our proposed model in comparison with literature-based methods.

:: SpringerLink :: slides ::

Giuseppe Sanfilippo, Niki Pfeifer and Angelo Gilio. Generalized probabilistic modus ponens.
Modus ponens (from A and “if A then C” infer C) is one of the most basic inference rules. The probabilistic modus ponens allows for managing uncertainty by transmitting assigned uncertainties from the premises to the conclusion (i.e., from P(A) and P(C|A) infer P(C)). In this paper, we generalize the probabilistic modus ponens by replacing A by the conditional event A|H. The resulting inference rule involves iterated conditionals (formalized by conditional random quantities) and propagates previsions from the premises to the conclusion. Interestingly, the propagation rules for the lower and the upper bounds on the conclusion of the generalized probabilistic modus ponens coincide with the respective bounds on the conclusion for the (non-nested) probabilistic modus ponens.

:: SpringerLink :: slides ::

Nenad Savic, Dragan Doder and Zoran Ognjanovic. A first-order logic for reasoning about higher-order upper and lower probabilities.
We present a first-order probabilistic logic for reasoning about the uncertainty of events modeled by sets of probability measures. In our language, we have formulas that essentially say that “according to agent Ag, for all x, formula α(x) holds with the lower probability at least 1/3”. Also, the language is powerful enough to allow reasoning about higher order upper and lower probabilities. We provide corresponding Kripke-style semantics, axiomatize the logic and prove that the axiomatization is sound and strongly complete (every satisfiable set of formulas is consistent).

:: SpringerLink ::slides ::

Asma Trabelsi, Zied Elouedi and Eric Lefevre. Ensemble enhanced evidential k-NN classifier through random subspaces.
The process of combining an ensemble of classifiers has been deemed to be an efficient way for improving the performance of several classification problems. The Random Subspace Method, that consists of training a set of classifiers on different subsets of the feature space, has been shown to be effective in increasing the accuracy of classifiers, notably the nearest neighbor one. Since, in several real world domains, data can also be suffered from several aspects of uncertainty, including incompleteness and inconsistency, an Enhanced Evidential k-Nearest Neighbor classifier has been recently introduced to deal with the uncertainty pervading both the attribute values and the classifier outputs within the belief function framework. Thus, in this paper, we are based primarily on the Enhanced Evidential k-Nearest Neighbor classifier to construct an ensemble pattern classification system. More precisely, we adopt the Random Subspace Method in our context to build ensemble classifiers with imperfect data.

:: SpringerLink :: slides ::

Linda C. van der Gaag and Stavros Lopatatzidis. Exploiting stability for compact representations of independency models.
The notion of stability in semi-graphoid independency models was introduced to describe the dynamics of (probabilistic) independency upon inference. We revisit the notion in view of establishing compact representations of semi-graphoid models in general. Algorithms for this purpose typically build upon dedicated operators for constructing new independency statements from a starting set of statements. In this paper, we formulate a generalised strong-contraction operator to supplement existing operators, and prove its soundness. We then embed the operator in a state-of-the-art algorithm and illustrate that the thus enhanced algorithm may establish more compact model representations.

:: SpringerLink :: slides ::

Jirka Vomlel and Václav Kratochvíl. Solving trajectory optimization problems by influence diagrams.
Influence diagrams are decision-theoretic extensions of Bayesian networks. In this paper we show how influence diagrams can be used to solve trajectory optimization problems. These problems are traditionally solved by methods of optimal control theory but influence diagrams offer an alternative that brings benefits over the traditional approaches. We describe how a trajectory optimization problem can be represented as an influence diagram. We illustrate our approach on two well-known trajectory optimization problems – the Brachistochrone Problem and the Goddard Problem. We present results of numerical experiments on these two problems, compare influence diagrams with optimal control methods, and discuss the benefits of influence diagrams.

:: SpringerLink :: slides ::

Marco Wilhelm, Christian Eichhorn, Richard Niland and Gabriele Kern-Isberner. A semantics for conditionals with default negation.
Ranking functions constitute a powerful formalism for non- monotonic reasoning based on qualitative conditional knowledge. Conditionals are formalized defeasible rules and thus allow one to express that certain individuals or subclasses of some broader concept behave differently. More precisely, in order to model these exceptions by means of ranking functions, it is necessary to state that they behave contrarily with respect to the considered property. This paper proposes conditionals with default negation which instead enable a knowledge engineer to formalize exceptions without giving more specific information. This is useful when a subclass behaves indifferent towards a certain property, or the knowledge engineer wants to exclude a certain subclass because she is not aware of its behavior. Based on this novel type of conditionals, we further present and discuss a nonmonotonic inference formalism.

:: SpringerLink :: slides ::

Word Cloud