A2 – Probabilistic Causality and Explications

The vision of the project is to develop a new theory for probabilistic causation in stochastic automata models with nondeterminism. It will yield the foundations for reasoning about cause-effect relations which, e. g., specify the degree to which events in one component can cause critical situations in another component. The project will develop algorithms for both a probabilistic causality analysis to compute cause-effect assertions and causality-based verification algorithms with integrated certification and explication facilities. The latter includes reduction techniques that exploit the stochastic (in-)dependence structure of actions as well as algorithms for the incremental recovery of the causal structure of systems. Besides providing foundations for the design of perspicuous system, the project also will address the use of causality-based reasoning in combination with run-time monitoring algorithms to detect critical event sequences and to support decision making at run-time. The long-term vision also includes techniques that support the transition from mathematical explications to visual or verbal explanations in terms of the automated generation of annotated graph structures and other graphical representations of causal relationships.

Principal Investigators

Principal Investigator
Technische Universität Dresden
Principal Investigator
CISPA Helmholtz Center for Information Security
Principal Investigator
Max Planck Institute for Software Systems

Researchers

Norine Coenen
( , )
Clemens Dubslaff
( , )
Florian Funke
( , , )
Simon Jantsch
( , )
Nikolai Käfer
( , )
Jakob Piribauer
( )
Kallistos Weis
( , )
Patrick Wienhöft
( )
Robin Ziemek
( )

Publications

View all Publications
Loading Data...