Seamless support for perspicuity is a formidable and largely unsolved scientific challenge. It touches understanding at design-time, plausibilisation at run-time, and – in case of a malfunctioning – justification at inspection-time. The Collaborative Research Centre/Transregio 248 takes up this challenge. It lays the scientific foundations for designing, analysing and interacting with computerised systems that explicate their functioning. We call them Perspicuous Systems.

The immediate focus of the CRC lies on the scientific foundations of perspicuous computing. It assembles experts in key areas of computer science, spanning what is needed to develop a profound and usable theory of explications – mathematical or computational constructs that describe facets of behaviour – as well as explanations – interactive visualisations and verbalisations that enable the exploration of explications by different user groups. The proposed research will have direct impact across a wide spectrum of cyber-physical applications, making it possible to build software-based systems that act predictably and understandably, and enabling other disciplines to use and reason about those systems. Thereby, the scope of the CRC will successively enlarge to a research agenda on perspicuous cyber-physical systems, reaching out to engineering, psychology, ethics, and law. This will serve our society in its need to stay in well-informed control over the computerised systems we all interact with. The CRC will enable comprehension in a cyber-physical world.

Our research efforts are structured into three project groups, each of them combining the expertise of Principal Investigators across FM, AI, and HCI. Below, you can see the project groups as well as the individual projects themselves.

Construct – Project Group C

The Construct project group will develop system and programming abstractions for the design and implementation of perspicuous systems. The abstractions will connect the computations on the various layers of the system with component and system-level behavioural guarantees, providing a high-level understanding of the relevant factors that determine the system’s behaviour.
The goal of this project is to develop programming abstractions for CPS applications, together with reasoning and verification techniques for correctness, synthesis techniques across continuous… Read More
Principal Investigators: Rupak Majumdar, Bernd Finkbeiner, Stefan Gumhold
A typical embedded device maintains and periodically updates a behaviour model of its surrounding world, on which it bases its decisions. In full generality, this… Read More
Principal Investigators: Holger Hermanns, Christel Baier, Sarah Gaggl
Future dependable systems are expected to be dynamic in the sense that they need to be adaptable in the field, to hanging needs and according… Read More
Principal Investigators: Christof Fetzer, Holger Hermanns, Jörg Hoffmann
The mission of this project is to develop the foundational concepts, algorithmic methods, and tools needed to construct cyber-physical systems whose behaviour is described in… Read More
Principal Investigators: Bernd Finkbeiner, Raimund Dachselt, Markus Krötzsch
This project aims to improve understandability of erroneous behaviour in cyber-physical systems using program analysis techniques. Toward this, first, we will establish the fundamental tools… Read More
Principal Investigators: Maria Christakis, Christof Fetzer
In dynamic environments, at design-time it is hard or impossible to model and tackle all possible circumstances that may arise; run-time planning serves to take… Read More
Principal Investigators: Jörg Hoffmann, Maria Christakis, Matthias Hein

Analyse & Explicate – Project Group A

The projects of group Analyse & Explicate will lay the foundations for analysis techniques as well as explication support across the entire system lifecycle, identifying and explicating causes and effects, thereby establishing the algorithmic principles for perspicuity to human stakeholders.
The vision of the project is to provide an algorithmic framework for the analysis of dynamical and hybrid automata models and corresponding explication mechanisms. While… Read More
Principal Investigators: Joël Ouaknine, Christel Baier
The vision of the project is to develop a new theory for probabilistic causation in stochastic automata models with nondeterminism. It will yield the foundations… Read More
Principal Investigators: Christel Baier, Bernd Finkbeiner, Rupak Majumdar
The main goal of the project is to develop novel techniques for the user-adaptive explication of knowledge-based reasoning, and to combine them with the explication… Read More
Principal Investigators: Franz Baader, Stefan Borgwardt, Antonio Krüger, Christoph Weidenbach
This project investigates perspicuous supervision of technical systems, carried out in a read data—analyse—react loop (RAR loop) several times per second. Our vision is a… Read More
Principal Investigators: Christoph Weidenbach, Christof Fetzer, Markus Krötzsch
The goal of this project is an explication framework for machine learning techniques with a particular focus on neural networks. As current classifiers are extremely… Read More
Principal Investigators: Matthias Hein, Stefan Gumhold

Explain & Interact – Project Group E

Project group Explain & Interact will develop advances in HCI, visualisation, and NLP research with respect to the seamless design, control, and inspection of cyber-physical systems with the goal of achieving perspicuity for diverse types of users, in diverse system settings, and throughout the entire lifecycle of such systems.
Beside the common textual representation, graph visualisation present both structural information and multivariate data attributes as a basis for improved understanding and explanation of models.… Read More
Principal Investigators: Raimund Dachselt, Franz Baader
In mixed-initiative control, one important issue is handover. The mission of this project is to better understand how machines can safely hand over control to… Read More
Principal Investigators: Antonio Krüger, Stefan Borgwardt, Vera Demberg, Jörg Hoffmann
The goal of this project is to establish methods and tools that enable information integration and analysis of CPS behaviour at inspection time. We will… Read More
Principal Investigators: Markus Krötzsch, Sarah Gaggl, Antonio Krüger
This project will investigate immersive visualisation techniques for large heterogeneous datasets partially embedded in physical space with a focus on inspection-time analysis. First, we study… Read More
Principal Investigators: Stefan Gumhold, Raimund Dachselt