Apply now until July 15 for all Bachelor's and German-language Master's programs as well as selected English-language Master's programs at TU Hamburg.
07.07.2025

Opening symposium of the research training group CAUSE

Joint research on self-explanatory systems at the interface of software, hardware and their design

How can digital systems explain their behavior in a way that is both precise and understandable? This question is the focus of CAUSE - Concepts and Algorithms for, and Usage of Self-Explaining Digitally Controlled Systems (https://rtg-cause.github.io/) - the graduate school funded by the DFG since November 2024 with the participation of three institutions - the Technical University of Hamburg, the University of Bremen and the University of Oldenburg.

Almost every technical system today is digitally controlled, which increases its intelligence. However, the way it works often remains incomprehensible to developers, users and cooperating systems, which can lead to false conclusions, inefficiency and even incorrect reactions. The CAUSE research training group is tackling these problems by working on making digitally controlled systems self-explanatory for developers, users and other systems.

Researchers have been cooperating within CAUSE since November 2024. They are working together to investigate self-explanation using the example of energy networks and wind farms. An overarching formalization of the term “explanation” similar to an epistemic logic allows the views of different systems to be compared, the need for explanations to be determined and explanations to be provided.

CAUSE went public with a symposium at the Hamburg University of Technology (TUHH). In his welcoming speech, TUHH President Andreas Timm-Giel emphasized the strategic importance of CAUSE for strengthening regional cooperation between the partner institutions and the further development of technically sound approaches to system transparency. Prof. Görschwin Fey, CAUSE spokesperson, and Prof. Martin Fränzle, co-spokesperson from Oldenburg, outlined the vision of the graduate school: the development of algorithmic principles, formal approaches and engineering methods for systems that can provide meaningful, contextual explanations for their own behavior.

The morning keynotes set the tone for the technical depth of the day. Prof. Erika Abraham (RWTH Aachen University) opened with a talk on formal verification methods as the basis for self-explanatory control systems, emphasizing the role of logical structures for traceability and trust. She was followed by Andreas Sommer (WEINMANN Emergency Medical Technology), who explained from an industry perspective how explainability in system design can promote the development of medical devices.

The subsequent presentations introduced current research directions within CAUSE. The talks ranged from fundamental challenges of cyber-physical systems (Prof. Heiko Falk) and receiver-specific explanation models (Prof. Verena Klös) to algorithmic decision procedures (Moritz Buhr) and executable explanations (Ulrike Engeln). Contributions by Ivo David Oliveira and Caroline Dominik showed how context-sensitive self-explanation can be integrated into adaptive software and virtual prototyping.

In the concluding poster session, all CAUSE doctoral students presented their research topics and illustrated the diversity of topics and the interdisciplinary collaborations already underway. This session provided space for a detailed professional exchange on points of contact that will shape the development of the program.

Researchers from academia and practice met at the TUHH to discuss the topic of self-explanation of systems.