Apply now until July 15 for all Bachelor's and German-language Master's programs as well as selected English-language Master's programs at TU Hamburg.
20.06.2025

Opening symposium of the CAUSE graduate school

Joint research on self-explaining systems at the interface of software, hardware, and their design.

How can digital systems explain their behavior in a precise and understandable way? This question is the focus of CAUSE – Concepts and Algorithms for, and Usage of Self-Explaining Digitally Controlled Systems – the graduate school funded by the DFG since November 2024 with the participation of three institutions – the Hamburg University of Technology, the University of Bremen, and the University of Oldenburg.

Today, almost every technical system is digitally controlled, which increases its intelligence. However, the way it works often remains incomprehensible to developers, users, and cooperating systems, which can lead to incorrect conclusions, inefficiency, and even incorrect responses. The CAUSE graduate school addresses these problems by working to make digitally controlled systems self-explanatory for developers, users, and other systems.

Researchers have been collaborating within CAUSE since November 2024. Together, they are working to investigate self-explanation using the example of energy networks and wind farms. A comprehensive formalization of the concept of “explanation,” similar to epistemic logic, allows the views of different systems to be compared, the need for explanations to be determined, and explanations to be provided.

CAUSE was launched with a symposium at the Hamburg University of Technology (TUHH). In his welcoming address, TUHH President Andreas Timm-Giel emphasized the strategic importance of CAUSE for strengthening regional cooperation between the partner institutions and further developing technically sound approaches to system transparency. Prof. Görschwin Fey, spokesperson for CAUSE, and Prof. Martin Fränzle, co-spokesperson from Oldenburg, outlined the vision of the graduate school: the development of algorithmic principles, formal approaches, and engineering methods for systems that can provide meaningful, context-specific explanations for their own behavior.

The morning keynotes set the tone for the technical depth of the day. Prof. Erika Abraham (RWTH Aachen) opened with a presentation on formal verification methods as a basis for self-explanatory control systems and emphasized the role of logical structures for traceability and trust. She was followed by Andreas Sommer (WEINMANN Emergency Medical Technology), who explained from an industry perspective how explainability in system design can promote the development of medical devices.

The subsequent presentations introduced current research directions within CAUSE. The talks ranged from fundamental challenges of cyber-physical systems (Prof. Heiko Falk) and recipient-specific explanation models (Prof. Verena Klös) to algorithmic decision-making processes (Moritz Buhr) and executable explanations (Ulrike Engeln). Contributions from Ivo David Oliveira and Caroline Dominik showed how context-sensitive self-explanation can be integrated into adaptive software and virtual prototyping.

In the concluding poster session, all CAUSE doctoral candidates presented their research topics, highlighting the diversity of topics and the interdisciplinary collaborations already underway. This session provided space for a detailed technical exchange on points of contact that will shape the development of the program.

Gruppenfoto
Forscherinnen und Forscher aus Academia und Praxis trafen sich an der TUHH, um das Thema Selbst-Erklärung von Systemen zu diskutieren.