PlexPlain - Explaining Linear Programs

PlexPlain („Erklärende KI für Komplexe Lineare Programme am Beispiel intelligenter Energiesysteme”) is a R&D project funded by the German Federal Ministry of Education and Research. Its goal is the automated generation of explanations for complex linear programs with a focus on applications in the energy sector.

TU Darmstadt’s Centre for Cognitive Science participates with its research groups and PIs: Models of Higher Cognition (Frank Jäkel, Project Leader), Artificial Intelligence and Machine Learning (Kristian Kersting), Psychology of Information Processing (Constantin Rothkopf). The application domain is represented by TU Darmstadt’s research group Energy Information Networks and Systems (Florian Steinke) and associated partners from energy industry: Siemens AG (Corporate Technology, Research in Energy and Electronics) in Munich, and Entega AG in Darmstadt.

As technical and social systems are increasing in complexity, Artificial Intelligence (AI) promises to help us manage these systems by providing support for planning and decision making. However, predictions and action policies generated by AI and Machine Learning are usually not transparent, i.e. AI-algorithms do not provide us with explanations for their solutions. In those applications where AI-support is most needed, systems often involve millions of variables easily, and their interaction is hardly understandable even for experts. This is due to the sheer size of those systems but is also a result of the complexity and opaqueness of AI algorithms.

Objectives: PlexPlain will investigate how human experts understand and explain complex systems and AI algorithms that support decision making for these systems. The goal is an (at least partially) automated generation of cognitively adequate explanations to also support non-expert users of AI. Applications will focus on examples from the energy sector, e.g. policies for the transition to renewable energy and prediction of the market price for electricity, but will open up to other problem domains during the project as well.

Methodology: PlexPlain will conduct behavioural studies to examine how humans develop an understanding of linear programs. The observed human strategies will be used to develop algorithms to simplify linear programs, translate them into graphical models, and finally generate cognitively adequate explanations. PlexPlain will exploit the fact, that linear programming, i.e. the optimization of linear objective functions with linear constraints, is a fundamental and widely used AI method for optimization and planning in complex systems. In addition, a variety of current other methods in Machine Learning and AI can be analysed with linear programming as well, e.g. neural networks or reinforcement learning. Therefore, linear programs represent a large and relevant class of problems for which AI should not just provide solutions but also cognitively adequate explanations.

Contacts:

Frank Jäkel, Florian Steinke

Conference, Journal and Magazine Articles:

Theses:

  • Frodl, E. (2021): The Furniture Company: Building Games to Measure Human Performance in Optimization Problems. Bachelor’s Thesis, Advisors: F. Jäkel, C. Ott. Technische Universität Darmstadt, 2021.
  • Sieben, L. (2021): Natural Language Interface for an Energy System Design Tool. Master’s Thesis, Advisors: F. Steinke, J. Hülsmann. Technische Universität Darmstadt, 2021.
  • Seng, J. (2021): Causal Discovery in Energy System Models. Master’s Thesis, Advisors: F. Steinke, K. Kersting. Technische Universität Darmstadt, 2021.
  • Busch, F. P. (2022): Explaining Neural Network Representations of Linear Programs. Master’s Thesis, Advisors: Kersting, M. Zečević. Technische Universität Darmstadt, 2022.
  • Steinmann, D. (2022): Explaining Linear Programs via Neural Attribution Methods. Master’s Thesis, Advisors: Kersting, M. Zečević. Technische Universität Darmstadt, 2022.
  • Dotterer, S. (2022) Investigating the Influence of Different Cost-Profit Ratios on Human Performance and Strategies in Optimization Problems in an Eye Tracking Experiment. Bachelor’s Thesis, Advisors: Rothkopf, C. A., Ibs, I. Technische Universität Darmstadt, 2022
  • Uetz, P. (2022): Investigation of the Vulnerability of Energy System Models to Adversarial Attacks. Master’s Thesis, Advisors: F. Steinke, J. Hülsmann. Technische Universität Darmstadt, 2022.
  • Pohl. A (2023): Die Heidelberger Struktur-Lege-Technik als Werkzeug zur Analyse subjektiver Theorien im Kontext des Planspiels Energiewende. Bachelor’s Thesis, Advisors: F. Jäkel, C. Ott. Technische Universität Darmstadt, 2023
  • Rödling, S. (2023): Providing Causal Explanations Over Time: An Extension of SCE for Time-Series Data. Master’s Thesis, Advisors: Kersting, M. Zečević. Technische Universität Darmstadt, 2023.
  • Hülsmann, J. (2024): Aspects of Explanations for Optimization-Based Energy System Models. Doctoral Thesis, Referees: Steinke, F., Jäkel, F. Technische Universität Darmstadt, 2024.

Preprints and Accepted Articles:

  • Claire Ott, Inga Ibs, Constantin Rothkopf, Frank Jäkel, (submitted). Unveiling the relationship between tasks: Optimization as a case for taxonomic analysis.

Project Details

Project: PlexPlain – Erklärende KI für Komplexe Lineare Programme am Beispiel intelligenter Energiesysteme
Project partners: Technical University of Darmstadt (TU Darmstadt)
Project duration: April 2020 – July 2023
Project funding: 1.23 Mio EUR
Funded by: German Federal Ministry of Education and Research (BMBF)
Grant no.: 01IS19081
Website: https://www.softwaresysteme.pt-dlr.de/de/ki-erkl-rbarkeit-und-transparenz.php
Final Report Download (opens in new tab)