TU Darmstadt’s Centre for Cognitive Science participates with its research groups and PIs: (Frank Jäkel, Project Leader), Models of Higher Cognition (Kristian Kersting), Artificial Intelligence and Machine Learning (Constantin Rothkopf). The application domain is represented by TU Darmstadt’s research group Psychology of Information Processing (Florian Steinke) and associated partners from energy industry: Siemens AG (Corporate Technology, Research in Energy and Electronics) in Munich, and Entega AG in Darmstadt. Energy Information Networks and Systems
As technical and social systems are increasing in complexity, Artificial Intelligence (AI) promises to help us manage these systems by providing support for planning and decision making. However, predictions and action policies generated by AI and Machine Learning are usually not transparent, i.e. AI-algorithms do not provide us with explanations for their solutions. In those applications where AI-support is most needed, systems often involve millions of variables easily, and their interaction is hardly understandable even for experts. This is due to the sheer size of those systems but is also a result of the complexity and opaqueness of AI algorithms.
Objectives: PlexPlain will investigate how human experts understand and explain complex systems and AI algorithms that support decision making for these systems. The goal is an (at least partially) automated generation of cognitively adequate explanations to also support non-expert users of AI. Applications will focus on examples from the energy sector, e.g. policies for the transition to renewable energy and prediction of the market price for electricity, but will open up to other problem domains during the project as well.
Methodology: PlexPlain will conduct behavioural studies to examine how humans develop an understanding of linear programs. The observed human strategies will be used to develop algorithms to simplify linear programs, translate them into graphical models, and finally generate cognitively adequate explanations. PlexPlain will exploit the fact, that linear programming, i.e. the optimization of linear objective functions with linear constraints, is a fundamental and widely used AI method for optimization and planning in complex systems. In addition, a variety of current other methods in Machine Learning and AI can be analysed with linear programming as well, e.g. neural networks or reinforcement learning. Therefore, linear programs represent a large and relevant class of problems for which AI should not just provide solutions but also cognitively adequate explanations.
Conference, Journal and Magazine Articles:
 Jonas Hülsmann and Florian Steinke (2020): Explaining Complex Energy Systems: A Challenge. Poster presented at: Tackling Climate Change with Machine Learning – NeurIPS; December 11, 2020.
 (opens in new tab)Matej Zečević, Devendra Singh Dhami, Athresh Karanam, Sriraam Natarajan, Kristian Kersting (2021): Interventional Sum-Product Networks: Causal Inference with Tractable Probabilistic Models. Published in Proceedings of Neural Information Processing Systems 34.
 Jonas Hülsmann, Lennart J. Sieben, Mohsen Mesgar, Florian Steinke (2021): A Natural Language Interface for an Energy System Model. 2021 IEEE PES Innovative Smart Grid Technologies Europe (ISGT Europe), 2021, pp. 1-5, doi: . 10.1109/ISGTEurope52324.2021.9640196
[A1] Frodl, E. (2021): The Furniture Company: Building Games to Measure Human Performance in Optimization Problems. Bachelor’s Thesis, Advisors: F. Jäkel, C. Ott. Technische Universität Darmstadt, 2021.
[A2] Sieben, L. (2021): Natural Language Interface for an Energy System Design Tool. Master’s Thesis, Advisors: F. Steinke, J. Hülsmann. Technische Universität Darmstadt, 2021.
[A3] Seng, J. (2021): Causal Discovery in Energy System Models. Master’s Thesis, Advisors: F. Steinke, K. Kersting. Technische Universität Darmstadt, 2021.