Project EXPECTATION (2021-2024) Accepted.

EXPECTATION is s CHIST-ERA (ERA-NET and FET supported project) on eXplainable AI (XAI) entitled: Personalized Explainable Artificial Intelligence for decentralized agents with heterogeneous knowledge.

Partnerts

The project involves 4 partners:

  • University of Luxembourg (PI: Prof. Leon van der Torre),
  • HES-SO, University of Applied Sciences and Arts Western Switzerland,
  • Alma Mater Studiorum Università di Bologna, Italy and
  • Özyeğin University, Turkey.

Project description

Explainable AI (XAI) has recently emerged proposing a set of techniques attempting to explain machine learning (ML) models. The recipients (explainee) are intended to be humans or other intelligent virtual entities. Transparency, trust, and debuging are the underlying features calling for XAI. However, in real-world settings, systems are distributed, data are heterogeneous, the “system” knowledge is bounded, and privacy concerns are subject to variable constraints. Current XAI approaches cannot cope with such requirements. Therefore, there is a need for personalized explainable artificial intelligence. We plan to develop models and mechanisms to reconcile sub-symbolic, symbolic, and semantic representations leveraging on the agent-based paradigm. In particular, the proposed approach combines inter-agent, intra-agent, and human-agent interactions to benefit from both the specialization of ML agents and the establishment of agent collaboration mechanisms, which will integrate heterogeneous knowledge/explanations extracted from efficient black-box AI agents. The project includes the validation of the personalization and heterogeneous knowledge integration approach through a prototype application in the domain of food and nutrition monitoring and recommendation, including the evaluation of agent-human explainability, and the performance of the employed techniques in a collaborative AI environment.