Enhancing Performance and Explainability of Multivariate Time Series Machine Learning Methods: Applications for Social Impact in Dairy Resource Monitoring and Earthquake Early Warning

Type de soutenance
Thèse
Date de début
Lieu
IRISA Rennes
Salle
Métivier
Orateur
Kevin Fauvel (LACODAM)
Sujet

The prevalent deployment and usage of sensors in a wide range of sectors generate an abundance of multivariate data which have proven to be instrumental for researches, businesses and policies. More specifically, multivariate data which integrates temporal evolution, i.e. Multivariate Time Series (MTS), has received significant interests in recent years, driven by high resolution monitoring applications (e.g. healthcare, mobility) and machine learning. However, for many applications, the adoption of machine learning algorithms cannot rely solely on their prediction performance. For example, the European Union's General Data Protection Regulation, which became enforceable on 25 May 2018, introduces a right to explanation for all individuals so that they can obtain "meaningful explanations of the logic involved" when automated decision-making has "legal effects" on individuals or similarly "significantly affecting" them.

The current best performing state-of-the-art MTS machine learning methods are "black-box" models, i.e. complicated-to-understand models, which rely on explainability methods providing explanations from any machine learning model to support their predictions (post-hoc model-agnostic). The main line of work in post-hoc model-agnostic explainability methods approximates the decision surface of a model using an explainable surrogate model. However, the explanations from the surrogate models cannot be perfectly faithful with respect to the original model, which is a prerequisite for numerous applications. Faithfulness is critical as it corresponds to the level of trust an end-user can have in the explanations of model predictions, i.e.  the level of relatedness of the explanations to what the model actually computes.

This thesis introduces new approaches to enhance both performance and explainability of MTS machine learning methods, and derive insights from the new methods about two real-world applications.

 

Composition du jury
- Fosca GIANNOTTI. Director of Reseach - CNR, Italy
- Véronique MASSON. Associate Professor - University of Rennes, France
- Germain FORESTIER. Full Professor - University of Haute-Alsace, France
- Sébastien LEFÈVRE. Full Professor - University of South Brittany, France
- Aurélien MADOUASSE. Associate Professor - ONIRIS/INRAE, France
- Philippe FAVERDIN. Director of Reseach - INRAE, France (Thesis Co-Dir.)
- Alexandre TERMIER. Full Professor - University of Rennes, France (Thesis Dir.)