We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

From SHAP to EBM: Explain your Gradient Boosting Models in Python

Formale Metadaten

Titel
From SHAP to EBM: Explain your Gradient Boosting Models in Python
Serientitel
Anzahl der Teile
18
Autor
Mitwirkende
Lizenz
CC-Namensnennung 4.0 International:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen.
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
**XGBoost** is considered a state-of-the-art model for regression, classification, and learning-to-rank problems on tabular data. Unfortunately, tree-based ensemble models are notoriously **difficult to explain**, limiting their application in critical fields. Techniques like SHapley Additive exPlanations (SHAP) and Explainable Boosting Machine (EBM) have become common methods for assessing how much each feature contributes to the model prediction. This talk will introduce SHAP and EBM, **explaining the theory** behind their mechanisms in an accessible way and **discussing the pros and cons** of both techniques. We will also comment on Python snippets where SHAP and EBM are used to explain a gradient boosting model. Attendees will walk away with an understanding of how SHAP and EBM work, the limitations and merits of both techniques, and a tutorial on how to use these methods in Python, courtesy of the [shap](shap.readthedocs.io/en/latest/) and [interpret-ml](interpret.ml/docs/ebm) packages. Talk outline: - A brief reminder about gradient boosting and XGBoost (5 mins) - The challenge of explainability (5 mins) - EBM: theory and applications (10 mins) - SHAP: theory and applications (10 mins) --------------------- About the speaker(s): Engineer, researcher, entrepreneur. Emanuele earned his PhD in AI by researching time series forecasting. He was a guest researcher at EPFL Lausanne, and he's now the Head of AI at xtream. He published 8 papers in international journals, presented and organized tracks and workshops at international conferences, including AMLD Lausanne, ODSC London, WeAreDevelopers Berlin, PyData Berlin and Paris, PyCon Florence, and lectured in Italy, Switzerland, and Poland.