What is SHAP (Shapley Additive Explanations)?

Shapley Additive Explanations (SHAP)

SHAP stands for “Shapley Additive Explanations," a concept derived from game theory and used to explain the output of machine learning models. SHAP values help interpret how much a given feature or input contributes, positively or negatively, to the target outcome or prediction.

SHAP graphic

Sign up for our monthly newsletter, The Drift.

Subscribe