Glossary of AI Terminology

What is SHAP (Shapley Additive Explanations)?

Shapley Additive Explanations (SHAP)

SHAP stands for “Shapley Additive Explanations,” a concept derived from game theory and used to explain the output of machine learning models. SHAP values help interpret how much a given feature or input contributes, positively or negatively, to the target outcome or prediction.

Shapley Additive Explanations (SHAP)

Bi-weekly AI Research Paper Readings

Stay on top of emerging trends and frameworks.