shap (SHapley Additive exPlanations)

SHAP package added to Python build: Python-3.6.5-foss-2016b-fh3 SHAP (SHapley Additive exPlanations) is a unified approach to explain the output of any machine learning model. SHAP connects game theory with local explanations, uniting several previous methods [1-7] and representing the only possible consistent and locally accurate additive feature attribution method based on expectations (see the SHAP NIPS paper for details).

Project Homepage: shap

Usage:

module load Python/3.6.5-foss-2016b-fh3

Updated: