-
Notifications
You must be signed in to change notification settings - Fork 720
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about global explanation and local explanation #476
Comments
Hi @JWKKWJ123 -- Global feature importances can be obtained via the term_importances function (terms include both individual features and also pairs): Local per-feature score contributions can be obtained with the predict_and_contrib function: The number in the red box is the value that you assigned to the feature for this sample and then passed in via the X parameter to explain_local. I suspect given that all the numbers in your example are between 0 and 1 that you're scaling them. The contribution in the blue box does not range between 0 and 1. For classification the score contributions are in logits, so having a +1 contribution from a single feature would be fairly significant and it appears at least for this model and particular sample that no feature has this level of contribution. |
Hi Paul, |
Hi Paul, |
Hey @JWKKWJ123, I left some instructions on doing custom image exports here here: #161 (comment) You can use any of the supported plotly image export formats via the |
Hi Harsha, |
Do anyone of you know how to set the label size in EBM explanation plots, I am using it for research but its plots labels are very small, making it unreadable in document at 100% resolution. |
Actually I have the same question, the features in my experiment have long names. Now I use the function ‘ ebm.term_importances( )’ (train the ebm at first) to output the feature importance and use 'seaborn' package to draw the plots by my self. |
I have actually reached out to some researchers in my field, they have used the EBM plots and had good resolution plots in their article, they referred me to go through plotly and matplotlib api docs. |
Hi all,
The show(ebm_local) and show(ebm_local) functions can show the plots of feature importance and local(subject-wise) prediction very well and I like the plot very well. But I still need to output the global feature importance and local predictions to draw plots that suit my need. I would like to ask is there functions to output the feature importance and local predictions?
By the way, I am confused by the local explanation. For the number in red box, is it the output of each feature? For the contribution for the prediction in the blue box, does it range between 0 and 1? Does the contribution for the prediction both related to the global feature importance and the local prediction?
The text was updated successfully, but these errors were encountered: