You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run BayesSHAP on my regression task, I encounter the convergence error at the self.enumerate_initial_shap function. Any idea why this happens?
I saw enumerate_initial is set to True as default which leads to this self.enumerate_initial_shap. Is it necessary to do so when performing BayesSHAP?
Also is BayesSHAP scalable to high input dimensions?
Best
Robin
Traceback (most recent call last):
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3441, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-15-414d493a31bd>", line 14, in <module>
e_i = exp.explain(X_bayesshap[i], **exp_kwargs)
File "/home/Robin/BNN_Torch/bayesshap/explanations.py", line 651, in explain
l2=l2)
File "/home/Robin/BNN_Torch/bayesshap/explanations.py", line 457, in _explain_bayes_shap
data_init, inverse_init = self._enumerate_initial_shap(data, max_coefs)
File "/home/Robin/BNN_Torch/bayesshap/explanations.py", line 417, in _enumerate_initial_shap
inverse = self.shap_info.discretizer.undiscretize(data)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/lime/discretize.py", line 145, in undiscretize
feature, ret[:, feature].astype(int)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/lime/discretize.py", line 132, in get_undiscretize_values
random_state=self.random_state
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/stats/_distn_infrastructure.py", line 980, in rvs
vals = self._rvs(*args)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/stats/_distn_infrastructure.py", line 913, in _rvs
Y = self._ppf(U, *args)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/stats/_continuous_distns.py", line 7163, in _ppf
return _truncnorm_ppf(q, a, b)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/stats/_continuous_distns.py", line 6933, in vf_wrapper
return vf(*args)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/numpy/lib/function_base.py", line 2163, in __call__
return self._vectorize_call(func=func, args=vargs)
File "/homeminiconda3/envs/hpmodel/lib/python3.7/site-packages/numpy/lib/function_base.py", line 2246, in _vectorize_call
outputs = ufunc(*inputs)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/stats/_continuous_distns.py", line 7113, in _truncnorm_ppf
maxiter=TRUNCNORM_MAX_BRENT_ITERS)
File "/home/miniconda3/envs/hpmodel/lib/python3.7/site-packages/scipy/optimize/zeros.py", line 780, in brentq
r = _zeros._brentq(f, a, b, xtol, rtol, maxiter, args, full_output, disp)
RuntimeError: Failed to converge after 40 iterations.
The text was updated successfully, but these errors were encountered:
Would you mind providing some more detail so I can reproduce this? I don't really see why this should occur. Is this for tabular data?
It doesn't look like the enumerate_initial is causing the issue, it's the feature discretization... (but you can turn enumerate_initial off if you want to, it just tends to make the Shapley value estimation a bit better by fitting some higher weight coalitions first).
It can take a while to fit the model for large feature dimensions (that's why we use super pixels for images), so depending on how large your talking, you could potentially run into issues.
Hi Dylan,
When I run BayesSHAP on my regression task, I encounter the convergence error at the
self.enumerate_initial_shap
function. Any idea why this happens?I saw
enumerate_initial
is set toTrue
as default which leads to thisself.enumerate_initial_shap
. Is it necessary to do so when performing BayesSHAP?Also is BayesSHAP scalable to high input dimensions?
Best
Robin
The text was updated successfully, but these errors were encountered: