-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support forced splits with data and voting parallel versions of LightGBM #4260
Comments
it looks like the ForceSplits function is also only implemented in the serial tree learner: https://github.com/microsoft/LightGBM/blob/master/src/treelearner/serial_tree_learner.cpp#L450 seems I would need to do something similar in the data parallel tree learner: https://github.com/microsoft/LightGBM/blob/master/src/treelearner/data_parallel_tree_learner.cpp and voting parallel learner as well |
For data and voting distributed training, we need to synchronize the histogram before LightGBM/src/treelearner/serial_tree_learner.cpp Lines 483 to 489 in f831808
So the logic for implementing |
Closed in favor of being in #2302. We decided to keep all feature requests in one place. Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature. |
Summary
I'm unable to add forced splits to the data and voting parallel versions of lightgbm in mmlspark, I see the error:
https://github.com/microsoft/LightGBM/blob/master/src/io/config.cpp#L318
Motivation
I would like to add this feature but I'm not sure why I can't just remove that line in the config and enable it. What is special about the data and voting parallel learners that wouldn't allow this config to be specified on each node?
Description
The fix would be to remove that thrown exception. Also, it would be great if we could specify the forced splits as a string directly instead of a file.
https://github.com/microsoft/LightGBM/blob/master/src/boosting/gbdt.cpp#L769
The text was updated successfully, but these errors were encountered: