-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom Loss Function for LGBMRegressor #5256
Comments
Hi @jdtrebbien, thank you for your interest in LightGBM. If you only want a custom objective it is enough to specify LightGBM/tests/python_package_test/test_sklearn.py Lines 207 to 214 in 4971a06
Keep in mind that the results will be a bit different at first because of the different init score #5114 (comment) but if you train for enough iterations you should get the same results. You can also use the built-in metrics if you want, there's no need to use a custom metric if there's already a built-in one. Please let us know if this helps. |
Thank you very much, that is basically what I was looking for. But are you sure I dont need to change metric or something to also change the loss function being used on the validation set for early_stopping? Since it still says One more thing if you have the time: |
That's the default metric for regression, if you do want to use a custom metric as well you have to set LightGBM/tests/python_package_test/test_sklearn.py Lines 913 to 917 in 4971a06
TBH I don't know about the l1/2 loss, the square root is defined only for non-negative numbers so I don't think it'll be a good objective but maybe someone else here can comment on it. |
@jdtrebbien You can find an end-to-end way of how to use custom-loss and evaluation function on my LightGBMLSS Repo. For the linked example, I use PyTorch's autograd function, so that you can derive gradients and hessians for any user-defined loss. Let me know if that is useful. |
This issue has been automatically closed because it has been awaiting a response for too long. When you have time to to work with the maintainers to resolve this issue, please post a new comment and it will be re-opened. If the issue has been locked for editing by the time you return to it, please open a new issue and reference this one. Thank you for taking the time to improve LightGBM! |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
I want to use a custom loss function for LGBMRegressor but I cant find any documentation on it. If I understand it correctly I need to use the params 'objective' and 'metric' to completely change the loss function in training and evaluation. The function for 'objective' returning (grad, hess) and the function for 'metric' returning ('<loss_name>', loss, uses_max). I am just searching for the two functions that are being used when the default objective 'regression' (l2 loss) is beeing used so I can reproduce and change it. I already found the C++ code for the regression, but I am unable to reproduce it using two custom functions written in python.
This would be my approach:
Does someone know how to reproduce the l2_loss in python?
The text was updated successfully, but these errors were encountered: