Skip to content

Commit

Permalink
Fixed boosting binomial loss (neoml-lib#1019)
Browse files Browse the repository at this point in the history
* added test to branch

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added rca22824

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Build Framework (NeoML-master 2.0.207.0): Incrementing version number.

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Counters linux (neoml-lib#1003)

* Revert "Build Framework (NeoML-master 2.0.204.0): Incrementing version number."

This reverts commit 9a29b52.

* added test to branch

* added rca22824

* linux couners modified

* fixing linux counters

* small change

* another count change

* added flag for linux counter perfomance

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added flag for linux counter perfomance

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added flag for linux counter perfomance

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added flag for linux counter perfomance

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added flag for linux counter perfomance

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* fixed adding flag to perfomancecounter

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* another counting fixing

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* added flag for counters linux

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

---------

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>
Co-authored-by: daniyalaliev <daniial.aliev@abbyy.com>
Co-authored-by: Valeriy Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Build Framework (NeoML-master 2.0.208.0): Incrementing version number.

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Reduced CUDA dropout memory usage (neoml-lib#1005)

* Revert "Build Framework (NeoML-master 2.0.204.0): Incrementing version number."

This reverts commit 9a29b52.

* added test to branch

* added rca22824

* started

* revert some changes

* added some new

* reducing dropout memory usage on cuda

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* design changes have been made

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

---------

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>
Co-authored-by: daniyalaliev <daniial.aliev@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* [CudaMathEngine] Fix warp-size iterations in max reducing functions (neoml-lib#1010)

Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* [NeoML] Add min-max gradient clipping (neoml-lib#1009)

Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Co-authored-by: Valeriy Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* [CudaMathEngine] Fix restrict modifier for function arguments (neoml-lib#1011)

Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Co-authored-by: Valeriy Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Fix CImageToPixelLayer::SetImageHeight/Width (neoml-lib#1012)

Signed-off-by: Valerii Fediunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Build Framework (NeoML-master 2.0.209.0): Incrementing version number.

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* [CudaMathEngine] Fix shared buffers initializations and expf calls (neoml-lib#1004)

Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Co-authored-by: Valeriy Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* update fineobj to 15 47 (neoml-lib#1013)

* update fineobj to 15 47

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Upgrade NeoMLTest

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Call set_global_variables before include(FineInstall)

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Remove unused variable

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Fix protobuf compilation errors on Darwin

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Remove invalid unicode symbols from copyright (+update)

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Remove unused variable

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Update NeoMLTest version

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Switch to next NeoMLTest

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

* Update NeoMLTest

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>

---------

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Build Framework (NeoML-master 2.0.210.0): Incrementing version number.

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* Fix iOS CMake toolchain for newer versions of CMake (neoml-lib#1016)

Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* [NeoML] Transfer CDnnBlob data in threads pools (neoml-lib#1014)

Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

* boost problem fixed

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>

---------

Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>
Signed-off-by: Kirill Golikov <kirill.golikov@abbyy.com>
Signed-off-by: Valerii Fediunin <valery.fedyunin@abbyy.com>
Signed-off-by: Valery Fedyunin <valery.fedyunin@abbyy.com>
Co-authored-by: buildtech <buildtech@abbyy.com>
Co-authored-by: daniyalaliev <daniial.aliev@abbyy.com>
Co-authored-by: Valeriy Fedyunin <valery.fedyunin@abbyy.com>
Co-authored-by: Kirill Golikov <kirill.golikov@abbyy.com>
Co-authored-by: NeoML-maintainer <65914319+NeoML-maintainer@users.noreply.github.com>
Signed-off-by: daniyalaliev <daniial.aliev@abbyy.com>
  • Loading branch information
6 people committed Mar 1, 2024
1 parent bc3593d commit 736c729
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Note that the *L1RegFactor*, *L2RegFactor*, *PruneCriterionValue* parameters are
The following loss functions are supported:

- *LF_Exponential*[classification only] exponential loss function: `L(x, y) = exp(-(2y - 1) * x)`;
- *LF_Binomial*[classification only] binomial loss function: `L(x, y) = ln(1 + exp(-x)) - x * y`;
- *LF_Binomial*[classification only] binomial loss function: `L(x, y) = ln(1 + exp(-x)) + x * (1 - y)`;
- *LF_SquaredHinge*[classification only] smoothed square hinge: `L(x, y) = max(0, 1 - (2y - 1)* x) ^ 2`;
- *LF_L2* — quadratic loss function: `L(x, y) = (y - x)^2 / 2`.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
При построении модели можно использовать следующие функции потерь:

- *LF_Exponential*[только для классификации] экспоненциальная функция потерь: `L(x, y) = exp(-(2y - 1) * x)`;
- *LF_Binomial*[только для классификации] биномиальная функция потерь: `L(x, y) = ln(1 + exp(-x)) - x * y`;
- *LF_Binomial*[только для классификации] биномиальная функция потерь: `L(x, y) = ln(1 + exp(-x)) + x * (1 - y)`;
- *LF_SquaredHinge*[только для классификации] сглаженный квадратичный Hinge: `L(x, y) = max(0, 1 - (2y - 1)* x) ^ 2`;
- *LF_L2* — квадратичная функция потерь: `L(x, y) = (y - x)^2 / 2`.

Expand Down
2 changes: 1 addition & 1 deletion NeoML/src/TraditionalML/GradientBoost.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ double CGradientBoostingBinomialLossFunction::CalcLossMean( const CArray< CArray
for( int i = 0; i < predicts.Size(); ++i ) {
double sum = 0;
for( int j = 0; j < predicts[i].Size(); ++j ) {
sum += log1p( exp( min( -predicts[i][j], MaxExpArgument ) ) ) - predicts[i][j] * answers[i][j];
sum += log1p( exp( min( -predicts[i][j], MaxExpArgument ) ) ) - predicts[i][j] * answers[i][j] + predicts[i][j];
}
overallSum += getMean( sum, predicts[i].Size() );
}
Expand Down

0 comments on commit 736c729

Please sign in to comment.