|
|
|
|
|
|
|
|
|
|
autograd - efficiently computes derivatives of numpy code
|
CatBoost - Yandex an open-source gradient boosting on decision trees library with categorical features support out of the box for Python, R
|
LightGBM - A fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. It is under the umbrella of the DMTK project of Microsoft.
|
Stochastic Gradient Descent (version 2) Leon Bottou |
Torch-autograd - automatically differentiates native Torch code.
|
XGBoost - an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable.
|
YellowFin - auto-tuning momentum SGD optimizer (can replace any standard one in Tensorflow).
|
|
|
|
|
|
|
|
|
|
|