catboost v0.16 Release Notes
Release Date: 2019-07-24 // over 4 years ago-
๐ฅ Breaking changes:
MultiClass
loss has now the same sign as Logloss. It had the other sign before and was maximized, now it is minimized.CatBoostRegressor.score
now returns the value of R2 metric instead of RMSE to be more consistent with the behavior of scikit-learn regressors.- ๐ Changed metric parameter
use_weights
default value to false (except for ranking metrics)
๐ New features:
- It is now possible to apply model on GPU
- We have published two new realworld datasets with monotonic constraints,
catboost.datasets.monotonic1()
andcatboost.datasets.monotonic2()
. Before that there was onlycalifornia_housing
dataset in open-source with monotonic constraints. Now you can use these two to benchmark algorithms with monotonic constraints. - We've added several new metrics to catboost, including
DCG
,FairLoss
,HammingLoss
,NormalizedGini
andFilteredNDCG
- Introduced efficient
GridSearch
andRandomSearch
implementations. get_all_params()
Python function returns the values of all training parameters, both user-defined and default.- โ Added more synonyms for training parameters to be more compatible with other GBDT libraries.
Speedups:
- AUC metric is computationally very expensive. We've implemented parallelized calculation of this metric, now it can be calculated on every iteration (or every k-th iteration) about 4x faster.
Educational materials:
- We've improved our command-line tutorial, now it has examples of files and more information.
๐ Fixes:
- Automatic
Logloss
orMultiClass
loss function deduction forCatBoostClassifier.fit
now also works if the training dataset is specified asPool
or filename string. - ๐ And some other fixes