Description
CatBoost is a machine learning method based on gradient boosting over decision trees.
Main advantages of CatBoost:
* Categorical features support
* Reduced overfitting
* User-friendly API interface
* Fast model inference
catboost alternatives and similar libraries
Based on the "Machine Learning" category.
Alternatively, view catboost alternatives based on common mentions on social networks and blogs.
-
mxnet
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more -
xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow. [Apache2] -
Dlib
A general purpose cross-platform C++ library designed using contract programming and modern C++ techniques. [Boost] -
Fido
A highly-modular C++ machine learning library for embedded electronics and robotics. [MIT] website -
Recommender
C library for product recommendations/suggestions using collaborative filtering (CF). [BSD] -
sofia-ml
The suite of fast incremental algorithms for machine learning. [Apache2]
Get performance insights in less than 4 minutes
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest. Visit our partner's website for more details.
Do you think we are missing an alternative of catboost or a related project?
README
Website | Documentation | Tutorials | Installation | Release Notes
CatBoost is a machine learning method based on gradient boosting over decision trees.
Main advantages of CatBoost:
- Superior quality when compared with other GBDT libraries on many datasets.
- Best in class prediction speed.
- Support for both numerical and categorical features.
- Fast GPU and multi-GPU support for training out of the box.
- Visualization tools included.
Get Started and Documentation
All CatBoost documentation is available here.
Install CatBoost by following the guide for the
Next you may want to investigate:
- Tutorials
- Training modes and metrics
- Cross-validation
- Parameters tuning
- Feature importance calculation
- Regular and staged predictions
If you cannot open documentation in your browser try adding yastatic.net and yastat.net to the list of allowed domains in your privacy badger.
Catboost models in production
If you want to evaluate Catboost model in your application read model api documentation.
Questions and bug reports
- For reporting bugs please use the catboost/bugreport page.
- Ask a question on Stack Overflow with the catboost tag, we monitor this for new questions.
- Seek prompt advice at Telegram group or Russian-speaking Telegram chat
Help to Make CatBoost Better
- Check out open problems and help wanted issues to see what can be improved, or open an issue if you want something.
- Add your stories and experience to [Awesome CatBoost](AWESOME.md).
- To contribute to CatBoost you need to first read CLA text and add to your pull request, that you agree to the terms of the CLA. More information can be found in CONTRIBUTING.md
- Instructions for contributors can be found here.
News
Latest news are published on twitter.
Reference Paper
Anna Veronika Dorogush, Andrey Gulin, Gleb Gusev, Nikita Kazeev, Liudmila Ostroumova Prokhorenkova, Aleksandr Vorobev "Fighting biases with dynamic boosting". arXiv:1706.09516, 2017.
Anna Veronika Dorogush, Vasily Ershov, Andrey Gulin "CatBoost: gradient boosting with categorical features support". Workshop on ML Systems at NIPS 2017.
License
© YANDEX LLC, 2017-2019. Licensed under the Apache License, Version 2.0. See LICENSE file for more details.
*Note that all licence references and agreements mentioned in the catboost README section above
are relevant to that project's source code only.