Description
CatBoost is a machine learning method based on gradient boosting over decision trees.
Main advantages of CatBoost:
* Categorical features support
* Reduced overfitting
* User-friendly API interface
* Fast model inference
catboost alternatives and similar libraries
Based on the "Machine Learning" category.
Alternatively, view catboost alternatives based on common mentions on social networks and blogs.
-
xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow -
mxnet
DISCONTINUED. Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more -
Caffe2
DISCONTINUED. A lightweight, modular, and scalable deep learning framework. [Apache2] website -
vowpal_wabbit
Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning. -
RNNLIB
RNNLIB is a recurrent neural network library for sequence learning problems. Forked from Alex Graves work http://sourceforge.net/projects/rnnl/ -
OpenHotspot
DISCONTINUED. OpenHotspot is a machine learning, crime analysis framework written in C++11.
SaaSHub - Software Alternatives and Reviews
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of catboost or a related project?
README
Website | Documentation | Tutorials | Installation | Release Notes
CatBoost is a machine learning method based on gradient boosting over decision trees.
Main advantages of CatBoost:
- Superior quality when compared with other GBDT libraries on many datasets.
- Best in class prediction speed.
- Support for both numerical and categorical features.
- Fast GPU and multi-GPU support for training out of the box.
- Visualization tools included.
- Fast and reproducible distributed training with Apache Spark and CLI.
Get Started and Documentation
All CatBoost documentation is available here.
Install CatBoost by following the guide for the
Next you may want to investigate:
- Tutorials
- Training modes and metrics
- Cross-validation
- Parameters tuning
- Feature importance calculation
- Regular and staged predictions
- CatBoost for Apache Spark videos: Introduction and Architecture
If you cannot open documentation in your browser try adding yastatic.net and yastat.net to the list of allowed domains in your privacy badger.
Catboost models in production
If you want to evaluate Catboost model in your application read model api documentation.
Questions and bug reports
- For reporting bugs please use the catboost/bugreport page.
- Ask a question on Stack Overflow with the catboost tag, we monitor this for new questions.
- Seek prompt advice at Telegram group or Russian-speaking Telegram chat
Help to Make CatBoost Better
- Check out open problems and help wanted issues to see what can be improved, or open an issue if you want something.
- Add your stories and experience to [Awesome CatBoost](AWESOME.md).
- To contribute to CatBoost you need to first read CLA text and add to your pull request, that you agree to the terms of the CLA. More information can be found in CONTRIBUTING.md
- Instructions for contributors can be found here.
News
Latest news are published on twitter.
Reference Paper
Anna Veronika Dorogush, Andrey Gulin, Gleb Gusev, Nikita Kazeev, Liudmila Ostroumova Prokhorenkova, Aleksandr Vorobev "Fighting biases with dynamic boosting". arXiv:1706.09516, 2017.
Anna Veronika Dorogush, Vasily Ershov, Andrey Gulin "CatBoost: gradient boosting with categorical features support". Workshop on ML Systems at NIPS 2017.
License
© YANDEX LLC, 2017-2022. Licensed under the Apache License, Version 2.0. See LICENSE file for more details.
*Note that all licence references and agreements mentioned in the catboost README section above
are relevant to that project's source code only.