XGBoost
| XGBoost | |
|---|---|
| Developer(s) | The XGBoost Contributors |
| Initial release | March 27, 2014 |
| Stable release | 3.0.0
/ 15 March 2025 |
| Repository | |
| Written in | C++ |
| Operating system | Linux, macOS, Microsoft Windows |
| Type | Machine learning |
| License | Apache License 2.0 |
| Website | xgboost |
XGBoost (eXtreme Gradient Boosting) is an open-source software library which provides a regularizing gradient boosting framework for C++, Java, Python, R, Julia, Perl, and Scala. It works on Linux, Microsoft Windows, and macOS. From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, Apache Flink, and Dask.
XGBoost gained much popularity and attention in the mid-2010s as the algorithm of choice for many winning teams of machine learning competitions.