XGBoost

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

Logo

../_images/dmlc_xgboost-small.png

Website

https://xgboost.ai/

Repository

https://github.com/dmlc/xgboost

Byline

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow.

License

Apache 2.0

Project age

7 years 4 months

Backers

AWS (Sponsored by), Distributed (Deep) Machine Learning Community @ University of Washington (Creator and maintainer), Intel (Sponsored by), NVIDIA (Sponsored by)

Lastest News (2021-04-11)

Release 1.4.0 stable. Release includes a pre-built binary package for R with GPU support, as well as improvements on prediction … more

Size score (1 to 10, higher is better)

5.75

Trend score (1 to 10, higher is better)

3.75

Education Resources

URL

Resource Type

Description

https://xgboost.readthedocs.io/en/latest/index.html

Documentation

Official project documentation.

Git Commit Statistics

Statistics computed using Git data through May 31, 2021.

Statistic

Lifetime

Last 12 Months

Commits

34,817

10,913

Lines committed

7,301,502

2,349,350

Unique committers

526

83

Core committers

11

6

../_images/dmlc_xgboost-monthly-commits.png

Similar Projects

Project

Size Score

Trend Score

Byline

CatBoost

9.25

7.75

A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.