Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow

Installers

conda install

  • linux-ppc64le  v1.5.0
  • linux-64  v1.5.0
  • win-64  v1.5.0
  • osx-64  v1.5.0
To install this package with conda run:
conda install -c anaconda libxgboost

Description

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.


© 2022 Anaconda, Inc. All Rights Reserved. (v2.35.6 fab5c9df) Legal | Privacy Policy