CMD + K

r-gbm

Community

An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway.

Installation

To install this package, run one of the following:

Conda
$conda install conda-forge::r-gbm

Usage Tracking

2.2.3
2.2.2
2.1.9
2.1.8.1
2.1.8
5 / 8 versions selected
Downloads (Last 6 months): 0

About

Summary

An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway.

Last Updated

Jan 24, 2026 at 13:36

License

GPL-2.0-or-later

Total Downloads

169.8K

Version Downloads

546

Supported Platforms

linux-aarch64
linux-ppc64le
macOS-64
win-64
linux-64