CMD + K

r-gbm

Community

An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway.

Installation

To install this package, run one of the following:

Conda
$conda install conda-forge::r-gbm

Usage Tracking

2.2.2
2.1.9
2.1.8.1
2.1.8
2.1.5
5 / 8 versions selected
Downloads (Last 6 months): 0

About

Summary

An implementation of extensions to Freund and Schapire's AdaBoost algorithm and Friedman's gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart). Originally developed by Greg Ridgeway.

Last Updated

Jul 19, 2024 at 06:42

License

GPL-2.0-or-later

Total Downloads

159.0K

Supported Platforms

linux-aarch64
linux-ppc64le
macOS-64
win-64
linux-64