CMD + K

perpetual

Community

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

Installation

To install this package, run one of the following:

Conda
$conda install sfe1ed40::perpetual

Usage Tracking

0.7.12
1 / 8 versions selected
Downloads (Last 6 months): 0

Description

PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms. Similar to AutoML libraries, it has a budget parameter. Increasing the budget parameter increases the predictive power of the algorithm and gives better results on unseen data.

About

Summary

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

Last Updated

Mar 3, 2025 at 14:58

License

GPL-3.0-or-later

Total Downloads

5.4K

Supported Platforms

linux-s390x
linux-64
macOS-arm64
linux-aarch64
macOS-64
win-64