perpetual
A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization
A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization
To install this package, run one of the following:
PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms. Similar to AutoML libraries, it has a budget parameter. Increasing the budget parameter increases the predictive power of the algorithm and gives better results on unseen data.
Summary
A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization
Last Updated
Mar 3, 2025 at 14:58
License
GPL-3.0-or-later
Total Downloads
5.4K
Supported Platforms
GitHub Repository
https://github.com/perpetual-ml/perpetualDocumentation
https://perpetual-ml.github.io/perpetual/