About Anaconda Help Download Anaconda
If you were automatically logged out you may need to refresh the page. You're trying to access a page that requires authentication. ×

Implements the Hierarchical Incremental GRAdient Descent (HiGrad) algorithm, a first-order algorithm for finding the minimizer of a function in online learning just like stochastic gradient descent (SGD). In addition, this method attaches a confidence interval to assess the uncertainty of its predictions. See Su and Zhu (2018) <arXiv:1802.04876> for details.


© 2025 Anaconda, Inc. All Rights Reserved. (v4.0.9) Legal | Privacy Policy