About Anaconda Help Download Anaconda

An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : Mini-Batch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), which is a SGD-based algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), which is an optimization to speed-up gradient descent learning. Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning. Adagrad, which is a gradient-descent-based algorithm that accumulate previous cost to do adaptive learning. Adadelta, which is a gradient-descent-based algorithm that use hessian approximation to do adaptive learning. RMSprop, which is a gradient-descent-based algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, which is a gradient-descent-based algorithm that mean and variance moment to do adaptive learning. Stochastic Variance Reduce Gradient (SVRG), which is an optimization SGD-based algorithm to accelerates the process toward converging by reducing the gradient. Semi Stochastic Gradient Descent (SSGD),which is a SGD-based algorithm that combine GD and SGD to accelerates the process toward converging by choosing one of the gradients at a time. Stochastic Recursive Gradient Algorithm (SARAH), which is an optimization algorithm similarly SVRG to accelerates the process toward converging by accumulated stochastic information. Stochastic Recursive Gradient Algorithm+ (SARAHPlus), which is a SARAH practical variant algorithm to accelerates the process toward converging provides a possibility of earlier termination.

copied from cf-staging / r-graddescent
Type Size Name Uploaded Downloads Labels
conda 157.3 kB | noarch/r-graddescent-3.0-r43hc72bb7e_3.conda  4 months and 28 days ago 405 main
conda 160.2 kB | noarch/r-graddescent-3.0-r44hc72bb7e_3.conda  4 months and 28 days ago 386 main
conda 157.3 kB | noarch/r-graddescent-3.0-r43hc72bb7e_2.conda  1 year and 5 months ago 840 main
conda 157.4 kB | noarch/r-graddescent-3.0-r42hc72bb7e_2.conda  1 year and 5 months ago 848 main
conda 160.4 kB | noarch/r-graddescent-3.0-r42hc72bb7e_1.tar.bz2  2 years and 1 month ago 1233 main
conda 160.3 kB | noarch/r-graddescent-3.0-r41hc72bb7e_1.tar.bz2  2 years and 1 month ago 1228 main
conda 159.5 kB | noarch/r-graddescent-3.0-r41hc72bb7e_0.tar.bz2  3 years and 1 month ago 1895 main
conda 159.5 kB | noarch/r-graddescent-3.0-r40hc72bb7e_0.tar.bz2  3 years and 1 month ago 1893 main

© 2024 Anaconda, Inc. All Rights Reserved. (v4.0.6) Legal | Privacy Policy