CRAN
gradDescent 3.0
Gradient Descent for Regression Tasks
Released Jan 25, 2018 by Lala Septem Riza
An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : Mini-Batch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), which is a SGD-based algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), which is an optimization to speed-up gradient descent learning. Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning. Adagrad, which is a gradient-descent-based algorithm that accumulate previous cost to do adaptive learning. Adadelta, which is a gradient-descent-based algorithm that use hessian approximation to do adaptive learning. RMSprop, which is a gradient-descent-based algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, which is a gradient-descent-based algorithm that mean and variance moment to do adaptive learning. Stochastic Variance Reduce Gradient (SVRG), which is an optimization SGD-based algorithm to accelerates the process toward converging by reducing the gradient. Semi Stochastic Gradient Descent (SSGD),which is a SGD-based algorithm that combine GD and SGD to accelerates the process toward converging by choosing one of the gradients at a time. Stochastic Recursive Gradient Algorithm (SARAH), which is an optimization algorithm similarly SVRG to accelerates the process toward converging by accumulated stochastic information. Stochastic Recursive Gradient Algorithm+ (SARAHPlus), which is a SARAH practical variant algorithm to accelerates the process toward converging provides a possibility of earlier termination.
Installation
Maven
This package can be included as a dependency from a Java or Scala project by including
the following your project's pom.xml
file.
Read more
about embedding Renjin in JVM-based projects.
<dependencies> <dependency> <groupId>org.renjin.cran</groupId> <artifactId>gradDescent</artifactId> <version>3.0-b4</version> </dependency> </dependencies> <repositories> <repository> <id>bedatadriven</id> <name>bedatadriven public repo</name> <url>https://nexus.bedatadriven.com/content/groups/public/</url> </repository> </repositories>
Renjin CLI
If you're using Renjin from the command line, you load this library by invoking:
library('org.renjin.cran:gradDescent')
Test Results
This package was last tested against Renjin 0.9.2644 on Jun 2, 2018.
- ADADELTA-examples
- ADAGRAD-examples
- ADAM-examples
- AGD-examples
- GD-examples
- MBGD-examples
- MGD-examples
- RMSE-examples
- RMSPROP-examples
- SAGD-examples
- SARAH-examples
- SARAHPlus-examples
- SGD-examples
- SSGD-examples
- SVRG-examples
- gradDescentR.learn-examples
- minmaxDescaling-examples
- minmaxScaling-examples
- predict.gradDescentRObject-examples
- prediction-examples
- splitData-examples
- varianceDescaling-examples
- varianceScaling-examples