jml.optimization
Class AcceleratedGradientDescent
java.lang.Object
jml.optimization.AcceleratedGradientDescent
public class AcceleratedGradientDescent
- extends java.lang.Object
A Java implementation for the accelerated gradient descent method.
It is a general algorithm interface, only gradient and objective
function value are needed to compute outside the class.
A simple example:
double epsilon = ...; // Convergence tolerance
Matrix W = ...; // Initial matrix (vector) you want to optimize
Matrix G = ...; // Gradient at the initial matrix (vector) you want to optimize
double fval = ...; // Initial objective function value
boolean flags[] = null;
while (true) {
  flags = AcceleratedGradientDescent.run(G, fval, epsilon, W); // Update W in place
  if (flags[0]) // flags[0] indicates if it converges
    break;
  fval = ...; // Compute the new objective function value at the updated W
  if (flags[1]) // flags[1] indicates if gradient at the updated W is required
    G = ...; // Compute the gradient at the new W
}
- Version:
- 1.0 Mar. 11th, 2013
- Author:
- Mingjie Qian
Method Summary |
static boolean[] |
run(org.apache.commons.math.linear.RealMatrix Grad_t,
double fval_t,
double epsilon,
org.apache.commons.math.linear.RealMatrix X_t)
Main entry for the accelerated gradient descent algorithm. |
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
prox
private static ProximalMapping prox
AcceleratedGradientDescent
public AcceleratedGradientDescent()
run
public static boolean[] run(org.apache.commons.math.linear.RealMatrix Grad_t,
double fval_t,
double epsilon,
org.apache.commons.math.linear.RealMatrix X_t)
- Main entry for the accelerated gradient descent algorithm.
The matrix variable to be optimized will be updated in place
to a better solution point with lower objective function value.
- Parameters:
Grad_t
- gradient at X_t, required on the first revocationfval_t
- objective function value at X_t: f(X_t)epsilon
- convergence precisionX_t
- current matrix variable to be optimized, will be
updated in place to a better solution point with
lower objective function value
- Returns:
- a
boolean
array with two elements: {converge, gradientRequired}