Category: Gradient methods

Gradient flow
No description available.
Stochastic variance reduction
(Stochastic) variance reduction is an algorithmic approach to minimizing functions that can be decomposed into finite sums. By exploiting the finite sum structure, variance reduction techniques are ab
Stochastic gradient descent
Stochastic gradient descent (often abbreviated SGD) is an iterative method for optimizing an objective function with suitable smoothness properties (e.g. differentiable or subdifferentiable). It can b
Frank–Wolfe algorithm
The Frank–Wolfe algorithm is an iterative first-order optimization algorithm for constrained convex optimization. Also known as the conditional gradient method, reduced gradient algorithm and the conv
Gradient method
In optimization, a gradient method is an algorithm to solve problems of the form with the search directions defined by the gradient of the function at the current point. Examples of gradient methods a
Nonlinear conjugate gradient method
In numerical optimization, the nonlinear conjugate gradient method generalizes the conjugate gradient method to nonlinear optimization. For a quadratic function the minimum of is obtained when the gra
Gradient descent
In mathematics, gradient descent (also often called steepest descent) is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take re
Landweber iteration
The Landweber iteration or Landweber algorithm is an algorithm to solve ill-posed linear inverse problems, and it has been extended to solve non-linear problems that involve constraints. The method wa
Contour currents
The term contour currents was first introduced by Heezen et al in 1966 as bottom currents along the continental shelf driven by Coriolis effects and temperature/salinity dependent density gradients. G
Biconjugate gradient stabilized method
In numerical linear algebra, the biconjugate gradient stabilized method, often abbreviated as BiCGSTAB, is an iterative method developed by H. A. van der Vorst for the numerical solution of nonsymmetr
Derivation of the conjugate gradient method
In numerical linear algebra, the conjugate gradient method is an iterative method for numerically solving the linear system where is symmetric positive-definite. The conjugate gradient method can be d
Coordinate descent
Coordinate descent is an optimization algorithm that successively minimizes along coordinate directions to find the minimum of a function. At each iteration, the algorithm determines a coordinate or c
Conjugate gradient method
In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is positive-definite. The conjugate gradie
Mirror descent
In mathematics, mirror descent is an iterative optimization algorithm for finding a local minimum of a differentiable function. It generalizes algorithms such as gradient descent and multiplicative we
Biconjugate gradient method
In mathematics, more specifically in numerical linear algebra, the biconjugate gradient method is an algorithm to solve systems of linear equations Unlike the conjugate gradient method, this algorithm
Proximal gradient method
Proximal gradient methods are a generalized form of projection used to solve non-differentiable convex optimization problems. Many interesting problems can be formulated as convex optimization problem
Random coordinate descent
Randomized (Block) Coordinate Descent Method is an optimization algorithm popularized by Nesterov (2010) and Richtárik and Takáč (2011). The first analysis of this method, when applied to the problem
Stochastic gradient Langevin dynamics
Stochastic gradient Langevin dynamics (SGLD) is an optimization and sampling technique composed of characteristics from Stochastic gradient descent, a Robbins–Monro optimization algorithm, and Langevi