Useful Links
Systems Science
Optimization Theory
1. Foundations of Optimization
2. Mathematical Foundations
3. Unconstrained Optimization
4. Constrained Optimization Theory
5. Linear Programming
6. Nonlinear Programming
7. Integer and Combinatorial Optimization
8. Dynamic Programming
9. Stochastic Optimization
10. Heuristic and Metaheuristic Methods
11. Multi-Objective Optimization
12. Specialized Optimization Topics
13. Applications and Case Studies
14. Computational Aspects and Software
Unconstrained Optimization
Theoretical Foundations
Necessary Optimality Conditions
First-Order Necessary Conditions
Gradient Conditions
Critical Points
Second-Order Necessary Conditions
Hessian Conditions
Sufficient Optimality Conditions
Second-Order Sufficient Conditions
Strict Convexity Implications
Line Search Strategies
Exact Line Search
One-Dimensional Optimization
Golden Section Search
Fibonacci Search
Inexact Line Search
Armijo Rule (Sufficient Decrease)
Wolfe Conditions
Curvature Condition
Strong Wolfe Conditions
Goldstein Conditions
Backtracking Line Search
Step Size Selection
Fixed Step Size
Adaptive Step Size
Diminishing Step Size Rules
Gradient-Based Methods
Steepest Descent Method
Algorithm Description
Convergence Analysis
Linear Convergence Rate
Dependence on Condition Number
Advantages and Limitations
Conjugate Gradient Methods
Linear Conjugate Gradient
Conjugate Directions
Krylov Subspaces
Nonlinear Conjugate Gradient
Fletcher-Reeves Formula
Polak-Ribière Formula
Hestenes-Stiefel Formula
Momentum Methods
Heavy Ball Method
Nesterov Acceleration
Convergence Acceleration Properties
Newton-Type Methods
Newton's Method
Algorithm Derivation
Quadratic Convergence
Computational Requirements
Modified Newton Methods
Quasi-Newton Methods
Secant Equation
BFGS Method
Update Formula
Positive Definiteness Preservation
DFP Method
Limited-Memory BFGS (L-BFGS)
Broyden Family of Updates
Trust Region Methods
Trust Region Subproblem
Cauchy Point
Dogleg Method
Trust Region Radius Updates
Specialized First-Order Methods
Stochastic Gradient Descent
Algorithm Variants
Convergence in Expectation
Mini-Batch Variants
Adaptive Gradient Methods
AdaGrad
RMSprop
Adam Optimizer
AdaDelta
Coordinate Descent Methods
Cyclic Coordinate Descent
Random Coordinate Descent
Block Coordinate Descent
Previous
2. Mathematical Foundations
Go to top
Next
4. Constrained Optimization Theory