Useful Links
Statistics
Computational Statistics
1. Foundations of Computational Statistics
2. Monte Carlo Methods
3. Resampling Methods
4. Numerical Optimization in Statistics
5. Bayesian Computational Methods
6. High-Dimensional Data Analysis
7. Advanced Computational Topics
Numerical Optimization in Statistics
The Role of Optimization in Statistical Modeling
Maximum Likelihood Estimation (MLE)
Likelihood Functions
Log-Likelihood
Score Functions
Fisher Information
Least Squares Estimation
Linear Regression
Nonlinear Regression
Weighted Least Squares
Method of Moments
Moment Equations
Generalized Method of Moments
Bayesian Estimation
Maximum A Posteriori (MAP)
Posterior Mode Finding
Optimization Algorithms
Gradient-Based Methods
Gradient Descent
Step Size Selection
Convergence Criteria
Line Search Methods
Stochastic Gradient Descent
Mini-batch Methods
Online Learning
Adaptive Learning Rates
Conjugate Gradient Methods
Application to Large Systems
Preconditioning
Newton's Method and Variants
Newton-Raphson Method
Hessian Matrix Computation
Convergence Properties
Quasi-Newton Methods
BFGS Algorithm
L-BFGS Algorithm
DFP Algorithm
Fisher Scoring
Application in Generalized Linear Models
Information Matrix
Derivative-Free Methods
Nelder-Mead Simplex
Simulated Annealing
Genetic Algorithms
Constrained Optimization
Lagrange Multipliers
Penalty Methods
Barrier Methods
The Expectation-Maximization (EM) Algorithm
The E-Step (Expectation)
Calculating Expected Values
Complete Data Log-Likelihood
The M-Step (Maximization)
Maximizing the Expected Log-Likelihood
Parameter Updates
Theory and Convergence
Monotonicity of Likelihood
Convergence Criteria
Rate of Convergence
Variants of EM
Generalized EM (GEM)
Expectation Conditional Maximization (ECM)
Monte Carlo EM (MCEM)
Applications with Missing Data and Latent Variables
Mixture Models
Hidden Markov Models
Factor Analysis
Incomplete Data Problems
Previous
3. Resampling Methods
Go to top
Next
5. Bayesian Computational Methods