by Neil E. Cotter

Research Assistant Professor

Last Updated: 8/31/09

 

(† = missing, * = article)

 


Optimization

Lagrange Multipliers

Karush-Kuhn-Tucker Theorem

Descent Algorithms

Newton's method

Line search

Gradient (or steepest) descent

Conjugate gradient methods

Polak-Ribiere method

Modified Newton's method

Linear Optimization

Linear programming

Example Problem 1 (pdf)

Simplex algorithm

Interior-point algorithms

Quadratic Programming

Global Optimization

Downhill simplex algorithm

Genetic algorithms

Diffusion algorithms

Random fields

Markov random fields (MRF)

Gibbs random fields (GRF)

Equivalence of MRF and GRF

Convergence

Notation
Kolmogorov-Chapman equation
Forgetting initial conditions
Approaches invariant distribution
Invariant distribution is Gibbs
Freezing to global minima

Metropolis algorithm

Simulated annealing

*Prejudicial search

Discrete Optimization

Dynamic programming

NP-completene problems

Downhill simplex algorithm

 

 

Continuation

Dynamic programming: Werbos

Random optimization

Stochastic approx

CMAC learning

Gradient descent

K-means squared