This post is a collection of some optimization methods, including some first order and second order optimization methods. I was interested in the theoretical proof of the optimization methods before and wrote several notes about them, and I will keep updating the notes in this post.

Gradient Descent

Gradient Descent and Line Search

Gradient Descent on Well Conditioned Functions

Conjugate Descent

Conjugate Descent Method

Steepest Descent

Steepest Descent

Subgradient Methods

Subgradient Methods

Quasi-Newton Method

Quasi-Newton Method