Introduction - If you have any usage issues, please Google them yourself
Conjugate gradient method (Conjugate Gradient) is between the steepest descent method between the method and Newton' s method, it takes only a first derivative information, but to overcome the steepest descent method convergence slow shortcomings, but also to avoid the Newton method needs to be stored Hesse and disadvantages of computing inverse matrix and the conjugate gradient method is not only one of the most useful methods to solve large linear equations, solution of large-scale nonlinear optimization is one of the most effective algorithm. In various optimization algorithm, conjugate gradient method is a very important one. The advantage is that a small amount of memory required, with step convergence, high stability, and does not require any external parameters.