Article snapshot taken from[REDACTED] with creative commons attribution-sharealike license.
Give it a read and then ask your questions in the chat.
We can research this topic together.
In optimization, a descent direction is a vector that points towards a local minimum of an objective function .
Computing by an iterative method, such as line search defines a descent direction at the th iterate to be any such that , where denotes the inner product. The motivation for such an approach is that small steps along guarantee that is reduced, by Taylor's theorem.
Using this definition, the negative of a non-zero gradient is always a
descent direction, as .