In optimization, a descent direction is a vector  that points towards a local minimum
 that points towards a local minimum  of an objective function
 of an objective function  .
.
Computing  by an iterative method, such as line search defines a descent direction
 by an iterative method, such as line search defines a descent direction  at the
 at the  th iterate to be any
th iterate to be any  such that
 such that  , where
, where  denotes the inner product. The motivation for such an approach is that small steps along
 denotes the inner product. The motivation for such an approach is that small steps along  guarantee that
 guarantee that  is reduced, by Taylor's theorem.
 is reduced, by Taylor's theorem.
Using this definition, the negative of a non-zero gradient is always a
descent direction, as  .
.
Numerous methods exist to compute descent directions, all with differing merits, such as gradient descent or the conjugate gradient method.
More generally, if  is a positive definite matrix, then
 is a positive definite matrix, then
 is a descent direction at
 is a descent direction at  . This generality is used in preconditioned gradient descent methods.
. This generality is used in preconditioned gradient descent methods.