As its name means, gradient-based methods need the gradient of objective functions to design variables. The evaluation of gradient can be achieved by finite difference method, linearized method or adjoint method. Both finite difference method and linearized method has a time-cost proportional to the number of design variables and not suitable for design optimization with a large number of design variables. Apart from that, finite difference method has a notorious disadvantage of subtraction cancellation and is not recommended for practical design application.

Suppose a cost function $J$ is defined as follows,

$J=J(U,\alpha)$

where $U$ and $\alpha$ are the flow variable vector and the design variable vector respectively. $U$ and $\alpha$ are implicitly related through the flow equation, which is represented by a residual function driven to zero.

${R}(U(\alpha),\alpha)=0$

The sensitivity of the cost function $J$ with respect to the design variables $\alpha$, that is $\frac{D J }{D \alpha_{i}}$, is needed for design purpose. The following is three main methods to obtain this sensitivity.

Finite difference method is the most straightforward approach, where the sensitivity is calculated through finite difference, using different cost function values corresponding to different design variable input

$\frac{DJ}{D\alpha_{i}}=\frac{J(\alpha_{i}+\delta \alpha)-J(\alpha_{i})}{\delta \alpha}$

The defect with this approach is that first of all, not efficient, because the computational cost is linearly proportional to the number of design variables which is practically too expensive. Second of all, the interval $\delta\alpha$ is difficult to determine due to the concern of accuracy and machine error.

Linearized method: