ebooksgratis.com

See also ebooksgratis.com: no banners, no cookies, totally FREE.

CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
Privacy Policy Cookie Policy Terms and Conditions
Newton's method in optimization - Wikipedia, the free encyclopedia

Newton's method in optimization

From Wikipedia, the free encyclopedia

A comparison of gradient descent (green) and Newton's method (red) for minimizing a function (with small step sizes). Newton's method uses curvature information to take a more direct route.
A comparison of gradient descent (green) and Newton's method (red) for minimizing a function (with small step sizes). Newton's method uses curvature information to take a more direct route.

In mathematics, Newton's method is a well-known algorithm for finding roots of equations in one or more dimensions. It can also be used to find local maxima and local minima of functions by noticing that if a real number x * is a stationary point of a function f(x), then x * is a root of the derivative f'(x), and therefore one can solve for x * by applying Newton's method to f'(x). The Taylor expansion of f(x)

\displaystyle f(x+\Delta x)=f(x)+f'(x)\Delta x+\frac 1 2 f'' (x) \Delta x^2,

attains its extremum when Δx solves the linear equation:

\displaystyle  f'(x)+f'' (x) \Delta x=0

and \displaystyle f'' (x) is positive. Thus, provided that \displaystyle f(x) is a twice-differentiable function and the initial guess \displaystyle x_0 is chosen close enough to x * , the sequence (xn) defined by

x_{n+1} = x_n - \frac{f'(x_n)}{f''(x_n)}, \ n \ge 0

will converge towards x * .

This iterative scheme can be generalized to several dimensions by replacing the derivative with the gradient, \nabla f(\mathbf{x}), and the reciprocal of the second derivative with the inverse of the Hessian matrix, H f(\mathbf{x}). One obtains the iterative scheme

\mathbf{x}_{n+1} = \mathbf{x}_n - [H f(\mathbf{x}_n)]^{-1} \nabla f(\mathbf{x}_n), \ n \ge 0.

Usually Newton's method is modified to include a small step size γ > 0 instead of γ = 1

\mathbf{x}_{n+1} = \mathbf{x}_n - \gamma[H f(\mathbf{x}_n)]^{-1} \nabla f(\mathbf{x}_n).

This is often done to ensure that the Wolfe conditions are satisfied at each step \mathbf{x}_n \to \mathbf{x}_{n+1} of the iteration.

The geometric interpretation of Newton's method is that at each iteration one approximates f(\mathbf{x}) by a quadratic function around \mathbf{x}_n, and then takes a step towards the maximum/minimum of that quadratic function. (If f(\mathbf{x}) happens to be a quadratic function, then the exact extremum is found in one step.)

Newton's method converges much faster towards a local maximum or minimum than gradient descent. In fact, every local minimum has a neighborhood N such that, if we start with \mathbf{x}_0 \in N, Newton's method with step size γ = 1 converges quadratically (if the Hessian is invertible in that neighborhood).

Finding the inverse of the Hessian is an expensive operation, so the linear equation

\mathbf{p}_{n} = \mathbf{x}_{n+1}-\mathbf{x}_{n} = -[H f(\mathbf{x}_n)]^{-1} \nabla f(\mathbf{x}_n), \ n \ge 0..

is often solved approximately (but to great accuracy) using a method such as conjugate gradient. There also exist various quasi-Newton methods, where an approximation for the Hessian is used instead.

If the Hessian is close to a non-invertible matrix, the inverted Hessian can be numerically unstable and the solution may diverge. In this case, certain workarounds have been tried in the past, which have varied success with certain problems. One can, for example, modify the Hessian by adding a correction matrix Bn so as to make H_f(\mathbf{x}_n) + B_n positive definite. One approach is to diagonalize Hf and choose Bn so that H_f(\mathbf{x}_n) + B_n has the same eigenvectors as Hf, but with each negative eigenvalue replaced by ε > 0.

[edit] See also

[edit] References

  • Avriel, Mordecai (2003). Nonlinear Programming: Analysis and Methods. Dover Publishing. ISBN 0-486-43227-0.
  • Nocedal, Jorge & Wright, Stephen J. (1999). Numerical Optimization. Springer-Verlag. ISBN 0-387-98793-2.

[edit] External links


aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -