Runge's phenomenon
In the mathematical field of numerical analysis, Runge's phenomenon (German: [ˈʁʊŋə]) is a problem of oscillation at the edges of an interval that occurs when using polynomial interpolation with polynomials of high degree over a set of equispaced interpolation points. It was discovered by Carl David Tolmé Runge (1901) when exploring the behavior of errors when using polynomial interpolation to approximate certain functions. The discovery shows that going to higher degrees does not always improve accuracy. The phenomenon is similar to the Gibbs phenomenon in Fourier series approximations.
The Weierstrass approximation theorem states that for every continuous function defined on an interval , there exists a set of polynomial functions for , each of degree at most , that approximates with uniform convergence over as tends to infinity. This can be expressed as:
Consider the case where one desires to interpolate through equispaced points of a function using the -degree polynomial that passes through those points. Naturally, one might expect from Weierstrass' theorem that using more points would lead to a more accurate reconstruction of . However, this particular set of polynomial functions is not guaranteed to have the property of uniform convergence; the theorem only states that a set of polynomial functions exists, without providing a general method of finding one.
The produced in this manner may in fact diverge away from as increases; this typically occurs in an oscillating pattern that magnifies near the ends of the interpolation points. The discovery of this phenomenon is attributed to Runge.