06. Solution of Non Linear Equations (Newton Raphson
Method)
Dated: 15-08-2025
The simplest way to derive a formula for this is to use first 2 terms in the Taylor’s series
.
\[
f(x_{n+1}) = f(x_n) + (x_{n+1} - x_n)f^\prime(x_n)
\]
Setting \(f(x_{n + 1}) = 0\), gives
\[
f(x_n) + (x_{n+1} - x_n)f^\prime(x_n) = 0
\]
\[
x_{n+1} = x_n - \frac{f(x_n)}{f^\prime(x_n)} \quad \text{for } n=0,1,2,\ldots
\]
Geometrical Interpretation
Let \(f(x) = 0\) meet \(x\) axis at \(x = \alpha\). Then let \(x_0\) be any point near \(\alpha\) so the equation of the tangent at \(x_0\) is
\[
y - f(x_0) = f^\prime(x_0)(x - x_0)
\]
This cuts the \(x\) axis at
\[
x_1 = x_0 - \frac{f(x_0)}{f^\prime(x_0)}
\]
Similarly,
\[
x_2 = x_1 - \frac{f(x_1)}{f^\prime(x_1)}
\]
Notes
- When \(f^\prime(x)\) is very large, is when the
slope
is large then \(h\) is small (as assumed) and hence the root
can be calculated in even less time.
- If \(x_0\) is close to
root
then we get the root
very quickly.
- The process will evidently fail if \(f^\prime(x) = 0\) is in the neighborhood of the
root
. In such cases the regula falsi method
should be used.
- If the initial approximation to the
root
is not given, choose two say \(a\) and \(b\) such that \(f(a) \times f(b) < 0\). If \(|f(a)| < |f(b)|\) then take \(a\) as the initial approximation.
Newton’s raphson
method is also referred to as the method of tangents
.
Example
Find a real
root
of the equation \(x^3 – x – 1 = 0\) using Newton - Raphson
method, correct to four decimal places.
Solution
\[
f(x) = x^3 - x - 1
\]
\[
f(1) = 1^3 - 1 - 1 = -1 < 0
\]
\[
f(2) = 2^3 - 2 - 1 = 8 - 2 - 1 = 5 > 0
\]
So the root
lies in \([1, 2]\).
\[f^\prime(x) = 3x^2 - 1\]
\[f^{\prime\prime}(x) = 6x\]
\[f^\prime(1) = 3 \times 1^2 - 1 = 2\]
\[f^\prime(2) = 3 \times 2^2 - 1 = 11\]
\[f^{\prime\prime}(1) = 6(1) = 6\]
\[f^{\prime\prime}(2) = 6(2) = 12\]
Here \(f(2)\) and \(f^{\prime\prime}(2)\) have the same signs so \(x_0 = 2\).
\[x_1 = x_0 - \frac{f(x_0)}{f^\prime(x_0)} = 2 - \frac 5 {11} = 1.54545\]
\[f(x_1) =1.54545^3 - 1.54541 - 1 =3.691177-1.54541-1=1.14576\]
\[f^\prime(x) = 3x^2 - 1 = 3(1.54545)^2 - 1 = 3 (2.38829) - 1 = 7.16487 - 1 = 6.16487\]
\[x_2 = 1.54545 - \frac {1.14576}{6.16525} = 1.35961\]
\[f(x_2) = 1.35961^3 - 1.35961 - 1 = 3.691177 - 1.54541 - 1 = 1.14576\]
\[f^\prime(x) = 3x^2 - 1 = 3(1.54541)^2 - 1 = 3(2.38829) - 1 = 7.16487 - 1 = 6.16487\]
\[x_3 = 1.35961 - \frac{0.15369}{4.54562} = 1.32579, \quad f(x_3) = 4.60959 \times 10^{-3}, \quad f^\prime(x_3) = 4.27316\]
\[x_4 = 1.32579 - \frac{4.60959 \times 10^{-3}}{4.27316} = 1.32471, \quad f(x_4) = -3.39345 \times 10^{-5}, \quad f^\prime(x^4) = 4.26457\]
\[x_5 = 1.32471 + \frac {3.39345 \times 10^{-5}}{4.26457} = 1.324718, \quad f(x_5) = 1.823 \times 10^{-7}\]
The required root
is \(1.3247\).
Newton-Raphson
method falls in category of open methods since there is no bracketing of interval
is required.
Draw Backs of N-R
Method
Divergence at Inflection Points
If the selection of a guess or an iterated value turns out to be close to the inflection point
of \(f(x)\) where \(f^{\prime\prime}(x) = 0\), the roots
start to diverge away from the root
.
\[x_{i + 1} = x_i - \frac{f(x_i)}{f^\prime(x_i)}\]
Division of Zero or near Zero
If an iteration, we come across the division by zero
or a near zero number, then we get a large magnitude for the next value, \(x_i+1\).
Root Jumping
In some case where the function
\(f (x)\) is oscillating and has a number of roots
, one may choose an initial guess close to a root
. However, the guesses may jump and converge to some other root
.
Oscillations near Local Maximum and Minimum
Results obtained from N-R
method may oscillate about the local max or min without converging on a root
but converging on the local max or min. Eventually, it may lead to division to a number close to zero and may diverge.
Convergence of N-R
Method
\[\because x_{n + 1} = x_n - \frac{f(x_n)}{f^\prime(x_n)}\]
\[\because x_{n+1} = \phi(x_n)\]
\[\phi(x_n) = x_n - \frac{f(x_n)}{f^\prime(x_n)}\]
\[\phi(x) = x - \frac{f(x)}{f^\prime(x)}\]
The iteration method
converges if
\[|\phi^\prime(x)| < 1\]
Therefore, N-R
formula converges, provided
\[|f(x) f^{\prime\prime}(x)| < |f^\prime(x)|^2\]
Definition
Let
\[x_n = \alpha + \epsilon_n\]
\[x_{n + 1} = \alpha + \epsilon_{n + 1}\]
Where \(f(\alpha) = 0\)
\[\epsilon_{n + 1} = K \epsilon_n^p\]
Terms
- \(\epsilon_n\) is the error involved at the \(n^{th}\) step.
- \(K\) is a constant.
- \(p\) is the rate of convergence of the method.
\[\because x_{n + 1} = x_n - \frac{f(x_n)}{f^\prime(x_n)}\]
\[
\alpha + \epsilon_{n+1} = \alpha + \epsilon_n - \frac{f(\alpha+\epsilon_n)}{f'(\alpha+\epsilon_n)}
\]
\[
\epsilon_{n+1} = \epsilon_n - \frac{f(\alpha+\epsilon_n)}{f'(\alpha+\epsilon_n)} = \frac{\epsilon_n f'(\alpha+\epsilon_n) - f(\alpha+\epsilon_n)}{f'(\alpha+\epsilon_n)}
\]
Using the taylor's series
, we have
\[
\epsilon_{n+1} = \frac{1}{f'(\alpha) + \epsilon_n f''(\alpha) + \dots} \left\{ \epsilon_n [f'(\alpha) + \epsilon_n f''(\alpha) + \dots] - \left[ f(\alpha) + \epsilon_n f'(\alpha) + \frac{\epsilon_n^2}{2} f''(\alpha) + \dots \right] \right\}
\]
Since \(f(\alpha) = 0\), the expression simplifies to
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} f''(\alpha) \frac{1}{f'(\alpha) + \epsilon_n f''(\alpha)}
\]
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} f''(\alpha) \frac{1}{f'(\alpha) + \epsilon_n f''(\alpha)} \cdot \frac{f^\prime(x)}{f^\prime(x)}
\]
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} \frac{f''(\alpha)}{f^\prime(x)} \frac{f^\prime(x)}{f'(\alpha) + \epsilon_n f''(\alpha)}
\]
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} \frac{f''(\alpha)}{f^\prime(x)} \left(\frac{f'(\alpha) + \epsilon_n f''(\alpha)}{f^\prime(x)}\right)^{-1}
\]
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} \frac{f''(\alpha)}{f^\prime(x)} \left(\frac{f'(\alpha)}{f'(\alpha)} + \frac{\epsilon_n f''(\alpha)}{f^\prime(x)}\right)^{-1}
\]
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} \frac{f''(\alpha)}{f^\prime(x)} \left(1+ \frac{\epsilon_n f''(\alpha)}{f^\prime(x)}\right)^{-1}
\]
\((1 + x)^{-1} \approx (1 - x) \quad |x| << 1\)
- \((1 + x)^{-1} = 1 - x + x^2 - x^3 + \cdots\)
- \((1 + x)^{-1} = (1 - x) + O(x^2) \quad \because (x \to 0)\)
\[
\epsilon_{n+1} = \frac{\epsilon_n^2}{2} \frac{f''(\alpha)}{f^\prime(x)} \left(1 - \frac{\epsilon_n f''(\alpha)}{f^\prime(x)} + O\left(e^2_n\right)\right)
\]
\[
\epsilon_{n + 1} = \frac{\epsilon_n^2}{2} \frac{f^{\prime\prime}(\alpha)}{f^\prime(\alpha)} - \frac{\epsilon_n^3}{2}\left(\frac{f^{\prime\prime}(\alpha)}{f^\prime(\alpha)}\right)^2 + O\left(\epsilon_n^4\right)
\]
\[
\epsilon_{n + 1} = \frac{\epsilon_n^2}{2} \frac{f^{\prime\prime}(\alpha)}{f^\prime(\alpha)} + O\left(\epsilon_n^3\right)
\]
Neglecting the terms which are \(\epsilon_n^3\) or of higher order, we get
\[\epsilon_{n + 1} = K \epsilon_n^2\]
\[K = \frac{f^{\prime\prime}(\alpha)}{2f^\prime(\alpha)}\]
It shows that N-R
method has second order convergence or converges quadratically.
Example
Set up Newton
’s scheme of iteration for finding the square root
of a positive number \(N\).
Solution
\[x = \sqrt N\]
\[x^2 = N\]
\[x^2 - N = 0\]
\[f(x) = x^2 - N\]
By Newton
’s method, we have
\[x_{n + 1} = x_n - \frac{f(x_n)}{f^\prime(x_n)}\]
\[\therefore f(x) = x^2 - N, \quad f^\prime(x) = 2x\]
\[ x_{n+1} = x_n - \frac{x_n^2 - N}{2x_n} = \frac{1}{2} \left( x_n + \frac{N}{x_n} \right) \]
Example
Evaluate \(\sqrt {12}\) by Newton
's formula
Solution
\[\because \sqrt 9 = 3, \quad \sqrt{16} = 4\]
\[x_0 = \frac {3 + 4}{2} = 3.5\]
\[
x_1 = \frac{1}{2} \left( x_0 + \frac{N}{x_0} \right) = \frac{1}{2} \left( 3.5 + \frac{12}{3.5} \right) = 3.4643
\]
\[
x_2 = \frac{1}{2} \left( 3.4643 + \frac{12}{3.4643} \right) = 3.4641
\]
\[
x_3 = \frac{1}{2} \left( 3.4641 + \frac{12}{3.4641} \right) = 3.4641
\]
\[\therefore \sqrt{12} = 3.4641\]
References
Read more about notations and symbols.