Newton's Method Calculator
Find roots of any equation using the Newton-Raphson method. Pick a preset function or enter your own, set an initial guess, and watch the method converge step by step with full iteration details.
f(x) = x³ - 2
f'(x) = 3x²
Step-by-Step Iterations
| n | x_n | f(x_n) | f'(x_n) | x_{n+1} | |x_{n+1} - x_n| |
|---|---|---|---|---|---|
| 1 | 1.0000000000 | -1.0000e+0 | 3.0000e+0 | 1.3333333333 | 3.3333e-1 |
| 2 | 1.3333333333 | 3.7037e-1 | 5.3333e+0 | 1.2638888889 | 6.9444e-2 |
| 3 | 1.2638888889 | 1.8955e-2 | 4.7922e+0 | 1.2599334934 | 3.9554e-3 |
| 4 | 1.2599334934 | 5.9259e-5 | 4.7623e+0 | 1.2599210500 | 1.2443e-5 |
| 5 | 1.2599210500 | 5.8526e-10 | 4.7622e+0 | 1.2599210499 | 1.2290e-10 |
Result
Approximate Root
1.2599210499
Known root
1.2599210499
Newton-Raphson Formula
x_{n+1} = x_n - f(x_n) / f'(x_n)
Start with an initial guess x₀ near the root. Each iteration refines the estimate by moving along the tangent line to the x-axis. Convergence is quadratic when close to the root.
Step-by-Step Table
See every iteration with x_n, f(x_n), f'(x_n), x_{n+1} and the error at each step. Understand exactly how convergence happens.
Custom Functions
Enter any expression with x as the variable. Supports powers, trig, log, exp, sqrt and more. Derivative computed automatically.
8 Preset Functions
Start with classic examples like x^3 - 2, cos(x) - x, or e^x - 3. Great for learning or checking your homework.
How to Use Newton's Method
Newton's method finds where a function crosses zero. You give it a starting point and it repeatedly improves the estimate until it lands on the root.
Understanding Newton-Raphson: A Complete Guide
The Core Idea
Suppose you have a function f(x) and you want to find where it equals zero. Newton's method starts with a guess x0, draws the tangent line to the curve at that point, and finds where the tangent crosses the x-axis. That crossing point becomes the next guess x1. Repeat the process and each new guess gets closer to the actual root.
The formula is straightforward: x_{n+1} = x_n - f(x_n) / f'(x_n). You evaluate the function and its derivative at the current guess, then subtract the ratio. That single line of math does all the work.
Why It Converges So Fast
Near a simple root where f'(root) is not zero, Newton's method has quadratic convergence. In practical terms, the number of accurate decimal places roughly doubles every step. If iteration 4 gives you 6 correct digits, iteration 5 gives you about 12. Most problems reach full machine precision in under 10 steps, making it one of the fastest root-finding algorithms available.
This speed comes from the fact that the tangent line is the best linear approximation to the curve at any point. Each step uses local information (the slope) to make an informed correction, rather than simply narrowing an interval like the bisection method.
When It Goes Wrong
Newton's method is not foolproof. Here are the main failure modes:
- Zero derivative: If f'(x_n) = 0, the tangent line is horizontal and never crosses the x-axis. The formula involves dividing by zero, so the method stops.
- Bad initial guess: Starting too far from the root can send the iterates spiraling away. The method may jump to a completely different region of the function.
- Cycling: For some functions and starting points, the iterates bounce back and forth between two values without settling down.
- Multiple roots: If the root has multiplicity greater than 1 (the function just touches zero without crossing), convergence slows to linear instead of quadratic.
Practical Applications
Newton's method shows up everywhere in applied mathematics and engineering:
- Square roots: The Babylonian method for computing square roots is a special case. Set f(x) = x^2 - c and the iteration becomes x_{n+1} = (x_n + c/x_n) / 2.
- Optimization: Finding where the derivative of a function is zero (critical points) turns an optimization problem into a root-finding problem for f'(x).
- Physics and engineering: Solving implicit equations in fluid dynamics, structural analysis, and circuit design often relies on Newton-Raphson iteration.
- Computer graphics: Ray tracing against implicit surfaces uses Newton's method to find intersection points efficiently.
- Finance: Computing internal rate of return (IRR) and implied volatility in options pricing both use Newton-Raphson.
Comparison with Other Methods
The bisection method is the safest alternative. It requires an interval [a, b] where f(a) and f(b) have opposite signs, then repeatedly halves the interval. Convergence is guaranteed but slow (linear, gaining about one decimal per 3.3 iterations).
The secant method is a close relative that avoids computing the derivative. Instead of f'(x_n), it uses the slope between the two most recent points. Convergence is superlinear (order about 1.618) rather than quadratic, but it does not require a derivative formula.
The fixed-point iteration rewrites f(x) = 0 as x = g(x) and iterates x_{n+1} = g(x_n). Convergence depends on the choice of g and is usually linear. Newton's method can be viewed as a specific fixed-point iteration with optimal convergence properties.
Tips for Students
When working through homework problems, always write out the iteration table with columns for n, x_n, f(x_n), f'(x_n), and x_{n+1}. Show the error |x_{n+1} - x_n| at each step to demonstrate convergence. Professors typically want to see 4 to 6 iterations. If the problem asks you to approximate a specific root, make sure your initial guess is on the correct side of that root to avoid converging to a different one.