Loading...
Calculate partial derivatives of multivariable functions online. Supports first, second, and higher order derivatives, mixed partials, and evaluation at a point. Enter your function and get the answer with steps.
Use * for multiplication, ^ for powers, sin, cos, tan, exp, log (natural log), sqrt
These are the same differentiation rules you already know. The only difference is that all other variables act as constants.
Find any partial derivative in three quick steps.
Enter a function using x, y, z, or any variable names. Use ^ for powers, * for multiplication, and standard function names like sin, cos, exp, log, sqrt. For example: x^2*y + sin(x*y) or exp(x*y*z).
Select which variable to differentiate with respect to (x, y, z, u, v, or w). Choose the order: 1st for a regular partial, 2nd for a second order partial derivative, and so on up to 5th. For mixed partials, switch to Mixed mode and add each differentiation step.
The calculator returns the symbolic derivative and shows each step of the computation. You can optionally evaluate the result at a specific point by entering numeric values for each variable.
Get exact symbolic answers, not decimal approximations. The calculator applies differentiation rules algebraically so your result is in closed form.
See each differentiation step written out. Useful for checking your own work or understanding how the rules were applied.
Compute second, third, fourth, or fifth order partial derivatives with a single click. The tool differentiates repeatedly and shows every intermediate step.
Calculate mixed partial derivatives like fxy or fxyz. Specify the exact sequence of variables and verify Clairaut's theorem on your own functions.
After computing the symbolic derivative, plug in numbers to evaluate it at any point. Helpful for gradient calculations and optimization problems.
Not limited to x and y. Use x, y, z, u, v, w, or any combination. The calculator detects all variables in your expression automatically.
When a function depends on more than one variable, a partial derivative tells you how the output changes when you nudge just one of those variables. Imagine a hilly landscape described by a height function f(x, y). The partial derivative with respect to x, written as ∂f/∂x, gives the slope of the terrain in the east-west direction at a given point. The partial derivative with respect to y gives the slope in the north-south direction.
The notation uses the curly ∂ symbol instead of a regular d to signal that other variables are being held constant. You might also see the subscript notation f_x or f_y, which means the same thing. In practice, computing a partial derivative is no harder than a regular derivative. You just pretend the other variables are fixed numbers and differentiate as usual.
If a function has one input, like f(x) = x³ + 2x, its derivative is f'(x) = 3x² + 2. That is an ordinary derivative. There is only one direction to move, so there is nothing to hold constant.
Now consider f(x, y) = x³y + 2xy. This function lives in three-dimensional space. To get the partial derivative with respect to x, treat y as a constant: ∂f/∂x = 3x²y + 2y. To get the partial with respect to y, treat x as a constant: ∂f/∂y = x³ + 2x.
Every ordinary derivative is technically a partial derivative of a one-variable function. The distinction only matters once you have multiple independent variables.
A second order partial derivative is the derivative of a derivative. If you differentiate f with respect to x and then differentiate the result with respect to x again, you get ∂²f/∂x², also written as f_xx. This tells you about the concavity of the function in the x-direction.
You can keep going. Third order, fourth order, and higher derivatives measure increasingly detailed information about how the function curves and twists. In practice, most applications in physics and engineering use first and second order partial derivatives. Our calculator supports up to fifth order to handle advanced problems.
A mixed partial derivative involves differentiating with respect to one variable first, then a different variable. For example, f_xy means differentiate with respect to x, then with respect to y.
Clairaut's theorem (also called Schwarz's theorem) says that if both mixed partials are continuous, then the order does not matter. That is, f_xy = f_yx. This is true for the vast majority of functions you will encounter in a calculus course. You can verify this yourself using our mixed partial mode: compute ∂²f/∂x∂y and ∂²f/∂y∂x and check that they match.
When variables depend on each other through intermediate functions, the chain rule for partial derivatives comes into play. Suppose z = f(u, v) where u = g(x, y) and v = h(x, y). To find ∂z/∂x, you add up the contributions from each path:
∂z/∂x = (∂f/∂u)(∂u/∂x) + (∂f/∂v)(∂v/∂x)
This pattern extends to any number of intermediate variables. Tree diagrams can help you keep track of all the paths. The chain rule for partial derivatives shows up constantly in physics, especially in thermodynamics and coordinate transformations.
Partial derivatives work the same way regardless of how many variables are involved. For f(x, y, z), you have three first-order partial derivatives: f_x, f_y, and f_z. Together they form the gradient vector, which points in the direction of steepest increase.
With three variables, the number of second-order partials grows. You have f_xx, f_yy, f_zz, f_xy, f_xz, and f_yz (plus their symmetric partners, which are equal by Clairaut's theorem). This collection of second derivatives forms the Hessian matrix, which is central to optimization and tells you whether a critical point is a maximum, minimum, or saddle point.
Partial derivatives appear throughout science and engineering. In physics, Maxwell's equations, the heat equation, and the wave equation are all written using partial derivatives. In economics, marginal cost and marginal utility are partial derivatives of cost and utility functions. Machine learning relies on partial derivatives to compute gradients for training neural networks through backpropagation.
In fluid dynamics, the Navier-Stokes equations describe how velocity and pressure in a fluid change with position and time, all expressed as partial derivatives. Even in everyday data science, the gradient descent algorithm uses partial derivatives to find the parameters that minimize a loss function.
If you are studying multivariable calculus, differential equations, or any applied math course, you will work with partial derivatives regularly. Having a reliable calculator to check your answers speeds up the learning process and helps catch algebraic mistakes early.
There are several equivalent ways to write a partial derivative, and different textbooks prefer different styles:
Start by clearly identifying which variable you are differentiating with respect to. Circle it or underline it so you do not lose track. Rewrite the function if needed to make the target variable more visible. For example, rewrite xy² + yz as (y²)x + yz when taking the partial with respect to x.
Watch out for the product rule when two or more factors contain the target variable. For instance, in x²sin(xy), both x² and sin(xy) depend on x, so you need the product rule.
After you finish, check your work by evaluating the original function and the derivative at a simple point. If f(1, 1) = 3 and ∂f/∂x at (1, 1) gives 5, then f(1.001, 1) should be approximately 3.005. This quick numerical check catches most algebraic errors.
Answers to common questions about partial derivatives and how to use this calculator.
This partial derivative calculator is for educational purposes. Always verify results for critical work. The tool assumes standard mathematical conventions and may not handle every edge case in symbolic computation.