Fiveable

5️⃣Multivariable Calculus Unit 3 Review

QR code for Multivariable Calculus practice questions

3.3 Partial Derivatives and the Gradient

3.3 Partial Derivatives and the Gradient

Written by the Fiveable Content Team • Last updated August 2025
Written by the Fiveable Content Team • Last updated August 2025

Partial Derivatives

Partial derivatives of multivariable functions

A partial derivative measures how a function changes when you nudge one variable while freezing all the others. Think of it like asking: "If I only change xx, how does the output respond?"

Notation: You'll see partial derivatives written several ways, all meaning the same thing:

fx,fx,fy,fy\frac{\partial f}{\partial x}, \quad f_x, \quad \frac{\partial f}{\partial y}, \quad f_y

How to compute a partial derivative:

  1. Pick the variable you're differentiating with respect to.
  2. Treat every other variable as a constant.
  3. Differentiate using the single-variable rules you already know.

Quick examples:

  • f(x,y)=x2yf(x,y) = x^2 y: Taking fxf_x, treat yy as a constant. You get fx=2xyf_x = 2xy. For fyf_y, treat x2x^2 as a constant, giving fy=x2f_y = x^2.
  • f(x,y)=sin(xy)f(x,y) = \sin(xy): By the chain rule, fx=ycos(xy)f_x = y\cos(xy) and fy=xcos(xy)f_y = x\cos(xy).
  • f(x,y)=ex+yf(x,y) = e^{x+y}: Both partials give ex+ye^{x+y}, since the derivative of x+yx+y with respect to either variable is 1.

The chain rule extends naturally to partial derivatives for composite functions. If ff depends on uu and vv, which themselves depend on xx and yy, you chain the partials together. Implicit differentiation also carries over: if a relationship like x2+y2+z2=1x^2 + y^2 + z^2 = 1 defines zz implicitly, you can differentiate both sides with respect to xx (treating zz as a function of xx) to find zx\frac{\partial z}{\partial x}.

Partial derivatives of multivariable functions, Chain Rule in Multivariable Calculus made easy - Mathematics Stack Exchange

Interpretation of partial derivatives

Geometrically, fx(a,b)f_x(a,b) is the slope of the tangent line to the surface z=f(x,y)z = f(x,y) at the point (a,b)(a,b), sliced along the direction of the xx-axis (with yy held fixed). Similarly, fy(a,b)f_y(a,b) gives the slope along the yy-direction.

Partial derivatives connect directly to directional derivatives, which measure rates of change in any direction, not just along the coordinate axes.

Applications show up everywhere:

  • Physics: Velocity components in 3D motion, where position depends on time and spatial coordinates.
  • Economics: Marginal cost or marginal revenue, measuring how cost changes when you adjust one input (like labor) while holding others fixed.
  • Sensitivity analysis: Determining which variable has the biggest impact on the output.

The linear approximation formula ties this together practically. For small changes Δx\Delta x and Δy\Delta y:

ΔffxΔx+fyΔy\Delta f \approx f_x \, \Delta x + f_y \, \Delta y

This tells you that the total change in ff is approximately the sum of each partial derivative times the corresponding small change. It's the multivariable version of the single-variable approximation Δff(x)Δx\Delta f \approx f'(x)\Delta x.

Partial derivatives of multivariable functions, The Chain Rule · Calculus

The Gradient and Higher-Order Derivatives

Higher-order partial derivatives

Just like in single-variable calculus, you can differentiate more than once. Second-order partial derivatives come in two flavors:

  • Unmixed: Differentiate twice with respect to the same variable, e.g., fxx=2fx2f_{xx} = \frac{\partial^2 f}{\partial x^2}
  • Mixed: Differentiate with respect to different variables, e.g., fxy=2fyxf_{xy} = \frac{\partial^2 f}{\partial y \, \partial x} (differentiate first with respect to xx, then yy)

Clairaut's Theorem: If fxyf_{xy} and fyxf_{yx} are both continuous, then fxy=fyxf_{xy} = f_{yx}. In practice, this holds for nearly every function you'll encounter in this course, so the order of mixed partials usually doesn't matter.

What second-order partials tell you:

  • fxx>0f_{xx} > 0 at a point means the function is concave up in the xx-direction there; fxx<0f_{xx} < 0 means concave down.
  • Mixed partials fxyf_{xy} capture how the slope in one direction changes as you move in the other direction.
  • These derivatives are essential for the second derivative test in optimization and for building Taylor series expansions of multivariable functions.

Gradient vector computation and geometry

The gradient packages all the partial derivatives of a function into a single vector:

f=(fx,fy,fz)\nabla f = \left(\frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}, \frac{\partial f}{\partial z}\right)

For a function of two variables, drop the zz-component.

Key properties of the gradient:

  • It points in the direction of steepest ascent of ff. If you're standing on a hillside and want to go uphill as fast as possible, walk in the direction of f\nabla f.
  • Its magnitude f\|\nabla f\| gives the rate of steepest ascent. A large magnitude means the function is changing rapidly.
  • It is perpendicular to level curves (in 2D) and level surfaces (in 3D). This is why the gradient provides a normal vector to a surface at any point.

Connection to directional derivatives: The directional derivative of ff in the direction of a unit vector u\mathbf{u} is:

Duf=fuD_{\mathbf{u}} f = \nabla f \cdot \mathbf{u}

This dot product is maximized when u\mathbf{u} points in the same direction as f\nabla f (confirming the steepest ascent property) and equals zero when u\mathbf{u} is tangent to a level curve (no change along a contour).

Where the gradient shows up:

  • Optimization: Setting f=0\nabla f = \mathbf{0} finds critical points where the function could have a maximum, minimum, or saddle point.
  • Conservative vector fields: A vector field F\mathbf{F} is conservative if F=f\mathbf{F} = \nabla f for some scalar function ff. One test for this is checking that the curl of F\mathbf{F} is zero.