upgrade
upgrade

Calculus IV

Partial Derivative Rules

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Partial derivatives are the foundation of everything you'll do in Calculus IV—they're how we extend the power of differentiation to functions with multiple inputs. When you're analyzing a surface, optimizing a function of several variables, or understanding how physical systems respond to changes, you're using partial derivatives. The rules governing them connect directly to gradient vectors, directional derivatives, optimization, and vector field analysis—all major exam topics.

Here's the key insight: you're being tested on more than just computation. Examiners want to see that you understand when to apply each rule and why it works. The chain rule for partials shows up in related rates problems; Clairaut's theorem saves you time on mixed partials; the gradient ties everything together for optimization. Don't just memorize formulas—know what concept each rule illustrates and when it becomes your go-to tool.


Foundational Concepts and Notation

Before applying any rules, you need rock-solid understanding of what partial derivatives actually measure and how to communicate them precisely. A partial derivative isolates the rate of change in one direction while treating all other variables as constants.

Definition of Partial Derivatives

  • Measures single-variable change—a partial derivative tells you how ff changes as one variable varies while all others remain fixed
  • Enables independent analysis of each variable's contribution to the function's behavior, essential for understanding multivariable relationships
  • Notation fx\frac{\partial f}{\partial x} explicitly indicates differentiation with respect to xx, distinguishing it from total derivatives

Partial Derivative Notation

  • First-order partials use fx\frac{\partial f}{\partial x}, fy\frac{\partial f}{\partial y}, or subscript notation like fxf_x, fyf_y
  • Mixed second derivatives written as 2fxy\frac{\partial^2 f}{\partial x \partial y} indicate sequential differentiation with respect to different variables
  • Higher-order notation extends naturally: 3fx2y\frac{\partial^3 f}{\partial x^2 \partial y} means differentiate twice with respect to xx, then once with respect to yy

Compare: fx\frac{\partial f}{\partial x} vs. dfdx\frac{df}{dx}—both use Leibniz-style notation, but the curly \partial signals multivariable context where other variables are held constant. If an FRQ uses \partial, you know you're in partial derivative territory.


Differentiation Rules for Partial Derivatives

The computational rules you learned in single-variable calculus—product rule, quotient rule, chain rule—all extend to partial derivatives with one crucial modification: treat all variables except the one you're differentiating with respect to as constants.

Partial Derivatives of Multivariable Functions

  • Compute each partial separately by treating other variables as constants—for f(x,y)f(x, y), find fx\frac{\partial f}{\partial x} by treating yy as a constant
  • Reveals directional behavior showing how the function changes along coordinate axes
  • Critical point analysis requires setting all first partials equal to zero simultaneously

Chain Rule for Partial Derivatives

  • Composite function differentiation—if z=f(x,y)z = f(x, y) where x=x(t)x = x(t) and y=y(t)y = y(t), then dzdt=fxdxdt+fydydt\frac{dz}{dt} = \frac{\partial f}{\partial x}\frac{dx}{dt} + \frac{\partial f}{\partial y}\frac{dy}{dt}
  • Tree diagrams help track dependencies when variables depend on multiple intermediate variables
  • Essential for implicit relationships and parametric surfaces where variables aren't independent

Implicit Differentiation for Multivariable Functions

  • No explicit solving required—differentiate the equation F(x,y,z)=0F(x, y, z) = 0 directly with respect to your variable of interest
  • Formula shortcut: if F(x,y,z)=0F(x, y, z) = 0, then zx=FxFz\frac{\partial z}{\partial x} = -\frac{F_x}{F_z} (provided Fz0F_z \neq 0)
  • Handles constraint equations that define surfaces implicitly, common in optimization problems

Compare: Standard partial differentiation vs. implicit differentiation—both find partial derivatives, but implicit differentiation works when you can't isolate the dependent variable. Use implicit when you see equations like x2+y2+z2=1x^2 + y^2 + z^2 = 1 rather than z=1x2y2z = \sqrt{1 - x^2 - y^2}.


Symmetry and Higher-Order Derivatives

When you take multiple partial derivatives, the order can matter—or not. Clairaut's theorem tells us exactly when we can swap the order, which is most of the time for functions you'll encounter.

Clairaut's Theorem (Equality of Mixed Partials)

  • Order doesn't matter when mixed partials are continuous: 2fxy=2fyx\frac{\partial^2 f}{\partial x \partial y} = \frac{\partial^2 f}{\partial y \partial x}
  • Computation shortcut—choose whichever order is easier to calculate; the result is identical
  • Continuity requirement is almost always satisfied for functions on exams, but watch for piecewise definitions at boundaries

Higher-Order Partial Derivatives

  • Second partials 2fx2\frac{\partial^2 f}{\partial x^2} measure concavity in the xx-direction, analogous to single-variable second derivatives
  • Mixed partials capture how the rate of change in one direction varies as you move in another direction
  • Hessian matrix collects all second partials and determines the nature of critical points (saddle, max, or min)

Compare: 2fx2\frac{\partial^2 f}{\partial x^2} vs. 2fxy\frac{\partial^2 f}{\partial x \partial y}—pure second partials measure curvature along axes, while mixed partials measure "twist." Both appear in the second derivative test: D=fxxfyy(fxy)2D = f_{xx}f_{yy} - (f_{xy})^2.


Gradient and Directional Analysis

The gradient packages all first partial derivatives into a single vector, unlocking powerful geometric interpretations. The gradient points toward steepest increase and its magnitude tells you how steep.

Gradient Vector

  • Definition f=fx,fy,fz\nabla f = \langle f_x, f_y, f_z \rangle collects all first-order partial derivatives into one vector
  • Points toward steepest ascent—move in the direction of f\nabla f to increase ff as rapidly as possible
  • Magnitude f|\nabla f| equals the maximum rate of change of ff at that point

Directional Derivatives

  • Rate of change in any direction—computed as Duf=fuD_{\mathbf{u}}f = \nabla f \cdot \mathbf{u} where u\mathbf{u} is a unit vector
  • Maximum value equals f|\nabla f| and occurs when u\mathbf{u} points in the gradient direction
  • Zero directional derivative occurs perpendicular to f\nabla f, along level curves/surfaces

Compare: Gradient vs. directional derivative—the gradient gives you the direction of maximum increase, while the directional derivative gives you the rate of change in any specified direction. FRQs often ask: "In what direction does ff increase most rapidly?" Answer: the gradient direction.


Vector-Valued Extensions

When your function outputs a vector instead of a scalar, partial derivatives apply component-by-component. This extends naturally to analyzing vector fields and parametric surfaces.

Partial Derivatives of Vector-Valued Functions

  • Component-wise differentiation—for r(u,v)=x(u,v),y(u,v),z(u,v)\mathbf{r}(u, v) = \langle x(u,v), y(u,v), z(u,v) \rangle, take partials of each component separately
  • Tangent vectors ru\mathbf{r}_u and rv\mathbf{r}_v span the tangent plane to parametric surfaces
  • Applications in physics include velocity fields, force fields, and electromagnetic analysis

Quick Reference Table

ConceptBest Examples
Basic computationfx\frac{\partial f}{\partial x}, fy\frac{\partial f}{\partial y} with other variables constant
Chain ruleComposite functions, parametric dependencies, related rates
Implicit differentiationConstraint equations, level surfaces, F(x,y,z)=0F(x,y,z) = 0
Clairaut's theoremMixed partials fxy=fyxf_{xy} = f_{yx}, simplifying calculations
Higher-order derivativesSecond derivative test, Hessian matrix, curvature analysis
Gradient vectorSteepest ascent, optimization, normal to level curves
Directional derivativesRate of change along arbitrary paths, Duf=fuD_{\mathbf{u}}f = \nabla f \cdot \mathbf{u}

Self-Check Questions

  1. If f(x,y)=x2y+exyf(x, y) = x^2 y + e^{xy}, which rule do you use to find fx\frac{\partial f}{\partial x}, and what do you treat yy as during the calculation?

  2. Compare and contrast 2fxy\frac{\partial^2 f}{\partial x \partial y} and 2fyx\frac{\partial^2 f}{\partial y \partial x}. Under what condition are they equal, and why does this matter computationally?

  3. Given f=3,4\nabla f = \langle 3, -4 \rangle at a point, what is the maximum rate of change of ff, and in what direction does it occur?

  4. When would you choose implicit differentiation over standard partial differentiation? Give an example equation where implicit differentiation is the better approach.

  5. FRQ-style: A surface is defined by z=f(x,y)z = f(x, y). Explain how you would use the gradient to find a vector normal to the level curve f(x,y)=cf(x, y) = c and a vector normal to the surface itself. What's the relationship between these two normals?