Partial derivatives are the foundation of everything you'll do in Calculus IV—they're how we extend the power of differentiation to functions with multiple inputs. When you're analyzing a surface, optimizing a function of several variables, or understanding how physical systems respond to changes, you're using partial derivatives. The rules governing them connect directly to gradient vectors, directional derivatives, optimization, and vector field analysis—all major exam topics.
Here's the key insight: you're being tested on more than just computation. Examiners want to see that you understand when to apply each rule and why it works. The chain rule for partials shows up in related rates problems; Clairaut's theorem saves you time on mixed partials; the gradient ties everything together for optimization. Don't just memorize formulas—know what concept each rule illustrates and when it becomes your go-to tool.
Foundational Concepts and Notation
Before applying any rules, you need rock-solid understanding of what partial derivatives actually measure and how to communicate them precisely. A partial derivative isolates the rate of change in one direction while treating all other variables as constants.
Definition of Partial Derivatives
Measures single-variable change—a partial derivative tells you how f changes as one variable varies while all others remain fixed
Enables independent analysis of each variable's contribution to the function's behavior, essential for understanding multivariable relationships
Notation ∂x∂f explicitly indicates differentiation with respect to x, distinguishing it from total derivatives
Partial Derivative Notation
First-order partials use ∂x∂f, ∂y∂f, or subscript notation like fx, fy
Mixed second derivatives written as ∂x∂y∂2f indicate sequential differentiation with respect to different variables
Higher-order notation extends naturally: ∂x2∂y∂3f means differentiate twice with respect to x, then once with respect to y
Compare:∂x∂f vs. dxdf—both use Leibniz-style notation, but the curly ∂ signals multivariable context where other variables are held constant. If an FRQ uses ∂, you know you're in partial derivative territory.
Differentiation Rules for Partial Derivatives
The computational rules you learned in single-variable calculus—product rule, quotient rule, chain rule—all extend to partial derivatives with one crucial modification: treat all variables except the one you're differentiating with respect to as constants.
Partial Derivatives of Multivariable Functions
Compute each partial separately by treating other variables as constants—for f(x,y), find ∂x∂f by treating y as a constant
Reveals directional behavior showing how the function changes along coordinate axes
Critical point analysis requires setting all first partials equal to zero simultaneously
Chain Rule for Partial Derivatives
Composite function differentiation—if z=f(x,y) where x=x(t) and y=y(t), then dtdz=∂x∂fdtdx+∂y∂fdtdy
Tree diagrams help track dependencies when variables depend on multiple intermediate variables
Essential for implicit relationships and parametric surfaces where variables aren't independent
Implicit Differentiation for Multivariable Functions
No explicit solving required—differentiate the equation F(x,y,z)=0 directly with respect to your variable of interest
Formula shortcut: if F(x,y,z)=0, then ∂x∂z=−FzFx (provided Fz=0)
Handles constraint equations that define surfaces implicitly, common in optimization problems
Compare: Standard partial differentiation vs. implicit differentiation—both find partial derivatives, but implicit differentiation works when you can't isolate the dependent variable. Use implicit when you see equations like x2+y2+z2=1 rather than z=1−x2−y2.
Symmetry and Higher-Order Derivatives
When you take multiple partial derivatives, the order can matter—or not. Clairaut's theorem tells us exactly when we can swap the order, which is most of the time for functions you'll encounter.
Clairaut's Theorem (Equality of Mixed Partials)
Order doesn't matter when mixed partials are continuous: ∂x∂y∂2f=∂y∂x∂2f
Computation shortcut—choose whichever order is easier to calculate; the result is identical
Continuity requirement is almost always satisfied for functions on exams, but watch for piecewise definitions at boundaries
Higher-Order Partial Derivatives
Second partials ∂x2∂2f measure concavity in the x-direction, analogous to single-variable second derivatives
Mixed partials capture how the rate of change in one direction varies as you move in another direction
Hessian matrix collects all second partials and determines the nature of critical points (saddle, max, or min)
Compare:∂x2∂2f vs. ∂x∂y∂2f—pure second partials measure curvature along axes, while mixed partials measure "twist." Both appear in the second derivative test: D=fxxfyy−(fxy)2.
Gradient and Directional Analysis
The gradient packages all first partial derivatives into a single vector, unlocking powerful geometric interpretations. The gradient points toward steepest increase and its magnitude tells you how steep.
Gradient Vector
Definition ∇f=⟨fx,fy,fz⟩ collects all first-order partial derivatives into one vector
Points toward steepest ascent—move in the direction of ∇f to increase f as rapidly as possible
Magnitude ∣∇f∣ equals the maximum rate of change of f at that point
Directional Derivatives
Rate of change in any direction—computed as Duf=∇f⋅u where u is a unit vector
Maximum value equals ∣∇f∣ and occurs when u points in the gradient direction
Zero directional derivative occurs perpendicular to ∇f, along level curves/surfaces
Compare: Gradient vs. directional derivative—the gradient gives you the direction of maximum increase, while the directional derivative gives you the rate of change in any specified direction. FRQs often ask: "In what direction does f increase most rapidly?" Answer: the gradient direction.
Vector-Valued Extensions
When your function outputs a vector instead of a scalar, partial derivatives apply component-by-component. This extends naturally to analyzing vector fields and parametric surfaces.
Partial Derivatives of Vector-Valued Functions
Component-wise differentiation—for r(u,v)=⟨x(u,v),y(u,v),z(u,v)⟩, take partials of each component separately
Tangent vectorsru and rv span the tangent plane to parametric surfaces
Applications in physics include velocity fields, force fields, and electromagnetic analysis
Quick Reference Table
Concept
Best Examples
Basic computation
∂x∂f, ∂y∂f with other variables constant
Chain rule
Composite functions, parametric dependencies, related rates
Second derivative test, Hessian matrix, curvature analysis
Gradient vector
Steepest ascent, optimization, normal to level curves
Directional derivatives
Rate of change along arbitrary paths, Duf=∇f⋅u
Self-Check Questions
If f(x,y)=x2y+exy, which rule do you use to find ∂x∂f, and what do you treat y as during the calculation?
Compare and contrast ∂x∂y∂2f and ∂y∂x∂2f. Under what condition are they equal, and why does this matter computationally?
Given ∇f=⟨3,−4⟩ at a point, what is the maximum rate of change of f, and in what direction does it occur?
When would you choose implicit differentiation over standard partial differentiation? Give an example equation where implicit differentiation is the better approach.
FRQ-style: A surface is defined by z=f(x,y). Explain how you would use the gradient to find a vector normal to the level curve f(x,y)=c and a vector normal to the surface itself. What's the relationship between these two normals?