# Total derivative

In the mathematical field of differential calculus, a total derivative or full derivative of a function ${\displaystyle f}$ of several variables, e.g., ${\displaystyle t}$, ${\displaystyle x}$, ${\displaystyle y}$, etc., with respect to an exogenous argument, e.g., ${\displaystyle t}$, is the limiting ratio of the change in the function's value to the change in the exogenous argument's value (for arbitrarily small changes), taking into account the exogenous argument's direct effect as well as indirect effects via the other arguments of the function.

The total derivative of a function is different from its corresponding partial derivative (${\displaystyle \partial }$). Calculation of the total derivative of ${\displaystyle f}$ with respect to ${\displaystyle t}$ does not assume that the other arguments are constant while ${\displaystyle t}$ varies; instead, it assumes that the other arguments too depend on ${\displaystyle t}$. The total derivative includes these indirect dependencies to find the overall dependency of ${\displaystyle f}$ on ${\displaystyle t.}$[1]:198–203 For example, the total derivative of ${\displaystyle f(t,x(t),y(t))}$ with respect to ${\displaystyle t}$ is

${\displaystyle {\frac {\operatorname {d} f}{\operatorname {d} t}}={\frac {\partial f}{\partial t}}{\frac {\operatorname {d} t}{\operatorname {d} t}}+{\frac {\partial f}{\partial x}}{\frac {\operatorname {d} x}{\operatorname {d} t}}+{\frac {\partial f}{\partial y}}{\frac {\operatorname {d} y}{\operatorname {d} t}},}$

which simplifies to

${\displaystyle {\frac {\operatorname {d} f}{\operatorname {d} t}}={\frac {\partial f}{\partial t}}+{\frac {\partial f}{\partial x}}{\frac {\operatorname {d} x}{\operatorname {d} t}}+{\frac {\partial f}{\partial y}}{\frac {\operatorname {d} y}{\operatorname {d} t}}.}$

Consider multiplying both sides of the equation by the differential ${\displaystyle \operatorname {d} t}$:

${\displaystyle \operatorname {d} f={\frac {\partial f}{\partial t}}\operatorname {d} t+{\frac {\partial f}{\partial x}}\operatorname {d} x+{\frac {\partial f}{\partial y}}\operatorname {d} y.}$

The result is the differential change ${\displaystyle \operatorname {d} f}$ in, or total differential of, the function ${\displaystyle f}$. Because ${\displaystyle f}$ depends on ${\displaystyle t}$, some of that change will be due to the partial derivative of ${\displaystyle f}$ with respect to ${\displaystyle t}$. However, some of that change will also be due to the partial derivatives of ${\displaystyle f}$ with respect to the variables ${\displaystyle x}$ and ${\displaystyle y}$. So, the differential ${\displaystyle \operatorname {d} t}$ is applied to the total derivatives of ${\displaystyle x}$ and ${\displaystyle y}$ to find differentials ${\displaystyle \operatorname {d} x}$ and ${\displaystyle \operatorname {d} y}$, which can then be used to find the contribution to ${\displaystyle \operatorname {d} f}$.

"Total derivative" is sometimes also used as a synonym for the material derivative, ${\displaystyle {\frac {\operatorname {D} }{\operatorname {D} t}}}$, in fluid mechanics.

## Differentiation with indirect dependencies

Suppose that f is a function of two variables, x and y. Normally these variables are assumed to be independent. However, in some situations they may be dependent on each other. For example y could be a function of x, constraining the domain of f to a curve in ${\displaystyle \mathbb {R} ^{2}}$. In this case the partial derivative of f with respect to x does not give the true rate of change of f with respect to changing x because changing x necessarily changes y. The total derivative takes such dependencies into account.

For example, suppose

${\displaystyle f(x,y)=xy}$.

The rate of change of f with respect to x is usually the partial derivative of f with respect to x; in this case,

${\displaystyle {\frac {\partial f}{\partial x}}=y}$.

However, if y depends on x, the partial derivative does not give the true rate of change of f as x changes because it holds y fixed.

Suppose we are constrained to the line

${\displaystyle y=x;}$

then

${\displaystyle f(x,y)=f(x,x)=x^{2}}$.

In that case, the total derivative of f with respect to x is

${\displaystyle {\frac {\mathrm {d} f}{\mathrm {d} x}}=2x}$.

Instead of immediately substituting for y in terms of x, this can be found equivalently using the chain rule:

${\displaystyle {\frac {\mathrm {d} f}{\mathrm {d} x}}={\frac {\partial f}{\partial x}}+{\frac {\partial f}{\partial y}}{\frac {\mathrm {d} y}{\mathrm {d} x}}=y+x\cdot 1=x+y=2x.}$

Notice that this is not equal to the partial derivative:

${\displaystyle {\frac {\mathrm {d} f}{\mathrm {d} x}}=2x\neq {\frac {\partial f}{\partial x}}=y=x}$.

While one can often perform substitutions to eliminate indirect dependencies, the chain rule provides for a more efficient and general technique. Suppose M(t, p1, ..., pn) is a function of time t and n variables ${\displaystyle p_{i}}$ which themselves depend on time. Then, the total time derivative of M is

${\displaystyle {\operatorname {d} M \over \operatorname {d} t}={\frac {\operatorname {d} }{\operatorname {d} t}}M{\bigl (}t,p_{1}(t),\ldots ,p_{n}(t){\bigr )}.}$

The chain rule for differentiating a function of several variables implies that

${\displaystyle {\operatorname {d} M \over \operatorname {d} t}={\frac {\partial M}{\partial t}}+\sum _{i=1}^{n}{\frac {\partial M}{\partial p_{i}}}{\frac {\operatorname {d} p_{i}}{\operatorname {d} t}}={\biggl (}{\frac {\partial }{\partial t}}+\sum _{i=1}^{n}{\frac {\operatorname {d} p_{i}}{\operatorname {d} t}}{\frac {\partial }{\partial p_{i}}}{\biggr )}(M).}$

This expression is often used in physics for a gauge transformation of the Lagrangian, as two Lagrangians that differ only by the total time derivative of a function of time and the n generalized coordinates lead to the same equations of motion. An interesting example concerns the resolution of causality concerning the Wheeler–Feynman time-symmetric theory. The operator in brackets (in the final expression) is also called the total derivative operator (with respect to t).

For example, the total derivative of f(x(t), y(t)) is

${\displaystyle {\frac {\operatorname {d} f}{\operatorname {d} t}}={\partial f \over \partial x}{\operatorname {d} x \over \operatorname {d} t}+{\partial f \over \partial y}{\operatorname {d} y \over \operatorname {d} t}.}$

Here there is no ∂f/∂t term since f itself does not depend on the independent variable t directly.

## The total derivative via differentials

Differentials provide a simple way to understand the total derivative. For instance, suppose ${\displaystyle M(t,p_{1},\dots ,p_{n})}$ is a function of time t and n variables ${\displaystyle p_{i}}$ as in the previous section. Then, the differential of M is

${\displaystyle \operatorname {d} M={\frac {\partial M}{\partial t}}\operatorname {d} t+\sum _{i=1}^{n}{\frac {\partial M}{\partial p_{i}}}\operatorname {d} p_{i}.}$

This expression is often interpreted heuristically as a relation between infinitesimals. However, if the variables t and ${\displaystyle p_{i}}$ are interpreted as functions, and ${\displaystyle M(t,p_{1},\dots ,p_{n})}$ is interpreted to mean the composite of M with these functions, then the above expression makes perfect sense as an equality of differential 1-forms, and is immediate from the chain rule for the exterior derivative. The advantage of this point of view is that it takes into account arbitrary dependencies between the variables. For example, if ${\displaystyle p_{1}^{2}=p_{2}p_{3}}$ then ${\displaystyle 2p_{1}\operatorname {d} p_{1}=p_{3}\operatorname {d} p_{2}+p_{2}\operatorname {d} p_{3}}$. In particular, if the variables ${\displaystyle p_{i}}$ are all functions of t, as in the previous section, then

${\displaystyle \operatorname {d} M={\frac {\partial M}{\partial t}}\operatorname {d} t+\sum _{i=1}^{n}{\frac {\partial M}{\partial p_{i}}}{\frac {\partial p_{i}}{\partial t}}\,\operatorname {d} t.}$

Dividing through by dt gives the total derivative dM/dt.

## The total derivative as a linear map

Let ${\displaystyle U\subseteq \mathbb {R} ^{n}}$ be an open subset. Then a function ${\displaystyle f:U\rightarrow \mathbb {R} ^{m}}$ is said to be (totally) differentiable at a point ${\displaystyle u\in U}$, if there exists a linear map ${\displaystyle \operatorname {d} f_{u}:\mathbb {R} ^{n}\rightarrow \mathbb {R} ^{m}}$ (also denoted Duf or Df(u)) such that

${\displaystyle \lim _{x\rightarrow u}{\frac {\|f(x)-f(u)-\operatorname {d} f_{u}(x-u)\|}{\|x-u\|}}=0.}$

The linear map ${\displaystyle \operatorname {d} f_{u}}$ is called the (total) derivative or (total) differential of ${\displaystyle f}$ at ${\displaystyle u}$. A function is (totally) differentiable if its total derivative exists at every point in its domain.

Note that f is differentiable if and only if each of its components ${\displaystyle f_{i}:U\rightarrow \mathbb {R} }$ is differentiable. For this it is necessary, but not sufficient, that the partial derivatives of each function fi exist. However, if these partial derivatives exist and are continuous, then f is differentiable and its differential at any point is the linear map determined by the Jacobian matrix of partial derivatives at that point.[2]

## Total differential equation

A total differential equation is a differential equation expressed in terms of total derivatives. Since the exterior derivative is a natural operator, in a sense that can be given a technical meaning, such equations are intrinsic and geometric.

## Application to equation systems

In economics, it is common for the total derivative to arise in the context of a system of equations.[1]:pp. 217–220 For example, a simple supply-demand system might specify the quantity q of a product demanded as a function D of its price p and consumers' income I, the latter being an exogenous variable, and might specify the quantity supplied by producers as a function S of its price and two exogenous resource cost variables r and w. The resulting system of equations,

${\displaystyle q=D(p,I),}$
${\displaystyle q=S(p,r,w),}$

determines the market equilibrium values of the variables p and q. The total derivative of, for example, p with respect to r, ${\displaystyle {\frac {\operatorname {d} p}{\operatorname {d} r}},}$ gives the sign and magnitude of the reaction of the market price to the exogenous variable r. In the indicated system, there are a total of six possible total derivatives, also known in this context as comparative static derivatives: dp/dr, dp/dw, dp/dI, dq/dr, dq/dw, and dq/dI. The total derivatives are found by totally differentiating the system of equations, dividing through by, say dr, treating dq/dr and dp/dr as the unknowns, setting dI=dw=0, and solving the two totally differentiated equations simultaneously, typically by using Cramer's rule.

## References

1. ^ a b Chiang, Alpha C. (1984). Fundamental Methods of Mathematical Economics (Third ed.). McGraw-Hill. ISBN 0-07-010813-7.
2. ^ Abraham, Ralph; Marsden, J. E.; Ratiu, Tudor (2012). Manifolds, Tensor Analysis, and Applications. Springer Science & Business Media. p. 78.
• A. D. Polyanin and V. F. Zaitsev, Handbook of Exact Solutions for Ordinary Differential Equations (2nd edition), Chapman & Hall/CRC Press, Boca Raton, 2003. ISBN 1-58488-297-2
• From thesaurus.maths.org total derivative