### Key Concepts

## 4.1 Related Rates

- To solve a related rates problem, first draw a picture that illustrates the relationship between the two or more related quantities that are changing with respect to time.
- In terms of the quantities, state the information given and the rate to be found.
- Find an equation relating the quantities.
- Use differentiation, applying the chain rule as necessary, to find an equation that relates the rates.
- Be sure not to substitute a variable quantity for one of the variables until after finding an equation relating the rates.

## 4.2 Linear Approximations and Differentials

- A differentiable function $y=f(x)$ can be approximated at $a$ by the linear function

$$L(x)=f(a)+f\prime (a)(x-a).$$ - For a function $y=f(x),$ if $x$ changes from $a$ to $a+dx,$ then

$$dy=f\prime (x)dx$$

is an approximation for the change in $y.$ The actual change in $y$ is

$$\text{\Delta}y=f(a+dx)-f(a).$$ - A measurement error $dx$ can lead to an error in a calculated quantity $f(x).$ The error in the calculated quantity is known as the
*propagated error*. The propagated error can be estimated by

$$dy\approx f\prime (x)dx.$$ - To estimate the relative error of a particular quantity $q,$ we estimate $\frac{\text{\Delta}q}{q}.$

## 4.3 Maxima and Minima

- A function may have both an absolute maximum and an absolute minimum, have just one absolute extremum, or have no absolute maximum or absolute minimum.
- If a function has a local extremum, the point at which it occurs must be a critical point. However, a function need not have a local extremum at a critical point.
- A continuous function over a closed, bounded interval has an absolute maximum and an absolute minimum. Each extremum occurs at a critical point or an endpoint.

## 4.4 The Mean Value Theorem

- If $f$ is continuous over $[a,b]$ and differentiable over $\left(a,b\right)$ and $f\left(a\right)=f\left(b\right),$ then there exists a point $c\in \left(a,b\right)$ such that ${f}^{\prime}\left(c\right)=0.$ This is Rolle’s theorem.
- If $f$ is continuous over $[a,b]$ and differentiable over $\left(a,b\right),$ then there exists a point $c\in \left(a,b\right)$ such that

$$f\prime \left(c\right)=\frac{f\left(b\right)-f\left(a\right)}{b-a}.$$

This is the Mean Value Theorem. - If $f\prime (x)=0$ over an interval $I,$ then $f$ is constant over $I.$
- If two differentiable functions $f$ and $g$ satisfy ${f}^{\prime}(x)={g}^{\prime}(x)$ over $I,$ then $f\left(x\right)=g\left(x\right)+C$ for some constant $C.$
- If ${f}^{\prime}\left(x\right)>0$ over an interval $I,$ then $f$ is increasing over $I.$ If ${f}^{\prime}(x)<0$ over $I,$ then $f$ is decreasing over $I.$

## 4.5 Derivatives and the Shape of a Graph

- If $c$ is a critical point of $f$ and ${f}^{\prime}\left(x\right)>0$ for $x<c$ and ${f}^{\prime}\left(x\right)<0$ for $x>c,$ then $f$ has a local maximum at $c.$
- If $c$ is a critical point of $f$ and ${f}^{\prime}\left(x\right)<0$ for $x<c$ and ${f}^{\prime}\left(x\right)>0$ for $x>c,$ then $f$ has a local minimum at $c.$
- If $f\text{\u2033}\left(x\right)>0$ over an interval $I,$ then $f$ is concave up over $I.$
- If $f\text{\u2033}\left(x\right)<0$ over an interval $I,$ then $f$ is concave down over $I.$
- If ${f}^{\prime}\left(c\right)=0$ and $f\text{\u2033}\left(c\right)>0,$ then $f$ has a local minimum at $c.$
- If ${f}^{\prime}\left(c\right)=0$ and $f\text{\u2033}\left(c\right)<0,$ then $f$ has a local maximum at $c.$
- If ${f}^{\prime}\left(c\right)=0$ and $f\text{\u2033}\left(c\right)=0,$ then evaluate ${f}^{\prime}\left(x\right)$ at a test point $x$ to the left of $c$ and a test point $x$ to the right of $c,$ to determine whether $f$ has a local extremum at $c.$

## 4.6 Limits at Infinity and Asymptotes

- The limit of $f\left(x\right)$ is $L$ as $x\to \infty $ (or as $x\to \text{\u2212}\infty )$ if the values $f\left(x\right)$ become arbitrarily close to $L$ as $x$ becomes sufficiently large.
- The limit of $f\left(x\right)$ is $\infty $ as $x\to \infty $ if $f\left(x\right)$ becomes arbitrarily large as $x$ becomes sufficiently large. The limit of $f\left(x\right)$ is $\text{\u2212}\infty $ as $x\to \infty $ if $f\left(x\right)<0$ and $\left|f\left(x\right)\right|$ becomes arbitrarily large as $x$ becomes sufficiently large. We can define the limit of $f\left(x\right)$ as $x$ approaches $\text{\u2212}\infty $ similarly.
- For a polynomial function $p\left(x\right)={a}_{n}{x}^{n}+{a}_{n-1}{x}^{n-1}+\text{\u2026}+{a}_{1}x+{a}_{0},$ where ${a}_{n}\ne 0,$ the end behavior is determined by the leading term ${a}_{n}{x}^{n}.$ If $n\ne 0,$ $p\left(x\right)$ approaches $\infty $ or $\text{\u2212}\infty $ at each end.
- For a rational function $f\left(x\right)=\frac{p\left(x\right)}{q\left(x\right)},$ the end behavior is determined by the relationship between the degree of $p$ and the degree of $q.$ If the degree of $p$ is less than the degree of $q,$ the line $y=0$ is a horizontal asymptote for $f.$ If the degree of $p$ is equal to the degree of $q,$ then the line $y=\frac{{a}_{n}}{{b}_{n}}$ is a horizontal asymptote, where ${a}_{n}$ and ${b}_{n}$ are the leading coefficients of $p$ and $q,$ respectively. If the degree of $p$ is greater than the degree of $q,$ then $f$ approaches $\infty $ or $\text{\u2212}\infty $ at each end.

## 4.7 Applied Optimization Problems

- To solve an optimization problem, begin by drawing a picture and introducing variables.
- Find an equation relating the variables.
- Find a function of one variable to describe the quantity that is to be minimized or maximized.
- Look for critical points to locate local extrema.

## 4.8 L’Hôpital’s Rule

- L’Hôpital’s rule can be used to evaluate the limit of a quotient when the indeterminate form $\frac{0}{0}$ or $\infty \text{/}\infty $ arises.
- L’Hôpital’s rule can also be applied to other indeterminate forms if they can be rewritten in terms of a limit involving a quotient that has the indeterminate form $\frac{0}{0}$ or $\infty \text{/}\infty .$
- The exponential function ${e}^{x}$ grows faster than any power function ${x}^{p},$ $p>0.$
- The logarithmic function $\text{ln}\phantom{\rule{0.1em}{0ex}}x$ grows more slowly than any power function ${x}^{p},$ $p>0.$

## 4.9 Newton’s Method

- Newton’s method approximates roots of $f\left(x\right)=0$ by starting with an initial approximation ${x}_{0},$ then uses tangent lines to the graph of $f$ to create a sequence of approximations ${x}_{1},{x}_{2},{x}_{3}\text{,\u2026}.$
- Typically, Newton’s method is an efficient method for finding a particular root. In certain cases, Newton’s method fails to work because the list of numbers ${x}_{0},{x}_{1},{x}_{2}\text{,\u2026}$ does not approach a finite value or it approaches a value other than the root sought.
- Any process in which a list of numbers ${x}_{0},{x}_{1},{x}_{2}\text{,\u2026}$ is generated by defining an initial number ${x}_{0}$ and defining the subsequent numbers by the equation ${x}_{n}=F\left({x}_{n-1}\right)$ for some function $F$ is an iterative process. Newton’s method is an example of an iterative process, where the function $F\left(x\right)=x-\left[\frac{f\left(x\right)}{{f}^{\prime}\left(x\right)}\right]$ for a given function $f.$

## 4.10 Antiderivatives

- If $F$ is an antiderivative of $f,$ then every antiderivative of $f$ is of the form $F\left(x\right)+C$ for some constant $C.$
- Solving the initial-value problem

$$\frac{dy}{dx}=f\left(x\right),y\left({x}_{0}\right)={y}_{0}$$

requires us first to find the set of antiderivatives of $f$ and then to look for the particular antiderivative that also satisfies the initial condition.