next up previous
Next: About this document ... Up: lab_template Previous: lab_template

Subsections


MA 1024: Partial derivatives, directional
derivatives, and the gradient

Purpose

The purpose of this lab is to acquaint you with using Maple to compute partial derivatives, directional derivatives, and the gradient.

Getting Started

To assist you, there is a worksheet associated with this lab that contains examples and even solutions to some of the exercises. You can copy that worksheet to your home directory by going to your computer's Start menu and choose run. In the run field type:

\\storage\academics\math\calclab\MA1024\Pardiff_grad_start_B17.mw

Remember to immediately save it in your own home directory. Once you've copied and saved the worksheet, read through the background on the internet and the background of the worksheet before starting the exercises.

Background

For a function $f(x)$ of a single real variable, the derivative $f'(x)$ gives information on whether the graph of $f$ is increasing or decreasing. Finding where the derivative is zero was important in finding extreme values. For a function $F(x,y)$ of two (or more) variables, the situation is more complicated.

Partial derivatives

A differentiable function, $F(x,y)$, of two variables has two partial derivatives: $\partial F /\partial x$ and $\partial F /\partial
y$. As you have learned in class, computing partial derivatives is very much like computing regular derivatives. The main difference is that when you are computing $\partial F /\partial x$, you must treat the variable $y$ as if it was a constant and vice-versa when computing $\partial F /\partial
y$.

The Maple commands for computing partial derivatives are D and diff. The Getting Started worksheet has examples of how to use these commands to compute partial derivatives.

Directional derivatives

The partial derivatives $\partial F /\partial x$ and $\partial F /\partial
y$ of $F$ can be thought of as the rate of change of $F$ in the direction parallel to the $x$ and $y$ axes, respectively. The directional derivative $D_{\mathbf{u}}F(\mathbf{p})$, where $\mathbf{u}$ is a unit vector, is the rate of change of $F$ in the direction $\mathbf{u}$. There are several different ways that the directional derivative can be computed. The method most often used for hand calculation relies on the gradient, which will be described below. It is also possible to simply use the definition

\begin{displaymath}D_{\mathbf{u}}F(\mathbf{p}) = \lim_{h \rightarrow 0}
\frac{F(\mathbf{p}+h\mathbf{u}) - F(\mathbf{p})}{h} \end{displaymath}

to compute the directional derivative. However, the following computation, based on the definition, is often simpler to use.

\begin{displaymath}D_{\mathbf{u}}F(\mathbf{p}) = \left. \frac{d}{dt}
F(\mathbf{p}+t\mathbf{u})   \right\vert _{t=0} \end{displaymath}

One way to think about this that can be helpful in understanding directional derivatives is to realize that $\mathbf{p}+t\mathbf{u}$ is a straight line in the $x,y$ plane. The plane perpendicular to the $x,y$ plane that contains this straight line intersects the surface $z =
F(x,y)$ in a curve whose $z$ coordinate is $F(\mathbf{p}+t\mathbf{u})$. The derivative of $F(\mathbf{p}+t\mathbf{u})$ at $t=0$ is the rate of change of $F$ at the point $\mathbf{p}$ moving in the direction $\mathbf{u}$.

Maple doesn't have a simple command for computing directional derivatives. There is a command in the tensor package that can be used, but it is a little confusing unless you know something about tensors. Fortunately, the method described above and the method using the gradient described below are both easy to implement in Maple. Examples are given in the Getting Started worksheet.

The Gradient

The gradient of $F$, written $\nabla F$, is most easily computed as

\begin{displaymath}\nabla F(\mathbf{p}) = \frac{\partial F}{\partial x}(\mathbf{...
...thbf{i} + \frac{\partial F}{\partial y}(\mathbf{p}) \mathbf{j} \end{displaymath}

As described in the text, the gradient has several important properties, including the following.

Maple has a fairly simple command grad in the linalg package (which we used for curve computations). Examples of computing gradients, using the gradient to compute directional derivatives, and plotting the gradient field are all in the Getting Started worksheet.

Classifying Local Extrema

In single-variable calculus, we found that we could locate candidates for local extreme values by finding points where the first derivative vanishes. For functions of two dimensions, the condition is that both first order partial derivatives must vanish at a local extreme value candidate point. Such a point is called a stationary point. It is also one of the three types of points called critical points. Note carefully that the condition does not say that a point where the partial derivatives vanish must be a local extreme point. Rather, it says that stationary points are candidates for local extrema. Just as was the case for functions of a single variable, there can be stationary points that are not extrema. For example, the saddle surface $f(x,y) = x^2-y^2$ has a stationary point at the origin, but it is not a local extremum.

Each critical point $(x_0,y_0)$ can be classified as a local maximum, local minimum, or a saddle point using the second-partials test:

If $f_{xx}(x_0,y_0)f_{yy}(x_0,y_0)-[f_{xy}(x_0,y_0)]^2 >0 $ and $f_{xx}(x_0,y_0) > 0$ then $f(x_0,y_0)$ is a local minimum.

If $f_{xx}(x_0,y_0)f_{yy}(x_0,y_0)-[f_{xy}(x_0,y_0)]^2 >0 $ and $f_{xx}(x_0,y_0) < 0$ then $f(x_0,y_0)$ is a local maximum.

If $f_{xx}(x_0,y_0)f_{yy}(x_0,y_0)-[f_{xy}(x_0,y_0)]^2 <0$ then $f(x_0,y_0)$ is a saddle point.

If $f_{xx}(x_0,y_0)f_{yy}(x_0,y_0)-[f_{xy}(x_0,y_0)]^2 =0$ then no conclusion can be made. The examples in the Getting Started worksheet are intended to help you learn how to use Maple to simplify these tasks.

Exercises

For the function $\displaystyle f(x,y)=\exp(-\frac{1}{3}y^3+y-x^2)$,
  1. Using the method from the Getting Started worksheet, compute the directional derivative of $f$ at the point $\displaystyle (-1,1)$ in each of the directions below. Explain your results in terms of being positive, negative or zero and what that tells about the surface at that point in the given direction.
    1. $\displaystyle \mathbf{u} = \langle -1, 0 \rangle$
    2. $\displaystyle \mathbf{u} = \langle 1, 0 \rangle$
    3. $\displaystyle \mathbf{u} = \langle 0, 1 \rangle$
    4. $\displaystyle \mathbf{u} = \langle 0, -1 \rangle$

  2. Now, find the directional derivative at each of the points $\displaystyle (0,1)$ and $\displaystyle (0,-1)$ in the direction in $\displaystyle \mathbf{u} = \langle -1, 1 \rangle$ and $\displaystyle \mathbf{u} = \langle 1, 1 \rangle$. What do your results suggest about the surface at these points? Explain.

  3. Using the method from the Getting Started worksheet, plot the gradient field and the contours of $f$ on the same plot over the intervals $-2 \leq x \leq 2$ and $-2 \leq y \leq 2$. Use 30 contours, a $[30,30]$ grid and fieldstrength=fixed for the gradient plot. Use two sentences, one for each point in the previous exercise, to classify the point using both the gradient field and the contour plot in your explanation.

  4. Use the second derivative test for the same two points above to confirm your classification.

next up previous
Next: About this document ... Up: lab_template Previous: lab_template
Dina J. Solitro-Rassias
2017-11-12