Week 2: Differentiability#

Key Terms#

  • Vector functions of multiple variables

  • Directional derivative

  • Differentiability

  • The Jacobian matrix, the gradient vector

  • The chain rule

  • The Hessian matrix

Preparation and Syllabus#


Exercises – Long Day#

1: Level Curves and Directional Derivative for Scalar Functions#

A function \(f:\mathbb{R}^2\rightarrow\mathbb{R}\) is given by the expression

\[\begin{equation*} f(x,y)=x^2+y^2. \end{equation*}\]

Another function \(g:\mathbb{R}^2\rightarrow\mathbb{R}\) is given by the expression

\[\begin{equation*} g(x,y)=x^2-4x+y^2. \end{equation*}\]

Question a#

Decribe the level curves given by \(f(x,y)=c\) for the values \(c\in\{1,2,3,4,5\}\).

Question b#

Determine the gradient of \(f\) at the point \((1,1)\) and determine the directional derivative of \(f\) at the point \((1,1)\) in the direction that is given by the unit direction vector \(\pmb{e}=(1,0)\).

Question c#

Describe the level curves given by \(g(x,y)=c\) for the values \(c \in\{-3,-2,-1,0,1\}\).

Question d#

Compute the gradient of \(g\) at the point \((1,2)\) and compute the directional derivative of \(g\) at the point \((1,2)\) in the direction towards the origin \((0,0)\).

2: Jacobian Matrices for Different Functions#

We define functions below of the form \(\pmb{f}: \mathbb{R}^n \to \mathbb{R}^k\), where \(n\) and \(k\) can be read from the functional expression.

Question a#

  1. Let \({f}(x_1, x_2, x_3) = x_1^2x_2 + 2x_3\). Compute the Jacobian matrix \(J_{f}(\pmb{x})\) and evaluate it at the point \(\pmb{x} = (1, -1, 3)\). Confirm that the Jacobian matrix of a scalar function of multiple variables only has one row.

  2. Let \(\pmb{f}(x) = (3x, x^2, \sin(2x))\). Compute the Jacobian matrix \(J_{\pmb{f}}(x)\) and evaluate it at the point \(x = 2\). Confirm that the Jacobian matrix of a vector function of one variable only has one column.

  3. Let \(\pmb{f}(x_1, x_2) = (x_1^2, -3x_2, 12x_1)\). Compute the Jacobian matrix \(J_{\pmb{f}}(\pmb{x})\) and evaluate it at the point \(\pmb{x} = (2, 0)\).

  4. Let \(\pmb{f}(x_1, x_2, x_3) = (x_2 \sin(x_3), 3x_1x_2 \ln(x_3))\). Compute the Jacobian matrix \(J_{\pmb{f}}(\pmb{x})\) and evaluate it at the point \(\pmb{x} = (-1, 3, 2)\).

  5. Let \(\pmb{f}(x_1, x_2, x_3) = (x_1 e^{x_2}, 3x_2 \sin(x_2), -x_1^2 \ln(x_2 + x_3))\). Compute the Jacobian matrix \(J_{\pmb{f}}(\pmb{x})\) and evaluate it at the point \(\pmb{x} = (1, 0, 1)\).

Question b#

All functions from the previous question are differentiable. How can one argue for this? For which of the functions can we determine the Hessian matrix? Compute the Hessian matrix of the functions for which it is defined.

Question c#

Let \(\pmb{v} = (1,1,1)\). Normalise the vector \(\pmb{v}\) and denote the result by \(\pmb{e}\). Check that \(||\pmb{e}||=1\). Compute the directional derivative of the scalar function \({f}(x_1, x_2, x_3) = x_1^2x_2 + 2x_3\) at the point \(\pmb{x} = (1, -1, 3)\) in the direction along \(\pmb{v}\). Then compute \(J_f(\pmb{x}) \pmb{e}\). Compare with the directional derivative. Are they equal? If so, is that a coincidence?

3: Description of Sets in the Plane#

Draw in each of the four below cases a sketch of the given set \(\,A\,\), its interior \(\,A^{\circ}\,\), its boundary \(\,\partial A\,\) and its closure \(\,\bar{A}\,\). Investigate further whether \(\,A\,\) is open, closed or neither. Finally, state whether \(\,A\,\) is bounded or not.

  1. \(\{(x,y)\,\vert\, xy\neq 0\}\)

  2. \(\{(x,y)\,\vert\, 0<x<1\,\,\,\mathrm{and}\,\,\,1\leq y\leq 3\}\)

  3. \(\{(x,y)\,\vert\, y\geq x^2 \,\,\,\mathrm{and}\,\,\,y<2 \}\)

  4. \(\{(x,y)\,\vert\, x^2+y^2-2x+6y\leq 15 \}\)

4: All Linear Maps from \(\mathbb{R}^n\) to \(\mathbb{R}\)#

Let \(L: \mathbb{R}^n \to \mathbb{R}\) be a (arbitrary) linear map. Let \(e = \pmb{e}_1, \pmb{e}_2, \dots, \pmb{e}_n\) be the standard basis of \(\mathbb{R}^n\), and let \(\beta\) be the standard basis of \(\mathbb{R}\). Remember the standard basis from Mathematics 1a. Note that since the dimension of \(\mathbb{R}\) (over \(\mathbb{R}\)) is 1, the standard basis of \(\mathbb{R}\) is just the number \(1\).

Show that a column vector \(\pmb{c} \in \mathbb{R}^n\) exists such that

\[\begin{equation*} L(\pmb{x}) = \pmb{c}^T \pmb{x} = \langle \pmb{x}, \pmb{c} \rangle , \end{equation*}\]

where \(\langle \cdot, \cdot \rangle\) denote the usual inner product on \(\mathbb{R}^n\). (The column vector is uniquely given, but it is not a part of this exercise to argue for that).

5: Linear(?) Vector Functions#

We consider the following two functions:

  1. \(f: \mathbb{R}^{2 \times 2} \to \mathbb{R}^{2 \times 2}, f(X) = C X B\), where \(C = \mathrm{diag}(2,1) \in \mathbb{R}^{2 \times 2}\) and \(B = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}\).

  2. \(g: \mathbb{R}^n \to \mathbb{R}, g(\pmb{x}) = \pmb{x}^T A \pmb{x}\), where \(A\) is an \(n \times n\) matrix (and not the zero matrix).

Determine for each function whether it is a linear map. If the map is linear, then find the mapping matrix with respect to, respectively:

  1. the standard basis \(E=\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}\) in \(\mathbb{R}^{2 \times 2}\). Remember this example from Math1a.

  2. the standard basis \(e\) in \(\mathbb{R}^n\). Remember this result from Math1a.

6: The Chain Rule for an Altitude Function#

We consider a real function of two real variables given by the expression

\[\begin{equation*} f(x,y)=\ln(9-x^2-y^2). \end{equation*}\]

Question a#

Determine the domain of \(f\), and characterise it using terms such as open, closed, bounded, unbounded.

We not consider a parametrized curve \(\pmb{r}\) in the \(\,(x,y)\) plane given by

\[\begin{equation*} \pmb{r}(u)=(u,u^3)\,,\,u\in \left[-1.2\,,\,1.2\right]. \end{equation*}\]

Question b#

Which curve are we dealing with here (you are familiar with its equation!)?

We now consider the composite function

\[\begin{equation*} h(u)=f(\pmb{r}(u)). \end{equation*}\]

Question c#

Why is it reasonable to call \(h\) an altitude function?

Question d#

Determine \(h'(1)\,\) using two different methods:

  1. Determine a functional expression for \(h(u)\), and differentiate it as usual.

  2. Use the chain rule in Section 3.7.

7: Partial Derivatives but not Differentiable#

The function \(f:\mathbb{R}^2 \to \mathbb{R}\) where

\[\begin{equation*} f(x_1,x_2)=x_1^2-4x_1+x_2^2 \end{equation*}\]

is given.

Question a#

Let \(\pmb{x}_0 = (x_1,x_2) \in \mathbb{R}^2\) be an arbitrary point. Justify that \(f\) is differentiable at \(\pmb{x}_0\), and compute the gradient of \(f\) at \(\pmb{x}_0\).

Hard version: Solve the task direction from the definition of differentiability in Section 3.6.

Soft version: Use the result in this theorem.

Question b#

To conclude differentiability based on the partial derivatives, according to this theorem, it is required that the partial derivatives are continuous. Why is it not enough that the partial derivatives exist? We will be investigating this question via concrete example. But first we generalize a (from highschool) well-known statement about a function of one variable: If it is differentiable at a point, then it is also continuous at that point.

Show that if a function of two variables is differentiable at a point \(\pmb{x}_0\), then it is also continuous at that point.

And now for the example. We consider the function

\[\begin{equation*} f(x_1,x_2) = \begin{cases} \frac{x_1^2x_2}{x_1^4+x_2^2}, & \text{for } (x_1,x_2) \neq (0,0) \\ 0, & \text{for } (x_1,x_2)=(0,0). \end{cases} \end{equation*}\]

Question c#

Show that the partial derivatives of \(f\) exist at \((0,0)\), but that \(f\) is not differentiable at this point.

8: The Generalized Chain Rule#

In this exercise we will be using the theorem: Generalized chain rule.

Given functions:

  1. \(\pmb{g} : \mathbb{R}^3 \to \mathbb{R}^2\) defined by \(\pmb{g}(x_1, x_2, x_3) = (g_1(x_1, x_2, x_3), g_2(x_1, x_2, x_3))\), where:

    \[\begin{align*} g_1(x_1, x_2, x_3) &= x_1^2 + x_2^2 + x_3^2, \\ g_2(x_1, x_2, x_3) &= e^{x_1 + x_2} \, \cos(x_3). \end{align*}\]
  2. \(f : \mathbb{R}^2 \to \mathbb{R}\) defined by \(f(y_1, y_2) = y_1 \, \sin(y_2)\).

  3. The composition of these functions: \(h = f \circ \pmb{g}\).

We will in this exercise compute the Jacobian matrix of \(h\) (with respect to the variables \(x_1, x_2,\) and \(x_3\)) using the generalized chain rule. You may carry out the computations in SymPy.

Question a#

Find the functional expression of \(h\) as well as the domain and co-domain. Compute the gradient of \(h\).

Question b#

Compute the Jacobian matrix of \(\pmb{g}\). Compute the Jacobian matrix of \(f\). What is the connection between the gradient and the Jacobian matrix of \(f\)?

Question c#

Now use the chain rule and the Jacobian matrices from the previous question to find the Jacobian matrix of \(h\). Compare with the answer to Question a.

9: Gradient Vector Field and Hessian Matrix#

Question a#

The gradient vector of \(f(x_1, x_2) = x_1^2 \sin(x_2)\) is \(\nabla f(\pmb{x})=(2x_1 \sin(x_2),x_1^2 \cos(x_2))\). The gradient vector can thus be considered as a map \(\nabla f : \mathrm{dom}(f) \to \mathbb{R}^2\). Write down the map as a function (where you state \(\mathrm{dom}(f)\)) and plot it as a vector field.

Question b#

Now compute the Jacobian matrix of \(\nabla f : \mathbb{R}^2 \to \mathbb{R}^2\) at the point \((x_1,x_2)\).

Question c#

Compute the Hessian matrix of \(f : \mathbb{R}^2 \to \mathbb{R}\) at the point \((x_1,x_2)\), and compare with the answer to the previous question.


Theme Exercise – Short Day#

Today we will work through Theme Exercise 1.