Week 3: Inner-Product Spaces#

Key Terms#

  • Vector spaces with inner product and norm

  • \(\mathbb{R}^n\) and \(\mathbb{C}^n\)

  • Projections on the line

  • Orthonormal bases

  • The Gram-Schmidt procedure

  • Orthogonal and unitary matrices

Preparation and Syllabus#


Exercises – Long Days#

1: Orthonormal Basis (Coordinates). By Hand.#

Question a#

Do the vectors

\[\begin{equation*} \pmb{u}_1=\left( \frac{1}{3},\frac{2}{3},\frac{2}{3}\right), \quad \pmb{u}_2=\left( \frac{2}{3}, \frac{1}{3}, -\frac{2}{3}\right),\quad \pmb{u}_3=\left( \frac{2}{3}, -\frac{2}{3}, \frac{1}{3}\right) \end{equation*}\]

constitute an orthonormal basis in \(\mathbb{R}^3\)?

Question b#

Consider the vector \(\pmb{x} = [1,2,3]^T\). Compute the inner products \(\langle \pmb{x}, \pmb{u}_k \rangle\) for \(k=1,2,3\).

Question c#

Let us denote the basis by \(\beta = \pmb{u}_1, \pmb{u}_2, \pmb{u}_3\). State the coordinate vector \({}_{\beta} \pmb{x}\) for \(\pmb{x}\) with respect to \(\beta\). Compute the norm of both \(\pmb{x}\) and the coordinate vector \({}_{\beta} \pmb{x}\).

Question d#

Create the \(3\times 3\) matrix \(U = [\pmb{u}_1 \vert \pmb{u}_2 \vert \pmb{u}_3]\) with \(\pmb{u}_1, \pmb{u}_2, \pmb{u}_3\) as the three columns. Compute \(U^T \pmb{x}\) and compare the result with the previous exercise.

2: Orthonormal Basis (Construction). By Hand.#

Create an orthonormal basis in \(\mathbb{R}^3\), in which

\[\begin{equation*} \left(\frac {\sqrt 2}2,\frac {\sqrt 2}2,0\right) \end{equation*}\]

is the first basis vector.

3: Orthonormalizing. By Hand.#

Determine the solution set to the homogeneous equation

\[\begin{equation*} x_1+x_2+x_3=0 \end{equation*}\]

and justify for it being a subspace in \(\mathbb{R}^3\). Find an orthonormal basis for this solution space.

4: Orthogonal Projections#

Let \(\boldsymbol{y}=(2,1,2) \in \mathbb{R}^3\) be given. Then the projection of \(\boldsymbol{x} \in \mathbb{R}^3\) onto the line \(Y = \mathrm{span}\{\boldsymbol{y}\}\) is given by:

\[\begin{equation*} \operatorname{Proj}_Y(\boldsymbol{x}) = \frac{\left<\boldsymbol{x},\boldsymbol{y} \right>}{\left<\boldsymbol{y},\boldsymbol{y} \right>}\boldsymbol{y} = \left<\boldsymbol{x},\boldsymbol{u}\right>\boldsymbol{u}, \end{equation*}\]

where \(\boldsymbol{u} = \frac{\boldsymbol{y}}{||\boldsymbol{y}||}\).

Question a#

Let \(\boldsymbol{x} = (1,2,3) \in \mathbb{R}^3\). Compute \(\operatorname{Proj}_Y(\boldsymbol{x})\), \(\operatorname{Proj}_Y(\boldsymbol{y})\) and \(\operatorname{Proj}_Y(\boldsymbol{u})\).

Question b#

We, as usual, consider all vectors as column vectors. Now set up the \(3 \times 3\) matrix \(P = \boldsymbol{u} \boldsymbol{u}^T\) and compute both \(P\boldsymbol{x}\) and \(P\boldsymbol{y}\).

5: An Orthonormal Basis for a Subspace of \(\mathbb{C}^4\)#

Question a#

Find an orthonormal basis \(\pmb{u}_1, \pmb{u}_2\) for the subspace \(Y = \mathrm{span}\{\pmb{v}_1, \pmb{v}_2\}\) spanned by the vectors:

v1 = Matrix([I, 1, 1, 0])
v2 = Matrix([0, I, I, sqrt(2)])

Question b#

Let

\[\begin{equation*} \pmb{x} = \left[\begin{matrix}3 i\\3 - 2 i\\3 - 2 i\\-2\sqrt{2}\end{matrix}\right]. \end{equation*}\]

Compute \(\langle \pmb{x}, \pmb{u}_1 \rangle\), \(\langle \pmb{x}, \pmb{u}_2 \rangle\) as well as

\[\begin{equation*} \langle \pmb{x}, \pmb{u}_1 \rangle \pmb{u}_1 + \langle \pmb{x}, \pmb{u}_2 \rangle \pmb{u}_2 . \end{equation*}\]

What does this linear combination give? Does \(\pmb{x}\) belong to the subspace \(Y\)?

6: A Python Algorithm#

Question a#

Consider the following code and explain what it does. Before you run the code in a Jupyter Notebook you must explain what will be the output.

from sympy import *
from dtumathtools import *
init_printing()

x1, x2, x3 = symbols('x1:4', real=True)
eqns = [Eq(1*x1 + 2*x2 + 3*x3, 1), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(5*x1 + 7*x2 + 8*x3, -1)]
eqns
A, b = linear_eq_to_matrix(eqns,x1,x2,x3)
T = A.row_join(b)  # augmented matrix

A, b, T

Question b#

We continue the Jupyter Notebook with the following code (do not run it yet). Go through the code by hand (calculate the for-loop iterations manually). Which \(T\) matrix will be the output? Copy/paste the code into a chatbot such as https://copilot.microsoft.com/ (log in with your DTU account) and ask the chatbot to explain the code line by line. Now check the result by running the code in a Python Notebook. Remember that T.shape[0] gives the number of rows in matrix \(T\).

for col in range(T.shape[0]):
    for row in range(col + 1, T.shape[0]):
        T[row, :] = T[row, :] - T[row, col] / T[col, col] * T[col, :]
    T[col, :] = T[col, :] / T[col, col]

T

Question c#

Write Python code that ensures zeroes above the diagonal in matrix \(T\) such that \(T\) ends out being in reduced-row echelon form.

Note

You should not take into account any potential divisions by zero (for general \(T\) matrices). We will assume that the calculations will work well.

Question d#

What algorithm is it that we have implemented? Test the same algorithm on:

x1, x2, x3, x4 = symbols('x1:5', real=True)
eqns = [Eq(1*x1 + 2*x2 + 3*x3, 1), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(5*x1 + 7*x2 + 8*x3, -1)]
A, b = linear_eq_to_matrix(eqns,x1,x2,x3,x4)
T = A.row_join(b)  # augmented matrix

7: Orthogonal Polynomials#

This is an exercise from the textbook. You can find help there.

Consider the list \(\alpha=1,x,x^{2},x^{3}\) of polynomials in \(P_{3}([-1,1])\) constructed with the \(L^{2}\)-inner product.

Question a#

Argument for why \(\alpha\) is a list of linearly independent vectors.

Question b#

Utilize the Gram-Schmidt procedure on \(\alpha\) and show that the procedure gives a normalized version of the Legendre polynomials.


Exercises – Short Day#

1: Matrix Multiplications. By Hand.#

Define

\[\begin{equation*} A = \left[\begin{matrix}1 & 2 & 3 & 4\\5 & 6 & 7 & 8\\4 & 4 & 4 & 4\end{matrix}\right], \quad \pmb{x} = \left[\begin{matrix} x_1\\ x_2\\ x_3\\ x_4\end{matrix}\right] = \left[\begin{matrix}1\\2\\-1\\1\end{matrix}\right]. \end{equation*}\]

Let \(\pmb{a}_1, \pmb{a}_2,\pmb{a}_3,\pmb{a}_4\) denote the columns of \(A\). Let \(\pmb{b}_1, \pmb{b}_2,\pmb{b}_3\) denote the rows of \(A\). We will now calculate \(A\pmb{x}\) in two different ways.

Question a#

Method 1: As a linear combination of the columns. Compute the linear combination

\[\begin{equation*} x_1 \pmb{a}_1 + x_2 \pmb{a}_2 + x_3 \pmb{a}_3 + x_4 \pmb{a}_4. \end{equation*}\]

Question b#

Method 2: As a “dot product” of the rows of \(A\) by \(\pmb{x}\). Compute

\[\begin{equation*} \left[\begin{matrix} \pmb{b}_1 \pmb{x} \\ \pmb{b}_2 \pmb{x} \\ \pmb{b}_3 \pmb{x} \end{matrix}\right]. \end{equation*}\]

Note

Since \(\pmb{b}_k\) is a row vector, \((\pmb{b}_k)^T\) is a column vector. The product \(\pmb{b}_k \pmb{x}\) hence corresponds to the dot product of \(\pmb{x}\) and \((\pmb{b}_k)^T\).

Question c#

Compute \(A\pmb{x}\) using SymPy and compare with your calculations in the previous exercises.

2: A Subspace in \(\mathbb{C}^4\) and its Orthogonal Complement#

In \(\mathbb{C}^4\,\) the following vectors are given

\[\begin{equation*} \pmb{v}_1=(1,1,1,1),\,\pmb{v}_2=(3 i ,i,i,3 i),\,\pmb{v}_3=(2,0,-2,4)\,\,\,\,\mathrm{and}\,\,\,\,\pmb{v}_4=(4-3i,2-i,-i,6-3i). \end{equation*}\]

A subspace \(Y\) in \(\mathbb{C}^4\) is determined by \(Y=\mathrm{span}\lbrace\pmb{v}_1,\pmb{v}_2,\pmb{v}_3,\pmb{v}_4\rbrace\).

Question a#

v1 = Matrix([1,1,1,1])
v2 = Matrix([3*I,I,I,3*I])
v3 = Matrix([2,0,-2,4])
v4 = Matrix([4-3*I,2-I,-I,6-3*I])

Run the command GramSchmidt([v1,v2,v3,v4], orthonormal=True) in Python. What does Python tell you?

# GramSchmidt([v1, v2, v3, v4], orthonormal = True)   

Question b#

Now show that \((\pmb{v}_1,\pmb{v}_2,\pmb{v}_3)\,\) is a basis for \(Y\), and determine the coordinate vector of \(\pmb{v}_4\,\) with respect to this basis.

Question c#

State an orthonormal basis for \(Y\).

Question d#

Find the coordinate vector of \(\pmb{v}_4 \in Y\) with respect to the orthonormal basis for \(Y\).

Question e#

Determine the orthogonal complement \(Y^\perp\) in \(\mathbb{C}^4\) to \(Y\).

Question f#

Choose a vector \(\pmb{y}\) in \(Y^\perp\) and choose a vector \(\pmb{x}\) in \(Y\). Compute \(\Vert \pmb{x} \Vert\), \(\Vert \pmb{y} \Vert\) and \(\Vert \pmb{x} + \pmb{y} \Vert\). Check that \(\Vert \pmb{x} \Vert^2 +\Vert \pmb{y} \Vert^2 = \Vert \pmb{x} + \pmb{y} \Vert^2\).

3: Orthogonal Projection on a Plane#

Let a matrix \(U = [\pmb{u}_1, \pmb{u}_2]\) be given by:

\[\begin{equation*} U = \left[\begin{matrix}\frac{\sqrt{3}}{3} & \frac{\sqrt{2}}{2}\\ \frac{\sqrt{3}}{3} & 0\\- \frac{\sqrt{3}}{3} & \frac{\sqrt{2}}{2}\end{matrix}\right]. \end{equation*}\]

Question a#

Show that \(\pmb{u}_1, \pmb{u}_2\) is an orthonormal basis for \(Y = \mathrm{span}\{\pmb{u}_1, \pmb{u}_2\}\).

Question b#

Let \(P = U U^* \in \mathbb{R}^{3 \times 3}\). This will give us a projection matrix that describes the orthogonal projection \(\pmb{x} \mapsto P \pmb{x}\), \(\mathbb{R}^3 \to \mathbb{R}^3\) onto the plane \(Y = \mathrm{span}\{\pmb{u}_1, \pmb{u}_2\}\). Verify that \(P^2 = P\), \(P \pmb{u}_1 = \pmb{u}_1\), and \(P \pmb{u}_2 = \pmb{u}_2\).

Question c#

Choose a vector \(\pmb{x} \in \mathbb{R}^3\) that does not belong to \(Y\), and find the projection \(\mathrm{proj}_Y(\pmb{x})\) of \(\pmb{x}\) down onto the plane \(Y\). Illustrate \(\pmb{x}\), \(Y\), and \(\mathrm{proj}_Y(\pmb{x})\) in a plot.

Question d#

Show that \(\pmb{x} - \mathrm{proj}_Y(\pmb{x})\) belongs to \(Y^\perp\).

4: Unitary Matrices#

Let a matrix \(F\) be given by:

n = 4
F = 1/sqrt(n) * Matrix(n, n, lambda k,j: exp(2*pi*I*k*j/n))
F
\[\begin{split}\displaystyle \left[\begin{matrix}\frac{1}{2} & \frac{1}{2} & \frac{1}{2} & \frac{1}{2}\\\frac{1}{2} & \frac{i}{2} & - \frac{1}{2} & - \frac{i}{2}\\\frac{1}{2} & - \frac{1}{2} & \frac{1}{2} & - \frac{1}{2}\\\frac{1}{2} & - \frac{i}{2} & - \frac{1}{2} & \frac{i}{2}\end{matrix}\right]\end{split}\]

Determine whether the following propositions are true or false:

  1. \(F\) is unitary

  2. \(F\) is invertible

  3. \(F\) is orthogonal

  4. \(F\) is symmetric

  5. \(F\) is Hermitian

  6. The columns of \(F\) constitute an orthonormal basis for \(\mathbb{C}^4\)

  7. The columns of \(F\) constitute an orthonormal basis for \(\mathbb{R}^4\)

  8. \(-F = F^{-1}\)