Week 3: Exercises#

The exercises are intended to be done by hand unless otherwise stated (such as when you are asked to plot a graph or run a script).

Exercises – Long Day#

from sympy import*
from dtumathtools import*
init_printing()

def inner(x1, x2):
    '''
    Computes the standard inner product of two vectors.
    '''
    
    return x1.dot(x2, conjugate_convention = 'right')

1: Orthonormal Basis (Coordinates)#

Question a#

Do the vectors

\[\begin{equation*} \pmb{u}_1=\left( \frac{1}{3},\frac{2}{3},\frac{2}{3}\right), \quad \pmb{u}_2=\left( \frac{2}{3}, \frac{1}{3}, -\frac{2}{3}\right),\quad \pmb{u}_3=\left( \frac{2}{3}, -\frac{2}{3}, \frac{1}{3}\right) \end{equation*}\]

constitute an orthonormal basis in \(\mathbb{R}^3\)?

Question b#

Consider the vector \(\pmb{x} = [1,2,3]^T\). Calculate the inner products \(\langle \pmb{x}, \pmb{u}_k \rangle\) for \(k=1,2,3\).

Question c#

Let’s denote the basis by \(\beta = \pmb{u}_1, \pmb{u}_2, \pmb{u}_3\). State the coordinate vector \({}_{\beta} \pmb{x}\) of \(\pmb{x}\) with respect to \(\beta\). Calculate the norm of both \(\pmb{x}\) and the coordinate vector \({}_{\beta} \pmb{x}\).

u1 = Matrix([1,2,2])/3
u2 = Matrix([2,1,-2])/3
u3 = Matrix([2,-2,1])/3
x = Matrix([1,2,3])
inner(x, u1), inner(x, u2), inner(x, u3)
../_images/e7f68614f140f59960e647f15450867012aff918d35ac1ecdeba07d98af916d2.png
# Calculated in the previous question. Check:
U = Matrix.hstack(u1, u2, u3)    # Change-of-basis matrix from the orthonormal beta basis til the standard basis
e_x = x
beta_x = U.T * e_x # U is unitary, so U^T = U^(-1) is the change-of-basis matrix from the standard basis to the orthonormal beta basis
beta_x
\[\begin{split}\displaystyle \left[\begin{matrix}\frac{11}{3}\\- \frac{2}{3}\\\frac{1}{3}\end{matrix}\right]\end{split}\]
# Another check
inner(x, u1) * u1 + inner(x, u2) * u2 + inner(x, u3) * u3
\[\begin{split}\displaystyle \left[\begin{matrix}1\\2\\3\end{matrix}\right]\end{split}\]
x.norm(), beta_x.norm()
../_images/dd1590f62db08bc9cf7feb527fff231d3bba4d0df15db48af905ae7af7d19857.png

Since \(U^T\) is unitary, the norm is kept by all vectors.

Question d#

Create the \(3\times 3\) matrix \(U = [\pmb{u}_1 \vert \pmb{u}_2 \vert \pmb{u}_3]\) with \(\pmb{u}_1, \pmb{u}_2, \pmb{u}_3\) as its three columns. Calculate \(U^T \pmb{x}\) and compare the result with that from the previous question.

2: Orthonormal Basis (Construction)#

Create an orthonormal basis for \(\mathbb{R}^3\), within which

\[\begin{equation*} \left(\frac {\sqrt 2}2,\frac {\sqrt 2}2,0\right) \end{equation*}\]

is the first basis vector.

3: Orthonormalization#

Find the solution set to the homogeneous equation

\[\begin{equation*} x_1+x_2+x_3=0 \end{equation*}\]

and justify that it constitutes a subspace in \(\mathbb{R}^3\). Find an orthonormal basis for this solution space.

x1, x2, x3 = symbols('x1 x2 x3')
eqn = Eq(x1 + x2 + x3, 0)
A, b = linear_eq_to_matrix(eqn, [x1, x2, x3])
# linsolve((A, b), x1, x2, x3)  # alternativt
A.nullspace()
\[\begin{split}\displaystyle \left[ \left[\begin{matrix}-1\\1\\0\end{matrix}\right], \ \left[\begin{matrix}-1\\0\\1\end{matrix}\right]\right]\end{split}\]
v1, v2 = A.nullspace()
u1 = v1/v1.norm()
w = v2 - inner(v2, u1) * u1
u2 = w/w.norm()
u1, u2 
\[\begin{split}\displaystyle \left( \left[\begin{matrix}- \frac{\sqrt{2}}{2}\\\frac{\sqrt{2}}{2}\\0\end{matrix}\right], \ \left[\begin{matrix}- \frac{\sqrt{6}}{6}\\- \frac{\sqrt{6}}{6}\\\frac{\sqrt{6}}{3}\end{matrix}\right]\right)\end{split}\]
# With Sympy
GramSchmidt([v1, v2], orthonormal = True)
\[\begin{split}\displaystyle \left[ \left[\begin{matrix}- \frac{\sqrt{2}}{2}\\\frac{\sqrt{2}}{2}\\0\end{matrix}\right], \ \left[\begin{matrix}- \frac{\sqrt{6}}{6}\\- \frac{\sqrt{6}}{6}\\\frac{\sqrt{6}}{3}\end{matrix}\right]\right]\end{split}\]

4: Orthogonal Projections#

Let \(\boldsymbol{y}=(2,1,2) \in \mathbb{R}^3\) be given. Then the projection of \(\boldsymbol{x} \in \mathbb{R}^3\) on the line \(Y = \operatorname{span}\{\boldsymbol{y}\}\) is given by:

\[\begin{equation*} \operatorname{Proj}_Y(\boldsymbol{x}) = \frac{\left<\boldsymbol{x},\boldsymbol{y} \right>}{\left<\boldsymbol{y},\boldsymbol{y} \right>}\boldsymbol{y} = \left<\boldsymbol{x},\boldsymbol{u}\right>\boldsymbol{u}, \end{equation*}\]

where \(\boldsymbol{u} = \frac{\boldsymbol{y}}{||\boldsymbol{y}||}\).

Question a#

Let \(\boldsymbol{x} = (1,2,3) \in \mathbb{R}^3\). Calculate \(\operatorname{Proj}_Y(\boldsymbol{x})\), \(\operatorname{Proj}_Y(\boldsymbol{y})\) and \(\operatorname{Proj}_Y(\boldsymbol{u})\).

Question b#

As usual, we consider all vectors as column vectors. Now, find the \(3 \times 3\) matrix \(P = \pmb{u} \pmb{u}^T\) and compute both \(P\boldsymbol{x}\) and \(P\boldsymbol{y}\).

y = Matrix([2,1,2])
x = Matrix([1,2,3])
u = y.normalized()
u, inner(x, u) * u, inner(y, u) * u, inner(u, u) * u
\[\begin{split}\displaystyle \left( \left[\begin{matrix}\frac{2}{3}\\\frac{1}{3}\\\frac{2}{3}\end{matrix}\right], \ \left[\begin{matrix}\frac{20}{9}\\\frac{10}{9}\\\frac{20}{9}\end{matrix}\right], \ \left[\begin{matrix}2\\1\\2\end{matrix}\right], \ \left[\begin{matrix}\frac{2}{3}\\\frac{1}{3}\\\frac{2}{3}\end{matrix}\right]\right)\end{split}\]
# Note that y (and u) already belong to the line, so y = proj_Y(y)
y == inner(y, u) * u
True
P = u * u.T
P, P * x, P * y
\[\begin{split}\displaystyle \left( \left[\begin{matrix}\frac{4}{9} & \frac{2}{9} & \frac{4}{9}\\\frac{2}{9} & \frac{1}{9} & \frac{2}{9}\\\frac{4}{9} & \frac{2}{9} & \frac{4}{9}\end{matrix}\right], \ \left[\begin{matrix}\frac{20}{9}\\\frac{10}{9}\\\frac{20}{9}\end{matrix}\right], \ \left[\begin{matrix}2\\1\\2\end{matrix}\right]\right)\end{split}\]
# P is a projection matrix, so P^2 = P
P * P == P
True

5: An Orthonormal Basis for a Subspace of \(\mathbb{C}^4\)#

Question a#

Find an orthonormal basis \(\pmb{u}_1, \pmb{u}_2\) for the subspace \(Y = \operatorname{span}\{\pmb{v}_1, \pmb{v}_2\}\) that is spanned by the vectors:

v1 = Matrix([I, 1, 1, 0])
v2 = Matrix([0, I, I, sqrt(2)])

Question b#

Let

\[\begin{equation*} \pmb{x} = \left[\begin{matrix}3 i\\3 - 2 i\\3 - 2 i\\-2\sqrt{2}\end{matrix}\right]. \end{equation*}\]

Calculate \(\langle \pmb{x}, \pmb{u}_1 \rangle\), \(\langle \pmb{x}, \pmb{u}_2 \rangle\) as well as

\[\begin{equation*} \langle \pmb{x}, \pmb{u}_1 \rangle \pmb{u}_1 + \langle \pmb{x}, \pmb{u}_2 \rangle \pmb{u}_2. \end{equation*}\]

What does this linear combination give? Does \(\pmb{x}\) belong to the subspace \(Y\)?

x = 3 * v1 - 2 * v2    # This is how x is constructed in the exercise
x
\[\begin{split}\displaystyle \left[\begin{matrix}3 i\\3 - 2 i\\3 - 2 i\\- 2 \sqrt{2}\end{matrix}\right]\end{split}\]
u1 = v1.normalized()
w = v2 - inner(v2, u1) * u1
u2 = w.normalized()
w.norm(), u1, u2 
\[\begin{split}\displaystyle \left( \frac{2 \sqrt{6}}{3}, \ \left[\begin{matrix}\frac{\sqrt{3} i}{3}\\\frac{\sqrt{3}}{3}\\\frac{\sqrt{3}}{3}\\0\end{matrix}\right], \ \left[\begin{matrix}\frac{\sqrt{6}}{6}\\\frac{\sqrt{6} i}{12}\\\frac{\sqrt{6} i}{12}\\\frac{\sqrt{3}}{2}\end{matrix}\right]\right)\end{split}\]
inner(x, u1), inner(x, u2)
../_images/aa287675d6ae8d9ca53e0a54d3aad521bec09fafbafc5eac35b86d4698e70a7a.png
expand(inner(x, u1)), expand(inner(x, u2))
../_images/4e55c0a5714fbad6c9c8cf6aaf669c82f6775ac45c99debb6b20667f98536c16.png
expand(inner(x, u1) * u1 + inner(x, u2) * u2)
\[\begin{split}\displaystyle \left[\begin{matrix}3 i\\3 - 2 i\\3 - 2 i\\- 2 \sqrt{2}\end{matrix}\right]\end{split}\]

\(\pmb{x}\) belongs to \(Y\) since \(\pmb{x}\) can be written as a linear combination of \(\pmb{u}_1\) and \(\pmb{u}_2\).

6: A Python Algorithm#

Question a#

Without running the following code, explain what it does and explain what you expect the output to be. Then run the code in a Jupyter Notebook.

from sympy import *
from dtumathtools import *
init_printing()

x1, x2, x3 = symbols('x1:4', real=True)
eqns = [Eq(1*x1 + 2*x2 + 3*x3, 1), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(5*x1 + 7*x2 + 8*x3, -1)]
eqns
A, b = linear_eq_to_matrix(eqns,x1,x2,x3)
T = A.row_join(b)  # Augmented matrix

A, b, T

Question b#

We will continue the Jupyter Notebook by adding the following code (as before, do not run it yet). Go through the code by hand (think through the for loops). What \(T\) matrix will be the result? Copy-paste the code into an AI tool, such as https://copilot.microsoft.com/ (log in with your DTU account), and ask it to explain the code line by line. Verify the result by running the code in a Jupyter Notebook. Remember that T.shape[0] gives the number of rows in the matrix \(T\).

for col in range(T.shape[0]):
    for row in range(col + 1, T.shape[0]):
        T[row, :] = T[row, :] - T[row, col] / T[col, col] * T[col, :]
    T[col, :] = T[col, :] / T[col, col]

T

Question c#

Write Python code that ensures zeros above the diagonal in the matrix \(T\) so that \(T\) ends up in reduced row-echelon form.

Note

Do not take into account any divisions by zero (for general \(T\) matrices). We will here assume that the computations are possible.

Question d#

What kind of algorithm have we implemented? Test the same algorithm on:

x1, x2, x3, x4 = symbols('x1:5', real=True)
eqns = [Eq(1*x1 + 2*x2 + 3*x3, 1), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(4*x1 + 5*x2 + 6*x3, 0), Eq(5*x1 + 7*x2 + 8*x3, -1)]
A, b = linear_eq_to_matrix(eqns,x1,x2,x3,x4)
T = A.row_join(b)  # Augmented matrix

7: Orthogonal Polynomials#

This is an exercise from the textbook. You can find help there.

Consider the list \(\alpha=1,x,x^{2},x^{3}\) of polynomials in \(P_{3}([-1,1])\) equipped with the \(L^{2}\) inner product.

Question a#

Argue that \(\alpha\) is a list of linearly independent vectors.

The students are probably not used to working with linearly independent functions. And they likely don’t know, for example, that the monomial basis is linearly independent. This follows, of course, from “The fundamental theorem of algebra”, taught in Math1a (although it was not proven). The task here can be solved with the following steps:

x = symbols('x', real=True)
c1,c2,c3,c4 = symbols('c1:5', real=True)
v1 = 1
v2 = x
v3 = x**2
v4 = x**3

eq1 = Eq(c1*v1 + c2*v2.subs(x,0) + c3*v3.subs(x,0) + c4*v4.subs(x,0),0)
eq2 = Eq(c1*v1 + c2*v2.subs(x,1) + c3*v3.subs(x,1) + c4*v4.subs(x,1),0)
eq3 = Eq(c1*v1 + c2*v2.subs(x,-1) + c3*v3.subs(x,-1) + c4*v4.subs(x,-1),0)
eq4 = Eq(c1*v1 + c2*v2.subs(x,S(1)/2) + c3*v3.subs(x,S(1)/2) + c4*v4.subs(x,S(1)/2),0)
eq1, eq2, eq3, eq4
../_images/270e6b9c03ca81f90e4608cece2073dcfcf922cdca39c666481c626c5755c0cf.png
A,b = linear_eq_to_matrix([eq1,eq2,eq3,eq4],c1,c2,c3,c4)
linsolve((A,b),c1,c2,c3,c4)
../_images/9fe95c497035d3fcc26457894c1dfb3aea36b0ea62597f8dfa85a47e5242e5b8.png

Question b#

Apply the Gram-Schmidt procedure on \(\alpha\) and show that the procedure yields a normalized version of the Legendre polynomials.

v1_norm = sqrt(integrate(v1**2, (x,-1,1)))
u1 = v1/v1_norm
u1
w2 = v2 - integrate(v2*u1, (x,-1,1)) * u1
u2 = w2/sqrt(integrate(w2**2, (x,-1,1)))
w3 = v3 - integrate(v3*u1, (x,-1,1)) * u1 - integrate(v3*u2, (x,-1,1)) * u2
u3 = w3/sqrt(integrate(w3**2, (x,-1,1)))
w4 = v4 - integrate(v4*u1, (x,-1,1)) * u1 - integrate(v4*u2, (x,-1,1)) * u2 - integrate(v4*u3, (x,-1,1)) * u3
u4 = w4/sqrt(integrate(w4**2, (x,-1,1)))
u1, u2, u3, u4
../_images/d46e6d17b4f02ead90775e623e785fc52c639487b4f0610046d468a7ce843e8a.png
# This is a normalized version of the Legendre polynomials:
legendre(0,x), legendre(1,x), legendre(2,x).factor(), legendre(3,x).factor()
../_images/d92dea05bac343407c58a5f91b1441fdc176afb2cbd8317922a3f3445c6e749d.png

Exercises – Short Day#

1: Matrix Multiplications#

Define

\[\begin{equation*} A = \left[\begin{matrix}1 & 2 & 3 & 4\\5 & 6 & 7 & 8\\4 & 4 & 4 & 4\end{matrix}\right], \quad \pmb{x} = \left[\begin{matrix} x_1\\ x_2\\ x_3\\ x_4\end{matrix}\right] = \left[\begin{matrix}1\\2\\-1\\1\end{matrix}\right]. \end{equation*}\]

Let \(\pmb{a}_1, \pmb{a}_2,\pmb{a}_3,\pmb{a}_4\) denote the columns in \(A\). Let \(\pmb{b}_1, \pmb{b}_2,\pmb{b}_3\) denote the rows in \(A\). We now calculate \(A\pmb{x}\) in two different ways.

Question a#

Method 1: As a linear combination of the columns. Calculate the linear combination

\[\begin{equation*} x_1 \pmb{a}_1 + x_2 \pmb{a}_2 + x_3 \pmb{a}_3 + x_4 \pmb{a}_4. \end{equation*}\]

Question b#

Method 2: As “dot product” of the rows in \(A\) with \(x\). Calculate

\[\begin{equation*} \left[\begin{matrix} \pmb{b}_1 \pmb{x} \\ \pmb{b}_2 \pmb{x} \\ \pmb{b}_3 \pmb{x} \end{matrix}\right]. \end{equation*}\]

Note

Since \(\pmb{b}_k\) is a row vector, \((\pmb{b}_k)^T\) is a column vector. Hence, the product \(\pmb{b}_k \pmb{x}\) corresponds to the dot product of \(\pmb{x}\) and \((\pmb{b}_k)^T\).

Question c#

Calculate \(A\pmb{x}\) in Sympy and compare with your calculations from the previous questions.

A = Matrix([[1,2,3,4],[5,6,7,8],[4,4,4,4]])
a1, a2, a3, a4 = A[:,0], A[:,1], A[:,2], A[:,3]
b1, b2, b3 = A[0,:], A[1,:], A[2,:]
x = Matrix([1,2,-1,1])

A*x, x[0]*a1 + x[1]*a2 + x[2]*a3 + x[3]*a4, b1.dot(x), b2.dot(x), b3.dot(x)
\[\begin{split}\displaystyle \left( \left[\begin{matrix}6\\18\\12\end{matrix}\right], \ \left[\begin{matrix}6\\18\\12\end{matrix}\right], \ 6, \ 18, \ 12\right)\end{split}\]

2: A Subspace in \(\mathbb{C}^4\) and its Orthogonal Compliment#

Let the following vectors be given in \(\mathbb{C}^4\):

\[\begin{equation*} \pmb{v}_1=(1,1,1,1),\,\pmb{v}_2=(3 i ,i,i,3 i),\,\pmb{v}_3=(2,0,-2,4)\,\,\,\,\operatorname{and}\,\,\,\,\pmb{v}_4=(4-3i,2-i,-i,6-3i). \end{equation*}\]

A subspace \(Y\) in \(\mathbb{C}^4\) is determined by \(Y=\operatorname{span}\lbrace\pmb{v}_1,\pmb{v}_2,\pmb{v}_3,\pmb{v}_4\rbrace\).

Question a#

v1 = Matrix([1,1,1,1])
v2 = Matrix([3*I,I,I,3*I])
v3 = Matrix([2,0,-2,4])
v4 = Matrix([4-3*I,2-I,-I,6-3*I])

Run the command GramSchmidt([v1,v2,v3,v4], orthonormal=True) in Python. What does Python tell you?

# GramSchmidt([v1, v2, v3, v4], orthonormal = True)   

Question b#

Now show that \((\pmb{v}_1,\pmb{v}_2,\pmb{v}_3)\,\) constitutes a basis for \(Y\), and find the coordinate vector of \(\pmb{v}_4\,\) with respect to this basis.

A = Matrix.hstack(v1, v2, v3)
A.rank(), linsolve((A, v4))
../_images/a853e8ad4d423c4160969632ae5a02ad794a2caed9193848d4995bbbf7382888.png

Question c#

Provide an orthonormal basis for \(Y\).

u1, u2, u3 = GramSchmidt([v1, v2, v3], orthonormal = True)
u1, u2, u3
\[\begin{split}\displaystyle \left( \left[\begin{matrix}\frac{1}{2}\\\frac{1}{2}\\\frac{1}{2}\\\frac{1}{2}\end{matrix}\right], \ \left[\begin{matrix}\frac{i}{2}\\- \frac{i}{2}\\- \frac{i}{2}\\\frac{i}{2}\end{matrix}\right], \ \left[\begin{matrix}- \frac{1}{2}\\\frac{1}{2}\\- \frac{1}{2}\\\frac{1}{2}\end{matrix}\right]\right)\end{split}\]

Question d#

Determine the coordinate vector of \(\pmb{v}_4 \in Y\) with respect to the orthonormal basis for \(Y\).

Umat = Matrix.hstack(u1, u2, u3)
simplify(Umat.adjoint() * v4 )
\[\begin{split}\displaystyle \left[\begin{matrix}6 - 4 i\\-2 - 4 i\\2\end{matrix}\right]\end{split}\]

Question e#

Determine the orthogonal complement \(Y^\perp\) in \(\mathbb{C}^4\) to \(Y\).

Question f#

Choose a vector \(\pmb{y}\) in \(Y^\perp\) and choose a vector \(\pmb{x}\) in \(Y\). Calculate \(\Vert \pmb{x} \Vert\), \(\Vert \pmb{y} \Vert\) and \(\Vert \pmb{x} + \pmb{y} \Vert\). Check that \(\Vert \pmb{x} \Vert^2 +\Vert \pmb{y} \Vert^2 = \Vert \pmb{x} + \pmb{y} \Vert^2\).

*u_vectors, u_perp = GramSchmidt([v1, v2, v3, Matrix([1,0,0,0])], orthonormal = True)
u_perp
\[\begin{split}\displaystyle \left[\begin{matrix}\frac{1}{2}\\\frac{1}{2}\\- \frac{1}{2}\\- \frac{1}{2}\end{matrix}\right]\end{split}\]
c = symbols('c', real=True)
y = c * u_perp    # c=1 or c=2 are ok choices
x = v4    # Could do this more generally
y.norm(), x.norm(), (x+y).norm()
../_images/81e05ad8b823218e50ef06453fce5a1159877c7de2d1dae33c2e8d2c1bd7f8a5.png
(x+y).norm()**2 == x.norm()**2 + y.norm()**2
True

3: Orthogonal Projection on a Plane#

Let the matrix \(U = [\pmb{u}_1, \pmb{u}_2]\) be given by:

\[\begin{equation*} U = \left[\begin{matrix}\frac{\sqrt{3}}{3} & \frac{\sqrt{2}}{2}\\ \frac{\sqrt{3}}{3} & 0\\- \frac{\sqrt{3}}{3} & \frac{\sqrt{2}}{2}\end{matrix}\right]. \end{equation*}\]

Question a#

Show that \(\pmb{u}_1, \pmb{u}_2\) is an orthonormal basis for \(Y = \operatorname{span}\{\pmb{u}_1, \pmb{u}_2\}\).

Question b#

Let \(P = U U^* \in \mathbb{R}^{3 \times 3}\). As explained in this section of the textbook, this will give us a projection matrix that describes the orthogonal projection \(\pmb{x} \mapsto P \pmb{x}\), \(\mathbb{R}^3 \to \mathbb{R}^3\) on the plane \(Y = \operatorname{span}\{\pmb{u}_1, \pmb{u}_2\}\). Verify that \(P^2 = P\), \(P \pmb{u}_1 = \pmb{u}_1\), and \(P \pmb{u}_2 = \pmb{u}_2\).

Question c#

Choose a vector \(\pmb{x} \in \mathbb{R}^3\), that does not belong to \(Y\) and find its projection \(\operatorname{proj}_Y(\pmb{x})\) af \(\pmb{x}\) down on the plane \(Y\). Illustrate \(\pmb{x}\), \(Y\) and \(\operatorname{proj}_Y(\pmb{x})\) in a plot.

Question d#

Show that \(\pmb{x} - \operatorname{proj}_Y(\pmb{x})\) belongs to \(Y^\perp\).

v1 = Matrix([1,1,-1])
v2 = Matrix([1,0,1])
u1,u2 = GramSchmidt([v1, v2], orthonormal = True)
U = Matrix.hstack(u1, u2)
P = U * U.T
P, P * v1, P * v2
\[\begin{split}\displaystyle \left( \left[\begin{matrix}\frac{5}{6} & \frac{1}{3} & \frac{1}{6}\\\frac{1}{3} & \frac{1}{3} & - \frac{1}{3}\\\frac{1}{6} & - \frac{1}{3} & \frac{5}{6}\end{matrix}\right], \ \left[\begin{matrix}1\\1\\-1\end{matrix}\right], \ \left[\begin{matrix}1\\0\\1\end{matrix}\right]\right)\end{split}\]
x = Matrix([1,2,3])
P * x
\[\begin{split}\displaystyle \left[\begin{matrix}2\\0\\2\end{matrix}\right]\end{split}\]
x - P * x
\[\begin{split}\displaystyle \left[\begin{matrix}-1\\2\\1\end{matrix}\right]\end{split}\]

This is the orthogonal projection of \(\pmb{x}\) on \(Y^\perp\). A vector belongs to \(Y^\perp\) if and only if the inner product of \(u_1\) and \(u_2\) is zero.

4: Unitary Matrices#

Let a matrix \(F\) be given by:

n = 4
F = 1/sqrt(n) * Matrix(n, n, lambda k,j: exp(-2*pi*I*k*j/n))
F
\[\begin{split}\displaystyle \left[\begin{matrix}\frac{1}{2} & \frac{1}{2} & \frac{1}{2} & \frac{1}{2}\\\frac{1}{2} & - \frac{i}{2} & - \frac{1}{2} & \frac{i}{2}\\\frac{1}{2} & - \frac{1}{2} & \frac{1}{2} & - \frac{1}{2}\\\frac{1}{2} & \frac{i}{2} & - \frac{1}{2} & - \frac{i}{2}\end{matrix}\right]\end{split}\]

State whether the following propositions are true or false:

  1. \(F\) is unitary

  2. \(F\) is invertible

  3. \(F\) is orthogonal

  4. \(F\) is symmetric

  5. \(F\) is Hermitian

  6. The columns in \(F\) constitute an orthonormal basis for \(\mathbb{C}^4\)

  7. The columns in \(F\) constitute an orthonormal basis for \(\mathbb{R}^4\)

  8. \(-F = F^{-1}\)

display(F.adjoint() * F == eye(n))
F.adjoint() * F 
True
\[\begin{split}\displaystyle \left[\begin{matrix}1 & 0 & 0 & 0\\0 & 1 & 0 & 0\\0 & 0 & 1 & 0\\0 & 0 & 0 & 1\end{matrix}\right]\end{split}\]
display(F.rank() == n) 
F.inv()
True
\[\begin{split}\displaystyle \left[\begin{matrix}\frac{1}{2} & \frac{1}{2} & \frac{1}{2} & \frac{1}{2}\\\frac{1}{2} & \frac{i}{2} & - \frac{1}{2} & - \frac{i}{2}\\\frac{1}{2} & - \frac{1}{2} & \frac{1}{2} & - \frac{1}{2}\\\frac{1}{2} & - \frac{i}{2} & - \frac{1}{2} & \frac{i}{2}\end{matrix}\right]\end{split}\]
F.T * F == eye(n)
False
F.is_symmetric()
True
F.adjoint() == F
False
True, False
(True, False)
-F == F.inv()
False

Note: \(F^{-1} = \overline{F}\).