# 15 In-Class Assignment: Diagonalization¶

## Agenda for today’s class (80 minutes)¶

%matplotlib inline
import matplotlib.pylab as plt
import numpy as np
import sympy as sym
sym.init_printing()

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-1-97c04e533e83> in <module>
----> 1 get_ipython().run_line_magic('matplotlib', 'inline')
2 import matplotlib.pylab as plt
3 import numpy as np
4 import sympy as sym
5 sym.init_printing()

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/IPython/core/interactiveshell.py in run_line_magic(self, magic_name, line, _stack_depth)
2342                 kwargs['local_ns'] = self.get_local_scope(stack_depth)
2343             with self.builtin_trap:
-> 2344                 result = fn(*args, **kwargs)
2345             return result
2346

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/decorator.py in fun(*args, **kw)
230             if not kwsyntax:
231                 args, kw = fix(args, kw, sig)
--> 232             return caller(func, *(extras + args), **kw)
233     fun.__name__ = func.__name__
234     fun.__doc__ = func.__doc__

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/IPython/core/magic.py in <lambda>(f, *a, **k)
185     # but it's overkill for just that one bit of state.
186     def magic_deco(arg):
--> 187         call = lambda f, *a, **k: f(*a, **k)
188
189         if callable(arg):

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/IPython/core/magics/pylab.py in matplotlib(self, line)
97             print("Available matplotlib backends: %s" % backends_list)
98         else:
---> 99             gui, backend = self.shell.enable_matplotlib(args.gui.lower() if isinstance(args.gui, str) else args.gui)
100             self._show_matplotlib_backend(args.gui, backend)
101

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/IPython/core/interactiveshell.py in enable_matplotlib(self, gui)
3511         """
3512         from IPython.core import pylabtools as pt
-> 3513         gui, backend = pt.find_gui_and_backend(gui, self.pylab_gui_select)
3514
3515         if gui != 'inline':

~/REPOS/MTH314_Textbook/MakeTextbook/envs/lib/python3.9/site-packages/IPython/core/pylabtools.py in find_gui_and_backend(gui, gui_select)
278     """
279
--> 280     import matplotlib
281
282     if gui and gui != 'auto':

ModuleNotFoundError: No module named 'matplotlib'


## 2. Diagonalization¶

Reminder: The eigenvalues of triangular (upper and lower) and diagonal matrices are easy:

• The eigenvalues for triangular matrices are the diagonal elements.

• The eigenvalues for the diagonal matrices are the diagonal elements.

### Diagonalization¶

Definition: A square matrix $$A$$ is said to be diagonalizable if there exist a matrix $$C$$ such that $$D=C^{-1}AC$$ is a diagonal matrix.

Definition: $$B$$ is a similar matrix of $$A$$ if we can find $$C$$ such that $$B=C^{-1}AC$$.

Given an $$n\times n$$ matrix $$A$$, can we find another $$n \times n$$ invertable matrix $$C$$ such that when $$D=C^{-1}AC$$ is diagonal, i.e., $$A$$ is diagonalizable?

• Because $$C$$ is inveritble, we have $$$C^{-1}AC=D \\ CC^{-1}AC = CD\\ AC = CD$$$

• Generate $$C$$ as the columns of $$n$$ linearly independent vectors $$(x_1...x_n)$$ We can compute $$AC=CD$$ as follows: $$$A\begin{bmatrix} \vdots & \vdots & \vdots & \vdots \\ \vdots & \vdots & \vdots & \vdots \\ { x }_{ 1 } & { x }_{ 2 } & \dots & { x }_{ n } \\ \vdots & \vdots & \vdots & \vdots \end{bmatrix}=AC=CD=\begin{bmatrix} \vdots & \vdots & \vdots & \vdots \\ \vdots & \vdots & \vdots & \vdots \\ { x }_{ 1 } & { x }_{ 2 } & \dots & { x }_{ n } \\ \vdots & \vdots & \vdots & \vdots \end{bmatrix}\begin{bmatrix} { \lambda }_{ 1 } & 0 & 0 & 0 \\ 0 & { \lambda }_{ 2 } & 0 & 0 \\ \vdots & \vdots & { \dots } & \vdots \\ 0 & 0 & 0 & { \lambda }_{ n } \end{bmatrix}$$$

• Then we check the corresponding columns of the both sides. We have $$$Ax_1 = \lambda_1x_1\\\vdots\\Ax_n=\lambda x_n$$$

• $$A$$ has $$n$$ linear independent eigenvectors.

• $$A$$ is saied to be similar to the diagonal matrix $$D$$, and the transformation of $$A$$ into $$D$$ is called a similarity transformation.

### A simple example¶

Consider the following: $$$A = \begin{bmatrix}7& -10\\3& -4\end{bmatrix},\quad C = \begin{bmatrix}2& 5\\1& 3\end{bmatrix}$$$

Do this: Find the similar matrix $$D = C^{-1}AC$$ of $$A$$.

#Put your answer to the above question here.

from answercheck import checkanswer



Do this: Find the eigenvalues and eigenvectors of $$A$$. Set variables e1 and vec1 to be the smallest eigenvalue and its associated eigenvector and e2, vec2 to represent the largest.

#Put your answer to the above question here.

from answercheck import checkanswer

from answercheck import checkanswer

from answercheck import checkanswer

from answercheck import checkanswer


Theorem: Similar matrices have the same eigenvalues.

Proof: Assume $$B=C^{-1}AC$$ is a similar matrix of $$A$$, and $$\lambda$$ is an eigenvalue of $$A$$ with corresponding eigenvector $$x$$. That is, $$$Ax=\lambda x$$$$Then we have$$$$B(C^{-1}x) = C^{-1}AC(C^{-1}x) = C^{-1}Ax = C^{-1}(\lambda x)= \lambda (C^{-1}x).$$$$That is$$C^{-1}x$$is an eigenvector of$$B$$with eigenvalue$$\lambda$.

### A second example¶

Do this: Consider $$$A = \begin{bmatrix}-4& -6\\3& 5\end{bmatrix}.$$$$Find a matrix$$C$$such that$$C^{-1}AC$ is diagonal. (Hint, use the function diagonalize in sympy.)

#Put your answer to the above question here.

#Check the output type
assert(type(C)==sym.Matrix)

from answercheck import checkanswer


### The third example¶

Do this: Consider $$$A = \begin{bmatrix}5& -3\\3& -1\end{bmatrix}.$$$$Can we find a matrix$$C$$such that$$C^{-1}AC$ is diagonal? (Hint: find eigenvalues and eigenvectors using sympy)

#Put your answer to the above question here.


### Dimensions of eigenspaces and diagonalization¶

Definition: The set of all eigenvectors of a $$n\times n$$ matrix corresponding to a eigenvalue $$\lambda$$, together with the zero vector, is a subspace of $$R^n$$. This subspace spaces is called eigenspace.

• For the third example, we have that the characteristic equation $$(\lambda-2)^2=0$$.

• Eigenvalue $$\lambda=2$$ has multiplicity 2, but the eigenspace has dimension 1, since we can not find two lineare independent eigenvector for $$\lambda =2$$.

The dimension of an eigenspace of a matrix is less than or equal to the multiplicity of the corresponding eigenvalue as a root of the characteristic equation.

A matrix is diagonalizable if and only if the dimension of every eigenspace is equal to the multiplicity of the corresponding eigenvalue as a root of the characteristic equation.

### The fourth example¶

Do this: Consider $$$A = \begin{bmatrix}2& -1\\1& 2\end{bmatrix}.$$$$Can we find a matrix$$C$$such that$$C^{-1}AC$ is diagonal?

#Put your answer to the above question here.


## 3. The Power of a Matrix¶

• For a diagonalizable matrix $$A$$, we have $$C^{-1}AC=D$$. Then we have $$$A = C D C^{-1}$$$

• We have $$$A^2 = C D C^{-1} C D C^{-1} = C D^2 C^{-1}$$$$A^n = C D C^{-1} \dots C D C^{-1} = C D^n C^{-1}$$$

• Because the columns of $$C$$ are eigenvectors, so we can say that the eigenvectors for $$A$$ and $$A^n$$ are the same if $$A$$ is diagonalizable.

• If $$x$$ is an eigenvector of $$A$$ with the corresponding eigenvalue $$\lambda$$, then $$x$$ is also an eigenvector of $$A^n$$ with the corresponding eigenvalue $$\lambda^n$$.

# Here are some libraries you may need to use
%matplotlib inline
import numpy as np
import sympy as sym
import networkx as nx
import matplotlib.pyplot as plt
sym.init_printing(use_unicode=True)


### Graph Random Walk¶

• Define the following matrices:

• $$I$$ is the identity matrix

• $$A$$ is the adjacency matrix

• $$D$$ is diagonal matrix of degrees (number of edges connected to each node)

$W=\frac{1}{2}(I + AD^{-1})$
• The lazy random walk matrix, $$W$$, takes a distribution vector of stuff, $$p_{t}$$, and diffuses it to its neighbors:

$p_{t+1}=Wp_{t}$
• For some initial distribution of stuff, $$p_{0}$$, we can compute how much of it would be at each node at time, $$t$$, by powering $$W$$ as follows:

$p_{t}=W^{t}p_{0}$
• Plugging in the above expression yields:

$p_{t}=\left( \frac{1}{2}(I+AD^{-1}) \right)^t p_{0}$

DO THIS: Using matrix algebra, show that $$\frac{1}{2}(I + AD^{-1})$$ is similar to $$I-\frac{1}{2}N$$, where $$N=D^{-\frac{1}{2}}(D-A)D^{-\frac{1}{2}}$$ is the normalized graph Laplacian.

### Random Walk on Barbell Graph¶

To generate the barbell graph, run the following cell.

n = 60 # number of nodes
B = nx.Graph() # initialize graph

## initialize empty edge lists
edge_list_complete_1 = []
edge_list_complete_2 = []
edge_list_path = []

## generate node lists
node_list_complete_1 = np.arange(int(n/3))
node_list_complete_2 = np.arange(int(2*n/3),n)
node_list_path = np.arange(int(n/3)-1,int(2*n/3))

## generate edge sets for barbell graph
for u in node_list_complete_1:
for v in np.arange(u+1,int(n/3)):
edge_list_complete_1.append((u,v))

for u in node_list_complete_2:
for v in np.arange(u+1,n):
edge_list_complete_2.append((u,v))

for u in node_list_path:
edge_list_path.append((u,u+1))

# G.remove_edges_from([(3,0),(5,7),(0,7),(3,5)])

## draw graph
pos=nx.spring_layout(B) # positions for all nodes

### nodes
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_complete_1),
node_color='c',
node_size=400,
alpha=0.8)
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_path),
node_color='g',
node_size=200,
alpha=0.8)
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_complete_2),
node_color='b',
node_size=400,
alpha=0.8)

### edges
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_complete_1,
width=2,alpha=0.5,edge_color='c')
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_path,
width=3,alpha=0.5,edge_color='g')
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_complete_2,
width=2,alpha=0.5,edge_color='b')

plt.axis('off')
plt.show() # display


Do this: Generate the lazy random walk matrix, $$W$$, for the above graph.

A = nx.adjacency_matrix(B)
A = A.todense()

d = np.sum(A,0) # Make a vector of the sums.
D = np.diag(np.asarray(d)[0])

#Put your answer to the above question here.

from answercheck import checkanswer


Do this: Compute the eigenvalues and eigenvectors of $$W$$. Make a diagonal matrix $$J$$ with the eigenvalues on the diagonal. Name the matrix of eigenvectors $$V$$ (each column is an eigenvector).

#Put your answer to the above question here.


Now we make sure we constructed $$V$$ and $$A$$ correctly by double checking that $$W = VJV^{-1}$$

np.allclose(W, V*J*np.linalg.inv(V))


Do this: Let your $$p_{0}=[1,0,0,\ldots,0]$$. Compute $$p_{t}$$ for $$t=1,2,\ldots,100$$, and plot $$||v_{1}-p_{t}||_{1}$$ versus $$t$$, where $$v_{1}$$ is the eigenvector associated with the largest eigenvalue $$\lambda_{1}=1$$ and whose sum equals 1. (Note: $$||\cdot||_{1}$$ may be computed using np.linalg.norm(v_1-p_t, 1).)

#Put your answer to the above question here.


#### Compare to Complete Graph¶

If you complete the above, do the same for a complete graph on the same number of nodes.

Question: What do you notice about the graph that is different from that above?