30. Diagonalization
Dated: 13-06-2025
Diagonalization
is a process of transforming a vector
\(A\) into \(PDP^{-1}\) where \(D\) is a diagonal matrix
and \(P\) is an invertible matrix
.
Example
Let
\[
A =
\begin{bmatrix}
7 & 2\\
-4 & 1
\end{bmatrix}
\]
Find a formula for \(A^k\) where
\[
D =
\begin{bmatrix}
5 & 0\\
0 & 3
\end{bmatrix}
\]
\[
P =
\begin{bmatrix}
1 & 1\\
-1 & -2
\end{bmatrix}
\]
\[\because A = PDP^{-1}\]
Finding \(P^{-1}\), we get
\[P^{-1} = \begin{bmatrix} 2 & 1 \\ -1 & -1 \end{bmatrix}\]
\[A^2 = A \cdot A\]
\[= (PDP^{-1})(PDP^{-1})\]
Due to associative property
\[= PD(P^{-1}P)DP^{-1}\]
\[= PDIDP^{-1}\]
\[= PDDP^{-1}\]
\[= PD^2P^{-1}\]
\[A^3 = A \cdot A^2\]
\[= (PDP^{-1})PD^2P^{-1}\]
Again, due to associative property
\[= PD(P^{-1}P)D^2P^{-1}\]
\[= PDID^2P^{-1}\]
\[= PDD^2P^{-1}\]
\[= PD^3P^{-1}\]
\[\therefore A^k = PD^kP^{-1}\]
\[
=
\begin{bmatrix}
1 & 1 \\
-1 & -2
\end{bmatrix}
\begin{bmatrix}
5^k & 0 \\
0 & 3^k
\end{bmatrix}
\begin{bmatrix}
2 & 1 \\
-1 & -1
\end{bmatrix}
\]
\[
=
\begin{bmatrix}
5^k & 3^k \\
-5^k & -2 \cdot 3^k
\end{bmatrix}
\begin{bmatrix}
2 & 1 \\
-1 & -1
\end{bmatrix}
\]
$$
=
\begin{bmatrix}
2 \cdot 5^k - 3^k & 5^k - 3^k \
-2 \cdot 5^k + 2 \cdot 3^k & -5^k + 2 \cdot 3^k
\end{bmatrix}
$$
Theorem
An \(n \times n\) matrix
\(A\) is diagonalizable
if and only if \(A\) has \(n\) linearly independent
eigenvectors
.
Diagonalizing Matrices
\[
A =
\begin{bmatrix}
1 & 3 & 3\\
-3 & -5 & -3\\
3 & 3 & 1
\end{bmatrix}
\]
Step 1
Find eigen values
of \(A\).
\[0 = \det(A - \lambda I)\]
\[= - \lambda^3 - 3 \lambda^2 + 4\]
\[= -(\lambda - 1)(\lambda + 2)^2\]
Therefore, the eigen values
are \(\lambda = 1\) and \(\lambda = -2\).
Step 2
Now we need 3 eigen vectors
because \(A\) is a \(3 \times 3\) matrix
.
Basis
vector
For \(\lambda = 1\)
\[(A - \lambda I) \vec x = \vec 0\]
\[
\begin{bmatrix}
0 & 3 & 3 \\
-3 & -6 & -3 \\
3 & 3 & 0
\end{bmatrix}
\begin{bmatrix}
x_1 \\
x_2 \\
x_3
\end{bmatrix}
=
\begin{bmatrix}
0 \\
0 \\
0
\end{bmatrix}
\]
After applying row operations
\[
\begin{bmatrix}
0 & 1 & 1 \\
3 & 3 & 0 \\
0 & 0 & 0
\end{bmatrix}
\begin{bmatrix}
x_1 \\
x_2 \\
x_3
\end{bmatrix}
=
\begin{bmatrix}
0 \\
0 \\
0
\end{bmatrix}
\]
\[
\begin{aligned}
x_2 + x_3 = 0\\
3x_1 + 3x_2 = 0\\
\end{aligned}
\]
\[x_1 = t, x_2 = -t, x_3 = t\]
\[
\therefore v_1 =
\begin{bmatrix}
1 \\
-1 \\
1
\end{bmatrix}
\]
Basis
vector
For \(\lambda = -2\)
\[(A-\lambda I)x = 0\]
$$
\begin{bmatrix}
3 & 3 & 3 \
-3 & -3 & -3 \
3 & 3 & 3
\end{bmatrix}
\begin{bmatrix}
x_1 \
x_2 \
x_3
\end{bmatrix}
=
\begin{bmatrix}
0 \
0 \
0
\end{bmatrix}
$$
\[
\begin{aligned}
3x_1 + 3x_2 + 3x_3 = 0 \\
-3x_1 - 3x_2 - 3x_3 = 0 \\
3x_1 + 3x_2 + 3x_3 = 0 \\
\end{aligned}
\]
\[x_1 = - s - t, x_2 = s, x_3 = t\]
\[
\begin{bmatrix}
x_1 \\
x_2 \\
x_3
\end{bmatrix}
=
\begin{bmatrix}
-s-t \\
s \\
t
\end{bmatrix}
=
\begin{bmatrix}
-s \\
s \\
0
\end{bmatrix}
+
\begin{bmatrix}
-t \\
0 \\
t
\end{bmatrix}
\]
\[= s \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} + t \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}\]
\[= x_2 \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} + x_3 \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}\]
\[
\therefore v_2 =
\begin{bmatrix}
-1 \\
1 \\
0
\end{bmatrix}, \quad
v_3 =
\begin{bmatrix}
-1 \\
0 \\
1
\end{bmatrix}
\]
Step 3
Check if \(\{\vec v_1, \vec v_2, \vec v_3\}\) is linearly independent
\[
P =
\begin{bmatrix}
\vec{v}_1 & \vec{v}_2 & \vec{v}_3
\end{bmatrix}
=
\begin{bmatrix} 1 & -1 & -1 \\ -1 & 1 & 0 \\ 1 & 0 & 1 \end{bmatrix}
\]
Step 3
The order of the eigen values
must match the order chosen for the columns
of \(P\).
\[
D = \begin{bmatrix} 1 & 0 & 0 \\ 0 & -2 & 0 \\ 0 & 0 & -2 \end{bmatrix}
\]
Theorem
An \(n \times n\) matrix
with \(n\) distinct eigenvalues
is diagonalizable
.
Theorem
Let \(A\) be an \(n \times n\) matrix
whose distinct eigen values
are \(\lambda_1, \ldots, \lambda_p\).
- Diagonalizable \(\iff\) enough
independent
eigenvectors
\(\iff\) Eigenspace
dimension
= Eigenvalue
multiplicity
.
- To build a
diagonal matrix
, find a basis
of \(\mathbb R^2\) from eigenvectors
of \(A\).
References
Read more about notations and symbols.