About Eigendecomposition

Matrix decompositions are a useful tool for reducing a matrix to their constituent parts in order to simplify a range of more complex operations. Perhaps the most used type of matrix decomposition is the eigendecomposition that decomposes a matrix into eigenvectors and eigenvalues. This decomposition also plays a role in methods used in machine learning, such as in the Principal Component Analysis method or PCA. In this tutorial, you will discover the eigendecomposition, eigenvectors, and eigenvalues in linear algebra. After completing this tutorial, you will know:

  • What an eigendecomposition is and the role of eigenvectors and eigenvalues.
  • How to calculate an eigendecomposition in Python with NumPy
  • How to confirm a vector is an eigenvector and how to reconstruct a matrix from eigenvectors and eigenvalues.

1.1 Tutorial Overview

 This tutorial is divided into 5 parts; they are:

  1.  Eigendecomposition of a Matrix
  2. Eigenvectors and Eigenvalues
  3. Calculation of Eigendecomposition
  4. Confirm an Eigenvector and Eigenvalue
  5. Reconstruct Matrix

1.2 Eidgendecomposition of a Matrix

Eigendecomposition of a matrix is a type of decomposition that involves decomposing a square matrix into a set of eigenvectors and eigenvalues.

One of the most widely used kinds of matrix decomposition is called eigendecomposition, in which we decompose a matrix into a set of eigenvectors and eigenvalues.

A vector is an eigenvector of a matrix if it satisfies the following equation.

                                        A · v = λ · v

This is called the eigenvalue equation, where A is the parent square matrix that we are decomposing, v is the eigenvector of the matrix, and λ is the lowercase Greek letter lambda and represents the eigenvalue scalar. Or without the dot notation.

                                        Av = λv

        A matrix could have one eigenvector and eigenvalue for each dimension of the parent matrix. Not all square matrices can be decomposed into eigenvectors and eigenvalues, and some can only be decomposed in a way that requires complex numbers. The parent matrix can be shown to be a product of the eigenvectors and eigenvalues.

                                                        A = Q · Λ · Q T

Or, without the dot notation.

                                                        A = QΛQ T

1.3 Eigenvectors and Eigenvalues

Eigenvectors are unit vectors, which means that their length or magnitude is equal to 1.0. They are often referred as right vectors, which simply means a column vector (as opposed to a row vector or a left vector). A right-vector is a vector as we understand them. Eigenvalues are coefficients applied to eigenvectors that give the vectors their length or magnitude. For example, a negative eigenvalue may reverse the direction of the eigenvector as part of scaling it. A matrix that has only positive eigenvalues is referred to as a positive definite matrix, whereas if the eigenvalues are all negative, it is referred to as a negative definite matrix. Decomposing a matrix in terms of its eigenvalues and its eigenvectors gives valuable insights into the properties of the matrix. Certain matrix calculations, like computing the power of the matrix, become much easier when we use the eigendecomposition of the matrix.

1.4 Calculation of Eigendecomposition

# eigendecomposition
from numpy import array
from numpy.linalg import eig
# define matrix
A = array([
    [1, 2, 3],
    [4, 5, 6],
    [7, 8, 9]
])
print(A)
#factorize
values,vectors = eig(A)
print(values)
print(vectors)

Running the example first prints the defined matrix, followed by the eigenvalues and the eigenvectors. More specifically, the eigenvectors are the right-hand side eigenvectors and are normalized to unit length.

1.5 Confirm an Eigenvector and Eigenvalue

We can confirm that a vector is indeed an eigenvector of a matrix. We do this by multiplying the candidate eigenvector by the value vector and comparing the result with the eigenvalue. First, we will define a matrix, then calculate the eigenvalues and eigenvectors. We will then test whether the first vector and value are in fact an eigenvalue and eigenvector for the matrix. We know they are, but it is a good exercise.

The eigenvectors are returned as a matrix with the same dimensions as the parent matrix, where each column is an eigenvector, e.g. the first eigenvector is vectors[:, 0]. Eigenvalues are returned as a list, where value indices in the returned array are paired with eigenvectors by column index, e.g. the first eigenvalue at values[0] is paired with the first eigenvector at vectors[:, 0].

# Example of calculating a confirmation of an eigendecomposition
# confirm eigenvector
from numpy import array
from numpy.linalg import eig
# define matrix
A = array([
    [1, 2, 3],
    [4, 5, 6],
    [7, 8, 9]
])
# factorize
values, vectors = eig(A)
B = A.dot(vectors[:,0])
print(B)
C = vectors[:,0] * values[0]
print(C)

The example multiplies the original matrix with the first eigenvector and compares it to the first eigenvector multiplied by the first eigenvalue. Running the example prints the results of these two multiplications that show the same resulting vector, as we would expect.

 1.5 Reconstruct Matrix

We can reverse the process and reconstruct the original matrix given only the eigenvectors and eigenvalues. First, the list of eigenvectors must be taken together as a matrix, where each vector becomes a row. The eigenvalues need to be arranged into a diagonal matrix. The NumPy diag() function can be used for this. Next, we need to calculate the inverse of the eigenvector matrix, which we can achieve with the inv() NumPy function. Finally, these elements need to be multiplied together with the dot() function.

# Example of reconstructing a matrix from an eigendecomposition
# reconstruct matrix
from numpy import diag
from numpy.linalg import inv
from numpy import array
from numpy.linalg import eig
# define matrix
A = array([
    [1, 2, 3],
    [4, 5, 6],
    [7, 8, 9]
])
print(A)

# factorize
values, vectors = eig(A)
# create matrix from eigenvevtors
Q = vectors
# create inverse of eigenvectors matrix
R = inv(Q)
# create diagonal matrix from eigenvalues
L = diag(values)
# reconstruct the original matrix
B = Q.dot(L).dot(R)
print(B)

The example calculates the eigenvalues and eigenvectors again and uses them to reconstruct the original matrix. Running the example first prints the original matrix, then the matrix reconstructed from eigenvalues and eigenvectors matching the original matrix.

 

1.6 Summary

In this tutorial, you discovered the eigendecomposition, eigenvectors, and eigenvalues in linear algebra. Specifically, you learned:

  • What an eigendecomposition is and the role of eigenvectors and eigenvalues.
  • How to calculate an eigendecomposition in Python with NumPy.
  • How to confirm a vector is an eigenvector and how to reconstruct a matrix from eigenvectors and eigenvalues.
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值