Introduction to Linear Algebra(5) Eigenvalue and Eigenvectors

Eigenvalues and Eigenvectors

Introduction to Linear Algebra(5) Eigenvalue and Eigenvectors

Eigenvalues

The Eigenvalues of a triangular matrix are the entries on its main diagonal.
If v1, ,vrv_1,\cdots,v_rv1​,⋯,vr​ are eigenvectors that correspond to distinct eigenvalues λ1, ,λr\lambda_1,\cdots,\lambda_rλ1​,⋯,λr​ of an n×nn\times nn×n matrix AAA, then the set {v1, ,vr}\{v_1,\cdots,v_r\}{v1​,⋯,vr​} is linearly independent.

The characteristic equation

Let A be an n×nn \times nn×n matrix. Then AAA is invertible if and only if:
s.the number 0 is not a eigenvalues ofAAA
t.the determinant of AAA is not zero

A scalar λ\lambdaλ is an eigenvalue of an n×nn \times nn×n matrix AAA if and only if λ\lambdaλ satisfies the characteristic equation det(AλI)=0det(A-\lambda I)=0det(A−λI)=0

Similarity

AAA is similar to BBB if there is an invertible matrix PPP that P1AP=BP^{-1}AP=BP−1AP=B
If n×nn \times nn×n matrix AAA and BBB are similar, then they have same characteristic polynomial and hence the same eigenvalues(with same multiplicities).

Diagonalization

An n×nn \times nn×n matrix AAA is diagoalizable if and only if AAA has nnn linearly independent eigenvectors.
In fact, A=PDP1A= PDP^{-1}A=PDP−1, with DDD a diagonal matrix, if and only if the columns of PPP are linearly independent eigenvectors of AAA. In this case, the diagonal entries of DDD are eigenvalues of AAA that correspond, respectively, to the eigenvectors in PPP.
An n×nn \times nn×n matrix with nnn distinct eigenvalues is diagonalizable
Let AAA ba an n×nn \times nn×n matrix whose distinct eigenvalues are λ1, ,λp\lambda_1 , \cdots, \lambda_pλ1​,⋯,λp​.
a. For 1kp1 \leq k \leq p1≤k≤p, the dimension of the eigenspace for λk\lambda_kλk​ is less than or equal to the multiplicity of the eigenvalue λk\lambda_kλk​.
b. The matrix AAA is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals nnn, and this happens if and only if (i)(i)(i) the characteristic polynomial factors completely into linear factors and (ii)(ii)(ii) the dimension of the eigenspace for each λk\lambda_kλk​ equals the multiplicity of λk\lambda_kλk​
c. If AAA is diagonalizable and βk\beta_kβk​ is a basis for the eigenspace corresponding to λk\lambda_kλk​ for each kkk, then the total collection of vactors in the sets β1, ,βp\beta_1,\cdots,\beta_pβ1​,⋯,βp​ forms an eigenvectors basis for RnR^nRn.

Eigenvectors and Linear Transformations

there are two vector space VVV RnR^nRn and WWW RmR^mRm, coordinate vector [x]β[x]_{\beta}[x]β​ is of RnR^nRn and [T(x)]C[T(x)]_{C}[T(x)]C​ is in RmR^mRm. b1, ,bn{b_1,\cdots,b_n}b1​,⋯,bn​ is the basis β\betaβ for VVV. Then the matrix M=[[T(b1)]c, ,[T(bn)]c]M=[[T(b_1)]_c,\cdots,[T(b_n)]_c]M=[[T(b1​)]c​,⋯,[T(bn​)]c​] The action
of T on XXX may be viewed as left-multiplication by MMM.

Linear Transformations on RnR^nRn

Suppose A=PDP1A=PDP^{-1}A=PDP−1, where DDD is a diagonal n×nn\times nn×n matrix. If BBB is the basis for RnR^nRn formed from the columns of PPP, then DDD is the β\betaβmatrix for the transformation xAxx\rightarrow Axx→Ax.

Similarity of Matrix Representattions

A=PCP1A=PCP^{-1}A=PCP−1

Iterative Estimates For Eigenvalues

If there is an power method applies to an n×nn \times nn×n matrix AAA with a strictly domaint eigenvalue λ1\lambda_1λ1​ meaning that λ1\lambda_1λ1​ must be larger in aboluate value than all the other eigenvalues. That is λ1>λ2lambda3λn|\lambda_1|>|\lambda_2|\geq |lambda_3| \cdots \geq |\lambda_n|∣λ1​∣>∣λ2​∣≥∣lambda3​∣⋯≥∣λn​∣ Then through the below method it is easy for us to get a eigenvector:
1. Select an initial vector x0x_0x0​ whose largest entry is 1.
2. For k=0,1, ,k=0,1,\cdots,k=0,1,⋯,
a.compute AxkAx_kAxk​
b.Let μk\mu_kμk​ be an entry inAxkAx_kAxk​ whose abolute value is as large as possible
c.Compute xk+1=(1/μk)Axkx_{k+1}=(1/\mu_k)Ax_kxk+1​=(1/μk​)Axk​
3. For almost all choices of x0x_0x0​, the sequence {μk}\{\mu_k\}{μk​} approaches the domaint eigenvalue, and the sequence {xk}\{x_k\}{xk​} approaches a corresponding eigenvector.

The inverse Power Method For estimating An Eigenvalue λ\lambdaλ Of AAA

  1. Select an initial estimate α\alphaα sufficiently close to λ\lambdaλ
  2. select an initial vector x0x_0x0​ whose largest entry is 1
  3. For k=0,1, ,k = 0,1,\cdots,k=0,1,⋯,
    a.slove (AαI)yk=xk(A-\alpha I)y_k=x_k(A−αI)yk​=xk​ for yky_kyk​
    b.Let μk\mu_kμk​ be an entry in yky_kyk​ whose absolute value is as large as possible.
    c.Compute vk=α+(1/μk)v_k=\alpha+(1/\mu_k)vk​=α+(1/μk​)
    d.Compute xk+1=(1/μk)ykx_{k+1}=(1/\mu_k)y_kxk+1​=(1/μk​)yk​
  4. For almost all choices of x0x_0x0​, the sequence {vk}\{v_k\}{vk​} approaches the eigenvalue λ\lambdaλ of AAA, and the sequence {xk}\{x_k\}{xk​} approaches a corresponding eigenvector.
上一篇:Comprehensive Introduction to Apache Spark


下一篇:Session &cookie introduction,usage