Eigenvalues and Eigenvectors
Eigenvalues
The Eigenvalues of a triangular matrix are the entries on its main diagonal.
If v1,⋯,vr are eigenvectors that correspond to distinct eigenvalues λ1,⋯,λr of an n×n matrix A, then the set {v1,⋯,vr} is linearly independent.
The characteristic equation
Let A be an n×n matrix. Then A is invertible if and only if:
s.the number 0 is not a eigenvalues ofA
t.the determinant of A is not zero
A scalar λ is an eigenvalue of an n×n matrix A if and only if λ satisfies the characteristic equation det(A−λI)=0
Similarity
A is similar to B if there is an invertible matrix P that P−1AP=B
If n×n matrix A and B are similar, then they have same characteristic polynomial and hence the same eigenvalues(with same multiplicities).
Diagonalization
An n×n matrix A is diagoalizable if and only if A has n linearly independent eigenvectors.
In fact, A=PDP−1, with D a diagonal matrix, if and only if the columns of P are linearly independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors in P.
An n×n matrix with n distinct eigenvalues is diagonalizable
Let A ba an n×n matrix whose distinct eigenvalues are λ1,⋯,λp.
a. For 1≤k≤p, the dimension of the eigenspace for λk is less than or equal to the multiplicity of the eigenvalue λk.
b. The matrix A is diagonalizable if and only if the sum of the dimensions of the eigenspaces equals n, and this happens if and only if (i) the characteristic polynomial factors completely into linear factors and (ii) the dimension of the eigenspace for each λk equals the multiplicity of λk
c. If A is diagonalizable and βk is a basis for the eigenspace corresponding to λk for each k, then the total collection of vactors in the sets β1,⋯,βp forms an eigenvectors basis for Rn.
Eigenvectors and Linear Transformations
there are two vector space V Rn and W Rm, coordinate vector [x]β is of Rn and [T(x)]C is in Rm. b1,⋯,bn is the basis β for V. Then the matrix M=[[T(b1)]c,⋯,[T(bn)]c] The action
of T on X may be viewed as left-multiplication by M.
Linear Transformations on Rn
Suppose A=PDP−1, where D is a diagonal n×n matrix. If B is the basis for Rn formed from the columns of P, then D is the βmatrix for the transformation x→Ax.
Similarity of Matrix Representattions
A=PCP−1
Iterative Estimates For Eigenvalues
If there is an power method applies to an n×n matrix A with a strictly domaint eigenvalue λ1 meaning that λ1 must be larger in aboluate value than all the other eigenvalues. That is ∣λ1∣>∣λ2∣≥∣lambda3∣⋯≥∣λn∣ Then through the below method it is easy for us to get a eigenvector:
1. Select an initial vector x0 whose largest entry is 1.
2. For k=0,1,⋯,
a.compute Axk
b.Let μk be an entry inAxk whose abolute value is as large as possible
c.Compute xk+1=(1/μk)Axk
3. For almost all choices of x0, the sequence {μk} approaches the domaint eigenvalue, and the sequence {xk} approaches a corresponding eigenvector.
The inverse Power Method For estimating An Eigenvalue λ Of A
- Select an initial estimate α sufficiently close to λ
- select an initial vector x0 whose largest entry is 1
- For k=0,1,⋯,
a.slove (A−αI)yk=xk for yk
b.Let μk be an entry in yk whose absolute value is as large as possible.
c.Compute vk=α+(1/μk)
d.Compute xk+1=(1/μk)yk - For almost all choices of x0, the sequence {vk} approaches the eigenvalue λ of A, and the sequence {xk} approaches a corresponding eigenvector.