### Eigenvalues and Eigenvectors - **Definition:** A number $\lambda \in F$ is an **eigenvalue** (latent root) of a linear transformation $T: V \to V$ if there exists a non-zero vector $u \in V$ such that $T(u) = \lambda u$. The non-zero vector $u$ is called an **eigenvector** corresponding to $\lambda$. - For a square matrix $A$, the relationship is $A\vec{x} = \lambda\vec{x}$. #### Finding Eigenvalues 1. For a square matrix $A$ of order $n$, if $\vec{x}$ is an eigenvector corresponding to $\lambda$, then $A\vec{x} = \lambda\vec{x} \implies (A - \lambda I)\vec{x} = 0$. 2. For a non-trivial solution $\vec{x} \neq 0$, the matrix $(A - \lambda I)$ must be singular. This means $\det(A - \lambda I) = 0$. 3. The equation $\det(A - \lambda I) = 0$ is called the **characteristic equation**. Its roots are the **eigenvalues** (or characteristic roots). $$|A - \lambda I| = p_n(\lambda) = (-1)^n(\lambda^n - c_1\lambda^{n-1} + c_2\lambda^{n-2} - \dots + (-1)^n c_n) = 0$$ where $c_1 = \text{Trace}(A) = \sum a_{ii}$, and $c_n = \det(A)$. #### Definitions - **Spectrum of A:** The set of all eigenvalues of $A$. - **Spectrum Radius of A:** $\max\{|\lambda| : \lambda \in \text{spectrum of A}\}$. - **Eigenspace for $\lambda$:** The union of the zero vector $\vec{0}$ and all eigenvectors corresponding to $\lambda$. It forms a subspace. - Given by $\{\vec{x} \in V : T(\vec{x}) = \lambda \vec{x}\}$. #### Remarks - If $|A| = 0$, then one of the eigenvalues is $0$. - If $A$ is a triangular matrix, its eigenvalues are its diagonal entries. #### Theorem 1.1 (Eigenvalue Properties for Linear Operators) For a finite-dimensional vector space $V$ over a field $F$ and a linear operator $T: V \to V$, the following are equivalent: (i) $\lambda$ is an eigenvalue of $T$. (ii) The operator $(T - \lambda I)$ is singular. (iii) $|T - \lambda I| = 0$. ### Eigenvalue Problem The problem of determining the eigenvalues and corresponding eigenvectors of a square matrix $A$ is called the eigenvalue problem. #### Types of Matrices For a square matrix $A = [a_{ij}]$: 1. **Unitary:** $A^T = (A^*)^{-1}$ (where $A^*$ is the conjugate transpose). 2. **Orthogonal:** $A^T = A^{-1} \implies AA^T = I$. 3. **Hermitian:** $A = A^*$ (or $A = A^T$ for real matrices). 4. **Skew Hermitian:** $A = -A^*$. 5. **Positive Definite:** $\vec{x}^*A\vec{x} > 0$ for $\vec{x} \neq \vec{0}$, and $\vec{x}^*A\vec{x} = 0 \iff \vec{x} = \vec{0}$. ### Properties of Eigenvalues and Eigenvectors If $\lambda$ is an eigenvalue of matrix $A$ and $\vec{x}$ is its corresponding eigenvector: 1. **Scalar Multiplication:** Matrix $\alpha A$ has eigenvalue $\alpha\lambda$ and the same eigenvector $\vec{x}$. ($A\vec{x} = \lambda\vec{x} \implies \alpha A\vec{x} = \alpha\lambda\vec{x}$) 2. **Powers of Matrix:** Matrix $A^m$ has eigenvalue $\lambda^m$ and the same eigenvector $\vec{x}$. ($A\vec{x} = \lambda\vec{x} \implies A^m\vec{x} = \lambda^m\vec{x}$) 3. **Shifted Matrix:** Matrix $(A - kI)$ has eigenvalue $(\lambda - k)$ and the same eigenvector $\vec{x}$. ($A\vec{x} = \lambda\vec{x} \implies (A - kI)\vec{x} = (A\vec{x} - kI\vec{x}) = (\lambda - k)\vec{x}$) 4. **Inverse Matrix:** Matrix $A^{-1}$ has eigenvalue $\lambda^{-1}$ and the same eigenvector $\vec{x}$ (if $A$ is invertible). ($A\vec{x} = \lambda\vec{x} \implies A^{-1}A\vec{x} = A^{-1}\lambda\vec{x} \implies \vec{x} = \lambda A^{-1}\vec{x} \implies A^{-1}\vec{x} = \lambda^{-1}\vec{x}$) 5. **Inverse of Shifted Matrix:** Matrix $(A - kI)^{-1}$ has eigenvalue $(\lambda - k)^{-1}$ and the same eigenvector $\vec{x}$. 6. **Transpose:** Eigenvalues of $A$ and $A^T$ are the same. 7. **Complex Conjugate:** For a real matrix $A$, if $\alpha + i\beta$ is an eigenvalue, then $\alpha - i\beta$ is also an eigenvalue. 8. **Scalar Multiple of Eigenvector:** If $\vec{x}$ is an eigenvector corresponding to $\lambda$, then $c\vec{x}$ (for $c \neq 0$) is also an eigenvector for the same eigenvalue. Eigenvectors are not unique, only unique up to a constant multiple. 9. **Trace:** $\text{Trace}(A) = \sum \lambda_i$ (sum of eigenvalues). 10. **Determinant:** $\det(A) = \prod \lambda_i$ (product of eigenvalues). 11. **Orthogonal Matrix:** If $A$ is an orthogonal matrix and $\lambda$ is an eigenvalue, then $\frac{1}{\lambda}$ is also an eigenvalue. 12. **Linearly Independent Eigenvectors:** Eigenvectors corresponding to distinct eigenvalues are linearly independent. #### Remarks on Eigenspace Dimension - The number of linearly independent eigenvectors corresponding to an eigenvalue $\lambda$ depends on the rank of $(A - \lambda I)$. - If $\lambda$ has multiplicity $m$, the number of LI eigenvectors $p$ associated with $\lambda$ satisfies $1 \le p \le m$. - $p = n - \text{Rank}(A - \lambda I)$. #### Example: Eigenvalues of an Inverse Matrix If $\lambda$ is an eigenvalue of an invertible matrix $A$, then $\frac{|A|}{\lambda}$ is an eigenvalue of $\text{adj}(A)$. **Proof:** $A\vec{x} = \lambda\vec{x} \implies \text{adj}(A)A\vec{x} = \text{adj}(A)\lambda\vec{x} \implies \det(A)I\vec{x} = \lambda \text{adj}(A)\vec{x} \implies \frac{\det(A)}{\lambda}\vec{x} = \text{adj}(A)\vec{x}$. ### Cayley-Hamilton Theorem **Theorem 1.2:** Every square matrix $A$ satisfies its own characteristic equation. If the characteristic equation is $|A - \lambda I| = \lambda^n - c_1\lambda^{n-1} + c_2\lambda^{n-2} - \dots + (-1)^n c_n = 0$, then $A^n - c_1A^{n-1} + c_2A^{n-2} - \dots + (-1)^n c_n I = 0$. #### Deductions - **Finding $A^{-1}$:** $A^{-1} = \frac{(-1)^{n-1}}{c_n} (A^{n-1} - c_1A^{n-2} + \dots + (-1)^{n-1}c_{n-1}I)$. - **Finding $A^n$ (or higher powers):** $A^n = c_1A^{n-1} - c_2A^{n-2} + \dots - (-1)^n c_n I$. #### Example: Verify Cayley-Hamilton Theorem Given $A = \begin{pmatrix} 1 & 4 \\ 2 & 3 \end{pmatrix}$. Characteristic equation: $|A - \lambda I| = \begin{vmatrix} 1-\lambda & 4 \\ 2 & 3-\lambda \end{vmatrix} = (1-\lambda)(3-\lambda) - 8 = 3 - 4\lambda + \lambda^2 - 8 = \lambda^2 - 4\lambda - 5 = 0$. By Cayley-Hamilton Theorem, $A^2 - 4A - 5I = 0$. $A^2 = \begin{pmatrix} 1 & 4 \\ 2 & 3 \end{pmatrix} \begin{pmatrix} 1 & 4 \\ 2 & 3 \end{pmatrix} = \begin{pmatrix} 1+8 & 4+12 \\ 2+6 & 8+9 \end{pmatrix} = \begin{pmatrix} 9 & 16 \\ 8 & 17 \end{pmatrix}$. $4A = \begin{pmatrix} 4 & 16 \\ 8 & 12 \end{pmatrix}$. $5I = \begin{pmatrix} 5 & 0 \\ 0 & 5 \end{pmatrix}$. $A^2 - 4A - 5I = \begin{pmatrix} 9 & 16 \\ 8 & 17 \end{pmatrix} - \begin{pmatrix} 4 & 16 \\ 8 & 12 \end{pmatrix} - \begin{pmatrix} 5 & 0 \\ 0 & 5 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$. The theorem is verified. ### Characteristic Roots of Some Special Matrices 1. **Hermitian Matrix ($A = A^*$):** All eigenvalues are real. 2. **Real Symmetric Matrix:** All eigenvalues are real. (A real symmetric matrix is a special case of a Hermitian matrix). 3. **Skew-Hermitian Matrix ($A^* = -A$):** All non-zero eigenvalues are purely imaginary. 4. **Orthogonal Matrix ($AA^T = A^T A = I$):** The modulus of eigenvalues is $1$. ($|\lambda|=1$). 5. **Unitary Matrix ($A^*A = I$):** The modulus of eigenvalues is $1$. ($|\lambda|=1$). 6. **Positive Definite Matrix:** All eigenvalues are real and positive. 7. **Leading Minors of Positive Definite Matrix:** All leading principal minors are positive. ### Similar Matrices - **Definition:** Matrices $A$ and $B$ are **similar** if there exists an invertible matrix $P$ such that $P^{-1}AP = B$ (or $AP = PB$). $P$ is the **similarity matrix**. - **Properties:** If $A$ is similar to $B$: - $A$ and $B$ have the same eigenvalues. - If $\vec{x}$ is an eigenvector of $A$, then $P^{-1}\vec{x}$ is an eigenvector of $B$ corresponding to the same eigenvalue. - **Note:** The converse is not true; two matrices with the same eigenvalues are not necessarily similar. ### Diagonalizable Matrices - **Definition:** A matrix $A$ is **diagonalizable** if there exists a non-singular matrix $P$ such that $P^{-1}AP = D$, where $D$ is a diagonal matrix. - **Theorem:** A square matrix $A$ of order $n$ is diagonalizable if and only if it has $n$ linearly independent eigenvectors. - If $X_1, X_2, \dots, X_n$ are linearly independent eigenvectors corresponding to eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$, then $P = [X_1 \ X_2 \ \dots \ X_n]$ and $D = \text{diag}(\lambda_1, \lambda_2, \dots, \lambda_n)$. - In this case, $AP = PD$. Since $P$ has $n$ LI columns, it's invertible, so $P^{-1}AP = D$. #### Remarks 1. If all eigenvalues are distinct, $A$ has $n$ LI eigenvectors and is diagonalizable. 2. $P$ is called the **modal matrix**, and $D$ is the **spectral matrix** of $A$. 3. The diagonal entries of $D$ are the eigenvalues of $A$. 4. The transformation $P^{-1}AP = D$ is called a **similarity transformation**. 5. If eigenvalues and LI eigenvectors are known, $A = PDP^{-1}$. #### How to Form the Modal Matrix $P$ 1. Find the eigenvalues of $A$. 2. Find the corresponding eigenvectors. 3. If there are $n$ linearly independent eigenvectors, form $P$ by using these eigenvectors as its columns. #### Finding $n^{th}$ Power of a Square Matrix If $A$ is diagonalizable, $A = PDP^{-1}$. Then $A^n = PD^nP^{-1}$. Also, if $Q(D)$ is a polynomial in $D$, then $Q(A) = PQ(D)P^{-1}$. ### Quadratic Forms - **Definition:** A homogeneous polynomial of second degree in $n$ variables. - Example for 2 variables: $ax^2 + by^2 + 2hxy$. - Example for $n$ variables: $Q(\vec{x}) = \vec{x}^T A \vec{x} = \sum_{i=1}^n \sum_{j=1}^n a_{ij}x_i x_j$. - If we define $b_{ij} = \frac{a_{ij} + a_{ji}}{2}$, then $Q(\vec{x}) = \vec{x}^T B \vec{x}$, where $B$ is a symmetric matrix. - $B$ is the **symmetric matrix of the quadratic form**. - $|B|$ is the **determinant of $Q$**. - The **rank of $B$** is the **rank of $Q$**. - If $\text{Rank}(B) 0$ for all $\vec{x} \neq \vec{0}$. - $Q(\vec{x}) = 0 \iff \vec{x} = \vec{0}$. - All leading principal minors of $A$ are positive OR all eigenvalues of $A$ are positive. - **Negative Definite:** - $Q(\vec{x}) ### Reduction of Quadratic Form to Canonical Form #### Inner Product (Dot Product) - For $\vec{X} = (x_1, \dots, x_n)^T, \vec{Y} = (y_1, \dots, y_n)^T \in \mathbb{R}^n$: - $\langle \vec{X}, \vec{Y} \rangle = \vec{X} \cdot \vec{Y} = \vec{X}^T \vec{Y} = \sum_{i=1}^n x_i y_i$. - For $\vec{X}, \vec{Y} \in \mathbb{C}^n$: $\langle \vec{X}, \vec{Y} \rangle = \vec{X}^* \vec{Y} = \sum_{i=1}^n \bar{x}_i y_i$. #### Length (Norm) - For $\vec{X} = (x_1, \dots, x_n)^T \in \mathbb{R}^n$: $||\vec{X}|| = \sqrt{\sum_{i=1}^n x_i^2}$. - **Unit Vector:** A vector $\vec{X}$ with $||\vec{X}|| = 1$. #### Orthogonal and Orthonormal Vectors - **Orthogonal Vectors:** $\vec{X}$ and $\vec{Y}$ are orthogonal if $\langle \vec{X}, \vec{Y} \rangle = 0$. - **Orthonormal Vectors:** $\vec{X}$ and $\vec{Y}$ are orthonormal if $\langle \vec{X}, \vec{Y} \rangle = 0$ and $||\vec{X}|| = 1, ||\vec{Y}|| = 1$. #### Properties of Real Symmetric Matrices For a quadratic form $Q(\vec{X}) = \vec{X}^T A \vec{X}$ with $A$ being a real symmetric matrix: 1. Eigenvalues of $A$ are real. 2. Eigenvectors corresponding to distinct eigenvalues are orthogonal. 3. A symmetric matrix $A$ has $n$ linearly independent eigenvectors, so it is always diagonalizable. 4. There exists an orthogonal matrix $P$ such that $P^{-1}AP = P^TAP = D$ (since for orthogonal matrices $P^{-1} = P^T$). #### Canonical Form - **Definition:** If a quadratic form $\vec{X}^T A \vec{X}$ is reduced to $\vec{Y}^T D \vec{Y}$ by a non-singular linear transformation $\vec{X} = P\vec{Y}$, where $D = \text{diag}(\lambda_1, \dots, \lambda_n)$ and $\lambda_i$ are eigenvalues of $A$, then $\vec{Y}^T D \vec{Y}$ is called the **canonical form**. - **Index ($p$):** Number of positive terms in the quadratic form. - **Negative Index ($q$):** Number of negative terms in the quadratic form. ($q = r - p$, where $r$ is the rank). - **Signature:** $p - q = 2p - r$. ### Solving System of First Order Differential Equations Eigenvalues and eigenvectors can be used to solve systems of first-order linear differential equations, especially when the matrix is diagonalizable. #### Method Consider the system $\vec{X}' = A\vec{X}$. 1. **Diagonalize A:** If $A$ is diagonalizable, find its eigenvalues $\lambda_i$ and corresponding eigenvectors $\vec{v}_i$. Form the modal matrix $P = [\vec{v}_1 \ \dots \ \vec{v}_n]$ and the diagonal matrix $D = \text{diag}(\lambda_1, \dots, \lambda_n)$ such that $P^{-1}AP = D$. 2. **Transform Variables:** Introduce a new column vector $\vec{Y}$ such that $\vec{X} = P\vec{Y}$. 3. **Decouple System:** Substitute $\vec{X} = P\vec{Y}$ into $\vec{X}' = A\vec{X}$: $P\vec{Y}' = AP\vec{Y} \implies \vec{Y}' = P^{-1}AP\vec{Y} \implies \vec{Y}' = D\vec{Y}$. This decouples the system into independent equations: $y_i' = \lambda_i y_i$. 4. **Solve Decoupled System:** The solution for each decoupled equation is $y_i(t) = c_i e^{\lambda_i t}$. 5. **Transform Back:** Substitute the solutions for $\vec{Y}$ back into $\vec{X} = P\vec{Y}$ to get the solution for $\vec{X}$. #### Example: System with two variables Given $\vec{X}' = A\vec{X}$ where $A = \begin{pmatrix} 4 & 2 \\ -1 & 1 \end{pmatrix}$. 1. **Characteristic Equation:** $|A - \lambda I| = (4-\lambda)(1-\lambda) - (-1)(2) = 4 - 5\lambda + \lambda^2 + 2 = \lambda^2 - 5\lambda + 6 = 0$. 2. **Eigenvalues:** $(\lambda - 2)(\lambda - 3) = 0 \implies \lambda_1 = 2, \lambda_2 = 3$. 3. **Eigenvectors:** - For $\lambda_1 = 2$: $(A - 2I)\vec{v}_1 = \begin{pmatrix} 2 & 2 \\ -1 & -1 \end{pmatrix} \vec{v}_1 = \vec{0} \implies 2v_{11} + 2v_{12} = 0 \implies v_{11} = -v_{12}$. Let $\vec{v}_1 = \begin{pmatrix} 1 \\ -1 \end{pmatrix}$. - For $\lambda_2 = 3$: $(A - 3I)\vec{v}_2 = \begin{pmatrix} 1 & 2 \\ -1 & -2 \end{pmatrix} \vec{v}_2 = \vec{0} \implies v_{21} + 2v_{22} = 0 \implies v_{21} = -2v_{22}$. Let $\vec{v}_2 = \begin{pmatrix} 2 \\ -1 \end{pmatrix}$. 4. **Modal Matrix:** $P = \begin{pmatrix} 1 & 2 \\ -1 & -1 \end{pmatrix}$. 5. **Decoupled System:** Let $\vec{X} = P\vec{Y}$. Then $\vec{Y}' = D\vec{Y} = \begin{pmatrix} 2 & 0 \\ 0 & 3 \end{pmatrix} \vec{Y}$. - $y_1' = 2y_1 \implies y_1(t) = c_1 e^{2t}$. - $y_2' = 3y_2 \implies y_2(t) = c_2 e^{3t}$. 6. **General Solution:** $\vec{X} = P\vec{Y} = \begin{pmatrix} 1 & 2 \\ -1 & -1 \end{pmatrix} \begin{pmatrix} c_1 e^{2t} \\ c_2 e^{3t} \end{pmatrix} = \begin{pmatrix} c_1 e^{2t} + 2c_2 e^{3t} \\ -c_1 e^{2t} - c_2 e^{3t} \end{pmatrix}$. #### Systems of Second Order Differential Equations This approach can be extended to systems of second-order differential equations. Let $\vec{X}'' = A\vec{X}$. Transforming $\vec{X} = P\vec{Y}$ yields $P\vec{Y}'' = AP\vec{Y} \implies \vec{Y}'' = D\vec{Y}$. This gives decoupled equations $y_i'' = \lambda_i y_i$. - If $\lambda_i > 0$, $y_i(t) = c_{i1} e^{\sqrt{\lambda_i}t} + c_{i2} e^{-\sqrt{\lambda_i}t}$. - If $\lambda_i