Matrices: Basic Definitions Definition: A matrix is a rectangular array of numbers, symbols, or expressions. An $m \times n$ matrix has $m$ rows and $n$ columns. Notation: $A = [a_{ij}]$, where $a_{ij}$ is the element in the $i$-th row and $j$-th column. $$A = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{pmatrix}$$ Types of Matrices: Row Matrix: A matrix with only one row ($1 \times n$). Example: $[1 \ 2 \ 3]$ Column Matrix: A matrix with only one column ($m \times 1$). Example: $\begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix}$ Square Matrix: A matrix where $m=n$. Zero Matrix: A matrix where all elements are zero, denoted by $O$. Identity Matrix: A square matrix $I$ with ones on the main diagonal and zeros elsewhere. $$I = \begin{pmatrix} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{pmatrix}$$ Diagonal Matrix: A square matrix where all non-diagonal elements are zero. $a_{ij}=0$ for $i \ne j$. Scalar Matrix: A diagonal matrix where all diagonal elements are equal. $a_{ii}=k$. Upper Triangular Matrix: A square matrix where all elements below the main diagonal are zero. $a_{ij}=0$ for $i > j$. Lower Triangular Matrix: A square matrix where all elements above the main diagonal are zero. $a_{ij}=0$ for $i Symmetric Matrix: A square matrix $A$ such that $A^T = A$. ($a_{ij}=a_{ji}$) Skew-Symmetric Matrix: A square matrix $A$ such that $A^T = -A$. ($a_{ij}=-a_{ji}$, implies $a_{ii}=0$) Matrix Operations Addition/Subtraction: For $A=[a_{ij}]$ and $B=[b_{ij}]$ of the same size $m \times n$: $A \pm B = [a_{ij} \pm b_{ij}]$. Commutative: $A+B = B+A$ Associative: $(A+B)+C = A+(B+C)$ Scalar Multiplication: For a scalar $k$ and matrix $A=[a_{ij}]$: $kA = [ka_{ij}]$. Matrix Multiplication: For $A$ ($m \times n$) and $B$ ($n \times p$), the product $C=AB$ is an $m \times p$ matrix where: $c_{ij} = \sum_{k=1}^n a_{ik}b_{kj}$. Not Commutative: $AB \ne BA$ (in general) Associative: $(AB)C = A(BC)$ Distributive: $A(B+C) = AB+AC$ Identity: $AI = IA = A$ Zero: $AO = OA = O$ Transpose of a Matrix ($A^T$): Rows become columns and columns become rows. If $A=[a_{ij}]$, then $A^T=[a_{ji}]$. $(A^T)^T = A$ $(A+B)^T = A^T+B^T$ $(kA)^T = kA^T$ $(AB)^T = B^T A^T$ (important!) Determinants Definition: A scalar value associated with a square matrix $A$, denoted $\det(A)$ or $|A|$. For $2 \times 2$ Matrix: If $A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}$, then $\det(A) = ad - bc$. For $3 \times 3$ Matrix (Sarrus' Rule): If $A = \begin{pmatrix} a & b & c \\ d & e & f \\ g & h & i \end{pmatrix}$, then $\det(A) = a(ei-fh) - b(di-fg) + c(dh-eg)$. Cofactor Expansion: For an $n \times n$ matrix, $\det(A) = \sum_{j=1}^n a_{ij}C_{ij}$ (along row $i$) or $\sum_{i=1}^n a_{ij}C_{ij}$ (along column $j$), where $C_{ij} = (-1)^{i+j}M_{ij}$ is the cofactor, and $M_{ij}$ is the minor (determinant of submatrix after removing row $i$ and column $j$). Properties: $\det(A^T) = \det(A)$ $\det(AB) = \det(A)\det(B)$ $\det(kA) = k^n \det(A)$ for an $n \times n$ matrix $A$. If a matrix has a row/column of zeros, $\det(A)=0$. If a matrix has two identical rows/columns, $\det(A)=0$. If one row/column is a scalar multiple of another, $\det(A)=0$. Swapping two rows/columns changes the sign of the determinant. Adding a multiple of one row/column to another does not change the determinant. Inverse of a Matrix Definition: For a square matrix $A$, its inverse $A^{-1}$ is a matrix such that $AA^{-1} = A^{-1}A = I$. Existence: $A^{-1}$ exists if and only if $\det(A) \ne 0$. Such a matrix is called non-singular or invertible . If $\det(A)=0$, $A$ is singular . For $2 \times 2$ Matrix: If $A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}$, then $A^{-1} = \frac{1}{\det(A)} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}$. General Formula: $A^{-1} = \frac{1}{\det(A)} \text{adj}(A)$, where $\text{adj}(A)$ is the adjugate (or classical adjoint) matrix, which is the transpose of the cofactor matrix. $\text{adj}(A) = (C_{ij})^T$. Properties: $(A^{-1})^{-1} = A$ $(AB)^{-1} = B^{-1}A^{-1}$ (important!) $(A^T)^{-1} = (A^{-1})^T$ $\det(A^{-1}) = \frac{1}{\det(A)}$ Systems of Linear Equations Matrix Form: A system of $m$ linear equations in $n$ variables can be written as $AX=B$. $$A = \begin{pmatrix} a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn} \end{pmatrix}, \quad X = \begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix}, \quad B = \begin{pmatrix} b_1 \\ \vdots \\ b_m \end{pmatrix}$$ Solution using Inverse: If $A$ is square and invertible ($\det(A) \ne 0$), then $X = A^{-1}B$. Cramer's Rule: For a system $AX=B$ where $A$ is an $n \times n$ invertible matrix, the solution is given by: $x_j = \frac{\det(A_j)}{\det(A)}$, where $A_j$ is the matrix formed by replacing the $j$-th column of $A$ with the column vector $B$. Gaussian Elimination / Gauss-Jordan Elimination: Using elementary row operations to transform the augmented matrix $[A|B]$ into row echelon form or reduced row echelon form to solve the system. Eigenvalues and Eigenvectors Definition: For a square matrix $A$, an eigenvector $v$ is a non-zero vector that changes at most by a scalar factor when $A$ is applied to it. The scalar factor is called the eigenvalue $\lambda$. $Av = \lambda v$. Characteristic Equation: To find eigenvalues $\lambda$, solve $\det(A - \lambda I) = 0$. This is a polynomial equation. Finding Eigenvectors: For each eigenvalue $\lambda_i$, solve the homogeneous system $(A - \lambda_i I)v = 0$ to find the corresponding eigenvectors $v$. Properties: The sum of eigenvalues equals the trace of $A$ (sum of diagonal elements). $\sum \lambda_i = \text{tr}(A)$. The product of eigenvalues equals the determinant of $A$. $\prod \lambda_i = \det(A)$. Eigenvectors corresponding to distinct eigenvalues are linearly independent.