### Vectors - **Definition:** An ordered list of numbers, e.g., $\vec{v} = (v_1, v_2, ..., v_n)$ or as a column matrix. - **Magnitude (Length):** For $\vec{v} \in \mathbb{R}^n$, $|\vec{v}| = \sqrt{v_1^2 + v_2^2 + ... + v_n^2}$. - **Unit Vector:** A vector with magnitude 1. $\hat{u} = \frac{\vec{v}}{|\vec{v}|}$. - **Vector Addition:** $\vec{a} + \vec{b} = (a_1+b_1, ..., a_n+b_n)$. - **Scalar Multiplication:** $c\vec{v} = (cv_1, ..., cv_n)$. - **Dot Product:** $\vec{a} \cdot \vec{b} = \sum_{i=1}^n a_i b_i = |\vec{a}||\vec{b}|\cos\theta$. - If $\vec{a} \cdot \vec{b} = 0$, then $\vec{a}$ and $\vec{b}$ are orthogonal. - **Cross Product (only in $\mathbb{R}^3$):** $\vec{a} \times \vec{b} = (a_2b_3 - a_3b_2, a_3b_1 - a_1b_3, a_1b_2 - a_2b_1)$. - Result is a vector orthogonal to both $\vec{a}$ and $\vec{b}$. - Magnitude: $|\vec{a} \times \vec{b}| = |\vec{a}||\vec{b}|\sin\theta$. - Area of parallelogram: $|\vec{a} \times \vec{b}|$. ### Matrices - **Definition:** A rectangular array of numbers. $A_{m \times n}$ has $m$ rows and $n$ columns. - **Matrix Addition:** $A+B$ is element-wise, only if dimensions match. - **Scalar Multiplication:** $cA$ is element-wise. - **Matrix Multiplication:** $(AB)_{ij} = \sum_{k=1}^p A_{ik}B_{kj}$. - $A_{m \times p} B_{p \times n} = C_{m \times n}$. - Not commutative: $AB \neq BA$. - **Identity Matrix ($I$):** Square matrix with 1s on the main diagonal, 0s elsewhere. $AI = IA = A$. - **Transpose ($A^T$):** Rows become columns. $(A^T)_{ij} = A_{ji}$. - $(AB)^T = B^T A^T$. - **Symmetric Matrix:** $A = A^T$. - **Inverse Matrix ($A^{-1}$):** For a square matrix $A$, if $A^{-1}$ exists, $AA^{-1} = A^{-1}A = I$. - **For $2 \times 2$ matrix:** If $A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}$, then $A^{-1} = \frac{1}{ad-bc} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}$. - $A$ is invertible if and only if $\det(A) \neq 0$. - $(AB)^{-1} = B^{-1}A^{-1}$. ### Determinants - **Definition:** A scalar value associated with a square matrix. - **For $2 \times 2$ matrix:** $\det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = ad-bc$. - **For $3 \times 3$ matrix (Sarrus' Rule):** $\det \begin{pmatrix} a & b & c \\ d & e & f \\ g & h & i \end{pmatrix} = a(ei-fh) - b(di-fg) + c(dh-eg)$. - **Properties:** - $\det(A) = \det(A^T)$. - $\det(AB) = \det(A)\det(B)$. - $\det(A^{-1}) = 1/\det(A)$. - If $A$ has a row/column of zeros, $\det(A)=0$. - If $A$ has two identical rows/columns, $\det(A)=0$. - Swapping two rows/columns changes sign of determinant. - Multiplying a row/column by $c$ multiplies determinant by $c$. - Adding a multiple of one row/column to another does not change determinant. ### Linear Systems - **Form:** $A\vec{x} = \vec{b}$, where $A$ is coefficient matrix, $\vec{x}$ is variable vector, $\vec{b}$ is constant vector. - **Solutions:** - **Unique Solution:** If $\det(A) \neq 0$, then $\vec{x} = A^{-1}\vec{b}$. - **No Solution (Inconsistent):** Rows of $A$ imply contradiction. - **Infinitely Many Solutions (Consistent):** Rows of $A$ are linearly dependent, free variables exist. - **Methods:** - **Gaussian Elimination (Row Reduction):** Transform $[A|\vec{b}]$ into Row Echelon Form (REF) or Reduced Row Echelon Form (RREF) using elementary row operations: 1. Swap two rows. 2. Multiply a row by a non-zero scalar. 3. Add a multiple of one row to another row. - **Cramer's Rule:** For $n \times n$ system $A\vec{x} = \vec{b}$, if $\det(A) \neq 0$, then $x_i = \frac{\det(A_i)}{\det(A)}$, where $A_i$ is $A$ with $i$-th column replaced by $\vec{b}$. ### Vector Spaces & Subspaces - **Vector Space:** A set $V$ with operations vector addition and scalar multiplication satisfying 10 axioms (closure, associativity, commutativity, identity, inverse, distributivity). - **Subspace:** A subset $W$ of a vector space $V$ that is itself a vector space under the same operations. - To prove $W$ is a subspace: 1. $W$ contains the zero vector. 2. $W$ is closed under vector addition ($\vec{u}, \vec{v} \in W \implies \vec{u}+\vec{v} \in W$). 3. $W$ is closed under scalar multiplication ($c \in \mathbb{R}, \vec{u} \in W \implies c\vec{u} \in W$). - **Span:** The set of all possible linear combinations of a set of vectors $\{\vec{v}_1, ..., \vec{v}_k\}$. $Span\{\vec{v}_1, ..., \vec{v}_k\} = \{c_1\vec{v}_1 + ... + c_k\vec{v}_k\}$. - **Linear Independence:** A set of vectors $\{\vec{v}_1, ..., \vec{v}_k\}$ is linearly independent if the only solution to $c_1\vec{v}_1 + ... + c_k\vec{v}_k = \vec{0}$ is $c_1 = ... = c_k = 0$. - **Basis:** A set of linearly independent vectors that span the entire vector space. - **Dimension:** The number of vectors in any basis for a vector space. - **Row Space ($Row(A)$):** The span of the row vectors of $A$. $\dim(Row(A)) = rank(A)$. - **Column Space ($Col(A)$):** The span of the column vectors of $A$. $\dim(Col(A)) = rank(A)$. - **Null Space ($Null(A)$ or $Ker(A)$):** The set of all solutions to $A\vec{x} = \vec{0}$. - $\dim(Null(A)) = nullity(A)$. - **Rank-Nullity Theorem:** For an $m \times n$ matrix $A$, $rank(A) + nullity(A) = n$. ### Eigenvalues & Eigenvectors - **Definition:** For a square matrix $A$, an eigenvector $\vec{v}$ is a non-zero vector such that $A\vec{v} = \lambda\vec{v}$, where $\lambda$ is a scalar called the eigenvalue. - **Finding Eigenvalues:** Solve the characteristic equation $\det(A - \lambda I) = 0$. - **Finding Eigenvectors:** For each eigenvalue $\lambda$, solve $(A - \lambda I)\vec{v} = \vec{0}$ for $\vec{v}$. - **Eigenspace:** The set of all eigenvectors corresponding to a particular eigenvalue $\lambda$, along with the zero vector. It's $Null(A - \lambda I)$. - **Diagonalization:** A square matrix $A$ is diagonalizable if there exists an invertible matrix $P$ and a diagonal matrix $D$ such that $A = PDP^{-1}$. - The columns of $P$ are the eigenvectors of $A$. - The diagonal entries of $D$ are the corresponding eigenvalues. - $A$ is diagonalizable if and only if it has $n$ linearly independent eigenvectors. - If $A$ has distinct eigenvalues, it is diagonalizable. - **Properties:** - The product of eigenvalues is $\det(A)$. - The sum of eigenvalues is the trace of $A$ (sum of diagonal elements). - Eigenvectors corresponding to distinct eigenvalues are linearly independent. ### Orthogonality - **Orthogonal Vectors:** $\vec{u} \cdot \vec{v} = 0$. - **Orthogonal Set:** A set of vectors where every pair is orthogonal. - **Orthonormal Set:** An orthogonal set where every vector is a unit vector. - **Orthogonal Basis:** A basis that is an orthogonal set. - **Orthonormal Basis:** A basis that is an orthonormal set. - **Gram-Schmidt Process:** Converts any basis into an orthogonal (or orthonormal) basis. 1. $\vec{u}_1 = \vec{v}_1$ 2. $\vec{u}_2 = \vec{v}_2 - proj_{\vec{u}_1}\vec{v}_2 = \vec{v}_2 - \frac{\vec{v}_2 \cdot \vec{u}_1}{\vec{u}_1 \cdot \vec{u}_1}\vec{u}_1$ 3. $\vec{u}_k = \vec{v}_k - \sum_{j=1}^{k-1} \frac{\vec{v}_k \cdot \vec{u}_j}{\vec{u}_j \cdot \vec{u}_j}\vec{u}_j$ (Normalize $\vec{u}_i$ to get orthonormal basis) - **Orthogonal Matrix ($Q$):** A square matrix such that $Q^T Q = QQ^T = I$. - Columns (and rows) form an orthonormal basis. - $\det(Q) = \pm 1$. - Preserves dot products and lengths: $(Q\vec{x}) \cdot (Q\vec{y}) = \vec{x} \cdot \vec{y}$ and $|Q\vec{x}| = |\vec{x}|$. - **Orthogonal Projection:** $proj_{\vec{u}}\vec{v} = \frac{\vec{v} \cdot \vec{u}}{\vec{u} \cdot \vec{u}}\vec{u}$. - Projection onto a subspace $W$ with orthonormal basis $\{\vec{u}_1, ..., \vec{u}_k\}$: $proj_W\vec{v} = (\vec{v} \cdot \vec{u}_1)\vec{u}_1 + ... + (\vec{v} \cdot \vec{u}_k)\vec{u}_k$. ### Matrix Factorizations - **LU Decomposition:** $A = LU$, where $L$ is a lower triangular matrix and $U$ is an upper triangular matrix. Used for solving linear systems efficiently. - **QR Decomposition:** $A = QR$, where $Q$ is an orthogonal matrix and $R$ is an upper triangular matrix. Used for solving least squares problems and eigenvalue computations. - **Singular Value Decomposition (SVD):** For any $m \times n$ matrix $A$, $A = U\Sigma V^T$. - $U$ is $m \times m$ orthogonal matrix (columns are left singular vectors). - $\Sigma$ is $m \times n$ diagonal matrix with non-negative singular values $\sigma_i$ on the diagonal. - $V$ is $n \times n$ orthogonal matrix (columns are right singular vectors). - Singular values are square roots of eigenvalues of $A^T A$ (or $AA^T$). - Applications: dimensionality reduction (PCA), data compression, pseudo-inverse. ### Least Squares - **Problem:** Find $\vec{x}$ that minimizes $|A\vec{x} - \vec{b}|^2$ when $A\vec{x} = \vec{b}$ has no exact solution. - **Normal Equations:** The solution $\hat{\vec{x}}$ satisfies $A^T A \hat{\vec{x}} = A^T \vec{b}$. - **Solution:** If $A^T A$ is invertible, $\hat{\vec{x}} = (A^T A)^{-1} A^T \vec{b}$. - **Geometric Interpretation:** $A\hat{\vec{x}}$ is the orthogonal projection of $\vec{b}$ onto $Col(A)$. ### Linear Transformations - **Definition:** A function $T: V \to W$ between vector spaces $V$ and $W$ such that for all $\vec{u}, \vec{v} \in V$ and scalar $c$: 1. $T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$ (additivity) 2. $T(c\vec{u}) = cT(\vec{u})$ (homogeneity) - **Matrix Representation:** Every linear transformation $T: \mathbb{R}^n \to \mathbb{R}^m$ can be represented by an $m \times n$ matrix $A$ such that $T(\vec{x}) = A\vec{x}$. - The columns of $A$ are $T(\vec{e}_1), T(\vec{e}_2), ..., T(\vec{e}_n)$, where $\vec{e}_i$ are standard basis vectors. - **Kernel (Null Space):** $Ker(T) = \{\vec{v} \in V \mid T(\vec{v}) = \vec{0}\}$. Similar to $Null(A)$. - **Image (Range):** $Im(T) = \{T(\vec{v}) \mid \vec{v} \in V\}$. Similar to $Col(A)$. - **Rank-Nullity Theorem (for transformations):** $\dim(Ker(T)) + \dim(Im(T)) = \dim(V)$. - **Change of Basis:** If $B = \{\vec{b}_1, ..., \vec{b}_n\}$ is a basis for $V$, and $C$ is another basis, then the change of basis matrix from $B$ to $C$ is $P_{C \leftarrow B} = [[\vec{b}_1]_C ... [\vec{b}_n]_C]$. - $[\vec{x}]_C = P_{C \leftarrow B} [\vec{x}]_B$. - The matrix of a linear transformation $T$ relative to basis $B$ is $[T]_B = P_{B \leftarrow E} A P_{E \leftarrow B}$, where $A$ is the standard matrix and $E$ is the standard basis.