Bessel Functions The Bessel function of the first kind of order $m$ is given by: $$ J_m(x) = \sum_{k=0}^{\infty} \frac{(-1)^k}{k! \Gamma(m+k+1)} \left(\frac{x}{2}\right)^{2k+m} $$ Special Cases $J_{1/2}(x) = \sqrt{\frac{2}{\pi x}} \sin x$ $J_{-1/2}(x) = \sqrt{\frac{2}{\pi x}} \cos x$ Differential Equations Helmholtz Equation in Cartesian Coordinates The Helmholtz equation is given by: $$ \left(\frac{\partial^2}{\partial x^2} + \frac{\partial^2}{\partial y^2} + \frac{\partial^2}{\partial z^2}\right)f(x,y,z) + p^2f(x,y,z) = 0 $$ This equation can be reduced into three ordinary differential equations (ODEs) using separation of variables. Laplace Transforms for ODEs Laplace transforms can solve initial value problems for linear ODEs. For example, for $y'' - 2y' + 5y = 0$ with $y(0) = -1, y'(0) = 7$: Apply Laplace transform to the equation: $L\{y''\} - 2L\{y'\} + 5L\{y\} = 0$. Use properties: $L\{y'\} = sY(s) - y(0)$, $L\{y''\} = s^2Y(s) - sy(0) - y'(0)$. Substitute initial conditions and solve for $Y(s)$. Find the inverse Laplace transform $y(t) = L^{-1}\{Y(s)\}$. Legendre Polynomials Generating Function The generating function for Legendre polynomials $P_n(x)$ is: $$ g(x, t) = \frac{1}{\sqrt{1 - 2tx + t^2}} = \sum_{n=0}^{\infty} P_n(x) t^n $$ Recurrence Relation From the generating function, the following recurrence relation can be derived: $$ (2n+1)x P_n(x) - (n+1)P_{n+1}(x) = n P_{n-1}(x) $$ Linear Algebra Basis of a Vector Space A set of vectors $\{ \vec{v_1}, \vec{v_2}, ..., \vec{v_n} \}$ forms a basis for a vector space if they are linearly independent and span the entire space. Example: Vectors $\vec{V_1} = \begin{pmatrix} 1 \\ 2 \end{pmatrix}$ and $\vec{V_2} = \begin{pmatrix} -1 \\ 1 \end{pmatrix}$ form a basis in $\mathbb{R}^2$ if their determinant is non-zero (i.e., they are linearly independent and span $\mathbb{R}^2$). Orthogonal Matrices A square matrix $A$ is orthogonal if $A^T A = A A^T = I$, where $I$ is the identity matrix. The determinant of an orthogonal matrix is $\pm 1$. A rotation matrix by angle $\phi$ is an example of an orthogonal matrix: $$ \begin{pmatrix} x' \\ y' \end{pmatrix} = \begin{pmatrix} \cos\phi & \sin\phi \\ -\sin\phi & \cos\phi \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix} $$ Any $2 \times 2$ orthogonal matrix has this form (or a reflection form). Eigenvalues and Eigenvectors For a square matrix $A$, an eigenvector $\vec{v}$ satisfies $A\vec{v} = \lambda\vec{v}$, where $\lambda$ is the eigenvalue. To find eigenvalues, solve the characteristic equation $\det(A - \lambda I) = 0$. For a real symmetric matrix, eigenvalues are real, and eigenvectors corresponding to distinct eigenvalues are orthogonal. Diagonalizing Matrix: If a matrix $A$ is diagonalizable, it can be written as $A = P D P^{-1}$, where $D$ is a diagonal matrix of eigenvalues and $P$ is a matrix whose columns are the corresponding eigenvectors. Example: For the matrix $\begin{pmatrix} \cos\theta & \sin\theta & 0 \\ \sin\theta & -\cos\theta & 0 \\ 0 & 0 & -1 \end{pmatrix}$, find its eigenvalues, orthonormal eigenvectors, and the diagonalizing matrix. Complex Analysis Cauchy-Riemann Equations in Polar Form For an analytic function $f(z) = u(r, \theta) + iv(r, \theta)$, the Cauchy-Riemann equations in polar coordinates are: $$ \frac{\partial u}{\partial r} = \frac{1}{r} \frac{\partial v}{\partial \theta} \quad \text{and} \quad \frac{\partial v}{\partial r} = -\frac{1}{r} \frac{\partial u}{\partial \theta} $$ Method of Residues The residue theorem states that for a function $f(z)$ with isolated singularities inside a simple closed contour $C$, $\oint_C f(z) dz = 2\pi i \sum \text{Res}(f, z_k)$. Residue at a simple pole $z_0$: $\text{Res}(f, z_0) = \lim_{z \to z_0} (z-z_0)f(z)$. Residue at a pole of order $m$ at $z_0$: $\text{Res}(f, z_0) = \frac{1}{(m-1)!} \lim_{z \to z_0} \frac{d^{m-1}}{dz^{m-1}} [(z-z_0)^m f(z)]$. This method is used to evaluate definite integrals, e.g., $\int_0^\infty \frac{dx}{x^4+1}$. Fourier Transforms Fourier Transform of a Single Square Pulse For a function $f(x)$ defined as: $$ f(x) = \begin{cases} 1 & |x| The Fourier transform $F(\omega)$ is given by: $$ F(\omega) = \int_{-\infty}^{\infty} f(x) e^{-i\omega x} dx = \int_{-1}^{1} e^{-i\omega x} dx = \left[ \frac{e^{-i\omega x}}{-i\omega} \right]_{-1}^{1} = \frac{e^{-i\omega} - e^{i\omega}}{-i\omega} = \frac{2 \sin(\omega)}{\omega} = 2 \text{sinc}(\omega) $$ Group Theory Multiplication Table for Permutations For the group of permutations of $\{1, 2, 3\}$, denoted $S_3$, the elements are: $e = (1)(2)(3)$ (identity) $(12) = (12)(3)$ $(13) = (13)(2)$ $(23) = (1)(23)$ $(123) = (123)$ $(132) = (132)$ The multiplication table shows the result of composing any two permutations. $\circ$ $e$ $(12)$ $(13)$ $(23)$ $(123)$ $(132)$ $e$ $e$ $(12)$ $(13)$ $(23)$ $(123)$ $(132)$ $(12)$ $(12)$ $e$ $(132)$ $(123)$ $(23)$ $(13)$ $(13)$ $(13)$ $(123)$ $e$ $(132)$ $(12)$ $(23)$ $(23)$ $(23)$ $(132)$ $(123)$ $e$ $(13)$ $(12)$ $(123)$ $(123)$ $(13)$ $(23)$ $(12)$ $(132)$ $e$ $(132)$ $(132)$ $(23)$ $(12)$ $(13)$ $e$ $(123)$ Abelian Group An Abelian group (or commutative group) is a group in which the result of applying the group operation to two group elements does not depend on the order in which they are written. That is, for all $a, b$ in the group $G$, $a \cdot b = b \cdot a$. Example: The set of integers $\mathbb{Z}$ under addition $(+, 0)$ is an Abelian group. For any integers $a, b$, $a+b = b+a$. The group $S_3$ (permutations of $\{1, 2, 3\}$) is not Abelian, e.g., $(12)(13) = (132)$ but $(13)(12) = (123)$.