Leibnitz Theorem for $n^{th}$ Derivative If $y = uv$ where $u$ and $v$ are functions of $x$ possessing $n^{th}$ derivatives, then: $$(uv)_n = \sum_{r=0}^n \binom{n}{r} u^{(n-r)}v^{(r)}$$ Expanded form: $$(uv)_n = u^{(n)}v + \binom{n}{1} u^{(n-1)}v^{(1)} + \binom{n}{2} u^{(n-2)}v^{(2)} + \dots + \binom{n}{r} u^{(n-r)}v^{(r)} + \dots + u v^{(n)}$$ Where $\binom{n}{r} = \frac{n!}{r!(n-r)!}$. Successive Differentiation (Higher Order Derivatives) The process of differentiating a function successively $n$ times. Common Notations for $y = f(x)$ $1^{st}$ derivative: $f'(x)$ or $y'$ or $\frac{dy}{dx}$ or $Dy$ or $y_1$ $2^{nd}$ derivative: $f''(x)$ or $y''$ or $\frac{d^2y}{dx^2}$ or $D^2y$ or $y_2$ $3^{rd}$ derivative: $f'''(x)$ or $y'''$ or $\frac{d^3y}{dx^3}$ or $D^3y$ or $y_3$ Calculation of $n^{th}$ Derivatives for Common Functions 1. $y = e^{ax}$ $y_1 = ae^{ax}$ $y_2 = a^2e^{ax}$ $\dots$ $y_n = a^ne^{ax}$ 2. $y = (ax+b)^m$ (for $m$ a positive integer, $m \ge n$) $y_1 = ma(ax+b)^{m-1}$ $y_2 = m(m-1)a^2(ax+b)^{m-2}$ $\dots$ $y_n = m(m-1)\dots(m-n+1)a^n(ax+b)^{m-n}$ 3. $y = \sin(ax+b)$ $y_1 = a\cos(ax+b) = a\sin(ax+b+\frac{\pi}{2})$ $y_2 = a^2\cos(ax+b+\frac{2\pi}{2}) = a^2\sin(ax+b+\frac{2\pi}{2})$ $\dots$ $y_n = a^n\sin(ax+b+\frac{n\pi}{2})$ 4. $y = \cos(ax+b)$ $y_1 = -a\sin(ax+b) = a\cos(ax+b+\frac{\pi}{2})$ $y_2 = -a^2\sin(ax+b+\frac{2\pi}{2}) = a^2\cos(ax+b+\frac{2\pi}{2})$ $\dots$ $y_n = a^n\cos(ax+b+\frac{n\pi}{2})$ 5. $y = \frac{1}{(x+1)(x+2)}$ Decompose into partial fractions: $y = \frac{1}{x+1} - \frac{1}{x+2}$ Let $u = (x+1)^{-1}$ and $v = (x+2)^{-1}$ $u_n = (-1)^n n! (x+1)^{-(n+1)}$ $v_n = (-1)^n n! (x+2)^{-(n+1)}$ $y_n = (-1)^n n! \left( \frac{1}{(x+1)^{n+1}} - \frac{1}{(x+2)^{n+1}} \right)$ Maclaurin's Series If $f(x)$ can be expanded in ascending powers of $x$, then: $$f(x) = f(0) + xf'(0) + \frac{x^2}{2!}f''(0) + \frac{x^3}{3!}f'''(0) + \dots + \frac{x^n}{n!}f^{(n)}(0) + \dots$$ Common Maclaurin Series Expansions 1. $e^x$ $e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots + \frac{x^n}{n!} + \dots$ 2. $\sin x$ $\sin x = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \dots$ 3. $\cos x$ $\cos x = 1 - \frac{x^2}{2!} + \frac{x^4}{4!} - \frac{x^6}{6!} + \frac{x^8}{8!} - \dots$ 4. $\tan x$ $\tan x = x + \frac{x^3}{3} + \frac{2x^5}{15} + \dots$ 5. Hyperbolic Functions $\cosh x = 1 + \frac{x^2}{2!} + \frac{x^4}{4!} + \frac{x^6}{6!} + \dots$ $\sinh x = x + \frac{x^3}{3!} + \frac{x^5}{5!} + \dots$ $\tanh x = x - \frac{x^3}{3} + \frac{2}{15}x^5 + \dots$ 6. $\log(1+x)$ $\log(1+x) = x - \frac{x^2}{2} + \frac{x^3}{3} - \frac{x^4}{4} + \dots$ 7. $\log(1-x)$ $\log(1-x) = -x - \frac{x^2}{2} - \frac{x^3}{3} - \frac{x^4}{4} - \dots$ 8. $(1+x)^m$ (Binomial Series) $(1+x)^m = 1 + mx + \frac{m(m-1)}{2!}x^2 + \frac{m(m-1)(m-2)}{3!}x^3 + \dots$ 9. $\frac{1}{1+x}$ (for $m=-1$) $\frac{1}{1+x} = 1 - x + x^2 - x^3 + x^4 - \dots$ 10. $\frac{1}{1-x}$ (for $m=-1$ and replacing $x$ with $-x$) $\frac{1}{1-x} = 1 + x + x^2 + x^3 + x^4 + \dots$ Taylor's Series Expansion of $f(x+h)$ about $x$: $$f(x+h) = f(x) + hf'(x) + \frac{h^2}{2!}f''(x) + \frac{h^3}{3!}f'''(x) + \dots + \frac{h^n}{n!}f^{(n)}(x) + \dots$$ Expansion of $f(x)$ about $a$ (by replacing $x$ with $a$ and $h$ with $x-a$): $$f(x) = f(a) + (x-a)f'(a) + \frac{(x-a)^2}{2!}f''(a) + \frac{(x-a)^3}{3!}f'''(a) + \dots + \frac{(x-a)^n}{n!}f^{(n)}(a) + \dots$$ Indeterminate Forms Limits leading to indeterminate forms can often be evaluated using L'Hopital's Rule. Types of Indeterminate Forms $0 \times \infty$ or $\infty - \infty$: Convert to $\frac{0}{0}$ or $\frac{\infty}{\infty}$ using algebraic manipulation. $1^\infty$, $\infty^0$, $0^0$: Use logarithms. If $\lim_{x \to a} [f(x)]^{g(x)} = L$, then $\log L = \lim_{x \to a} [g(x) \cdot \log f(x)]$. This converts the form to $0 \times \infty$. L'Hopital's Rule If $f(x)$ and $g(x)$ are two functions that can be expanded by Taylor's series in the neighborhood of $x=a$, and if $f(a) = g(a) = 0$ (or $f(a) = g(a) = \infty$), then: $$\lim_{x \to a} \frac{f(x)}{g(x)} = \lim_{x \to a} \frac{f'(x)}{g'(x)}$$ provided the latter limit exists. Beta & Gamma Functions Gamma Function Definition For $n > 0$, the Gamma function is defined as: $$\Gamma(n) = \int_0^\infty e^{-x} x^{n-1} dx$$ Properties of Gamma Function $\Gamma(1) = 1$ $\Gamma(n+1) = n\Gamma(n)$ (Reduction formula) If $n$ is a positive integer, $\Gamma(n+1) = n!$ $\Gamma\left(\frac{1}{2}\right) = \sqrt{\pi}$ $\Gamma\left(\frac{n}{2}\right)$ for rational $n$ can be expressed in terms of $\sqrt{\pi}$ and factorials. Beta Function Definition For $m>0, n>0$, the Beta function is defined as: $$B(m,n) = \int_0^1 x^{m-1} (1-x)^{n-1} dx$$ Properties of Beta Function $B(m,n) = B(n,m)$ $B(m,n) = 2 \int_0^{\pi/2} \sin^{2m-1}\theta \cos^{2n-1}\theta d\theta$ $B(m,n) = \int_0^\infty \frac{x^{m-1}}{(1+x)^{m+n}} dx$ $B(m,n) = \int_0^\infty \frac{x^{n-1}}{(1+x)^{m+n}} dx$ $B(m+1,n) = \frac{m}{m+n} B(m,n)$ $B(m,n+1) = \frac{n}{m+n} B(m,n)$ Relation between Beta and Gamma Functions $$B(m,n) = \frac{\Gamma(m)\Gamma(n)}{\Gamma(m+n)}$$ Duplication Formula for Gamma Function (Legendre's Duplication Formula) $$\Gamma(m)\Gamma\left(m+\frac{1}{2}\right) = \frac{\sqrt{\pi}}{2^{2m-1}} \Gamma(2m)$$ Infinite Series Series of Positive Terms An infinite series where all terms after some particular term are positive. Zero Test (Necessary Condition for Convergence) If $\sum u_n$ converges, then $\lim_{n \to \infty} u_n = 0$. Divergence Test: If $\lim_{n \to \infty} u_n \ne 0$, then $\sum u_n$ must be divergent. Integral Test For a positive term series $f(1) + f(2) + f(3) + \dots$, where $f(n)$ decreases as $n$ increases: The series converges if $\int_1^\infty f(x)dx$ is finite. The series diverges if $\int_1^\infty f(x)dx$ is infinite. Harmonic Series (p-series) The series $\sum_{n=1}^\infty \frac{1}{n^p} = 1 + \frac{1}{2^p} + \frac{1}{3^p} + \dots$ Converges if $p > 1$. Diverges if $p \le 1$. Comparison Test For two positive term series $\sum u_n$ and $\sum v_n$: Case 1: If $\sum v_n$ converges and $u_n \le v_n$ for all $n$, then $\sum u_n$ also converges. Case 2: If $\sum v_n$ diverges and $u_n \ge v_n$ for all $n$, then $\sum u_n$ also diverges. Limit Form of Comparison Test If $\sum u_n$ and $\sum v_n$ are two positive term series such that $\lim_{n \to \infty} \frac{u_n}{v_n} = L$ (a finite non-zero quantity), then $\sum u_n$ and $\sum v_n$ converge or diverge together. Geometric Series The series $1 + r + r^2 + r^3 + \dots$ Converges if $|r| Diverges if $r \ge 1$. Oscillates if $r \le -1$. D'Alembert's Ratio Test For a positive term series $\sum u_n$: Converges if $\lim_{n \to \infty} \frac{u_{n+1}}{u_n} = \lambda Diverges if $\lim_{n \to \infty} \frac{u_{n+1}}{u_n} = \lambda > 1$. Test fails if $\lim_{n \to \infty} \frac{u_{n+1}}{u_n} = \lambda = 1$. Raabe's Test (when Ratio Test fails, i.e., $\lambda = 1$) For a positive term series $\sum u_n$: Converges if $\lim_{n \to \infty} n\left(\frac{u_n}{u_{n+1}} - 1\right) = k > 1$. Diverges if $\lim_{n \to \infty} n\left(\frac{u_n}{u_{n+1}} - 1\right) = k Test fails if $\lim_{n \to \infty} n\left(\frac{u_n}{u_{n+1}} - 1\right) = k = 1$. Logarithmic Test (when Raabe's Test fails, i.e., $k = 1$) For a positive term series $\sum u_n$: Converges if $\lim_{n \to \infty} \left(n \log \frac{u_n}{u_{n+1}}\right) = k > 1$. Diverges if $\lim_{n \to \infty} \left(n \log \frac{u_n}{u_{n+1}}\right) = k Test fails if $\lim_{n \to \infty} \left(n \log \frac{u_n}{u_{n+1}}\right) = k = 1$. Cauchy's Root Test For a positive term series $\sum u_n$: Converges if $\lim_{n \to \infty} (u_n)^{1/n} = \lambda Diverges if $\lim_{n \to \infty} (u_n)^{1/n} = \lambda > 1$. Test fails if $\lim_{n \to \infty} (u_n)^{1/n} = \lambda = 1$. Alternating Series A series where terms are alternately positive and negative, e.g., $u_1 - u_2 + u_3 - u_4 + \dots$ Leibnitz's Rule for Alternating Series An alternating series converges if: Each term is numerically less than its preceding term ($|u_{n+1}| \le |u_n|$). $\lim_{n \to \infty} u_n = 0$. If $\lim_{n \to \infty} u_n \ne 0$, the series is oscillatory. Partial Derivatives Notation $\frac{\partial Z}{\partial x} = Z_x$ $\frac{\partial Z}{\partial y} = Z_y$ $\frac{\partial}{\partial x}\left(\frac{\partial Z}{\partial x}\right) = \frac{\partial^2 Z}{\partial x^2} = Z_{xx}$ $\frac{\partial}{\partial y}\left(\frac{\partial Z}{\partial y}\right) = \frac{\partial^2 Z}{\partial y^2} = Z_{yy}$ $\frac{\partial}{\partial y}\left(\frac{\partial Z}{\partial x}\right) = \frac{\partial^2 Z}{\partial y \partial x} = Z_{xy}$ $\frac{\partial}{\partial x}\left(\frac{\partial Z}{\partial y}\right) = \frac{\partial^2 Z}{\partial x \partial y} = Z_{yx}$ Clairaut's Theorem: If $Z_{xy}$ and $Z_{yx}$ are continuous, then $Z_{xy} = Z_{yx}$. Composite Function (Chain Rule) If $Z = f(x,y)$ and $x = \phi(t)$, $y = \psi(t)$, then: $$\frac{dZ}{dt} = \frac{\partial Z}{\partial x} \frac{dx}{dt} + \frac{\partial Z}{\partial y} \frac{dy}{dt}$$ Homogeneous Function & Euler's Theorem Homogeneous Function Definition A function $f(x,y,z)$ is called a homogeneous function of degree $n$ if, by putting $X=xt, Y=yt, Z=zt$, the function $f(X,Y,Z)$ becomes $t^n f(x,y,z)$. i.e., $f(xt, yt, zt) = t^n f(x,y,z)$. Euler's Theorem for Homogeneous Functions If $u$ is a homogeneous function of $x,y,z,\dots$ of degree $n$, then: $$x\frac{\partial u}{\partial x} + y\frac{\partial u}{\partial y} + z\frac{\partial u}{\partial z} + \dots = nu$$ Corollaries of Euler's Theorem Corollary 1: If $z$ is a homogeneous function of two variables $x$ and $y$ of degree $n$, then: $$x^2\frac{\partial^2 z}{\partial x^2} + 2xy\frac{\partial^2 z}{\partial x \partial y} + y^2\frac{\partial^2 z}{\partial y^2} = n(n-1)z$$ Corollary 2: If $u$ is a homogeneous function of degree $n$ in $x,y,z$ and $u=f(X,Y,Z)$ where $X,Y,Z$ are the first order partial derivatives of $u$ w.r.t $x,y,z$, then: $$X\frac{\partial f}{\partial X} + Y\frac{\partial f}{\partial Y} + Z\frac{\partial f}{\partial Z} = \frac{n}{n-1}u$$ Corollary 3: If $z$ is a homogeneous function of degree $n$ in $x$ and $y$, and $z=f(u)$, then: $$x\frac{\partial u}{\partial x} + y\frac{\partial u}{\partial y} = n \frac{f(u)}{f'(u)}$$ Corollary 4: If $z$ is a homogeneous function of degree $n$ in $x$ and $y$, and $z=f(u)$, then: $$x^2\frac{\partial^2 u}{\partial x^2} + 2xy\frac{\partial^2 u}{\partial x \partial y} + y^2\frac{\partial^2 u}{\partial y^2} = g(u)[g'(u)-1]$$ where $g(u) = n \frac{f(u)}{f'(u)}$. Maxima & Minima (Stationary or Turning Values) For a function $f(x,y)$: Solve $\frac{\partial f}{\partial x} = 0$ and $\frac{\partial f}{\partial y} = 0$ simultaneously to find stationary points $(x_1, y_1)$. Calculate the second partial derivatives at each stationary point: $r = \frac{\partial^2 f}{\partial x^2}$ $s = \frac{\partial^2 f}{\partial x \partial y}$ $t = \frac{\partial^2 f}{\partial y^2}$ Evaluate $rt - s^2$: If $rt - s^2 > 0$: If $r If $r > 0$ (or $t > 0$), $f(x,y)$ has a local minimum. If $rt - s^2 If $rt - s^2 = 0$, the test is inconclusive. Lagrange's Method of Undetermined Multipliers To find extreme values of $u = f(x,y,z)$ subject to the constraint $\Phi(x,y,z) = 0$: Form the Lagrangian function $L(x,y,z,\lambda) = f(x,y,z) + \lambda \Phi(x,y,z)$. Set the first partial derivatives of $L$ with respect to $x,y,z,\lambda$ to zero: $\frac{\partial L}{\partial x} = \frac{\partial f}{\partial x} + \lambda \frac{\partial \Phi}{\partial x} = 0$ $\frac{\partial L}{\partial y} = \frac{\partial f}{\partial y} + \lambda \frac{\partial \Phi}{\partial y} = 0$ $\frac{\partial L}{\partial z} = \frac{\partial f}{\partial z} + \lambda \frac{\partial \Phi}{\partial z} = 0$ $\frac{\partial L}{\partial \lambda} = \Phi(x,y,z) = 0$ Solve this system of equations for $x,y,z,\lambda$. The values of $x,y,z$ obtained are the points where $u$ may have extreme values. Ordinary Differential Equations Order and Degree of a Differential Equation Order: The order of the highest derivative present in the equation. Degree: The highest power of the highest derivative after removing radicals and fractions from the derivatives. Formation of a Differential Equation Eliminate arbitrary constants from the general solution of a differential equation by differentiating it successively. Note: The number of arbitrary constants in the solution is equal to the order of the differential equation. Differential Equation of the First Order & First Degree 1. Variable Separable If a D.E. can be written in the form $f(y)dy = g(x)dx$, then integrate both sides: $\int f(y)dy = \int g(x)dx + C$. 2. Homogeneous Differential Equation A D.E. of the form $\frac{dy}{dx} = \frac{f(x,y)}{g(x,y)}$ where $f(x,y)$ and $g(x,y)$ are homogeneous functions of the same degree. Method: Substitute $y=vx$, then $\frac{dy}{dx} = v + x\frac{dv}{dx}$. The equation becomes separable in $v$ and $x$. 3. Equations Reducible to Homogeneous Form D.E. of the form $\frac{dy}{dx} = \frac{ax+by+c}{Ax+By+C}$. Method: Substitute $x=X+h, y=Y+k$. Choose $h,k$ such that $ah+bk+c=0$ and $Ah+Bk+C=0$. The equation becomes homogeneous in $X,Y$. Case of Failure: If $\frac{a}{A} = \frac{b}{B}$, this method fails. Then substitute $ax+by=Z$ to make it separable. 4. Linear Differential Equation A D.E. of the form $\frac{dy}{dx} + P(x)y = Q(x)$, where $P(x)$ and $Q(x)$ are functions of $x$ (or constants). Method: Calculate the Integrating Factor (I.F.): $I.F. = e^{\int P(x)dx}$. The general solution is $y \cdot (I.F.) = \int Q(x) \cdot (I.F.) dx + C$. 5. Bernoulli's Equation (Reducible to Linear Form) A D.E. of the form $\frac{dy}{dx} + P(x)y = Q(x)y^n$, where $n \ne 0, 1$. Method: Divide by $y^n$: $y^{-n}\frac{dy}{dx} + P(x)y^{1-n} = Q(x)$. Substitute $z = y^{1-n}$, then $\frac{dz}{dx} = (1-n)y^{-n}\frac{dy}{dx}$. The equation becomes linear in $z$: $\frac{1}{1-n}\frac{dz}{dx} + P(x)z = Q(x)$, or $\frac{dz}{dx} + (1-n)P(x)z = (1-n)Q(x)$. Solve this linear D.E. for $z$. 6. Exact Differential Equations A D.E. of the form $M(x,y)dx + N(x,y)dy = 0$ is exact if $\frac{\partial M}{\partial y} = \frac{\partial N}{\partial x}$. Method: The general solution is $\int_{\text{y is constant}} M(x,y)dx + \int_{\text{terms of N not containing x}} N(x,y)dy = C$. 7. Equations Reducible to Exact Equations (Integrating Factors) If $\frac{\frac{\partial M}{\partial y} - \frac{\partial N}{\partial x}}{N} = f(x)$ (a function of $x$ only), then $I.F. = e^{\int f(x)dx}$. If $\frac{\frac{\partial N}{\partial x} - \frac{\partial M}{\partial y}}{M} = f(y)$ (a function of $y$ only), then $I.F. = e^{\int f(y)dy}$. If $M = yf_1(xy)$ and $N = xf_2(xy)$, then $I.F. = \frac{1}{Mx-Ny}$ (provided $Mx-Ny \ne 0$). If the D.E. is homogeneous and $Mx+Ny \ne 0$, then $I.F. = \frac{1}{Mx+Ny}$. If the D.E. is of the form $x^a y^b (my dx + nx dy) + x^c y^d (py dx + qx dy) = 0$, grouping terms and finding $I.F. = x^h y^k$ to make each group exact.