Sequences and Limits Definition of a Sequence: An ordered list of numbers $\{a_n\}_{n=1}^{\infty} = a_1, a_2, a_3, \dots$. Often defined by a formula for $a_n$ or recursively. Formal Definition of Limit: A sequence $\{a_n\}$ converges to a limit $L$ (written $\lim_{n \to \infty} a_n = L$) if for every $\epsilon > 0$, there exists an integer $N$ such that for all $n > N$, $|a_n - L| Divergence: A sequence diverges if it does not converge to a finite limit. This includes sequences that tend to $\pm \infty$ or oscillate without settling. Limit Laws for Sequences: If $\lim_{n \to \infty} a_n = L$ and $\lim_{n \to \infty} b_n = M$ exist, and $c$ is a constant: $\lim_{n \to \infty} (a_n \pm b_n) = L \pm M$ $\lim_{n \to \infty} (c \cdot a_n) = c \cdot L$ $\lim_{n \to \infty} (a_n b_n) = LM$ $\lim_{n \to \infty} \frac{a_n}{b_n} = \frac{L}{M}$ (if $M \ne 0$) $\lim_{n \to \infty} (a_n)^p = L^p$ (if $L^p$ is defined) Relationship to Functions: If $\lim_{x \to \infty} f(x) = L$ and $f(n) = a_n$ for integers $n$, then $\lim_{n \to \infty} a_n = L$. This allows using L'Hôpital's Rule for sequences. Squeeze Theorem for Sequences: If $a_n \le b_n \le c_n$ for all $n \ge N_0$ and $\lim_{n \to \infty} a_n = L = \lim_{n \to \infty} c_n$, then $\lim_{n \to \infty} b_n = L$. Monotonic and Bounded Sequences: A sequence is monotonic if it is either non-decreasing ($a_n \le a_{n+1}$) or non-increasing ($a_n \ge a_{n+1}$). A sequence is bounded above if $a_n \le M$ for some $M$. It is bounded below if $a_n \ge m$ for some $m$. It is bounded if it is bounded above and below. Monotonic Sequence Theorem: Every bounded, monotonic sequence converges. Convergence of Series Definition of a Series: Given a sequence $\{a_n\}$, a series is the sum $\sum_{n=1}^{\infty} a_n = a_1 + a_2 + a_3 + \dots$. Partial Sums: The $k$-th partial sum is $S_k = \sum_{n=1}^{k} a_n$. Convergence of a Series: The series $\sum a_n$ converges if the sequence of its partial sums $\{S_k\}$ converges to a finite number $S$. If $\lim_{k \to \infty} S_k = S$, then $S$ is the sum of the series. Otherwise, the series diverges. Divergence Test ($n$-th Term Test): If $\lim_{n \to \infty} a_n \ne 0$ or the limit does not exist, then the series $\sum a_n$ diverges. Crucial Note: If $\lim_{n \to \infty} a_n = 0$, the test is inconclusive. The series may converge (e.g., $\sum 1/n^2$) or diverge (e.g., $\sum 1/n$). Properties of Convergent Series: If $\sum a_n$ and $\sum b_n$ converge, then $\sum (a_n \pm b_n)$ converges and $\sum (a_n \pm b_n) = \sum a_n \pm \sum b_n$. If $\sum a_n$ converges and $c$ is a constant, then $\sum c \cdot a_n$ converges and $\sum c \cdot a_n = c \sum a_n$. Geometric Series: $\sum_{n=0}^{\infty} ar^n = a + ar + ar^2 + \dots$ (first term $a$, common ratio $r$). Converges to $\frac{a}{1-r}$ if $|r| Diverges if $|r| \ge 1$. $p$-Series: $\sum_{n=1}^{\infty} \frac{1}{n^p} = 1 + \frac{1}{2^p} + \frac{1}{3^p} + \dots$. Converges if $p > 1$. Diverges if $p \le 1$. (The Harmonic Series $\sum 1/n$ is a $p$-series with $p=1$, which diverges). Integral Test (for positive, decreasing, continuous $f(x)$): If $f(x)$ is positive, continuous, and decreasing for $x \ge 1$, and $a_n = f(n)$, then $\sum_{n=1}^{\infty} a_n$ converges if and only if $\int_1^{\infty} f(x) dx$ converges. Tests for Convergence (for series with positive terms) 1. Comparison Test Suppose $\sum a_n$ and $\sum b_n$ are series with positive terms. If $a_n \le b_n$ for all $n \ge N_0$ (for some integer $N_0$): If $\sum b_n$ converges, then $\sum a_n$ converges. If $\sum a_n$ diverges, then $\sum b_n$ diverges. 2. Limit Comparison Test Suppose $\sum a_n$ and $\sum b_n$ are series with positive terms. If $\lim_{n \to \infty} \frac{a_n}{b_n} = L$, where $L$ is a finite number and $L > 0$, then either both series converge or both diverge. If $\lim_{n \to \infty} \frac{a_n}{b_n} = 0$ and $\sum b_n$ converges, then $\sum a_n$ converges. If $\lim_{n \to \infty} \frac{a_n}{b_n} = \infty$ and $\sum b_n$ diverges, then $\sum a_n$ diverges. 3. Ratio Test (useful for factorials and $n$-th powers) Let $L = \lim_{n \to \infty} \left| \frac{a_{n+1}}{a_n} \right|$. If $L If $L > 1$ or $L = \infty$, the series $\sum a_n$ diverges. If $L = 1$, the test is inconclusive (use another test). 4. Root Test (useful for $(...)^n$ terms) Let $L = \lim_{n \to \infty} \sqrt[n]{|a_n|} = \lim_{n \to \infty} (|a_n|)^{1/n}$. If $L If $L > 1$ or $L = \infty$, the series $\sum a_n$ diverges. If $L = 1$, the test is inconclusive (use another test). Absolute and Conditional Convergence Absolute Convergence: A series $\sum a_n$ converges absolutely if the series of absolute values $\sum |a_n|$ converges. Theorem: If a series converges absolutely, then it converges. (The converse is not true). Conditional Convergence: A series $\sum a_n$ converges conditionally if $\sum a_n$ converges but $\sum |a_n|$ diverges. Example: The alternating harmonic series $\sum (-1)^{n-1}/n$ converges, but $\sum 1/n$ diverges. Alternating Series Alternating Series: A series whose terms alternate in sign, typically of the form $\sum_{n=1}^{\infty} (-1)^{n-1} b_n = b_1 - b_2 + b_3 - \dots$ or $\sum_{n=1}^{\infty} (-1)^n b_n = -b_1 + b_2 - b_3 + \dots$, where $b_n > 0$. Alternating Series Test (Leibniz Test): An alternating series $\sum_{n=1}^{\infty} (-1)^{n-1} b_n$ (or $\sum (-1)^n b_n$) converges if both of the following conditions are met: The terms $b_n$ are decreasing (i.e., $b_{n+1} \le b_n$ for all $n$). $\lim_{n \to \infty} b_n = 0$. Alternating Series Estimation Theorem: If an alternating series satisfies the conditions of the AST, then for $S = \sum (-1)^{n-1} b_n$ and $S_k = \sum_{n=1}^k (-1)^{n-1} b_n$, the remainder (error) $|R_k| = |S - S_k|$ satisfies $|R_k| \le b_{k+1}$. This means the error is less than the absolute value of the first neglected term. Power Series Definition: A series of the form $\sum_{n=0}^{\infty} c_n (x-a)^n = c_0 + c_1(x-a) + c_2(x-a)^2 + \dots$. The constant $a$ is the center of the series, and $c_n$ are the coefficients. Convergence: A power series converges for at least $x=a$. For other $x$, its convergence depends on $x$. Radius of Convergence ($R$): For any power series $\sum c_n (x-a)^n$, there are three possibilities for its convergence: The series converges only when $x=a$. In this case, $R=0$. The series converges for all $x$ (i.e., for $x \in (-\infty, \infty)$). In this case, $R=\infty$. There is a positive number $R$ such that the series converges if $|x-a| R$. Finding $R$: The Ratio Test is typically used. Calculate $L = \lim_{n \to \infty} \left| \frac{c_{n+1}(x-a)^{n+1}}{c_n(x-a)^n} \right| = |x-a| \lim_{n \to \infty} \left| \frac{c_{n+1}}{c_n} \right|$. If $L = |x-a| \cdot K$, then for convergence we need $|x-a| \cdot K If $K=0$, then $L=0$ for all $x$, so $R=\infty$. If $K=\infty$, then $L=\infty$ for $x \ne a$, so $R=0$. Alternatively, using the Root Test: $L = \lim_{n \to \infty} \sqrt[n]{|c_n (x-a)^n|} = |x-a| \lim_{n \to \infty} \sqrt[n]{|c_n|}$. Then $R = 1/\lim_{n \to \infty} \sqrt[n]{|c_n|}$. Interval of Convergence: The set of all $x$ for which the power series converges. It is an interval centered at $a$ with radius $R$. The interval is initially $(a-R, a+R)$. The endpoints $x=a-R$ and $x=a+R$ must be checked separately by substituting them into the original series and applying an appropriate convergence test (e.g., $p$-series, AST, or integral test). The series may converge or diverge at either, both, or neither endpoint. Fundamental Theorem of Calculus (FTC) FTC Part 1 (Differentiation of an Integral): If $f$ is continuous on $[a, b]$, then the function $g(x) = \int_a^x f(t) dt$ has an antiderivative at every point $x$ in $(a, b)$, and $g'(x) = f(x)$. More generally, if $F(x) = \int_{u(x)}^{v(x)} f(t) dt$, then $F'(x) = f(v(x))v'(x) - f(u(x))u'(x)$. FTC Part 2 (Evaluation of a Definite Integral): If $f$ is continuous on $[a, b]$ and $F$ is any antiderivative of $f$ (i.e., $F'(x) = f(x)$), then $\int_a^b f(x) dx = F(b) - F(a)$. Mean Value Theorems for Integrals Mean Value Theorem for Integrals: If $f$ is continuous on a closed interval $[a, b]$, then there exists a number $c$ in $[a, b]$ such that $\int_a^b f(x) dx = f(c)(b-a)$. This means that the average value of the function $f_{avg} = \frac{1}{b-a} \int_a^b f(x) dx$ is actually attained by the function at some point $c$ in the interval. Evaluation of Definite Integrals - Reduction Formulae Reduction Formulae: These are recursive formulas used to evaluate integrals, especially definite integrals, by expressing an integral of a certain form (e.g., $\int \sin^n x dx$) in terms of an integral of the same form but with a lower power or index. They are often derived using integration by parts. Example 1: For $I_n = \int \sin^n(x) dx$: $$ I_n = -\frac{1}{n}\sin^{n-1}(x)\cos(x) + \frac{n-1}{n} I_{n-2} $$ For definite integral $\int_0^{\pi/2} \sin^n(x) dx$: $$ \int_0^{\pi/2} \sin^n(x) dx = \frac{n-1}{n} \int_0^{\pi/2} \sin^{n-2}(x) dx $$ (Wallis' Integrals, for $n \ge 2$) Example 2: For $I_n = \int_0^{\pi/2} \cos^n(x) dx$, the same reduction formula applies. Example 3: For $I_n = \int x^n e^x dx$: $$ I_n = x^n e^x - n \int x^{n-1} e^x dx = x^n e^x - n I_{n-1} $$ Example 4: For $I_n = \int \sec^n(x) dx$: $$ I_n = \frac{\sec^{n-2}(x)\tan(x)}{n-1} + \frac{n-2}{n-1} I_{n-2} \quad (n \ne 1) $$ Applications of Integration Area between Curves: If $f(x) \ge g(x)$ on $[a, b]$: $A = \int_a^b [f(x) - g(x)] dx$. If $f(y) \ge g(y)$ on $[c, d]$: $A = \int_c^d [f(y) - g(y)] dy$. For regions defined by multiple intersections, sum the integrals over subintervals where one function consistently dominates the other. Arc Length: For $y=f(x)$ on $[a, b]$: $L = \int_a^b \sqrt{1 + [f'(x)]^2} dx$. For $x=g(y)$ on $[c, d]$: $L = \int_c^d \sqrt{1 + [g'(y)]^2} dy$. For parametric curve $x=x(t), y=y(t)$ on $[t_1, t_2]$: $L = \int_{t_1}^{t_2} \sqrt{\left(\frac{dx}{dt}\right)^2 + \left(\frac{dy}{dt}\right)^2} dt$. Volume of Revolution (Disk/Washer Method): Rotation about an axis. About x-axis: $V = \pi \int_a^b [R(x)^2 - r(x)^2] dx$. $R(x)$ is outer radius, $r(x)$ is inner radius. For disk method, $r(x)=0$. About y-axis: $V = \pi \int_c^d [R(y)^2 - r(y)^2] dy$. ($R(y)$ outer, $r(y)$ inner). Volume of Revolution (Cylindrical Shells Method): About y-axis: $V = 2\pi \int_a^b (\text{radius}) \cdot (\text{height}) dx = 2\pi \int_a^b x \cdot [f(x) - g(x)] dx$. About x-axis: $V = 2\pi \int_c^d (\text{radius}) \cdot (\text{height}) dy = 2\pi \int_c^d y \cdot [f(y) - g(y)] dy$. Surface Area of Revolution: For $y=f(x)$ on $[a, b]$ rotated about x-axis: $S = \int_a^b 2\pi f(x) \sqrt{1 + [f'(x)]^2} dx$. For $x=g(y)$ on $[c, d]$ rotated about y-axis: $S = \int_c^d 2\pi g(y) \sqrt{1 + [g'(y)]^2} dy$. Differentiation Under the Integral Sign (Leibniz Integral Rule) General Form: If $F(x) = \int_{a(x)}^{b(x)} f(x, t) dt$, where $a(x)$ and $b(x)$ are differentiable functions of $x$, and $f(x, t)$ and $\frac{\partial}{\partial x} f(x, t)$ are continuous, then $$ \frac{dF}{dx} = f(x, b(x)) \cdot b'(x) - f(x, a(x)) \cdot a'(x) + \int_{a(x)}^{b(x)} \frac{\partial}{\partial x} f(x, t) dt $$ Special Case (Constant Limits): If the limits of integration are constants, $a$ and $b$: If $F(x) = \int_a^b f(x, t) dt$, then $\frac{dF}{dx} = \int_a^b \frac{\partial}{\partial x} f(x, t) dt$. This allows interchanging differentiation and integration. Example: To find $\frac{d}{dx} \int_x^{x^2} \sin(xt) dt$: Here $a(x)=x$, $b(x)=x^2$, $f(x,t) = \sin(xt)$. $b'(x)=2x$, $a'(x)=1$. $\frac{\partial}{\partial x} f(x,t) = \frac{\partial}{\partial x} \sin(xt) = t \cos(xt)$. $$ \frac{dF}{dx} = \sin(x \cdot x^2) \cdot (2x) - \sin(x \cdot x) \cdot (1) + \int_x^{x^2} t \cos(xt) dt $$ $$ = 2x \sin(x^3) - \sin(x^2) + \int_x^{x^2} t \cos(xt) dt $$