Bellman-Gronwall Lemma Suppose $u, v \geq 0$ are scalar, real-valued functions and $c$ is a positive constant. If $u(t) \leq c + \int_0^t u(\tau)v(\tau)d\tau$, then $u(t) \leq c e^{\int_0^t v(\tau)d\tau}$. Remark: Converts an implicit inequality for $u$ into an explicit one. Application: Used to prove the uniqueness of solutions for linear ODEs. For $\dot{x} = A(t)x$, if $w(t)$ and $v(t)$ are two solutions with $w(t_0) = v(t_0) = x_0$, then defining error $e(t) = w(t) - v(t)$, we can show $||e(t)||_2 = 0$, implying uniqueness. Example: If $u(t) \leq 5 + \int_0^t u(\tau) \cdot 2 d\tau$, then $u(t) \leq 5e^{\int_0^t 2 d\tau} = 5e^{2t}$. Asymptotic Stability of $\dot{x} = Ax$ Definitions Stable: For each $\epsilon > 0$, there exists $\delta > 0$ such that $||x(0)|| Asymptotically Stable: Stable AND $\lim_{t \to \infty} ||x(t)|| = 0$. Remarks Stability properties are independent of the chosen vector norm. Time-invariant linear systems that are asymptotically stable are also exponentially stable ($||x(t)|| \leq c e^{-\alpha t} ||x(0)||$). Theorem: The system $\dot{x} = Ax$ is asymptotically stable if and only if the real parts of all eigenvalues of $A$ are strictly less than zero ($\text{Re}(\lambda_i(A)) Hurwitz . Proof (Necessity): If $A$ has an eigenvalue $\lambda$ with $\text{Re}(\lambda) \geq 0$, then for $x(0) = v$ (eigenvector), $x(t) = e^{\lambda t}v$, so $||x(t)|| = e^{\text{Re}(\lambda)t} ||v||$, which does not decay to zero. Example: For $A = \begin{pmatrix} -1 & 0 \\ 0 & -2 \end{pmatrix}$, eigenvalues are $-1, -2$. Both have $\text{Re}(\lambda) Example: For $A = \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix}$, eigenvalues are $\pm i$. $\text{Re}(\lambda) = 0$, so the system is stable but not asymptotically stable (oscillatory behavior). Lyapunov Stability Theory Motivation (Damped Spring-Mass System) For $\ddot{x} + c\dot{x} + kx = 0$, define energy $V(t) = \frac{1}{2}m\dot{x}^2(t) + \frac{1}{2}kx^2(t)$. Time derivative: $\dot{V}(t) = -c\dot{x}^2(t) \leq 0$ (non-positive). This shows asymptotic stability as $V(t) \to 0$ for $t \to \infty$ if $c>0$. Generalized Lyapunov Function Define $V(x) = x^*Px$ for a positive definite matrix $P$. Time derivative: $\dot{V}(x) = x^*(A^*P + PA)x$. If $A^*P + PA = -Q$ for a positive definite $Q$, then $\dot{V}(x) = -x^*Qx This implies $V(x) \to 0$ as $t \to \infty$, demonstrating asymptotic stability. Lyapunov Theorem The differential equation $\dot{x} = Ax$ is asymptotically stable if and only if for any $Q \in C^{n \times n} > 0$, there exists a unique $P \in C^{n \times n} > 0$ that satisfies the algebraic Lyapunov equation : $$A^*P + PA = -Q$$ Proof (Sufficiency): If $P, Q > 0$ satisfy $A^*P + PA = -Q$, and $\lambda$ is an eigenvalue of $A$ with eigenvector $v$, then $v^*(A^*P + PA)v = -v^*Qv$. This simplifies to $2\text{Re}(\lambda)v^*Pv = -v^*Qv$. Since $P, Q > 0$, $v^*Pv > 0$ and $v^*Qv > 0$, implying $2\text{Re}(\lambda) Rate of Convergence: For an asymptotically stable system, $V(x(t)) \leq V(x(0))e^{-\frac{\lambda_{\min}(Q)}{\lambda_{\max}(P)}t}$, which implies $||x(t)||_2 \leq \sqrt{\kappa(P)} ||x(0)||_2 e^{-\frac{\lambda_{\min}(Q)}{2\lambda_{\max}(P)}t}$. Proof (Necessity): If $\dot{x} = Ax$ is asymptotically stable, then $||e^{At}|| \leq m e^{-\alpha t}$. We can define $P = \int_0^\infty e^{A^*t}Qe^{At} dt$. This integral converges due to exponential stability. Then $A^*P + PA = \int_0^\infty (A^*e^{A^*t}Qe^{At} + e^{A^*t}QAe^{At}) dt = \int_0^\infty \frac{d}{dt}(e^{A^*t}Qe^{At}) dt = [e^{A^*t}Qe^{At}]_0^\infty = 0 - Q = -Q$. Example: For $A = \begin{pmatrix} -1 & 0 \\ 0 & -2 \end{pmatrix}$, let $Q = I = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$. Solving $A^*P + PA = -I$ gives $P = \begin{pmatrix} 1/2 & 0 \\ 0 & 1/2 \end{pmatrix}$, which is positive definite. This confirms asymptotic stability. Observability Definition A system $\dot{x} = Ax, y = Cx$ is observable if for any unknown initial condition $x(0)$, there exists a finite time $t_1 > 0$ such that measurement of the output $y$ over $[0, t_1]$ suffices to uniquely determine $x(0)$. Otherwise, it is unobservable . Observability Gramian If $A$ is asymptotically stable, the observability gramian $G_o(\infty)$ is the unique solution to the Lyapunov equation: $$A^*G_o(\infty) + G_o(\infty)A = -C^*C$$ The system is observable if and only if $G_o(\infty) > 0$. $G_o(t_1) = \int_0^{t_1} e^{A^*\tau}C^*Ce^{A\tau}d\tau$. If $G_o(t_1) > 0$, $x(0)$ can be uniquely determined. Unobservable Subspace The unobservable subspace $S_o$ is $N(G_o(t))$ for any $t \neq 0$. If $x_0 \in S_o$, then $y(t) = 0$ for all $t \geq 0$. $S_o$ is an $A$-invariant subspace. Observability Matrix The observability matrix $\mathcal{O}$ is given by: $$\mathcal{O} = \begin{pmatrix} C \\ CA \\ CA^2 \\ \vdots \\ CA^{n-1} \end{pmatrix}$$ The system is observable if and only if $\text{rank}(\mathcal{O}) = n$. The unobservable subspace $S_o = N(\mathcal{O})$. Example: For $A = \begin{pmatrix} 0 & 1 \\ -2 & -3 \end{pmatrix}$, $C = \begin{pmatrix} 1 & 0 \end{pmatrix}$. $CA = \begin{pmatrix} 1 & 0 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ -2 & -3 \end{pmatrix} = \begin{pmatrix} 0 & 1 \end{pmatrix}$. $\mathcal{O} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$. Since $\text{rank}(\mathcal{O}) = 2 = n$, the system is observable. PBH Eigenvector Test for Observability A system is unobservable if there exists an eigenvector $v$ of $A$ corresponding to eigenvalue $\lambda$ such that $Cv = 0$. This condition is also sufficient for unobservability. For repeated eigenvalues, check all associated eigenvectors. Example: If $A = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}$, $C = \begin{pmatrix} 0 & 1 \end{pmatrix}$. Eigenvector for $\lambda=1$ is $v = \begin{pmatrix} 1 \\ 0 \end{pmatrix}$. $Cv = \begin{pmatrix} 0 & 1 \end{pmatrix} \begin{pmatrix} 1 \\ 0 \end{pmatrix} = 0$. Thus, the system is unobservable (the state $x_1$ cannot be observed). The state $x_2$ is observable since for $v_2 = \begin{pmatrix} 0 \\ 1 \end{pmatrix}$, $Cv_2 = 1 \neq 0$. PBH Rank Test for Observability A system is observable if and only if $\text{rank}\begin{pmatrix} sI - A \\ C \end{pmatrix} = n$ for all $s \in \mathbb{C}$. It is sufficient to check this condition only for $s$ in the spectrum of $A$ (eigenvalues). Canonical Decomposition for Unobservable Systems If $(C, A)$ is unobservable, there exists a similarity transformation $T$ such that in the new coordinates $z = T^{-1}x$, the system matrices take the form: $$\dot{z} = \begin{pmatrix} A_{11} & 0 \\ A_{21} & A_{22} \end{pmatrix} z + \begin{pmatrix} B_1 \\ B_2 \end{pmatrix} u$$ $$y = \begin{pmatrix} C_1 & 0 \end{pmatrix} z + Du$$ The unobservable subspace $S_o$ is spanned by the last $n-r$ columns of $T$. The pair $(C_1, A_{11})$ is observable. The transfer function $y(s)/u(s) = C_1(sI - A_{11})^{-1}B_1 + D$, showing that unobservable states can be removed without affecting input-output behavior. Controllability Definition A system $\dot{x} = Ax + Bu$ is controllable if for any terminal state $x_1$, there exists an input $u(t)$ defined on an interval $[0, t_1]$ that transfers the state from the origin at $t=0$ to $x_1$ at $t_1$. Otherwise, it is uncontrollable . The terminal state $x_1 = \int_0^{t_1} e^{A(t_1-\tau)}Bu(\tau)d\tau$. Controllability Gramian If $A$ is asymptotically stable, the controllability gramian $G_c(\infty)$ is the unique solution to the Lyapunov equation: $$AG_c(\infty) + G_c(\infty)A^* = -BB^*$$ The system is controllable if and only if $G_c(\infty) > 0$. $G_c(t_1) = \int_0^{t_1} e^{A\tau}BB^*e^{A^*\tau}d\tau$. If $G_c(t_1) > 0$, any state $x_1$ can be reached. The set of states reachable from the origin is $R(G_c(t))$. Controllable Subspace The controllable subspace $S_c = R(G_c(t))$ for any $t > 0$. $S_c$ is the smallest $A$-invariant subspace that contains $R(B)$. Controllability Matrix The controllability matrix $\mathcal{C}$ is given by: $$\mathcal{C} = \begin{pmatrix} B & AB & A^2B & \dots & A^{n-1}B \end{pmatrix}$$ The system is controllable if and only if $\text{rank}(\mathcal{C}) = n$. The controllable subspace $S_c = R(\mathcal{C})$. Example: For $A = \begin{pmatrix} 0 & 1 \\ -2 & -3 \end{pmatrix}$, $B = \begin{pmatrix} 0 \\ 1 \end{pmatrix}$. $AB = \begin{pmatrix} 0 & 1 \\ -2 & -3 \end{pmatrix} \begin{pmatrix} 0 \\ 1 \end{pmatrix} = \begin{pmatrix} 1 \\ -3 \end{pmatrix}$. $\mathcal{C} = \begin{pmatrix} 0 & 1 \\ 1 & -3 \end{pmatrix}$. Since $\text{rank}(\mathcal{C}) = 2 = n$, the system is controllable. PBH Eigenvector Test for Controllability A system is uncontrollable if there exists a left eigenvector $w^*$ of $A$ corresponding to eigenvalue $\lambda$ such that $w^*B = 0$. This condition is also sufficient for uncontrollability. Example: If $A = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}$, $B = \begin{pmatrix} 0 \\ 1 \end{pmatrix}$. Left eigenvector for $\lambda=1$ is $w^* = \begin{pmatrix} 1 & 0 \end{pmatrix}$. $w^*B = \begin{pmatrix} 1 & 0 \end{pmatrix} \begin{pmatrix} 0 \\ 1 \end{pmatrix} = 0$. Thus, the system is uncontrollable (the state $x_1$ cannot be affected by the input). The state $x_2$ is controllable since for $w_2^* = \begin{pmatrix} 0 & 1 \end{pmatrix}$, $w_2^*B = 1 \neq 0$. PBH Rank Test for Controllability A system is controllable if and only if $\text{rank}\begin{pmatrix} sI - A & B \end{pmatrix} = n$ for all $s \in \mathbb{C}$. It is sufficient to check this condition only for $s$ in the spectrum of $A$ (eigenvalues). Canonical Decomposition for Uncontrollable Systems If $(A, B)$ is uncontrollable, there exists a similarity transformation $T$ such that in the new coordinates $z = T^{-1}x$, the system matrices take the form: $$\dot{z} = \begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix} z + \begin{pmatrix} B_1 \\ 0 \end{pmatrix} u$$ $$y = \begin{pmatrix} C_1 & C_2 \end{pmatrix} z + Du$$ The controllable subspace $S_c$ is spanned by the first $r$ columns of $T$. The pair $(A_{11}, B_1)$ is controllable. The transfer function $y(s)/u(s) = C_1(sI - A_{11})^{-1}B_1 + D$, showing that uncontrollable states can be removed without affecting input-output behavior. Kalman Decomposition The Kalman decomposition combines canonical decompositions for uncontrollable and unobservable systems into a single theorem. There exists a similarity transform (change of coordinates) such that the state is partitioned into four subspaces: $$\dot{x} = \begin{pmatrix} A_{co} & 0 & A_{13} & 0 \\ A_{21} & A_{\bar{c}o} & A_{23} & A_{24} \\ 0 & 0 & A_{\bar{c}\bar{o}} & 0 \\ 0 & 0 & A_{43} & A_{c\bar{o}} \end{pmatrix} x + \begin{pmatrix} B_{co} \\ B_{\bar{c}o} \\ 0 \\ 0 \end{pmatrix} u$$ $$y = \begin{pmatrix} C_{co} & 0 & C_{\bar{c}o} & 0 \end{pmatrix} x + Du$$ $x_{co}$: controllable and observable $x_{\bar{c}o}$: controllable and unobservable $x_{c\bar{o}}$: uncontrollable and observable $x_{\bar{c}\bar{o}}$: uncontrollable and unobservable The system has the same transfer function as the controllable and observable subsystem: $$\dot{z} = A_{co}z + B_{co}u$$ $$y = C_{co}z + Du$$