1. Asymptotic Analysis Big O Notation ($O$): Upper bound. $O(g(n))$ means $f(n) \le c \cdot g(n)$ for $n \ge n_0$. Big Omega Notation ($\Omega$): Lower bound. $\Omega(g(n))$ means $f(n) \ge c \cdot g(n)$ for $n \ge n_0$. Big Theta Notation ($\Theta$): Tight bound. $\Theta(g(n))$ means $c_1 \cdot g(n) \le f(n) \le c_2 \cdot g(n)$ for $n \ge n_0$. Common Complexities: $O(1)$: Constant $O(\log n)$: Logarithmic $O(n)$: Linear $O(n \log n)$: Linearithmic $O(n^2)$: Quadratic $O(2^n)$: Exponential $O(n!)$: Factorial 2. Arrays Description: Contiguous memory block, fixed size (usually). Access: $O(1)$ by index. Insertion/Deletion: $O(n)$ in worst case (shifting elements). Dynamic Array (ArrayList/Vector): Resizable array. Amortized $O(1)$ for append, $O(n)$ for resize. 3. Linked Lists Description: Nodes containing data and a pointer to the next node. Types: Singly: Node $\rightarrow$ Next Doubly: Prev $\leftarrow$ Node $\rightarrow$ Next Circular: Last node points to first. Access: $O(n)$ (traverse from head). Insertion/Deletion: $O(1)$ if position is known, $O(n)$ to find position. 4. Stacks Description: LIFO (Last In, First Out) data structure. Operations: Push: Add element to top. $O(1)$. Pop: Remove element from top. $O(1)$. Peek/Top: Get top element without removing. $O(1)$. isEmpty: Check if stack is empty. $O(1)$. Implementation: Array or Linked List. 5. Queues Description: FIFO (First In, First Out) data structure. Operations: Enqueue: Add element to rear. $O(1)$. Dequeue: Remove element from front. $O(1)$. Front/Peek: Get front element without removing. $O(1)$. isEmpty: Check if queue is empty. $O(1)$. Implementation: Array (circular array for efficiency) or Linked List. 6. Trees Description: Hierarchical data structure. Nodes connected by edges. Terminology: Root, Parent, Child, Sibling, Leaf, Depth, Height. Binary Tree: Each node has at most two children. Binary Search Tree (BST): Left child $\le$ Parent $\le$ Right child. Search/Insert/Delete: $O(h)$ where $h$ is height. Worst case $O(n)$ (skewed tree). Balanced BSTs (AVL, Red-Black Tree): Maintain $O(\log n)$ height. Tree Traversals: Inorder: Left $\rightarrow$ Root $\rightarrow$ Right (sorted for BST). Preorder: Root $\rightarrow$ Left $\rightarrow$ Right (copy tree structure). Postorder: Left $\rightarrow$ Right $\rightarrow$ Root (delete tree). Level Order: BFS using a queue. 7. Heaps Description: Complete binary tree satisfying heap property. Types: Min-Heap: Parent $\le$ Children. Max-Heap: Parent $\ge$ Children. Operations: Insert: $O(\log n)$. Delete Min/Max: $O(\log n)$. Heapify: Build heap from array. $O(n)$. Applications: Priority Queues, Heap Sort. 8. Hash Tables Description: Maps keys to values using a hash function. Average Case: Insert, Delete, Search: $O(1)$. Worst Case: $O(n)$ (due to collisions). Collision Resolution: Chaining: Use linked lists for colliding elements. Open Addressing: Linear Probing, Quadratic Probing, Double Hashing. Load Factor ($\alpha$): $\frac{\text{Number of entries}}{\text{Number of buckets}}$. Resizing when $\alpha$ exceeds threshold. 9. Graphs Description: Set of vertices (nodes) and edges (connections). Terminology: Directed/Undirected, Weighted/Unweighted, Cycle, Path, Degree. Representations: Adjacency Matrix: $A[i][j]=1$ if edge $(i, j)$ exists. $O(V^2)$ space. Adjacency List: Array of lists, where $Adj[u]$ contains neighbors of $u$. $O(V+E)$ space. Graph Traversal: Breadth-First Search (BFS): Uses a Queue. Finds shortest path in unweighted graphs. $O(V+E)$. Depth-First Search (DFS): Uses a Stack (or recursion). $O(V+E)$. 10. Sorting Algorithms Algorithm Time Complexity (Avg) Time Complexity (Worst) Space Complexity Stable Bubble Sort $O(n^2)$ $O(n^2)$ $O(1)$ Yes Selection Sort $O(n^2)$ $O(n^2)$ $O(1)$ No Insertion Sort $O(n^2)$ $O(n^2)$ $O(1)$ Yes Merge Sort $O(n \log n)$ $O(n \log n)$ $O(n)$ Yes Quick Sort $O(n \log n)$ $O(n^2)$ $O(\log n)$ (Avg) No Heap Sort $O(n \log n)$ $O(n \log n)$ $O(1)$ No 11. Searching Algorithms Linear Search: Iterate through elements one by one. Time Complexity: $O(n)$. Binary Search: Requires sorted data. Divides search space in half. Time Complexity: $O(\log n)$. 12. Dynamic Programming Description: Solves complex problems by breaking them into simpler overlapping subproblems and storing results. Key Concepts: Optimal Substructure: Optimal solution to problem contains optimal solutions to subproblems. Overlapping Subproblems: Same subproblems are encountered multiple times. Techniques: Memoization (Top-down): Store results of expensive function calls and return the cached result when same inputs occur again. (Recursion + Caching) Tabulation (Bottom-up): Solve all related subproblems first, typically filling up a table. (Iteration) Example: Fibonacci sequence, Longest Common Subsequence, Knapsack Problem. 13. Greedy Algorithms Description: Makes the locally optimal choice at each stage with the hope of finding a global optimum. Characteristics: Does not always guarantee globally optimal solution. Example: Dijkstra's algorithm, Prim's/Kruskal's algorithm (MST), Activity Selection Problem. 14. Backtracking Description: Algorithmic paradigm for finding solutions by trying to build a solution incrementally, one piece at a time, removing those solutions that fail to satisfy constraints ("backtracking") and trying other options. Example: N-Queens problem, Sudoku Solver, finding all permutations/combinations. 15. Recursion Description: A function calling itself. Base case and recursive step. Tail Recursion: Recursive call is the last operation in the function. Can be optimized by compilers to iterative. Space Complexity: $O(h)$ where $h$ is the maximum depth of the recursion stack.