Solving Recurrences

Week 4, Tuesday (Video)

January 27, 2026

Introduction

What This Video Covers

We continued the discussion from class…

Merge Sort Runtime Analysis

def merge_sort(arr):
    if len(arr) <= 1:          # O(1)
        return arr
    mid = len(arr) // 2        # O(1)
    left = merge_sort(arr[:mid])    # T(n/2)
    right = merge_sort(arr[mid:])   # T(n/2)
    return merge(left, right)       # O(n)

The recurrence:

\[T(n) = 2T(n/2) + O(n)\]

What Does This Mean?

\[T(n) = 2T(n/2) + n\]

  • We make 2 recursive calls
  • Each on a problem of size n/2
  • Plus n work to merge

This is a recurrence relation: T(n) defined in terms of T(smaller).

Visualizing the Recursion

The Call Tree

                merge_sort([7,1,6,4,5,3,2,8])
                        /              \
        merge_sort([7,1,6,4])    merge_sort([5,3,2,8])
            /        \              /        \
    ms([7,1])    ms([6,4])    ms([5,3])    ms([2,8])
      /  \        /  \        /  \        /  \
   [7]  [1]    [6]  [4]    [5]  [3]    [2]  [8]

How deep does this go?

Recursion Depth

At each level, problem size halves: \(n \to n/2 \to n/4 \to \ldots \to 1\)

Depth = How many times can we halve \(n\)?

\[n \to n/2 \to n/4 \to \ldots \to 1\]

Answer: \(\log_2 n\) levels

Work at Each Level

Level 0:     [n]                      n work (merge)
            /   \
Level 1:  [n/2] [n/2]                 n work (two merges of n/2)
          / \   / \
Level 2: [n/4]×4                      n work (four merges of n/4)
         ...
Level log n: [1]×n                    n work (base cases)

Total work: \(n \times \log n = O(n \log n)\)

Recurrence Forms

The Big Idea

Different recursive structures - → different recurrences - → different running times!

Form 1: Linear Shrinking

Example: Processing a list one element at a time

def sum_list(arr):
    if len(arr) == 0:
        return 0
    return arr[0] + sum_list(arr[1:]) 

\(T(n) =\)

Form 2: Halving

Example: Binary search

def binary_search(arr, target, lo, hi):
    if lo > hi:
        return -1
    mid = (lo + hi) // 2
    if arr[mid] == target:
        return mid
    elif arr[mid] < target:
        return binary_search(arr, target, mid+1, hi)  
    else:
        return binary_search(arr, target, lo, mid-1)

\(T(n) =\)

Form 3: Two Calls, Halving, Constant Work

Example: triangle_double — calls itself twice!

def triangle_double(n):
    if n <= 1:
        return n
    half = n // 2
    top = n - half
    return triangle_double(half) + triangle_double(half) + top * top

\(T(n) =\)

Form 4: Two Calls, Halving, Linear Work

Example: Merge sort!

def merge_sort(arr):
    ...
    left = merge_sort(arr[:mid])    
    right = merge_sort(arr[mid:])   
    return merge(left, right)       

\(T(n) =\)

Form 5: Linear Shrinking, Linear Work

Example: Selection sort (find min, then sort rest)

def selection_sort(arr):
    if len(arr) <= 1:
        return arr
    min_idx = find_min_index(arr)  
    swap(arr, 0, min_idx)
    return [arr[0]] + selection_sort(arr[1:])  

\(T(n) =\)

Closed Forms

Algorithm Recurrence Solution
Binary search \(T(n) = T(n/2) + O(1)\)
Merge sort \(T(n) = 2T(n/2) + O(n)\)
Linear search \(T(n) = T(n-1) + O(1)\)
triangle_double \(T(n) = 2T(n/2) + O(1)\)
Selection sort \(T(n) = T(n-1) + O(n)\)

Closed Form Reasons

Recurrence Solution Why?
\(T(n) = T(n-1) + 1\) \(O(n)\) \(n\) levels × \(O(1)\) work
\(T(n) = T(n/2) + 1\) \(O(\log n)\) \(\log n\) levels × \(O(1)\) work
\(T(n) = 2T(n/2) + 1\) \(O(n)\) Doubling nodes overwhelms depth
\(T(n) = 2T(n/2) + n\) \(O(n \log n)\) \(\log n\) levels × \(O(n)\) work
\(T(n) = T(n-1) + n\) \(O(n^2)\) \(n\) levels × \(O(n)\) avg work

Summary

What We Learned Today

  1. Math vs. Time: Same functionality, different running times!

  2. Merge sort: Divide, conquer, merge

  3. MS recurrence: \(T(n) = 2T(n/2) + O(n)\)

  4. Recursion sketches: Visualize depth × work per level

  5. Recurrence forms: Different structures → different runtimes

Coming Up

  • Tuesday video: Solving recurrences formally (review for DSCI 220 folks)
  • Wednesday: Why \(O(n \log n)\) is optimal for comparison sorting