Week 10: Music, Markov, and Mazes

Last week we met graphs. This week — and for the rest of the term — we use them.

Two very different problems this week: can a computer compose music? and can a computer solve Sudoku? Beneath their very different surfaces, both reduce to the same question: how do you navigate a space of possibilities? The answer is graphs. Music generation, puzzle solving, route planning, optimization — they’re all graph problems in disguise. This week is where that realization lands.

Your Growing Toolkit

Every problem we solve uses some combination of these tools:

  • Representation — how we encode meaning
  • Collections — how we group things
  • Control flow — how we make decisions and repeat
  • Functions — how we name and reuse logic
  • Abstraction — how we hide complexity
  • Efficiency — how we measure cost

This week: Markov Chains + Depth-First Search — two elegant ways to navigate a graph of possibilities!

NoteThe Rest of the Term: Graphs Applied

Weeks 10–14 are all about putting graphs to work on real problems: music generation, puzzle solving, geographic maps, route planning, optimization. The vocabulary from Week 9 (\(V\), \(E\), paths, search) is now your power tool. Every week adds a new application.


Tuesday: A Computer That Composes

Ada’s Prophecy

Portrait of Ada Lovelace.

Ada Lovelace, 1842

We opened with a remarkable quote. In 1842 — nearly a century before the first electronic computer — Ada Lovelace imagined what a programmable engine might do:

“Supposing… that the fundamental relations of pitched sounds… were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

She was right. This week, we built exactly that.


Step 1: Listen to the Music

Before generating music, we need to understand it — mathematically.

Sheet music showing the notes of Mary Had a Little Lamb.

The notes of Mary Had a Little Lamb

We started with a classic: Mary Had a Little Lamb. The first step was to build a tally:

Note Count
C 3
D 10
E 11
G 2

Most notes in Mary are E or D. A random note generator that just sampled from this distribution would produce something that sounds like the song — but without any sense of melody.


Step 2: Capture the Transitions

The secret ingredient isn’t which notes appear — it’s which notes follow which.

A transition table showing which notes follow which in Mary Had a Little Lamb.

Characterizing Mary by note transitions

Instead of counting each note in isolation, we count pairs: how often does E follow D? How often does D follow E? This gives us a transition table — a complete picture of the song’s structure.

TipFill in the blank

The transition table stores, for each note, the probability that the next note will be each possible note. Given you just played D, what are the odds the next note is E? D again? C?

This is all the information you need to generate new music that “sounds like” the original!


Step 3: Build the Generator

Diagram showing the music generator algorithm using a transition table.

The music generator algorithm

The algorithm is beautifully simple:

  1. Start on any note
  2. Look up that note’s row in the transition table
  3. Pick the next note randomly, weighted by the transition probabilities
  4. Repeat

The result is a sequence of notes that respects the statistical structure of the original melody — not a copy, but a plausible variation.

The music generator producing a sequence of notes one at a time.

Building up a song step by step

A complete sequence of notes output by the generator.

A complete generated song

The Big Idea: Markov Chains

What we just built has a name. A Markov chain is a random process where:

The next state depends only on the current state — not on any history before it.

This “memoryless” property is what makes Markov chains tractable. We don’t need to remember the whole song so far — just the current note.

ImportantMarkov Chains Are Everywhere
  • 🔍 PageRank — Google’s original search algorithm models a random web surfer following links. Pages with high probability of being visited get high rank.
  • 💬 Natural language processing — Some words are more likely to follow others. “I just ate the whole desert” is probably a spelling error. *“For dinner I ___“* is probably followed by “ate”.
  • 🧬 DNA sequencing — Nucleotide transitions follow statistical patterns used for alignment and error correction.
  • ⚗️ Chemical reaction simulation — Reaction probabilities model which molecules form next.

Markov Chains Are Graphs

Here’s the most elegant part. A transition table like ours can be read as an adjacency matrix — the representation of a graph!

A transition table showing note-to-note probabilities, interpretable as a graph adjacency matrix.

The transition table as an adjacency matrix
TipFill in the blank

Vertices: Each musical note (C, D, E, G, …) is a vertex.

Edges: There’s a directed edge from note \(u\) to note \(v\) if the transition probability from \(u\) to \(v\) is non-zero — i.e., \(v\) ever follows \(u\) in the song.

Special characteristics: The graph is directed (edges have direction — “E follows D” is different from “D follows E”) and weighted (each edge carries a probability). The probabilities on outgoing edges from each vertex sum to 1.

A random walk on this graph is the Markov chain. See it in action at setosa.io/markov.