This class was an overview of probabilistic modeling and learning: Bayesian networks, Markov models, HMMs, EM, and information theory (entropy, KL, mutual information). What made it especially fun was how abstract math kept turning into very concrete little systems.

Some of the more interesting/unique bits:

Using the knowledge from this class, I made a Wordle solver that uses letter frequency and positional information to maximize expected information gain on each guess.

Wordle solver screenshot

The solver can be found here (GitHub).