The DRP is running in Spring Term 2023!

Stay tuned for updates!

Markov Chain Monte Carlo and Bayesian Computation

Jonathan Lindbloom

The use of Markov Chain Monte Carlo (MCMC) methods for solving problems arising in Bayesian computation is ubiquitous. Our guiding question for this DRP is: how can we characterize a high-dimensional probability distribution for which we possess only the unnormalized density function? We will begin with an in-depth study of Monte Carlo methods and the theoretical basis for the Metropolis-Hastings algorithm. We will then proceed with a survey of many specialized instances of MCMC algorithms, including: random walk MH, adaptive methods, Langevin algorithms, Gibbs sampling, Hamiltonian MCMC, the NUTS sampler, and ensemble methods such as parallel tempering and affine-invariant samplers. To supplement our theoretical understanding, we will implement some of these methods from scratch to sample the posteriors of some simple Bayesian models. This program would be a great jumping-off point for anyone interested in the intersection of applied mathematics, computational science, and statistics.

Suggested reference: Monte Carlo Statistical Methods (Robert and Casella)

Prerequisites: Math 13, Math 22/24, Math 20/40/60, and familiarity with a programming language.

Modular Forms

Mentor: Eran Assaf

In how many ways can you represent a number as a sum of four squares? Can you hear the shape of a drum? What are all the integers x, y, z satisfying x^n + y^n = z^n for n a positive integer? Is the number 1 + 1/2^3 + 1/3^3 + 1/4^3 + …. rational? All these questions (and more) are answered by the beautiful theory of modular forms, which will be explored in this reading course. We will learn about elliptic curves, modular curves and classical modular forms, with a view towards the modularity theorem - every rational elliptic curve arises from a modular form. This program would be a good preparation for those interested in pursuing research in number theory in the future.

Suggested References: A First Course in Modular Forms (Diamond and Shurman), Modular Forms, a Computational Approach (Stein)

Prerequisites: Math 25, Math 31, Math 35, Math 43

Representation Theory

Mentors: Eran Assaf

Symmetry is an important tool used consistently throughout mathematics in various ways, and formalized through the notion of a group action. Groups can act on many different objects, such as sets, topological spaces or vector spaces. An action of the group on a vector space gives rise to a representation of the group, and representation theory studies these representations. These have interesting applications in both physics and quantum chemistry. It also turns out one can infer quite a lot about the group from its representations. In this program, we embark on a journey, which begins by learning the theory of finite-dimensional representations of finite groups, and time permitting, goes on to the study of finite-dimensional representations of Lie groups and Lie algebras.

Suggested References: Representation Theory: A First Course (Fulton and Harris), Linear Representations of Finite Groups (Serre), Representations and Characters of Groups (James and Liebeck)

Prerequisites: Math 31

Category Theory

Mentor: Richard Haburack

Category theory takes a bird's eye view of math by considering collections of objects all at once (e.g. vector spaces, groups, topological spaces...) and functions between them (linear transformations, group homomorphisms, continuous maps...). Furthermore there are relationships between different collections of objects, for example associated to every topological space is a group of paths on the space called the fundamental group, and continuous maps between topological spaces induce homomorphisms between the associated fundamental groups. Abstracting ideas like this and making them precise are at the heart of category theory. Many subjects in math require a basic understanding of the notions of category theory and for some such as algebraic geometry it is essential. Whilst category theory might initially sound wilfully abstract, an introduction is full of hands-on examples and exercises.

Suggested References: Category theory in context (Riehl), Basic Category theory (Leinster)

Prerequisites: Math 31

A Mathematical Foundation for Synthetic Aperture Radar

Mentor: Dylan Green

Synthetic aperture radar (SAR) imaging is a day and night, all-weather method of forming high-resolution images of large swaths of land and sea. SAR has many applications and is widely used by the military, academia, and private industry. The problem of SAR image formation has been an ongoing area of research in both engineering and applied mathematics for decades now, and advances are still being made to this day. In this reading course, we will develop a mathematical framework for the basic principles in SAR imaging and work our way up to forming an image using real SAR data.

Suggested Reference: Spotlight-Mode Synthetic Aperture Radar: A Signal Processing Approach (Jakowatz, Wahl, Eichel, Ghiglia, and Thompson)

Prerequisites: Math 11 or 13, Math 22 or 24, some coding. Previous exposure to Fourier transforms is preferred, but not required.

Abstract Nonsense

Mentor: Luke Askew

Come join a book club! This reading project can serve as an introduction to higher mathematics through its preferred language: category theory. We'll see how the very human ideas of objects, relationships, and analogies are made formal mathematically before embarking on a whirlwind tour of beautiful abstract algebraic structures. We'll be using a fresh off the press book which gives us the opportunity to participate in a book club with the author through the Topos Institute!

Suggested References: The Joy of Abstraction: An Exploration of Math, Category Theory, and Life by Eugenia Cheng

Prerequisites: None

Statistical Learning

Mentor: Zhen Chen

The program will focus on statistical/machine learning basics, including linear regression and classification, to more advanced topics, such as deep learning.

Suggested References: “The Elements of Statistical Learning: Data Mining, Inference, and Prediction” by Jerome H. Friedman, Robert Tibshirani, and Trevor Hastie

Prerequisites: Math 13, Math 22, and Math 20 or equivalent

Mathematical Art

Mentor: Brian Mintz

As students advance in their mathematical career, they learn to solve more abstract problems by being more imaginative. However, few people get to see this creative side of mathematics, so most think math is just a series of increasingly complex calculations. In this project, we'll explore some of the many ways art has been infused with mathematical ideas, creating some of our own along the way!

Suggested References: To be determined based on particapant's interests

Prerequisites: None

Quantum information theory

Mentor: Travis Russel

Classical information theory concerns how one can efficiently transfer information from point A to point B. Often, information is transmitted over noisy channels which can corrupt the intended message. The basic techniques for correcting these random errors were developed in an elegant theory by Claude Shannon in the 1940s, and these techniques remain a fundamental ingredient of modern computing and communication devices. With the development of quantum computing and quantum communication devices, it has become necessary to consider the efficient transfer of quantum information in the form of qubits. While qubits can store more information than classical bits, they are notoriously fragile and can be easily corrupted by noise. The fragility of qubits is the main obstacle preventing the development of powerful quantum computers today. In this reading course, we will learn what quantum information is, how it is affected by noise, and how "quantum" errors can be corrected. Along the way, we will learn the basics of quantum mechanics and quantum computing.

Suggested References: Quantum Computation and Quantum Information, Nielsen & Chaung

Prerequisites: Math 22 or Math 24, No physics background required.

Quantum Computation and Quantum Information

Mentor: Casey Dowdle

You may have heard about recent hype around quantum computing in the news. This is your chance to study the mathematics behind quantum computing and quantum information with a graduate student who is actively doing quantum computing research. We will be reading the classic "Quantum Computation and Quantum Information" by Nielsen and Chuang. No prior background of quantum mechanics is necessary! We will cover topics such as the Bloch sphere, measurement of quantum states, quantum gates and algorithms, and quantum error correction. Along the way you will also gain a better understanding of classical computation and information theory. We will also program simple quantum algorithms using quantum circuit simulator libraries such as Braket or Qiskit.

Suggested References:"Quantum Computation and Quantum Information" - Nielsen and Chuang

Prerequisites: Math 13, Math 24, (Math 40 and some familiarity with programming would be helpful but is not strictly required)

Deep reinforcement learning

Mentor: Mark Lovett

Reinforcement learning has been described as a key element to solving true AI. Deep reinforcement learning a relatively new and hot field in machine learning. Deep reinforcement learning has been applied to many of the current problems and recent advances in machine learning; including but not limited to, autonomous cars, generalizability of AI, and chat GPT. This project will introduce the student to some of the basics of reinforcement learning and explore how the mathematics of deep reinforcement learning and its implementation in python.

Suggested References:Deep Reinforcement Learning-Plaat

Prerequisites: Math 13, Math 24, Math 23, Programing experience in python is encouraged