- “You were so excited about the material and I liked that a lot! I enjoyed the part that you covered some basic stuff in the class when you realized that there are people from engineering and science, sitting there and they have absolutely no clue about what’s going on! Thank you very much!”
- “He did a great job of trying to incorporate real-world connections to help make what we were learning have value to us. He asked us questions to help us learn for ourselves and would take the time to show us alternative solutions or alternative problems so that we would be exposed to multiple ways of analyzing problems. Thank you for an enjoyable math experience from someone who hadn’t been expecting one.”
- “He was amazing and gave helpful tips for not only this course but for future math and science courses.”
- “I hope you can find accomplishment in the fact that I have decided to double major in CS and Math because of what you have taught me.”
- “I really enjoyed the physics examples provided by Yoonsang. He created a very entertaining format to learn calculus.”
- “Extremely intelligent Yoonsang Lee gave lessons outside of math that pertained to many subjects. He related math to many applications, which at times complicated things.”
Teaching at Dartmouth College
I regulary teaching the following courses at Dartmouth
- Math 23: Differential Equations
- Math 40: Probability and Statistical Inference (every Winter term)
- Math 53/126: Partial Differential Equations (every Fall term; course developer)
- Math 76.02: Computational Inverse Problems
- Math 106: Data-driven Uncertainty Quantification (course developer)
Math 40 is a core course for the mathematical data science major.
Math 106 and 126 are core courses for applied math graduate students.
Mentoring
I was fortunate to advise the following talented undergraduate students for their honors thesis.
Undergraduate Honors Thesis
- Yunjin Tong (‘22): Bayesian Inference for Stochastic Predictions of Non-Gaussian Systems with Applications in Climate Change, Spring 2024 (currently PhD student at Stanford Business)
- Matthew Landry (‘24): Non-Gaussian Bayesian Update in High-Dimensional Space for Stock Prediction, Spring 2024 (currently data scientist)
- Brian Wang (‘23): Capturing dynamical systems using deep learning and understanding optimal data size in training time series prediction models, Spring 2023 (currently data scientist)
- Nandini Prasad (‘22): Using Hierarchical Neural Networks to Predict Parkinson’s Disease Severity, Spring 2022 (currently MS student at Columbia Statistics)
- Andreas Louskos (‘21): Neural network approach for the Black-Scholes equation, Spring 2021 (currently PhD student at NYU)
I also advised undergraduate research assistants
Undergraduate Research Assistants
- Paul Chirkov (’26): Regression in high dimensional space, Spring 2024.
- Fangzhou Yu (’25): Investigation of network structures for hierarchical learning, Spring 2023.
- Daniel Carstensen (’24): NSF project Sub-linear complexity method for multiscale problems without scale separation,
- Tooryanand Seetohul (’22): Modeling a fat-tailed large-scale fluctuation in anisotropic
turbulent processes, Summer 2021.
- Stephanie Finley (’23): Bayesian parameter estimation of dynamical systems, Fall 2020,
Winter 2021.
- Scott Dayton (’23): Filtering non-Gaussian Systems, Summer 2020, Fall 2020.
- Zachary J Couvillion (’23): Stability of Variable Coefficient Diffusion Problems, Summer 2020.
- Winston Wang (’21): GPU-based acceleration of computations in complex dynamical models, Summer 2020.
Publication (Book)
- Y. Lee, Y. Kim, D. S and H. C, Science for High School Students (in Korean), ETOOS, 2006, ISBN-13:
9788957352571