Sampling Theory (Yuxin Chen)

Yuxin Chen (yuxinc@wharton.upenn.edu) is a professor of statistics and of electrical and systems engineering at the University of Pennsylvania (UPenn). Before joining UPenn, he was an assistant professor of electrical and computer engineering at Princeton. He completed his Ph.D. in Electrical Engineering at Stanford University, and was also a postdoc scholar at Stanford Statistics. Dr. Chen’s expertise includes diffusion models, reinforcement learning theory, high-dimensional statistics, optimization, statistical learning theory, and information theory. His research accomplishments have been recognized by the Alfred P. Sloan Fellowship, the SIAM Imaging Science Best Paper Prize, the ICCM Best Paper Award (Gold Medal), the IEEE Transactions on Power Electronics Prize Paper Award (first place), and he was selected as a finalist for the Best Paper Prize for Young Researchers in Continuous Optimization. In addition, he has also received the Google Research Scholar Award and the Amazon Research Award. He has won six teaching awards and a Princeton Graduate Mentoring Award.
Dr. Chen’s Related Work and Experience:
Dr. Chen’s recent work [1] develops the state-of-the-art convergence theory for diffusion models. For instance, his work [5] provides the first sharp convergence guarantees for the probability flow ODE sampler, while his work [1] is the first work that unveils the DDIM sampler’s ability to adapt to unknown low-dimensional structure of the target data distribution. Dr. Chen has presented a number of tutorials in conferences and workshops like SIGMETRICS, Joint Statistical Meetings, and ISIT.
Papers
- Jiadong Liang, Zhihan Huang, Yuxin Chen, “Low-Dimensional Adaptation of Diffusion Models: Convergence in Total Variation,” Conference on Learning Theory (COLT), 2025. Preprint – PDF
- Gen Li*, Yuchen Zhou*, Yuting Wei, Yuxin Chen, “Faster Diffusion Models via Higher-Order Approximation,” arXiv preprint arXiv:2506.24042, 2025 (*=equal contributions). Preprint – PDF
- Zhihan Huang, Yuting Wei, Yuxin Chen, “Denoising Diffusion Probabilistic Models Are Optimally Adaptive to Unknown Low Dimensionality,” arXiv preprint arXiv:2408.02320, 2024. Preprint – PDF
- Yuchen Wu, Yuxin Chen, Yuting Wei, “Stochastic Runge-Kutta Methods: Provable Acceleration of Diffusion Models,” arXiv preprint arXiv:2410.04760, 2024. Preprint – PDF
- Gen Li, Yuting Wei, Yuejie Chi, Yuxin Chen, “A Sharp Convergence Theory for The Probability Flow ODEs of Diffusion Models,” arXiv preprint arXiv:2408.02320, 2024. Preprint – PDF
- Gen Li*, Yu Huang*, Timofey Efimov, Yuting Wei, Yuejie Chi, Yuxin Chen, “Accelerating Convergence of Score-Based Diffusion Models, Provably,” International Conference on Machine Learning (ICML), 2024. (*= equal contributions) Preprint – PDF – Code
- Gen Li, Yuting Wei, Yuxin Chen, Yuejie Chi, “Towards Non-Asymptotic Convergence for Diffusion Based Generative Models,” International Conference on Learning Representations (ICLR), 2024. Preprint – PDF