机器学习与数据科学博士生系列论坛(第六十六期)—— An Introduction to Diffusion Distillation Models
报告人:罗维俭(北京大学)
时间:2024-01-18 16:00-17:00
地点:腾讯会议 551-1675-5419
摘要:
Diffusion models have become one of the foundation models for generative modeling, with numerous successful applications such as image and video generation, molecule designs, and controllable data creation, etc. The generation process of diffusion models requires solving generative ordinary differential equations (ODEs) or stochastic differential equations (SDEs), which are often computationally inefficient for high dimensional data.
In recent years, there has been a wide range of attempts to improve the data generation efficiency of diffusion models from different perspectives. In this talk, we will give a brief introduction to existing diffusion distillation methods which are proposed from different perspectives to accelerate the efficiency of diffusion models. Especially, we will focus on the Diff-Instruct method that achieves strong one-step diffusion distillation performance in a distribution-matching manner. Moreover, we will also compare and summarize other distillation approaches to give an intuitive understanding of diffusion distillation.
论坛简介:该线上论坛是由张志华教授机器学习实验室组织,每两周主办一次(除了公共假期)。论坛每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。