Announcement_5
Diffusion-Based Neural Samplers: A Systematic Review and Open Questions
Computational Statistics and Machine Learning seminar at Imperial College London
Abstract: Sampling from unnormalized density is a fundamental task in machine learning. Recently, motivated by the success of diffusion models, diffusion-based neural sampler start to gain attention. This talk will provide a systematical review according to their choices of sampling process, and their training objectives. By combining different sampling process with different objectives, we can recover almost all diffusion/controlled-based neural samplers in recent literatures. We then consider a potential approach to achieve simulation-free training. Although promising in theory, this method ultimately encounters severe mode collapse. In fact, on closer inspection, we find that nearly all successful neural samplers rely on Langevin preconditioning to avoid mode collapsing, raising important questions about their efficiency and shedding light on future explorations of these neural samplers.