Learn to Generate Data by Estimating Gradients of Data Distribution – Machine Learning Conference | Events
Generating data with complex patterns, such as images, audio, and molecular structures, requires adapting highly flexible statistical models to the data distribution. Even in the age of deep neural networks, building such models is difficult because they usually require an intractable normalization procedure to represent a probability distribution.
To address this challenge, I propose to model the gradient vector field of the data distribution (known as the score function), which does not require normalization and therefore can take full advantage of the flexibility of deep neural networks . I will show how to (1) estimate the score function from data with flexible deep neural networks and efficient statistical methods, (2) generate new data using stochastic differential equations and the Markov chain Monte Carlo, and even (3) evaluate probability values precisely as in a traditional statistical model.
The resulting method, called score-based generative modeling, achieves record-breaking performance in applications such as image synthesis, speech synthesis generation, time-series prediction, and scatterplot generation, challenging question the long-standing dominance of generative adversarial networks (GANs). ) on many of these tasks. Moreover, unlike GANs, score-based generative models are suitable for Bayesian reasoning tasks such as solving ill-posed inverse problems, and I have demonstrated their superior performance on rare-view computed tomography and resonance imaging. accelerated magnetic.
Yang Song is in her final year of doctoral studies. student at Stanford University. His research interests focus on deep generative models and their applications to inverse problem solving and AI security. His first-author papers have been recognized with an Outstanding Paper Award at ICLR-2021 and an oral presentation at NeurIPS-2019. He is a recipient of the Apple PhD Fellowship in AI/ML and the JP Morgan PhD Fellowship.