Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
autoregressive_network [2018/11/03 10:41]
admin
autoregressive_network [2018/11/03 13:58] (current)
admin
Line 172: Line 172:
 https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows https://​arxiv.org/​abs/​1804.00779 Neural Autoregressive Flows
  
-Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.+Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions,​ and their greater expressivity allows them to better capture multimodal target distributions. Experimentally,​ NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST. ​ ​https://​github.com/​CW-Huang/​NAF
  
 https://​arxiv.org/​abs/​1806.05575 Autoregressive Quantile Networks for Generative Modeling https://​arxiv.org/​abs/​1806.05575 Autoregressive Quantile Networks for Generative Modeling