Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Last revision Both sides next revision
discrete_model [2018/08/31 15:03]
admin
discrete_model [2018/11/03 12:27]
admin
Line 91: Line 91:
  
 . In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior. We show that the invertibility condition allows for efficient exact inference and marginal likelihood computation in our model so long as the prior is well-behaved. . In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior. We show that the invertibility condition allows for efficient exact inference and marginal likelihood computation in our model so long as the prior is well-behaved.
 +
 +https://​arxiv.org/​pdf/​1808.09111v1.pdf Unsupervised Learning of Syntactic Structure
 +with Invertible Neural Projections
 +
 +we propose a novel generative
 +model that jointly learns discrete syntactic
 +structure and continuous word representations
 +in an unsupervised fashion by cascading
 +an invertible neural network with a structured
 +generative prior. ​