Code & tools available at http://edwardlib.org.
Paper available at http://openreview.net/pdf?id=Hy6b4Pqee
We propose Edward, a new Turing-complete probabilistic programming language which builds on two compositional representations—random variables and inference.
- We show how to integrate our language into existing computational graph frameworks such as TensorFlow; this provides significant speedups over existing probabilistic systems.
- We also show how Edward makes it easy to fit the same
model using a variety of composable inference methods, ranging from point estimation, to variational inference, to MCMC. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as computationally efficient and flexible as traditional deep learning. - For flexibility, we show how to reuse the modeling representation within inference to design rich variational models and generative adversarial networks.
- For efficiency, we show that our implementation of Hamiltonian Monte Carlo is 35x faster than hand-optimized software such as Stan.