BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News MIT Debuts Gen, a Julia-Based Language for Artificial Intelligence

MIT Debuts Gen, a Julia-Based Language for Artificial Intelligence

This item in japanese

In a recent paper, MIT researchers introduced Gen, a general-purpose probabilistic language based on Julia that aims to allow users to express models and create inference algorithms using high-level programming constructs.

To this aim, Gen includes a number of novel language constructs, such as a generative function interface to encapsulate probabilistic models, combinators to create new generative functions from existing ones, and an inference library providing high-level inference algorithms users can choose from.

Although Gen is not the first probabilistic programming language, MIT researchers say existing ones either lack generality at the modelling level, or lack algorithmic efficiency when they support generic modelling. Gen aims to be both expressive at the modelling level and efficient at the algorithmic level. This entails two different goals: algorithm efficiency, i.e., the number of iterations an algorithm requires to produce a result, and implementation efficiency, i.e. the time required by a single iteration.

The approach Gen takes to achieve those two goals represents a diversion from the usual approach consisting in representing models as programs. Instead, Gen models are black boxes, called generative functions (GF), that provide an interface (GFI) exposing capabilities required by inference. Inference algorithms are then written using those interfaces. Gen models can be expressed in a number of different ways, each striking a different flexibility/efficiency trade-off. Gen provides a built-in modeling language that extends Julia's syntax for function definition:

@gen function foo(prob::Float64)
    z1 = @trace(bernoulli(prob), :a)
    z2 = @trace(bernoulli(prob), :b)
    return z1 || z2
end

A variant of this modelling language, known as the Static Modelling Language, aims to provide better time and memory performance by only using a subset of Julia intermediate representation (IR) language and flagging as an error any attempt to use constructs outside of that subset.

The generative function approach is key to making Gen suitable for application to a wide range of problems, and enables Gen to use models created in TensorFlow as algorithms written in a programming language, or as result of simulations. Models can be to some extent combined and reused using combinators, which capture recurring modelling patterns, such as a dataset of independently sampled data points. Combinators are in some sense similar to "higher-order generative functions", in that they manipulate GFs, but are not themselves GRs. Using combinators, inference algorithm can be created based on an existing library of inference algorithms.

MIT is targeting Gen to a wide range of problems, from Bayesian statistics, to machine learning, to computer vision. Actually, its flexibility is touted as one of its major improvements over alternative approaches, which is usually more narrowly-focused to one specific approach, e.g. deep neural networks. According to MIT, Gen has already showed better performance than existing probabilistic programming systems for a number of different problems, such as tracking objects in space, estimating 3D body pose from a depth image, and inferring the structure of a time series.

Gen is built on top of Julia, a language specialised for numerical analysis that was created at MIT and is available as a Julia package. For a quick introduction to probabilistic programming, have a look at MIT introduction to Gen.

Rate this Article

Adoption
Style

BT