uu.seUppsala University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Measure transformer semantics for Bayesian machine learning
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science. (Mobility)
Microsoft Research, Cambridge.
University of Pennsylvania. (Programming Languages)
Microsoft Research, Cambridge.
Show others and affiliations
2013 (English)In: Logical Methods in Computer Science, ISSN 1860-5974, E-ISSN 1860-5974, Vol. 9, no 3, p. 11-Article in journal (Refereed) Published
Abstract [en]

The Bayesian approach to machine learning amounts to computing posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define measure-transformer combinators inspired by theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zero-probability events. We compile our core language to a small imperative language that is processed by an existing inference engine for factor graphs, which are data structures that enable many efficient inference algorithms. This allows efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models.

Place, publisher, year, edition, pages
2013. Vol. 9, no 3, p. 11-
Keyword [en]
Machine learning, Probabilistic programming, Programming languages, Bayesian modelling
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:uu:diva-201425DOI: 10.2168/LMCS-9(3:11)2013ISI: 000327472100016OAI: oai:DiVA.org:uu-201425DiVA: diva2:627163
Note

An abridged version of this paper appears in the proceedings of the 20th European Symposium on Programming (ESOP’11), part of ETAPS 2011, held in Saarbrücken, Germany, March 26–April 3, 2011.

Available from: 2013-06-11 Created: 2013-06-11 Last updated: 2018-01-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textarXiv page

Authority records BETA

Borgström, Johannes

Search in DiVA

By author/editor
Borgström, Johannes
By organisation
Computing Science
In the same journal
Logical Methods in Computer Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 547 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf