uu.seUppsala University Publications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control. (Syscon)
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control. (Syscon)ORCID iD: 0000-0001-5183-234X
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control. (Syscon)
2018 (English)In: Mechanical systems and signal processing, ISSN 0888-3270, E-ISSN 1096-1216, Vol. 104, p. 915-928Article in journal (Refereed) Published
Abstract [en]

Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p(datalparameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.

Place, publisher, year, edition, pages
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD , 2018. Vol. 104, p. 915-928
Keyword [en]
Probabilistic modelling, Bayesian methods, Nonlinear system identification, Sequential Monte Carlo, Particle filter, Approximate Bayesian computations, Highly informative observations, Tempering, Wiener-Hammerstein model
National Category
Probability Theory and Statistics
Identifiers
URN: urn:nbn:se:uu:diva-350269DOI: 10.1016/j.ymssp.2017.09.016ISI: 000423652800057OAI: oai:DiVA.org:uu-350269DiVA, id: diva2:1206003
Funder
Swedish Research Council, 621-2013-5524, 2016-04278, 621-2016-06079Swedish Foundation for Strategic Research , RIT15-0012
Available from: 2018-05-15 Created: 2018-05-15 Last updated: 2018-05-15Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records BETA

Svensson, AndreasSchön, Thomas B.Lindsten, Fredrik

Search in DiVA

By author/editor
Svensson, AndreasSchön, Thomas B.Lindsten, Fredrik
By organisation
Automatic control
In the same journal
Mechanical systems and signal processing
Probability Theory and Statistics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 16 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf