Logo: to the web site of Uppsala University

uu.sePublications from Uppsala University
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Incorporating Sum Constraints into Multitask Gaussian Processes
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Division of Systems and Control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Artificial Intelligence.
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Division of Systems and Control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Artificial Intelligence.ORCID iD: 0000-0002-6028-8961
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Division of Systems and Control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Artificial Intelligence.ORCID iD: 0000-0001-5183-234X
Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Division of Systems and Control. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Artificial Intelligence. Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Automatic control.ORCID iD: 0000-0002-4634-7240
2022 (English)In: Transactions on Machine Learning ResearchArticle in journal (Refereed) Epub ahead of print
Abstract [en]

Machine learning models can be improved by adapting them to respect existing background knowledge. In this paper we consider multitask Gaussian processes, with background knowledge in the form of constraints that require a specific sum of the outputs to be constant. This is achieved by conditioning the prior distribution on the constraint fulfillment. The approach allows for both linear and nonlinear constraints. We demonstrate that the constraints are fulfilled with high precision and that the construction can improve the overall prediction accuracy as compared to the standard Gaussian process.

Place, publisher, year, edition, pages
2022.
National Category
Signal Processing
Research subject
Electrical Engineering with specialization in Signal Processing
Identifiers
URN: urn:nbn:se:uu:diva-491388OAI: oai:DiVA.org:uu-491388DiVA, id: diva2:1721088
Available from: 2022-12-20 Created: 2022-12-20 Last updated: 2025-09-19Bibliographically approved
In thesis
1. Integrating Prior Knowledge into Machine Learning Models with Applications in Physics
Open this publication in new window or tab >>Integrating Prior Knowledge into Machine Learning Models with Applications in Physics
2023 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

At the extremes, two antithetical approaches to describing natural processes exist. Theoretical models can be derived from first principles, allowing for clear interpretability; on the downside, this approach may be infeasible or inefficient for complex systems. Alternatively, methods from statistical machine learning can be employed to learn black box models from large amounts of data, while providing little or no understanding of their inner workings.

Both approaches have different desirable properties and weaknesses. It is natural to ask how they may be combined to create better models. This is the question that the field of physics-informed machine learning is concerned with, and which we will consider in this thesis. More precisely, we investigate ways of integrating additional prior knowledge into machine learning models.

In Paper I, we consider multitask Gaussian processes and devise a way to include so-called sum constraints into the model, where a nonlinear sum of the outputs is required to equal a known value. In Paper II, we consider the task of determining unknown parameters from data when solving partial differential equations (PDEs) with physics-informed neural networks. Given the prior knowledge that the measurement noise is homogeneous but otherwise unknown, we demonstrate that it is possible to learn the solution and parameters of the PDE jointly with the noise distribution. In Paper III, we consider generative adversarial networks, which may produce realistic-looking samples but fail to reproduce their true distribution. In our work, we mitigate this issue by matching the true and generated distributions of statistics extracted from the data.

Place, publisher, year, edition, pages
Uppsala University, 2023. p. 65
Series
Information technology licentiate theses: Licentiate theses from the Department of Information Technology, ISSN 1404-5117 ; 2023-002
National Category
Computer Sciences
Research subject
Machine learning
Identifiers
urn:nbn:se:uu:diva-510379 (URN)
Presentation
2023-09-20, 101130, Lägerhyddsvägen 1, Uppsala, 10:15 (English)
Opponent
Supervisors
Available from: 2023-09-05 Created: 2023-08-29 Last updated: 2023-09-05Bibliographically approved
2. Physics-Informed Machine Learning for Regression and Generative Modeling
Open this publication in new window or tab >>Physics-Informed Machine Learning for Regression and Generative Modeling
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Conventionally, the models used in science and technology were obtained through the careful study of nature. Scientists proposed models based on their observations, and the models were understood in detail. In the new paradigm of machine learning (ML), on the other hand, models are instead trained on large amounts of data. They are typically of a black-box nature and come without guarantees on their accuracy and range of validity. 

It is natural to ask if both approaches can be combined beneficially. Incorporating domain knowledge from the sciences may make ML models more accurate and robust. Conversely, models and techniques from ML may constitute useful tools for improving scientific models. These questions are the subject of this dissertation, which therefore belongs to the broad area of physics-informed machine learning.        

The first main contribution of the thesis concerns various problems encountered in regression tasks. Physics-informed neural networks (PINNs) are extended to suit situations with noisy or incomplete data. Firstly, a method to learn the noise distribution in the case of homogeneous measurement noise is developed. Secondly, repulsive ensembles are employed to enable Bayesian uncertainty quantification in PINNs. Furthermore, multitask Gaussian processes (GPs) are considered, and a method to incorporate nonlinear sum constraints is presented.        

The second group of contributions deals with the use of generative models in physical applications. A modification of generative adversarial networks (GANs) is developed, where the distribution of certain high-level statistics chosen from domain knowledge is matched between true and generated data. A modularized architecture tailored to the generation of in-ice radio signals is devised for use as a component in an optimization pipeline. This architecture enables the generation of physically consistent samples as well as the differentiability of the samples in an efficient manner.

Place, publisher, year, edition, pages
Uppsala: Acta Universitatis Upsaliensis, 2025. p. 82
Series
Digital Comprehensive Summaries of Uppsala Dissertations from the Faculty of Science and Technology, ISSN 1651-6214 ; 2591
Keywords
machine learning, deep learning, prior knowledge, regression, generative modelling, physics-informed neural networks, Gaussian processes, surrogate models
National Category
Computer Sciences
Research subject
Machine learning
Identifiers
urn:nbn:se:uu:diva-567542 (URN)978-91-513-2596-5 (ISBN)
Public defence
2025-11-07, 101195, Heinz-Otto Kreiss, Regementsvägen 10, Ångströmlaboratoriet, Uppsala, 09:15 (English)
Opponent
Supervisors
Available from: 2025-10-16 Created: 2025-09-19 Last updated: 2025-10-28

Open Access in DiVA

No full text in DiVA

Other links

https://openreview.net/forum?id=gzu4ZbBY7SarXiv:2202.01793

Authority records

Pilar, PhilippJidling, CarlSchön, Thomas B.Wahlström, Niklas

Search in DiVA

By author/editor
Pilar, PhilippJidling, CarlSchön, Thomas B.Wahlström, Niklas
By organisation
Division of Systems and ControlArtificial IntelligenceAutomatic control
Signal Processing

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 192 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf