Encoding prior knowledge through Gaussian processes

Activity: Talk or presentation typesInvited academic talk


With lots of data that you don’t know much about, just fitting a reasonably flexible model might be good enough. But what if you don’t have as much data, or you already have some knowledge about its properties? Encoding prior knowledge makes a model intentionally less flexible, and allows it to generalise better from less data. In this talk I will give a short introduction to how we can encode prior knowledge in Gaussian processes, for example invariances in the input space.
Period8 Dec 2021
Held atUniversity of Tübingen, Germany