Papers
Topics
Authors
Recent
Search
2000 character limit reached

Bézier Curve Gaussian Processes

Published 3 May 2022 in stat.ML and cs.LG | (2205.01754v3)

Abstract: Probabilistic models for sequential data are the basis for a variety of applications concerned with processing timely ordered information. The predominant approach in this domain is given by recurrent neural networks, implementing either an approximate Bayesian approach (e.g. Variational Autoencoders or Generative Adversarial Networks) or a regression-based approach, i.e. variations of Mixture Density networks (MDN). In this paper, we focus on the $\mathcal{N}$-MDN variant, which parameterizes (mixtures of) probabilistic B\'ezier curves ($\mathcal{N}$-Curves) for modeling stochastic processes. While in favor in terms of computational cost and stability, MDNs generally fall behind approximate Bayesian approaches in terms of expressiveness. Towards this end, we present an approach for closing this gap by enabling full Bayesian inference on top of $\mathcal{N}$-MDNs. For this, we show that $\mathcal{N}$-Curves are a special case of Gaussian processes (denoted as $\mathcal{N}$-GP) and then derive corresponding mean and kernel functions for different modalities. Following this, we propose the use of the $\mathcal{N}$-MDN as a data-dependent generator for $\mathcal{N}$-GP prior distributions. We show the advantages granted by this combined model in an application context, using human trajectory prediction as an example.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.