Degree

Doctor of Philosophy (PhD)

Department

Department of Mathematics

Document Type

Dissertation

Abstract

Stochastic differential equations (SDEs) are essential for modeling systems influenced by both deterministic dynamics and random fluctuations, with applications in a wide variety of disciplines. This thesis develops a Bayesian framework for nonparametric learning in SDEs, addressing key challenges in inference, particularly when dealing with complex systems and incomplete data. The thesis begins by establishing a theoretical foundation in optimization over Hilbert spaces, including a generalized representer theorem to address infinite-dimensional optimization problems encountered in nonparametric inference. Building on this, we introduce a Bayesian framework with shrinkage priors to learn drift functions from high-frequency data. Bayesian approach incorporates low-cost sparse learning through proper use of shrinkage priors while allowing proper quantification of uncertainty through posterior distributions. We then extend the framework to scenarios involving sparse and noisy data. A novel bridge method is proposed, integrating Sequential Monte Carlo (SMC) techniques to simulate unobserved states and combining it with an Expectation-Maximization (EM) algorithm for efficient nonparameter estimation. The efficiency of the algorithms are validated through numerical experiments.

Date

11-1-2024

Committee Chair

Ganguly, Arnab

Share

COinS