Files

Abstract

This thesis is centered on questions coming from Machine Learning (ML) and Statistical Field Theory (SFT). In Machine Learning, we consider the subfield of Supervised Learning (SL), and in particular regression tasks where one tries to find a regressor that fits given labeled data points as well as possible while at the same time optimizing other constrains. We consider two very famous Regression Models: Kernel Methods and Random Features (RF) models. We show that RF models can be used as an effective way to implement Kernel Methods and discuss in details the robustness of this approximation. Furthermore we propose a new estimator for the analysis of the choice of kernels that it is based only on the acquired dataset. In SFT we focus on so-called two-dimensional Conformal Field Theories (CFTs). We study these theories via their connection with lattice models, detailing how they emerge as continuous limits of discrete models as well as the meaning of all the central quantities of the theory. Furthermore, we show how to connect CFTs with the theory of Schramm-Loewner Evolutions (SLEs). Finally, we focus on the semi-local theories associated with the Ising and Tricritical Ising models and analyze the probabilistic and geometric meaning of the holomorphic fermions present therein.

Details

Actions