**Antoni Musolas****Title: **Differential geometrical approach to covariance estimation**
Abstract: **Given a predefined covariance structure, the problem of finding a covariance matrix or kernel that maximizes the likelihood of the data is a key step in many statistical modeling frameworks, including Gaussian process regression. We develop a differential geometrical interpretation of this problem and reduce it to an optimization problem on matrix manifolds. The main advantage of this viewpoint is that the covariance structure becomes totally flexible; in particular, one can use empirical covariance matrices, estimated from observational or simulation data, to define richer parametric families of covariance kernels. These covariances offer more flexibility for problem-specific tailoring than classical parametric families of covariance kernels. After a comparison with the usual solution, we conclude that the proposed methodology is sound and versatile.

**Zheng Wang****Title**: Optimization-Based Samplers Using the Framework of Transport Maps**Abstract**: Markov chain Monte Carlo (MCMC) relies on efficient proposals to sample from a target distribution of interest. Recent optimization-based MCMC algorithms for Bayesian inference, for example implicit sampling and randomize-then-optimize (RTO), repeatedly solve optimization problems to obtain proposal samples. We analyze these algorithms by treating them as the action of an approximate transport map. From this analysis, several new variants of these algorithms ---such as transformed random walks--- follow naturally.