On deep learning, the curse of dimensionality, and stochastic approximation algorithms for PDEs

  • Arnulf Jentzen (ETH Zürich)
A3 01 (Sophus-Lie room)


Partial differential equations (PDEs) are among the most universal tools used in modelling problems in nature and man-made complex systems. In particular, PDEs are a fundamental tool in portfolio optimization problems and in the state-of-the-art pricing and hedging of financial derivatives. The PDEs appearing in such financial engineering applications are often high dimensional as the dimensionality of the PDE corresponds to the number of financial asserts in the involved hedging portfolio. Such PDEs can typically not be solved explicitly and developing efficient numerical algorithms for high dimensional PDEs is one of the most challenging tasks in applied mathematics. As is well-known, the difficulty lies in the so-called "curse of dimensionality" in the sense that the computational effort of standard approximation algorithms grows exponentially in the dimension of the considered PDE and there is only a very limited number of cases where a practical PDE approximation algorithm with a computational effort which grows at most polynomially in the PDE dimension has been developed. In the case of linear parabolic PDEs the curse of dimensionality can be overcome by means of stochastic approximation algorithms and the Feynman-Kac formula. We first review some results for stochastic approximation algorithms for linear PDEs and, thereafter, we present a stochastic approximation algorithm for high dimensional nonlinear PDEs whose key ingredients are deep artificial neural networks, which are widely used in data science applications. Numerical simulations and first mathematical results sketch the efficiency and the accuracy of the proposed stochastic approximation algorithm in the cases of several high dimensional PDEs from finance and physics.

Katja Heid

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar