Search

Workshop

Machine learning in adaptive domain decomposition methods

  • Axel Klawonn (University of Cologne)
Live Stream MPI für Mathematik in den Naturwissenschaften Leipzig (Live Stream)

Abstract

The convergence rate of domain decomposition methods is generally determined by the eigenvalues of the preconditioned system. For second-order elliptic partial differential equations, coefficient discontinuities with a large contrast can lead to a deterioration of the convergence rate. A remedy can be obtained by enhancing the coarse space with elements, which are often called constraints, that are computed by solving small eigenvalue problems on portions of the interface of the domain decomposition, i.e., edges in two dimensions or faces and edges in three dimensions. In the present work, without restriction of generality, the focus is on two dimensions. In general, it is difficult to predict where these constraints have to be computed, i.e., on which edges. Here, a machine learning based strategy using neural networks is suggested to predict the geometric location of these edges in a preprocessing step. This reduces the number of eigenvalue problems that have to be solved during the iteration. Numerical experiments for model problems and realistic microsections using regular decompositions as well as those from graph partitioners are provided, showing very promising results.

This is joint work with Alexander Heinlein, Martin Lanser, and Janine Weber.

conference
9/10/20 9/11/20

GAMM AG Workshop Computational and Mathematical Methods in Data Science

MPI für Mathematik in den Naturwissenschaften Leipzig Live Stream

Valeria Hünniger

Max Planck Institute for Mathematics in the Sciences Contact via Mail

Max von Renesse

Leipzig University

André Uschmajew

Max Planck Institute for Mathematics in the Sciences