Search

Talk

Low-rank tensor approximation

  • André Uschmajew
G3 10 (Lecture hall)

Abstract

The concept of matrix rank is fundamental in linear algebra, and the idea of low-rank matrix approximation has found many useful applications in modern applied mathematics. For instance it is used for data analysis and compression. The singular value decomposition is the central tool for this task and the starting point for a rich theory of low-rank matrices that combines linear algebra with analysis and geometry.

If one aims at a similar theory for higher-order tensors, one is faced with several options to generalize the notion of rank, and for any of these the low-rank approximation task is considerably harder. In this lecture we consider some of these generalizations, including low-rank tree tensor networks like the (hierarchical) Tucker format and the tensor train format. For them, the concept of higher-order singular value decomposition plays a similar role for low-rank approximation like the SVD in the matrix case. It is a main goal of this lecture to get familiar with this concept.

Additionally, some special topics of interest will be discussed, e.g., manifold structure of low-rank matrices, spectral norm of tensors, tensor products of operators, and motivating low-rank approximation tasks arising in high-dimensional scientific computing. Lectures will be as self-conatined as possible.

Date and time info
Thursday 11:00 - 12:30

Prerequisites
Basic knowledge about linear algebra, analysis and functional analysis

Audience
MSc students, PhD students, Postdocs

Language
English

lecture
01.04.18 31.07.18

Regular lectures Summer semester 2018

MPI for Mathematics in the Sciences / University of Leipzig see the lecture detail pages

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail