Search

Talk

Orthogonal decomposition of tensor trains

  • Elina Robeva (University of British Columbia)
Live Stream

Abstract

Tensor decomposition has many applications. However, it is often a hard problem. Orthogonally decomposable tensors form are a small subfamily of tensors and retain many of the nice properties of matrices that general tensors don't. A symmetric tensor is orthogonally decomposable if it can be written as a linear combination of tensor powers of n orthonormal vectors. We will see that the decomposition of such tensors can be found efficiently, their eigenvectors can be computed efficiently, and the set of orthogonally decomposable tensors of low rank is closed and can be described by a set of quadratic equations. One of the issues with orthogonally decomposable tensors, however, is that they form a very small subset of the set of all tensors. We expand this subset and consider orthogonally decomposable tensor trains. These are formed by placing an orthogonally decomposable tensor at each of the vertices of a tensor train, and then contracting. We give algorithms for decomposing such tensors both in the setting where the tensors at the vertices of the tensor train are symmetric and non-symmetric. This is based on joint work with Karim Halaseh and Tommi Muller.

Links

seminar
3/17/20 2/21/22

Nonlinear Algebra Seminar Online (NASO)

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail