Abstract for the talk on 03.11.2020 (17:00 h)Nonlinear Algebra Seminar Online (NASO)
Elina Robeva (University of British Columbia)
Orthogonal decomposition of tensor trains
See the video of this talk.
Tensor decomposition has many applications. However, it is often a hard problem. Orthogonally decomposable tensors form are a small subfamily of tensors and retain many of the nice properties of matrices that general tensors don’t. A symmetric tensor is orthogonally decomposable if it can be written as a linear combination of tensor powers of n orthonormal vectors. We will see that the decomposition of such tensors can be found efficiently, their eigenvectors can be computed efficiently, and the set of orthogonally decomposable tensors of low rank is closed and can be described by a set of quadratic equations. One of the issues with orthogonally decomposable tensors, however, is that they form a very small subset of the set of all tensors. We expand this subset and consider orthogonally decomposable tensor trains. These are formed by placing an orthogonally decomposable tensor at each of the vertices of a tensor train, and then contracting. We give algorithms for decomposing such tensors both in the setting where the tensors at the vertices of the tensor train are symmetric and non-symmetric. This is based on joint work with Karim Halaseh and Tommi Muller.