Poster Session
Abstract
Low-Rank Approximability and Entropy Area Laws for PDEs
Mazen AliUlm University, Germany
Please see the abstract as PDF file.
A Jacobi-Davidson Method on the manifold of Tensors with fixed TT-rank
Henrik EisenmannTechnische Universität Berlin, Germany
We present a generalization of the Jacobi-Davidson method to manifolds of tensors with fixed tensor-train rank. If eigenvectors of symmetric Operators are known to be of approximately low rank, this method inherits the advantages of the original Jacobi-Davidson method while staying computational feasible. For this purpose a correction equation is derived by approximating the Riemannian Hessian of the Rayleigh-quotient on the tensor-train manifold for a Riemannian Newton method. It is shown how to approximately solve the correction equation, and finally ground states of molecular Hamiltonians are computed as numerical experiments.
Existence of best low rank approximations for tensors of order 3
Eric EvertKU Leuven (Kortrijk), Belgium
Low rank tensor decomposition is a fundamental tool in many applications, including data analysis and machine learning. In practice we work with a measurement of a low rank tensor which is corrupted by additive noise. Generically, this measurement has high rank. High rank tensors may fail to have best low rank approximations, and low rank decompositions are uninterpretable when this failure occurs. We provide guarantees for the existence of best low rank approximations over $\mathcal{R}$.
Statistics Rooted Decomposition Method: Tridiagonal Folmat Enhanced Multivariance Products Representation
Zeynep GündoğarFatih Sultan Mehmet Vakıf Üniversity (Istanbul), Turkey
This research presents a novel statistics-rooted decomposition method for multidimensional arrays which is called “Tridiagonal Folmat Enhanced Multivariance Products Representation (TFEMPR)”. It has been developed for decomposing multidimensional arrays with the philosophy behind Tridiagonal Matrix Enhanced Multivariance Products Representation (TMEMPR). TMEMPR method decomposes a matrix having finitely or infinitely many rows and columns to a sum of outer products whose coefficients can be gathered into a core matrix which is tridiagonal in contrast to the diagonal matrix structure of the Singular Value Decomposition (SVD). TFEMPR can be considered as higher order analogue of matrix decomposition since it focuses on folmats and folvecs which are in fact high order arrays counterparts of ordinary linear algebraic matrices and vectors.
TFEMPR has widespread application area such as image and video processing, reconstruction problems, data compression and etc. In this research, some experimental results of applications will be given.
Nonlinear classification: A Kernelized Support Tensor Train Machine
Kirandeep KourMax Planck Institute Magdeburg, Germany
Please see the abstract as PDF file.
Multivariate polynomial root finding with tensor techniques
Patrick KürschnerKU Leuven (Kortrijk), Belgium
We show how the root finding problem for systems of multivariate polynomials can solved in terms of tensor decompositions. One way is the reformulation of the polynomial system as linear system of equations where the solution vectors allow a representation as canonical polyadic decomposition of a tensor. As a consequence, this enables the application of specialized numerical solvers which exploit this structure. Another approach is to use the Macaulay matrix, whose null space admits a representation as tensor decomposition which reveals the roots.
Randomized methods for recompresion of low rank tensors
Lana PerisaEPFL (Lausanne), Switzerland
Many basic linear algebra operations with low-rank tensors, like addition, matrix-vector multiplication or element-wise product, have a tendency to significantly increase the rank, even though the resulting tensors admit a good low-rank approximation. We use randomized algorithm to recompress these tensors when dealing with low-rank formats such as Tucker and Tensor Train, by employing random vectors with rank-1 structure which matches the structure of the tensors. In case of element-wise product of tensors, this has shown to significantly reduce the computational effort, while achieving a similar accuracy as the corresponding deterministic techniques.
The ED polynomial of the dual varieties of Segre-Veronese varieties
Luca SodomacoUniversity of Florence, Italy
We outline the main properties of the ED polynomial of a real algebraic variety, where ED stands for "Euclidean Distance". Then we focus on ED polynomials of dual varieties of Segre-Veronese varieties, showing a close relationship with the Spectral Theory of partially-symmetric tensors. Their roots correspond to the singular values of a partially symmetric tensor. In particular, we investigate their lowest and highest coefficients and their corresponding vanishing loci.
Moment ideals of local Dirac mixtures
Markus WageringelOsnabrück University, Germany
Moments are quantities that measure the shape of statistical or stochastic objects and have recently been studied from a more algebraic and combinatorial point of view. We study a special case of finite mixture models, the Dirac local mixtures, highlighting connections to algebraic statistics and signal processing. We focus on the moment varieties of first order local mixtures, providing generators for the ideals and showing a connection to moment ideals of Pareto distributions.
Further, we consider mixture models of these distributions and investigate the problem of recovering the parameters of such a distribution from its low-rank moment matrix, using an extension of Prony's method. We showcase our results with an application in signal processing.
This is joint work with Alexandros Grosdos Koutsoumpelias (Osnabrück University).