In the early twentieth century, Élie Cartan solved the equivalence problem for submanifolds under the action of a Lie group. In essence, two (suitably regular) submanifolds can be locally mapped to one another by a group transformation if and only if their differential invariants have identical functional relationships. Cartan's result was subsequently reformulated by the author by introducing the notion of a signature, which is the submanifold parametrized by the fundamental differential invariants. The subsequent equivariant method of moving frames made this result completely algorithmic, and applicable to arbitrary Lie group (and even Lie pseudo-group) actions. In this talk, I will discuss some of the history, survey basic ideas and algorithms, and present a few of the many applications, including the automatic reassembly of objects: jigsaw puzzles, egg shells, and broken bones. I will endeavor to keep the talk accessible to a general audience.
The analysis of animal bones extracted from archaeological sites is important for understanding how sites were formed and how early humans interacted with animals on the landscape. Oftentimes, bones are fragmented. One method for analyzing fragmentary bone is to put the fragments back together. This task is done manually which is daunting when working with large faunal assemblages comprised of thousands of bone fragments. Nonetheless, refitting is important and 3D imaging of bone fragments offers the opportunity to use powerful mathematical and computational tools to develop more efficient refitting methods. Fully leveraging these tools necessitates collaboration between mathematicians and anthropologists. The Anthropological and Mathematical Analysis of Archaeological and Zooarchaeological Evidence (AMAAZE) is a consortium that promotes such collaborations. Here, I will provide an overview of current AMAAZE projects, emphasizing our work on refitting. I will describe ways in which refitting is used to answer important anthropological questions and highlight some of the specific challenges we encounter when trying to reconstruct fragmented long bones.
We will be concerned with the application of signatures to machine learning. The basic principle of the signature method is to represent multidimensional paths by a graded feature set of their iterated integrals, called the signature. On the one hand, in order to combine signatures with machine learning algorithms, it is necessary to truncate these infinite series. Therefore, we define an estimator of the truncation order and provide theoretical guarantees in a linear functional regression setting. On the other hand, the signature method presents several variations, which can be grouped into "augmentations", "windows", "transforms" and "rescalings". We perform an empirical study on which aspects of this framework typically produce the best results for multivariate time series classification. Combining the top choices produces a canonical pipeline for the generalised signature method, which demonstrates state-of-the-art accuracy.
Signatures describe curves as a sequence of tensors of increasing degree. Consequently random curves, that is stochastic processes, can be studied by a sequence of random tensors of increasing degree. For example, in analogy to the classical moment generating function, their expected value can characterize the underlying probability measure. I will talk about how this leads to interesting theoretical questions and, vice versa, how applications in statistical inference can inspire theoretical results and new questions.
Tim Duff Georgia Tech, USANumerical Algebraic Geometry meets Differential Signatures
We apply numerical algebraic geometry to the invariant-theoretic problem of detecting symmetries between two plane algebraic curves. We describe an efficient equality test which determines, with ``probability-one'', whether or not two rational maps have the same image up to Zariski closure. The application to invariant theory is based on the construction of suitable signature maps associated to a group acting linearly on the respective curves. We consider two versions of this construction: differential and joint signature maps. In our examples and computational experiments, we focus on the complex Euclidean group, and introduce an algebraic joint signature that we prove determines equivalence of curves under this action. We demonstrate that the test is efficient and use it to empirically compare the sensitivity of differential and joint signatures to noise.
Olga Kuznetsova Aalto University, FinlandExact solutions in log-concave maximum likelihood estimation
Shape-constrained density estimation has gained attention in recent years. We focus on the case when the densities on R^d are log-concave. It has been shown that the log of optimal log concave density is piecewise linear and supported on a regular subdivision of the given data sample (Cule, Samworth, Stewart) and that every regular subdivision arises in ML for some set of weights (Robeva, Sturmfels, Uhler). We further the understanding of logconcave MLE by studying its exact solutions and connecting this research to the recent developments in solving of polynomial-exponential systems.
Darrick Lee University of Pennsylvania, USAPath Signatures on Lie Groups
Path signatures are powerful nonparametric tools for time series analysis, shown to form a universal and characteristic feature map for Euclidean valued time series data. We lift the theory of path signatures to the setting of Lie group valued time series, adapting these tools for time series with underlying geometric constraints. We prove that this generalized path signature is universal and characteristic. To demonstrate universality, we analyze the human action recognition problem in computer vision, using SO(3) representations for the time series, providing comparable performance to other shallow learning approaches, while offering an easily interpretable feature set. We also provide a two-sample hypothesis test for Lie group-valued random walks to illustrate its characteristic property.
Alexander Schmeding University of Bergen, NorwayShape Analysis on Lie Groups (and beyond) with Applications
Miruna-Stefana Sorea Max-Planck-Institut für Mathematik in den Naturwissenschaften, GermanyThe shapes of level curves of real polynomials near strict local minima
We consider a real bivariate polynomial function vanishing at the origin and exhibiting a strict local minimum at this point. We work in a neighbourhood of the origin in which the non-zero level curves of this function are smooth Jordan curves. Whenever the origin is a Morse critical point, the sufficiently small levels become boundaries of convex disks. Otherwise, these level curves may fail to be convex.The aim of this talk is two-fold. Firstly, to study a combinatorial object measuring this non-convexity; it is a planar rooted tree. And secondly, we want to characterise all possible topological types of these objects. To this end, we construct a family of polynomial functions with non-Morse strict local minima realising a large class of such trees.
This talk is devoted to two related curve matching problems that have numerous applications in computer vision and shape analysis: (1) the equivalence problem of deciding whether or not two plane curves are equivalent under the actions of the projective group and its subgroups; (2) the projection problem of deciding whether a given plane curve is the image of a given space curve under central and parallel projections. I will discuss solutions to these problems based on differential and integral invariants, as well as their approximations. I will emphasize the differences between the smooth and algebraic cases.
This talk will focus on using the Euclidean Signature to determine whether two smooth planar curves are congruent under the Special Euclidean group. Work done by Emilio Musso and Lorenzo Nicolodi emphasize that signatures must be used with caution by constructing 1-parameter families of non-congruent curves with degenerate vertices (curve segments of constant curvature) with identical signatures. We address the claim made by Mark Hickman, that the Euclidean Signature uniquely identifies curves without degenerate vertices. While the claim is true for simple, closed curves with simple signatures, it fails for curves with non-simple signatures. For curves with non-simple signatures, we associate a directed graph (a signature quiver) with the signature and show how various paths along the quiver give rise to a family of non-congruent, non-degenerate curves with identical Euclidean Signatures. Using this additional structure, we formulate congruence criteria for non-degenerate, closed, simple planar curves.
The time series generation is a challenging problem, as existing generative adversarial networks (GANs) usually cannot capture the temporal dynamics well. Besides, the training of GAN models is computationally expensive and not stable. In this talk, I will present our recent work on a novel GAN framework for time series generation, which uses the principled and universal (log)-signature feature of time series to extract the temporal dependence of time series and design a more compact generator and discriminator. Numerical results show that our method improves the stability and of training computational efficiency while capturing the temporal dynamics of the observed time series.
When data are geometrical objects, classical statistical tools cannot be employed directly because the object space is no longer Euclidean. Even worse, often one is interested in object descriptors that are different from the original objects, e.g. principal components, which live on real projective spaces. Analogs of such descriptors usually live in even more non-Euclidean spaces. Extending the concept of the Euclidean expected value to generalized Fréchet means we provide for a framework, for instance for dimension reduction and hypothesis testing. Further, we sketch some of the pitfalls coming along with the non-Euclidean geometry, e.g. finite sample smeariness, and how to avoid them. We conclude with a list of open problems.
Archaeologists rarely find complete examples of the things from the past that they study. They find instead the broken, discarded and often widely strewn and fragmented remains of once complete objects. Putting these things back together again, when possible, is difficult, requires persons with special skill sets, and is time consuming. Thus while the insights gained from studying whole objects is often significant, mostly archaeologists have learned to work with assemblages of fragmented remains. Here I will mainly consider the challenges and potentials of putting stone artefacts back together. Stone has the quality of being incredibly durable and ubiquitous in space and time since its first use over 2.5 million years ago. It also has the unique property that the breaking of the material (known as knapping) is the actual behavior we want to reconstruct. Thus refitting the pieces in this case tells us much more than what it looked like at the start. It tells us what people did and to what ends. I will discuss some examples of the challenges and rewards of refitting stone and why stone might be suitable for automated, statistical or mathematical approaches. I will also present an independent research program to simulate knapping that may offer a data set also suitable for testing automated approaches to putting stones back together.
Time-varying shapes exist in abundance especially in life sciences where shape changes within and between individuals are tracked over time to gain insights into dynamic processes, such as aging or disease progression. A major challenge in the digital age is to deal with the increasing amount of such data, for example from large-scale clinical studies. This requires the development of robust, efficient, and consistent analysis and processing tools.
In this talk we will discuss recent approaches for the analysis of longitudinal shape data using generative hierarchical models. Such models describe the inner-individual changes as smooth, parametric curves, which in turn are considered as perturbations of a population-average trend.
To this end, we present a principled way of comparing shape trends in terms of a novel Riemannian metric, which increases the computational efficiency and does not require the implementation of the curvature tensor. We propose the corresponding variational time discretization of geodesics and apply it to the estimation of group trends and statistical testing of 3D shapes derived from epidemiological imaging studies.
Consider a complex network of interconnected input-output systems. The input can, for example, be a time-series or a curve in $\mathbb{R}^m$. The objective is to understand the resulting input-output system represented by the network.This is accomplished by providing an extension of the notion of a "Kawski-Sussmann formal system". This is a formal ODE whose solution is viewed as a trajectory evolving in an infinite-dimensional formal Lie group $G$ and whose tangent space at each point is identified with the shuffle algebra over $X$. It is shown that this method can produce the generating series of any of the networks appearing in the literature, but it can also be applied to networks that are too complex for any existing method. (Based on joint work with W. S. Gray.)
Signatures provide a succinct description of certain features of paths in a reparametrization invariant way. We propose a method for classifying shapes based on signatures on Lie groups, and compare it to current approaches based on the SRV transform and dynamic programming. This talk is based on joint work with E. Celledoni and P. E. Lystad.
Suppose that some microphones are placed on a drone inside a room with planar walls/floors/ceilings. A loudspeaker emits a sound impulse and the microphones receive several delayed responses corresponding to the sound bouncing back from each planar surface. These are the first-order echoes. In this talks, we will discuss the problem of reconstructing the shape of the room from the first-order echoes. The time delay for each echo determines the distance from the microphone to a mirror image of the source reflected across a wall. Since we do not know which echo corresponds to which wall, the distances are unlabeled. The problem is to figure out under which circumstances, and how, one can find out the correct distance-wall assignments and reconstruct the wall positions. Our algorithm uses a simple echo sorting criterion to recover the wall assignments for the echoes. We prove that, if the position and orientation of the drone is generic, then the wall assignment obtained through our echo sorting criterion must be the right one and thus the reconstruction obtained through our algorithm is correct. Our proof uses methods from computational commutative algebra. This is joint work with Gregor Kemper.