Magnitude is an isometric invariant for metric spaces that was introduced by Leinster around 2010, and is currently the object of intense research, since it has been shown to encode many known invariants of metric spaces. In recent work, Govc and Hepworth introduced persistent magnitude, a numerical invariant of a filtered simplicial complex associated to a metric space. Inspired by Govc and Hepworth’s definition, we introduced alpha magnitude. Alpha magnitude presents computational advantages over both magnitude as well as Rips magnitude, and is thus an easily computable new measure for the estimation of fractal dimensions of real-world data sets. I will also briefly talk about work in progress, a clustering algorithm based on the alpha magnitude. This is joint work with Miguel O'Malley and Nina Otter.

In this talk, we consider the simplest class of stratified spaces – linearly embedded graphs. We present a method to learns the abstract structure of an embedded graph and model the specific embedding from a point cloud sampled from it. We use tools and inspiration from computational geometry, algebraic topology, and topological data analysis and prove the correctness of the identified abstract structure under assumptions on the embedding. The algorithm is implemented in the Julia package Skyler, which we used for the numerical simulations.

In this talk I will describe a new model for time-varying graphs (TVGs) based on persistent topology and cosheaves. In its simplest form, this model presents TVGs as matrices with entries in the semi-ring of subsets of time; applying the classic Kleene star construction yields novel summary statistics for space networks (such as STARLINK) called "lifetime curves." In its more complex form, this model leads to a natural featurization and discrimination of certain Earth-Moon-Mars communication scenarios using zig-zag persistent homology. Finally, and if time allows, I will describe recent work with David Spivak and NASA, which provides a complete description of delay tolerant networking (DTN) in terms of an enriched double category.

Permutation patterns appear in a plethora of research areas, including statistics, combinatorics, algebra, geometry, and computer science. As an example, counting the number of occurrences of small patterns is relevant in nonparametric statistics. The naive counting algorithm has polynomial complexity with degree corresponding to the size of the pattern. Recent breakthrough work by Zohar et al. (2021) shows that some patterns can be counted in nearly linear time. To address the problem, Zohar introduced corner trees, finite rooted trees endowed with some additional data. To understand the subspace of permutations expressable by corner trees, we encode both corner trees and permutations as double posets, i.e. sets equipped with two partial order relations.
This approach seems beneficial since:various corner trees are the same double poset in disguise,we can use results from category theory that concern the factorization of morphisms. These factorizations yield immediate proofs of results appearing in the article by Zohar et al. (2021),we introduce a nearly quadratic algorithm to count a double poset which is almost a corner tree. This allows us to count all permutations up to level 4.
This is ongoing work with Joscha Diehl

Classical vector valued paths are widespread across pure and applied mathematics: from stochastic processes in probability to time series data in machine learning. Parallel transport (or path development) and path signatures provide an effective method to characterize such paths while preserving the concatenation structure of paths. In this talk, we extend this framework to build structure-preserving characterizations of random and possibly nonsmooth surfaces using surface holonomy. This is based on joint work with Harald Oberhauser.

In this presentation, we study mixing times and the so-called cutoff phenomenon for an ergodic overdamped nonlinear non-gradient Langevin dynamics with a strongly coercive potential and driven by an additive noise with small amplitude.
The cutoff phenomenon was introduced by Diaconis and Aldous in the study of a quantitative convergence to equilibrium for card-shuffling Markov models. When the driven noise is the Brownian motion, the total variation distance between the current state and its equilibrium distribution decays around the mixing time from one to zero abruptly. When the noise is the alpha-stable with index alpha>3/2, cutoff phenomenon still holds while for alpha\leq 3/2 our coupling techniques do not apply, and therefore we cannot conclude if the cutoff phenomenon still holds.
In the case of degenerate potential, we show that cutoff phenomenon does not hold.
The talk is based on series of papers with Milton Jara (IMPA, Brazil), Michael Högele (Universidad de los Andes, Colombia), Juan Carlos Pardo (CIMAT, Mexico) and Conrado da Costa (Durham University, UK).

We generalize the theory of $k$-minimal models in rational homotopy theory to the context of persistence. Our main motivation is the computation of barcode decompositions of rational homotopy groups of a space. We give an explicit construction of minimal models of maps between commutative differential-graded algebras, which we then use to construct minimal models of tame persistent CDGAs. We then build a model structure on persistent CDGAs for which these minimal models become cell complexes. Applications of our results include an explicit construction of Postnikov towers for copersistent spaces and a decomposition of copersistent Postnikov invariants.
This is joint work with Kathryn Hess and Samuel Lavenir.

The study of regular objects, such as polytopes, and their symmetries is a subject that attracts researchers from different areas of mathematics, such as geometers and algebraists, but also researchers from other areas of knowledge such as chemistry, thanks to the high symmetry of the molecules. An abstract polytope is a structure that combinatorically describes a classical polytope. Abstract regular polytopes can be described as a poset, as an incidence geometry or as C-group with linear diagram. A hypertope was introduced as a polytope-like structure where its group of symmetries is a C-group however it does not need to have a linear diagram. The groups of these structures can be represented as faithful transitive permutation representation graphs.

Nowadays, scientific literature discusses Topological Data Analysis (TDA) applications in Neuroscience. Nevertheless, a fundamental question in the field is, how different are fMRI in one individual over a short time? Are they similar? What are the changes between individuals? This talk presents the approach used to study resting-state functional Magnetic Resonance Images (fMRI) with TDA methods using the Vietoris-Rips filtration over a weighted network and looking for statistical differences between their Betti Curves and also a vectorization method using the Minimum Spanning Tree.

Persistent homology provides a robust methodology to infer topological structures from point cloud data. In this talk I will explore the persistent homology of point clouds embedded into a probabilistic setting, exploiting the theory of point processes. I will introduce measures on the space of persistence diagrams and the self-similar scaling of a one-parameter family of these. As the main result I will discuss a packing relation between the occurring scaling exponents.

Despite intense research in stochastic topology, little is known about the case when the connectivity threshold doesn't shrink with sample size. In this talk I will characterise high-dimensional topology arising from a simple low-dimensional manifold: Circle. We observe a fascinating phase transition going from 1-sphere, bouquet of 2-sphere, 3-sphere, bouquet of 4-sphere, and so on. Our main tool is the expected Euler characteristic, which is computed exactly for any fixed point count. These systematic behaviour cannot be regarded as "topological noise", and calls for deeper investigations from the TDA community.

We will examinate with the second moment method in Costa and Farber's approach to random simplicial complexes, the appearence of a geometric configuration needed for the "Rigid expansion" procedure in the curve complex of a surface.

This talk will summarize two recent works at the crossroads of information theory and geometry.
The first one deals with the “typical realizations” of a memoryless random source whose law is a stratified measure: a convex combination of rectifiable measures. Stratified measures generalize discrete-continuous mixtures and may have a singular continuous part (which is “carried” by manifolds in various dimensions). We state an asymptotic equipartition property for stratified measures that shows concentration of probability on subsets of a few "typical dimensions" and that quantifies the volume growth of typical sequences in each stratum. This gives a concrete interpretation for Renyi’s information dimension in some cases.
The second concerns the magnitude of (possibly enriched) categories, which gives, in particular, a new isometric invariant of metric spaces. Magnitude can be seen as a categorical generalization of cardinality and shares many properties with it. In turn, the Boltzmann-Shannon entropy is a probabilistic extension of (the logarithm of) cardinality. We aim to connect the two ideas by considering a generalization of Shannon entropy to finite categories that also serves as a probabilistic extension of magnitude.

In this talk we study bounds and approximations to the change of the spectral radius of a graph when a node or edge is removed. We characterize the graphs for which some of these approximations are exact.

Spectral clustering is a widely employed technique for uncovering underlying data structures. Researchers have explored various tools to analyze its theoretical properties. Recently, a variational method is employed to demonstrate the convergence of spectral clustering results towards a continuum operator that captures the geometry of the underlying manifold. This research line not only offers efficient explanation to spectral clustering but also inspires algorithm design.
In this talk, I will present two significant applications of this convergence, focusing on algorithm design and interpretation. Firstly, I will discuss the two sufficient conditions for designing an algorithm capable of solving Multi-manifold clustering problems. Subsequently, I will present an explicit algorithm that satisfies these sufficient conditions.
Moving forward, I will introduce a cutting-edge technique known as Spectral Neural Network, which showcases state-of-the-art performance in the field of contrastive learning. I will demonstrate that the curse of dimensionality does not afflict the number of neurons required to approximate the manifold's geometry. Furthermore, I will explore the optimization perspective of Spectral Neural Networks, revealing the convexity of the ambient problem in a quotient geometry. Additionally, I will illustrate the existence of a local strongly convex regime in the parameterized problem space. To support these theoretical findings, I will present some simulation results.

The self-avoiding walk is a model of statistical physics which has been studied extensively on the hypercubic lattice $\mathbb{Z}^d$. Over the last few decades, the study of self-avoiding walk beyond $\mathbb{Z}^d$ has received increasing attention. In this talk, we will consider the case of regular tessellations of the hyperbolic plane. We will show that there are exponentially fewer self-avoiding polygons than self-avoiding walks, and we will deduce that the self-avoiding walk is ballistic.

In this work, we provide an approach to signal compression and reconstruction on chain complexes that leverages the tools of algebraic discrete Morse theory. The main goal is to reduce and reconstruct a based chain complex together with a set of signals on its cells via deformation retracts, with the aim of preserving parts of the global topological structure of both the complex and the signals.
We show that any deformation retract of a real degree-wise finite-dimensional based chain complexes is equivalent to a Morse matching. We then study how the signal changes under particular types of Morse matching, showing its reconstruction error is trivial on specific components of the Hodge decomposition. Furthermore, we provide an algorithm to compute Morse matchings which locally minimizes reconstruction error.
This is joint work with Stefania Ebli and Celia Hacker.

Start with a square grid having n rows and n columns. Split the grid up into at least two squares and rectangles, as long as you don’t repeat the same sized square or the same sized rectangle twice. Once you are done, compute your score M(n), that is the difference between the piece of largest area and the piece of lowest area. What is the best score you can achieve for a fixed value of n? Is there an n such that M(n) = 0?
We will show some recent computational results that give a lower bound on n.
This is joint work in progress with Natalia Garcia Colin and Erika Roldan.

High dimensional noisy dataset is commonly encountered in many scientific fields, and a critical step in data analysis is denoising. Under the white noise assumption, optimal shrinkage has been well-developed and widely applied to many problems. However, in practice, noise is usually colored and dependent, and the algorithm needs modification. We introduce a novel fully data-driven optimal shrinkage algorithm when the noise satisfies the separable covariance structure. The novelty involves a precise rank estimation and an accurate imputation strategy. In addition to showing theoretical supports under the random matrix framework, we show the performance of our algorithm in simulated datasets and apply the algorithm to extract fetal electrocardiogram from the benchmark trans-abdominal maternal electrocardiogram, which is a special single-channel blind source separation challenge.

A strong hypothesis in neuroscience is that many aspects of brain function are determined by the “map of the brain" and that its computational power relies on its connectivity architecture. Impressive scientific and engineering advances in recent years generated a plethora of large brain networks of incredibly complex architectures. A crucial aspect of the architecture is its inherent directionality reflecting the direction of information flow. One of the stark differences between directed and undirected networks is the presence of reciprocal connections. It has been shown that reciprocal connections are an overrepresented motif in neural networks and that these are formed selectively rather than randomly.
This brings forward the mathematical question: how to build appropriate null-models for directed brain networks and how the amount and location of reciprocal connections affect these? We take this question to its core and ask: how does the presence of reciprocal connections in a graph change the number of directed simplex counts (a relevant motif in neuroscience)? This inquiry is linked to deep mathematical questions in the fields of combinatorics and computational complexity as it can be phrased as counting the number of possible linear extensions of certain posets, or the number of inversions of certain permutations.
This is joint work with Jason Smith and Matteo Santoro

(Part I) Neuromodulation is a mechanism that can alter the dynamics of a neural network without changing the underlying network’s connectivity structure. This can be modeled with simple recurrent neural networks. Using threshold-linear networks (TLNs), we investigate the extent to which neuromodulation alone can change a network’s dynamics as well as how a network’s connectivity constrains the possible dynamics achievable via neuromodulation. We show that the bifurcation diagram over the neuromodulatory phase space can be described via system of overlapping simplicial cones.
(Part II) Using single-photon calcium imaging, we record for an hour the activity of ~1000s of neurons in zebrafish larvae optic tectum in the absence of stimulation. We observe spontaneous activity of neuronal assemblies that are both functionally coordinated and localized in the optic tectum. We analyze the neural correlations using topological signatures that identify distinct correlation structures during spontaneous activity.

For a finite point set P $\subset{R^d}$, let $M_{r,k}$ denote the set of points in $R^d$ that are within distance r of at least k points in P. Allowing r and k to vary yields a 2-parameter family of spaces M, called the multicover bifiltration of P. It is a density-sensitive extension of the union-of-balls filtration commonly considered in TDA. It is robust to outliers in a strong sense, which motivates the problem of efficiently computing it (up to homotopy). A recent algorithm of Edelsbrunner and Osang computes a polyhedral model of M called the rhomboid bifiltration. In this work, we introduce an approximation of M (up to homotopy) which extends a version of the rhomboid tiling and devise an efficient algorithm to compute it.
The talk is based on joint work with Uli Bauer, Tamal Dey and Michael Lesnick.

The Morse index associated to a critical point of a smooth function is a local quantity that is equal to the number of negative eigenvalues of the Hessian evaluated at that point. As second derivatives might be difficult to compute or unavailable in real world contexts, one can use the fundamental results of Morse theory to bypass the need for second derivatives. Unfortunately, this involves computing the relative homology of pairs of sub-level sets, which is no longer a local quantity. In this talk, we propose a new algorithm which combines the best of both worlds by reducing the computation of the Morse index to a homology inference problem. The key ingredient is the theory of Gromoll-Meyer pairs, which facilitates this transition from global sublevel sets to local submanifolds with corners. Finally, we describe an upper bound on the density of sample points needed in order to recover the Morse index.

Directed flag complexes are semisimplicial complexes which have recently been used as a tool to explore the global structure of directed graphs, most notably those arising from neuroscience. It is a folklore observation that random flag complexes often have the homotopy type of a wedge of spheres. One might therefore wonder whether this is also the case for graphs arising from nature. To explore this idea, we will take a look at the brain network of the C. Elegans nematode, an important model organism in biology, and show that the homotopy type of its directed flag complex can be computed in its entirety by an iterative approach using elementary techniques of algebraic topology such as homology, simplicial collapses and coning operations. Along the way, we will encounter some other interesting examples and properties of semisimplicial complexes.

Shape is foundational to biology. Observing and documenting shape has fueled biological understanding as the shape of biomolecules, cells, tissues, and organisms arise from the effects of genetics, development, and the environment. The vision of Topological Data Analysis (TDA), that data is shape and shape is data, will be relevant as biology transitions into a data-driven era where meaningful interpretation of large datasets is a limiting factor. We focus first on quantifying the morphology of X-ray CT scans of barley spikes and seeds using topological descriptors based on the Euler Characteristic Transform. We then successfully train a support vector machine to distinguish and classify 28 different varieties of barley based solely on the 3D shape of their grains. This shape characterization will allow us later to link genotype with phenotype, furthering our understanding on how the physical shape is genetically specified in DNA. There is also current work looking at the shape of citrus fruits and walnuts.

I will argue, supported by an experiment, that theoretical biology moves (very slowly) in a direction where predictions (as in theoretical physics) become possible. We start with a simple, theoretical, mechanical-genetic model of protein. Using the interpration of its spectral properties, one can formulate some predictions on the possible effects when the protein is mutated. The results suggest that only mutations at specific positions in the gene sequence can have a relevant effect on the function of the protein. These predictions have then been checked in a delicate experiment with actual mutations in the protein Guanylate kinase. A clear signal confirms the theoretical prediction.

The classical q-state Potts model of interacting spins in the integer lattice Zd – of which the Ising model is the special case q=2 – is one of the most important models in statistical physics and probability. It is often studied via a coupling with the Fortuin-Kasteleyn random cluster model of dependent bond percolation. In our talk, we describe how to generalize these models to higher-dimensional cubical complexes by defining q-state Potts lattice gauge theory and the plaquette random cluster model. When q is a prime integer, we show that the expectation of a Wilson loop variable in Potts lattice gauge theory (an analogue of its namesake in quantum field theory) equals the probability that the loop is null-homologous in the corresponding plaquette random cluster model. We also prove that the i-dimensional plaquette random cluster model in the 2i-dimensional torus exhibits a sharp phase transition in the sense of homological percolation: that is, the emergence of giant cycles which are non-trivial homology classes in the ambient torus.

The Persistent Homology Transform (PHT) is a topological transform which can be use to quantify the difference between subsets of Euclidean space. To each unit vector the transform assigns the persistence module of the height function over that shape with respect to that direction. The PHT is injective on piecewise-linear subsets of Euclidean space, and it has been demonstrably useful in diverse applications. One shortcoming is that shapes with different essential homology (i.e., Betti numbers) have an infinite distance between them. The theory of extended persistence for Morse functions on a manifold was developed by Cohen-Steiner, Edelsbrunner and Harer in 2009 to quantify the support of the essential homology classes. By using extended persistence modules of height functions over a shape, we obtain the extended persistent homology transform (XPHT) which provides a finite distance between shapes even when they have different Betti numbers. I will discuss how the XPHT of a manifold with boundary can be deduced from the XPHT of the boundary which allows for efficient calculation. James Morgan has implemented the required algorithms for 2-dimensional binary images as a forthcoming R-package. Work is also with Vanessa Robins.