Topological Data Analysis is a relatively new paradigm in data-science that models datasets as point-clouds sampled from a shape embedded in an Euclidean space. The inference problem is to estimate the essential topological features of the underlying shape from a point-cloud sampled from it. This is done through a technique called persistent homology which is a mathematical formalism based on algebraic topology. A necessary step in the persistent homology pipeline is the reduction a so-called boundary matrix, which is a process reminiscent to Gaussian elimination. In this talk, I present a number of structural dependencies in boundary matrices and use them to design a novel parallel algorithm for their reduction, which is especially fit for GPUs. Simulations on synthetic examples show that the computational burden can be conducted in a small fraction of the number of iterations needed by traditional methods. For example, numerical experiments show that for a boundary matrix with 10^4 columns, the reduction completed to within 1% in about ten iterations as opposed to nearly approximately eight thousand iterations for traditional methods. |