We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Projections, Learning, and Sparsity for Efficient Data Processing

Formal Metadata

Title
Projections, Learning, and Sparsity for Efficient Data Processing
Title of Series
Part Number
9
Number of Parts
10
Author
License
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date2016
LanguageEnglish

Content Metadata

Subject Area
Genre
Abstract
The talk will discuss recent generalizations of sparse recovery guarantees and compressive sensing to the context of machine learning. Assuming some "low-dimensional model" on the probability distribution of the data, we will see that in certain scenarios it is indeed (empirically) possible to compress a large data-collection into a reduced representation, of size driven by the complexity of the learning task, while preserving the essential information necessary to process it. Two case studies will be given: compressive clustering, and compressive Gaussian Mixture Model estimation, with an illustration on large-scale model-based speaker verification. Time allowing, some recent results on compressive spectral clustering will also be discussed.