Randomized Low-Rank Approximation and PCA: Beyond Sketching, Cameron Musco

Поділитися
Вставка
  • Опубліковано 4 чер 2024
  • I will discuss recent work on randomized algorithms for low-rank approximation and principal component analysis (PCA). The talk will focus on efforts that move beyond the extremely fast, but relatively crude approximations offered by random sketching algorithms.
    In particular, we will see how advances in Johnson-Lindenstrauss projection methods have provided tools for improving the analysis of classic iterative SVD algorithms, including the block power method and block Krylov methods. The key insight is to view the iterative algorithms as denoising procedures for coarse sketching methods. I will discuss how this view can be used to analyze a simple block
    Krylov method, showing that the algorithm gives (1+epsilon) near optimal PCA and low-rank approximation in just O(1/sqrtepsilon) iterations. Despite their long history, this analysis is the first of a Krylov subspace method that does not depend on the matrixs spectral gaps.
    I will also survey open questions in the analysis of iterative methods, promising work on approximate PCA via stochastic optimization, fast sampling methods for low-rank kernel matrix approximation, and faster techniques for singular value decomposition targeted at specific downstream tasks, such as principal component regression.

КОМЕНТАРІ • 2