Seminars & Colloquia
"Convexity, Sparsity, Nullity and all that in Machine Learning "
Thursday September 14, 2017 04:00 PM
Location: 322, Daniels Hall NCSU Main Campus
(Visitor parking instructions)
This talk is part of the Data Science series
Considering the parsimonious degrees of freedom of high dimensional data compared to its dimensionality, we study the union-of-subspaces (UoS) model, as a generalization of the linear subspace model. The UoS model preserves the simplicity of the linear subspace model, and enjoys the additional ability to address nonlinear data. We show a sufficient condition to use l1 minimization to reveal the underlying UoS structure, and further propose a bi-sparsity model (RoSure) as an effective algorithm, to recover the given data characterized by the UoS model from errors/corruptions.
As an interesting twist on the related problem of Dictionary Learning Problem, we discuss the sparse null space problem (SNS). Based on linear equality constraint, it first appeared in 1986 and has since inspired results, such as sparse basis pursuit, we investigate its relation to the analysis dictionary learning problem, and
show that the SNS problem plays a central role, and may naturally be exploited to solve dictionary learning problems.
Substantiating examples are provided, and the application and performance of these approaches are demonstrated on a wide range of problems, such as face clustering and video segmentation.
Host: Joseph Hart, SIAM Student Chapter Data Science Lecture Series