Department of Mathematics
 Search | Help | Login | pdf version | printable version

Math @ Duke



Publications [#264718] of Guillermo Sapiro

Papers Published

  1. Sprechmann, P; Litman, R; Ben Yakar, T; Bronstein, A; Sapiro, G, Efficient supervised sparse analysis and synthesis operators, Advances in Neural Information Processing Systems (January, 2013), ISSN 1049-5258
    (last updated on 2019/06/25)

    In this paper, we propose a new computationally efficient framework for learning sparse models. We formulate a unified approach that contains as particular cases models promoting sparse synthesis and analysis type of priors, and mixtures thereof. The supervised training of the proposed model is formulated as a bilevel optimization problem, in which the operators are optimized to achieve the best possible performance on a specific task, e.g., reconstruction or classification. By restricting the operators to be shift invariant, our approach can be thought as a way of learning sparsity-promoting convolutional operators. Leveraging recent ideas on fast trainable regressors designed to approximate exact sparse codes, we propose a way of constructing feed-forward networks capable of approximating the learned models at a fraction of the computational cost of exact solvers. In the shift-invariant case, this leads to a principled way of constructing a form of task-specific convolutional networks. We illustrate the proposed models on several experiments in music analysis and image processing applications.
ph: 919.660.2800
fax: 919.660.2821

Mathematics Department
Duke University, Box 90320
Durham, NC 27708-0320