Publications [#326893] of Robert Calderbank

Conference articles PUBLISHED
  1. Nokleby, M; Beirami, A; Calderbank, R, A rate-distortion framework for supervised learning, IEEE International Workshop on Machine Learning for Signal Processing, MLSP, vol. 2015-November (November, 2015), IEEE [doi] .

    Abstract:
    An information-theoretic framework is presented for bounding the number of samples needed for supervised learning in a parametric Bayesian setting. This framework is inspired by an analogy with rate-distortion theory, which characterizes tradeoffs in the lossy compression of random sources. In a parametric Bayesian environment, the maximum a posteriori classifier can be viewed as a random function of the model parameters. Labeled training data can be viewed as a finite-rate encoding of that source, and the excess loss due to using the learned classifier instead of the MAP classifier can be viewed as distortion. A strict bound on the loss-measured in terms of the expected total variation-is derived, providing a minimum number of training samples needed to drive the expected total variation to within a specified tolerance. The tightness of this bound is demonstrated on the classification of Gaus-sians, for which one can derive closed-form expressions for the bound.

Duke University * Arts & Sciences * Physics * Faculty * Staff * Grad * Researchers * Reload * Login
Copyright (c) 2001-2002 by Duke University Physics.