Department of Mathematics
 Search | Help | Login | pdf version | printable version

Math @ Duke



Publications [#287108] of Ingrid Daubechies

Papers Published

  1. Rudin, C; Daubechies, I; Schapire, RE, On the dynamics of boosting, edited by Thrun, S; Saul, LK; Schölkopf, B, Advances in Neural Information Processing Systems (January, 2004), pp. 1101-1108, M I T PRESS, ISBN 0262201526 [advances-in-neural-information-processing-systems-16-2003]
    (last updated on 2019/08/23)

    In order to understand AdaBoost's dynamics, especially its ability to maximize margins, we derive an associated simplified nonlinear iterated map and analyze its behavior in low-dimensional cases. We find stable cycles for these cases, which can explicitly be used to solve for Ada- Boost's output. By considering AdaBoost as a dynamical system, we are able to prove R̈atsch and Warmuth's conjecture that AdaBoost may fail to converge to a maximal-margin combined classifier when given a 'nonoptimal' weak learning algorithm. AdaBoost is known to be a coordinate descent method, but other known algorithms that explicitly aim to maximize the margin (such as AdaBoost and arc-gv) are not. We consider a differentiable function for which coordinate ascent will yield a maximum margin solution. We then make a simple approximation to derive a new boosting algorithm whose updates are slightly more aggressive than those of arcgv.
ph: 919.660.2800
fax: 919.660.2821

Mathematics Department
Duke University, Box 90320
Durham, NC 27708-0320