Math @ Duke

Publications [#287108] of Ingrid Daubechies
Papers Published
 Rudin, C; Daubechies, I; Schapire, RE, On the dynamics of boosting, edited by Thrun, S; Saul, LK; Schölkopf, B,
Advances in Neural Information Processing Systems
(January, 2004),
pp. 11011108, M I T PRESS, ISBN 0262201526 [advancesinneuralinformationprocessingsystems162003]
(last updated on 2019/08/23)
Abstract: In order to understand AdaBoost's dynamics, especially its ability to maximize margins, we derive an associated simplified nonlinear iterated map and analyze its behavior in lowdimensional cases. We find stable cycles for these cases, which can explicitly be used to solve for Ada Boost's output. By considering AdaBoost as a dynamical system, we are able to prove R̈atsch and Warmuth's conjecture that AdaBoost may fail to converge to a maximalmargin combined classifier when given a 'nonoptimal' weak learning algorithm. AdaBoost is known to be a coordinate descent method, but other known algorithms that explicitly aim to maximize the margin (such as AdaBoost and arcgv) are not. We consider a differentiable function for which coordinate ascent will yield a maximum margin solution. We then make a simple approximation to derive a new boosting algorithm whose updates are slightly more aggressive than those of arcgv.


dept@math.duke.edu
ph: 919.660.2800
fax: 919.660.2821
 
Mathematics Department
Duke University, Box 90320
Durham, NC 277080320

