Subgradient Optimization for
Max Margin Structured Classification
Fast structured prediction:
Training associative Markov networks by repeated inference.
Nathan Ratliff, J. Andrew Bagnell, Martin Zinkevich, Daniel Munoz, Nicholas Vandapel, Martial Hebert
Maximum margin planning (MMP) arose as a reduction of imitation learning to maximum margin structured classification (MMSC) (Taskar et al., 2004). The LEARCH class of functional gradient optimization routines that we developed to make learning efficient for MMP, therefore, applies to the entire class of MMSC problems. Memory requirements of previous approaches made MMSC infeasible for large-scale problems; our algorithms provide efficient stochastic and online gradient-based alternatives to these approaches with practical nonlinear generalizations. These algorithms scale only linearly in the size of the problem and can therefore be applied to a wider range of problems than previous techniques.
The above image depicts the resulting generalization performance of the LEARCH class of optimization procedures for maximum margin structured classification applied to classifying LADAR point clouds into three classes: facade, bush, and tree (Ratliff et al., 2007c). Munoz et al. independently performed a similar study using LEARCH for LADAR classification (Munoz et al., 2008). They collected a large-scale data set across the Carnegie Mellon University campus and devised a novel associative Markov network model that facilitated the detection of linear structures such as power lines. The image below depicts the learned classifications across a large hold-out region of the LADAR data set. LADAR point cloud classification was first studied by Anguelov et al. using commercial QP solvers to solve the MMSC optimization problem (Anguelov et al., 2005).
Thesis Research >