Abstract
Latent Variable Perceptron Algorithm for Structured Classification
We propose a perceptron-style algorithm for fast discriminative training of structured latent variable model. This method extends the perceptron algorithm for the learning with latent dependencies, as an alternative to existing probabilistic latent variable models. It relies on Viterbi decoding over latent variables, combined with simple additive updates. Its training cost is significantly lower than that of probabilistic latent variable models, while it gives comparable or even superior classification accuracy on our tasks. Experiments on natural language processing problems demonstrate that its results are among those good reports on corresponding data sets.
Xu Sun, Takuya Matsuzaki, Daisuke Okanohara, Jun'ichi Tsujii