Abstract
Online Latent Structure Training for Language Acquisition
Michael Connor, Cynthia Fisher, Dan Roth
A fundamental step in sentence comprehension involves assigning semantic roles to sentence constituents. To accomplish this, the listener must parse the sentence, find constituents that are candidate arguments, and assign semantic roles to those constituents. Where do children learning their first languages begin in solving this problem? Even assuming children can derive a rough meaning for the sentence from the situation, how do they begin to map this meaning to the structure and the structure to the form of the sentence? In this paper we use feedback from a semantic role labeling (SRL) task to improve the intermediate syntactic representations that feed the SRL. We accomplish this by training an intermediate classifier using signals derived from latent structure optimization techniques. By using a separate classifier to predict internal structure we see benefits due to knowledge embedded in the classifier's feature representation. This extra structure allows the system to begin to learn using weaker, more plausible semantic feedback.