Learning Tag Dependencies for Sequence Tagging
Learning Tag Dependencies for Sequence Tagging
Yuan Zhang, Hongshen Chen, Yihong Zhao, Qun Liu, Dawei Yin
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 4581-4587.
https://doi.org/10.24963/ijcai.2018/637
Sequence tagging is the basis for multiple applications in natural language processing. Despite successes in learning long term token sequence dependencies with neural network, tag dependencies are rarely considered previously. Sequence tagging actually possesses complex dependencies and interactions among the input tokens and the output tags. We propose a novel multi-channel model, which handles different ranges of token-tag dependencies and their interactions simultaneously. A tag LSTM is augmented to manage the output tag dependencies and word-tag interactions, while three mechanisms are presented to efficiently incorporate token context representation and tag dependency. Extensive experiments on part-of-speech tagging and named entity recognition tasks show that the proposed model outperforms the BiLSTM-CRF baseline by effectively incorporating the tag dependency feature.
Keywords:
Natural Language Processing: Phonology, Morphology, and word segmentation
Natural Language Processing: Tagging, chunking, and parsing
Natural Language Processing: Named Entities