Parsing with Transformer
❏ A Supervised Syntactic Structure Analysis using Tree Structure Transformer
Overview
BERT is a language model that captures the dependency structure between words by using the self-attention mechanism of the Transformer and is said to contain latent syntactic structure information inside the model. However, few studies use this information for syntactic structure analysis. An unsupervised syntactic structure analyzer, Tree-Transformer, has been proposed to analyze the syntactic structure of input sentences by using the self-attention mechanism. On the other hand, in natural language processing, there are many syntactic parsing result data that have been constructed in the research of syntactic structure parsing. Therefore, in addition to the unsupervised learning of the Tree-Transformer, we propose a hierarchical error back-propagation method that utilizes the loss between the output of each layer of the Transformer and the teacher data of the parsing results and we develop a supervised learning method for syntactic structure analysis using the Transformer.
Slides
narita-1

Momoka Narita
成田 百花,谷口 巴,持橋 大地,小林 一郎「木構造Transformerを用いた教師あり統語構造解析」人工知能学会全国大会(第36回),国立京都国際会館,京都,2022年6月. (in Japanese)