Data-to-Text
❏ A Study on Topic-Based Abstract Summarization Generation
Overview
Previous research generates a summary of the contents of a document based on various topics in the text. As a method for summarizing sentences, we adopt the encoder part of PreSumm (Liu et al., 2019), which generates an abstractive summary with Transformer, and we apply the general-purpose language model GPT2 as a decoder. We also employ the Plug and Play Language Models (PPLM) to generate a summary that reflects the topic. The objective of the current research is to control text generation based on Systemic Functional Linguistics (SFL). SFL is the theory of language which considers not only grammar but also the context of communication. In addition to topic control, we control the rhetorical structure of the text and the difficulty of words depending on the reader’s proficiency.
Slides
yokogawa-01
yokogawa-02
yokogawa-03
yokogawa-04
yokogawa-05
yokogawa-06
yokogawa-07
yokogawa-08
yokogawa-09
yokogawa-10
yokogawa-11
yokogawa-12
yokogawa-13
yokogawa-14
yokogawa-15
yokogawa-16
yokogawa-17
yokogawa-18
yokogawa-19
yokogawa-20
yokogawa-21
yokogawa-22
yokogawa-23
yokogawa-24
yokogawa-25
yokogawa-26
yokogawa-27
yokogawa-28
yokogawa-29
yokogawa-30
yokogawa-31
yokogawa-32
yokogawa-33
yokogawa-34
yokogawa-35
yokogawa-36
yokogawa-37
yokogawa-38
yokogawa-39
yokogawa-40
previous arrow
next arrow
Yuka Yokogawa
横川 悠香,小林 一郎「トピックに基づく抽象型要約文生成への取り組み」人工知能学会全国大会(第36回),国立京都国際会館,京都,2022年6月.(in Japanese)