Previous research generates a summary of the contents of a document based on various topics in the text. As a method for summarizing sentences, we adopt the encoder part of PreSumm (Liu et al., 2019), which generates an abstractive summary with Transformer, and we apply the general-purpose language model GPT2 as a decoder. We also employ the Plug and Play Language Models (PPLM) to generate a summary that reflects the topic. The objective of the current research is to control text generation based on Systemic Functional Linguistics (SFL). SFL is the theory of language which considers not only grammar but also the context of communication. In addition to topic control, we control the rhetorical structure of the text and the difficulty of words depending on the reader’s proficiency. |