Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism

Neural auto-regressive sequence-to-sequence models have been dominant in text generation tasks, especially the question generation task. However, neural generation models suffer from the global and local semantic semantic drift problems. Hence, we propose the hierarchical encoding–decoding mechanism...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Tianbo Ji, Chenyang Lyu, Zhichao Cao, Peng Cheng
Formato: article
Lenguaje:EN
Publicado: MDPI AG 2021
Materias:
Q
Acceso en línea:https://doaj.org/article/9ea3193bfe014b1ba4773bde0b5c6f0d
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:9ea3193bfe014b1ba4773bde0b5c6f0d
record_format dspace
spelling oai:doaj.org-article:9ea3193bfe014b1ba4773bde0b5c6f0d2021-11-25T17:29:45ZMulti-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism10.3390/e231114491099-4300https://doaj.org/article/9ea3193bfe014b1ba4773bde0b5c6f0d2021-10-01T00:00:00Zhttps://www.mdpi.com/1099-4300/23/11/1449https://doaj.org/toc/1099-4300Neural auto-regressive sequence-to-sequence models have been dominant in text generation tasks, especially the question generation task. However, neural generation models suffer from the global and local semantic semantic drift problems. Hence, we propose the hierarchical encoding–decoding mechanism that aims at encoding rich structure information of the input passages and reducing the variance in the decoding phase. In the encoder, we hierarchically encode the input passages according to its structure at four granularity-levels: [word, chunk, sentence, document]-level. Second, we progressively select the context vector from the document-level representations to the word-level representations at each decoding time step. At each time-step in the decoding phase, we progressively select the context vector from the document-level representations to word-level. We also propose the context switch mechanism that enables the decoder to use the context vector from the last step when generating the current word at each time-step.It provides a means of improving the stability of the text generation process during the decoding phase when generating a set of consecutive words. Additionally, we inject syntactic parsing knowledge to enrich the word representations. Experimental results show that our proposed model substantially improves the performance and outperforms previous baselines according to both automatic and human evaluation. Besides, we implement a deep and comprehensive analysis of generated questions based on their types.Tianbo JiChenyang LyuZhichao CaoPeng ChengMDPI AGarticlemulti-hop question generationhierarchical encoding-decodingsyntactic knowledgeScienceQAstrophysicsQB460-466PhysicsQC1-999ENEntropy, Vol 23, Iss 1449, p 1449 (2021)
institution DOAJ
collection DOAJ
language EN
topic multi-hop question generation
hierarchical encoding-decoding
syntactic knowledge
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
spellingShingle multi-hop question generation
hierarchical encoding-decoding
syntactic knowledge
Science
Q
Astrophysics
QB460-466
Physics
QC1-999
Tianbo Ji
Chenyang Lyu
Zhichao Cao
Peng Cheng
Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
description Neural auto-regressive sequence-to-sequence models have been dominant in text generation tasks, especially the question generation task. However, neural generation models suffer from the global and local semantic semantic drift problems. Hence, we propose the hierarchical encoding–decoding mechanism that aims at encoding rich structure information of the input passages and reducing the variance in the decoding phase. In the encoder, we hierarchically encode the input passages according to its structure at four granularity-levels: [word, chunk, sentence, document]-level. Second, we progressively select the context vector from the document-level representations to the word-level representations at each decoding time step. At each time-step in the decoding phase, we progressively select the context vector from the document-level representations to word-level. We also propose the context switch mechanism that enables the decoder to use the context vector from the last step when generating the current word at each time-step.It provides a means of improving the stability of the text generation process during the decoding phase when generating a set of consecutive words. Additionally, we inject syntactic parsing knowledge to enrich the word representations. Experimental results show that our proposed model substantially improves the performance and outperforms previous baselines according to both automatic and human evaluation. Besides, we implement a deep and comprehensive analysis of generated questions based on their types.
format article
author Tianbo Ji
Chenyang Lyu
Zhichao Cao
Peng Cheng
author_facet Tianbo Ji
Chenyang Lyu
Zhichao Cao
Peng Cheng
author_sort Tianbo Ji
title Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
title_short Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
title_full Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
title_fullStr Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
title_full_unstemmed Multi-Hop Question Generation Using Hierarchical Encoding-Decoding and Context Switch Mechanism
title_sort multi-hop question generation using hierarchical encoding-decoding and context switch mechanism
publisher MDPI AG
publishDate 2021
url https://doaj.org/article/9ea3193bfe014b1ba4773bde0b5c6f0d
work_keys_str_mv AT tianboji multihopquestiongenerationusinghierarchicalencodingdecodingandcontextswitchmechanism
AT chenyanglyu multihopquestiongenerationusinghierarchicalencodingdecodingandcontextswitchmechanism
AT zhichaocao multihopquestiongenerationusinghierarchicalencodingdecodingandcontextswitchmechanism
AT pengcheng multihopquestiongenerationusinghierarchicalencodingdecodingandcontextswitchmechanism
_version_ 1718412301670809600