A BERT-Based Approach for Extracting Prerequisite Relations among Wikipedia Concepts

Concept prerequisite relation prediction is a common task in the field of knowledge discovery. Concept prerequisite relations can be used to rank learning resources and help learners plan their learning paths. As the largest Internet encyclopedia, Wikipedia is composed of many articles edited in mul...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Youheng Bai, Yan Zhang, Kui Xiao, Yuanyuan Lou, Kai Sun
Formato: article
Lenguaje:EN
Publicado: Hindawi Limited 2021
Materias:
Acceso en línea:https://doaj.org/article/fecebe4516a943d3850ea298306679af
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Sumario:Concept prerequisite relation prediction is a common task in the field of knowledge discovery. Concept prerequisite relations can be used to rank learning resources and help learners plan their learning paths. As the largest Internet encyclopedia, Wikipedia is composed of many articles edited in multiple languages. Basic knowledge concepts in a variety of subjects can be found on Wikipedia. Although there are many knowledge concepts in each field, the prerequisite relations between them are not clear. When we browse pages in an area on Wikipedia, we do not know which page to start. In this paper, we propose a BERT-based Wikipedia concept prerequisite relation prediction model. First, we created two types of concept pair features, one is based on BERT sentence embedding and the other is based on the attributes of Wikipedia articles. Then, we use these two types of concept pair features to predict the prerequisite relations between two concepts. Experimental results show that our proposed method performs better than state-of-the-art methods for English and Chinese datasets.