A clinical specific BERT developed using a huge Japanese clinical text corpus.
Generalized language models that are pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate the...
Guardado en:
Autores principales: | Yoshimasa Kawazoe, Daisaku Shibata, Emiko Shinohara, Eiji Aramaki, Kazuhiko Ohe |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Public Library of Science (PLoS)
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/2e0cce2d87e84b0e9269723053006110 |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
A clinical specific BERT developed using a huge Japanese clinical text corpus
por: Yoshimasa Kawazoe, et al.
Publicado: (2021) -
Bert-Enhanced Text Graph Neural Network for Classification
por: Yiping Yang, et al.
Publicado: (2021) -
Weibo Text Sentiment Analysis Based on BERT and Deep Learning
por: Hongchan Li, et al.
Publicado: (2021) -
BERT based clinical knowledge extraction for biomedical knowledge graph construction and analysis
por: Ayoub Harnoune, et al.
Publicado: (2021) -
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
por: Yuan Huang, et al.
Publicado: (2021)