A clinical specific BERT developed using a huge Japanese clinical text corpus
Generalized language models that are pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate the...
Enregistré dans:
Auteurs principaux: | Yoshimasa Kawazoe, Daisaku Shibata, Emiko Shinohara, Eiji Aramaki, Kazuhiko Ohe |
---|---|
Format: | article |
Langue: | EN |
Publié: |
Public Library of Science (PLoS)
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/d91d1c1105f045dc8aaa84db58182b7f |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Documents similaires
-
A clinical specific BERT developed using a huge Japanese clinical text corpus.
par: Yoshimasa Kawazoe, et autres
Publié: (2021) -
Bert-Enhanced Text Graph Neural Network for Classification
par: Yiping Yang, et autres
Publié: (2021) -
Weibo Text Sentiment Analysis Based on BERT and Deep Learning
par: Hongchan Li, et autres
Publié: (2021) -
BERT based clinical knowledge extraction for biomedical knowledge graph construction and analysis
par: Ayoub Harnoune, et autres
Publié: (2021) -
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
par: Yuan Huang, et autres
Publié: (2021)