A clinical specific BERT developed using a huge Japanese clinical text corpus

Generalized language models that are pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate the...

Description complète

Enregistré dans:
Détails bibliographiques
Auteurs principaux: Yoshimasa Kawazoe, Daisaku Shibata, Emiko Shinohara, Eiji Aramaki, Kazuhiko Ohe
Format: article
Langue:EN
Publié: Public Library of Science (PLoS) 2021
Sujets:
R
Q
Accès en ligne:https://doaj.org/article/d91d1c1105f045dc8aaa84db58182b7f
Tags: Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
Search Result 1