Presentation 2019-09-28
A comparison of Japanese pretrained BERT models
Naoki Shibayama, Rui Cao, Jing Bai, Wen Ma, Hiroyuki Shinnou,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) BERT is useful pre-training method for neural languages. There are pre-trained models for English which was used in a paper of BERT. Now, There are 3 Japanese pre-trained models which uses SentencePiece, Juman++ with BPE, or MeCab with NEologd to separate input texts by lexical. In this paper, we compared these 3 models with a sentiment analysis task as a subject.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) machine learning / BERT / pre-trained model / natural language processing
Paper # NLC2019-24
Date of Issue 2019-09-20 (NLC)

Conference Information
Committee NLC / IPSJ-DC
Conference Date 2019/9/27(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Future Corporation
Topics (in Japanese) (See Japanese page)
Topics (in English) The Thirteenth Text Analytics Symposium
Chair Takeshi Sakaki(Hottolink) / Ryoji Akimoto(Toppan Printing)
Vice Chair Mitsuo Yoshida(Toyohashi Univ. of Tech.) / Kazutaka Shimada(Kyushu Inst. of Tech.)
Secretary Mitsuo Yoshida(Ryukoku Univ.) / Kazutaka Shimada(NTT) / (Future Univ. Hakodate)
Assistant Takeshi Kobayakawa(NHK) / Hiroki Sakaji(Univ. of Tokyo)

Paper Information
Registration To Technical Committee on Natural Language Understanding and Models of Communication / Special Interest Group on Document Communication
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) A comparison of Japanese pretrained BERT models
Sub Title (in English)
Keyword(1) machine learning
Keyword(2) BERT
Keyword(3) pre-trained model
Keyword(4) natural language processing
1st Author's Name Naoki Shibayama
1st Author's Affiliation Ibaraki University(Ibaraki Univ.)
2nd Author's Name Rui Cao
2nd Author's Affiliation Ibaraki University(Ibaraki Univ.)
3rd Author's Name Jing Bai
3rd Author's Affiliation Ibaraki University(Ibaraki Univ.)
4th Author's Name Wen Ma
4th Author's Affiliation Ibaraki University(Ibaraki Univ.)
5th Author's Name Hiroyuki Shinnou
5th Author's Affiliation Ibaraki University(Ibaraki Univ.)
Date 2019-09-28
Paper # NLC2019-24
Volume (vol) vol.119
Number (no) NLC-212
Page pp.pp.89-92(NLC),
#Pages 4
Date of Issue 2019-09-20 (NLC)