Presentation 2020-02-17
[Tutorial Invited Lecture] BERT for dialogue systems
Koh Mitsuda,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Transfer learning, which is one of machine learning has been attracting attention in natural language processing. A general purpose model called pre-trained model is trained based on an enormous amount of plain text data and the model is tuned by downstream tasks. Using the released pre-trained models, we can easily create advanced natural language applications. This report introduces the overview of the transfer learning technique, such as BERT (Bidirectional Encoder Representations from Transformers) developed by Google in 2018, in terms of architecture, usage, and performance.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) transfer learning / BERT / dialogue processing / natural language processing
Paper # NLC2019-43
Date of Issue 2020-02-09 (NLC)

Conference Information
Committee NLC
Conference Date 2020/2/16(2days)
Place (in Japanese) (See Japanese page)
Place (in English) Seikei University
Topics (in Japanese) (See Japanese page)
Topics (in English) Integration of verbal and non-verbal information
Chair Takeshi Sakaki(Hottolink)
Vice Chair Mitsuo Yoshida(Toyohashi Univ. of Tech.) / Kazutaka Shimada(Kyushu Inst. of Tech.)
Secretary Mitsuo Yoshida(Ryukoku Univ.) / Kazutaka Shimada(NTT)
Assistant Takeshi Kobayakawa(NHK) / Hiroki Sakaji(Univ. of Tokyo)

Paper Information
Registration To Technical Committee on Natural Language Understanding and Models of Communication
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) [Tutorial Invited Lecture] BERT for dialogue systems
Sub Title (in English)
Keyword(1) transfer learning
Keyword(2) BERT
Keyword(3) dialogue processing
Keyword(4) natural language processing
1st Author's Name Koh Mitsuda
1st Author's Affiliation NTT(NTT)
Date 2020-02-17
Paper # NLC2019-43
Volume (vol) vol.119
Number (no) NLC-415
Page pp.pp.37-37(NLC),
#Pages 1
Date of Issue 2020-02-09 (NLC)