Presentation 2017-10-12
[Tutorial Lecture] Families of GANs
Tomohiro Takahashi,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) Generative Adversarial Networks(GANs) have recently gained popularity due to their ability to synthesize images which are quite similar to target images. As a result, these are used in various applications such as image domain transfer, generating unseen images etc. However, there are several challenges in training a GAN model like mode collapse, blurred outputs etc. In this tutorial, we would address these issues and present solutions for the same. This tutorial will consists of 1. Introduction to how GANs work, 2. Modified objectives for GANsand 3. Applications of GANs in image generation. As both the applications and research on GANs is growing, this tutorial will provide researchers an introduction and common challenges with GAN models.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Deep Learning / Generative Adversarial Nets
Paper # PRMU2017-80
Date of Issue 2017-10-05 (PRMU)

Conference Information
Committee PRMU
Conference Date 2017/10/12(2days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair Shinichi Sato(NII)
Vice Chair Hironobu Fujiyoshi(Chubu Univ.) / Yoshihisa Ijiri(Omron)
Secretary Hironobu Fujiyoshi(AIST) / Yoshihisa Ijiri(NAIST)
Assistant Masato Ishii(NEC) / Yusuke Sugano(Osaka Univ.)

Paper Information
Registration To Technical Committee on Pattern Recognition and Media Understanding
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) [Tutorial Lecture] Families of GANs
Sub Title (in English)
Keyword(1) Deep Learning
Keyword(2) Generative Adversarial Nets
1st Author's Name Tomohiro Takahashi
1st Author's Affiliation ABEJA, Inc.(ABEJA)
Date 2017-10-12
Paper # PRMU2017-80
Volume (vol) vol.117
Number (no) PRMU-238
Page pp.pp.95-100(PRMU),
#Pages 6
Date of Issue 2017-10-05 (PRMU)