|
|
All Technical Committee Conferences (Searched in: All Years)
|
|
Search Results: Conference Papers |
Conference Papers (Available on Advance Programs) (Sort by: Date Descending) |
|
Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
EMM, BioX, ISEC, SITE, ICSS, HWS, IPSJ-CSEC, IPSJ-SPT [detail] |
2023-07-24 16:20 |
Hokkaido |
Hokkaido Jichiro Kaikan |
A Random Ensemble Method with Encrypted Models for Improving Robustness against Adversarial Examples Ryota Iijima, Miki Tanaka, Sayaka Shiota, Hitoshi Kiya (Tokyo Metro. Univ.) ISEC2023-27 SITE2023-21 BioX2023-30 HWS2023-27 ICSS2023-24 EMM2023-27 |
[more] |
ISEC2023-27 SITE2023-21 BioX2023-30 HWS2023-27 ICSS2023-24 EMM2023-27 pp.86-90 |
EMM |
2023-01-26 09:55 |
Miyagi |
Tohoku Univ. (Primary: On-site, Secondary: Online) |
On the Transferability of Adversarial Examples between Isotropic Network and CNN models Miki Tanaka (Tokyo Metropolitan Univ.), Isao Echizen (NII), Hitoshi Kiya (Tokyo Metropolitan Univ.) EMM2022-62 |
Deep neural networks are well known to be vulnerable to adversarial examples (AEs). In addition, AEs generated for a sou... [more] |
EMM2022-62 pp.7-12 |
CAS, SIP, VLD, MSS |
2022-06-16 14:40 |
Aomori |
Hachinohe Institute of Technology (Primary: On-site, Secondary: Online) |
Adversarial Robustness of Secret Key-Based Defenses against AutoAttack Miki Tanaka, April Pyone MaungMaung (Tokyo Metro Univ.), Isao Echizen (NII), Hitoshi Kiya (Tokyo Metro Univ.) CAS2022-7 VLD2022-7 SIP2022-38 MSS2022-7 |
Deep neural network (DNN) models are well-known to easily misclassify prediction results by using input images with smal... [more] |
CAS2022-7 VLD2022-7 SIP2022-38 MSS2022-7 pp.34-39 |
EMM |
2022-03-07 15:55 |
Online |
(Primary: Online, Secondary: On-site) (Primary: Online, Secondary: On-site) |
[Poster Presentation]
Video Forgery Detection Using a Robust Hashing Algorithm Shoko Niwa, Miki Tanaka, Hitoshi Kiya (Tokyo Metro. Univ.) EMM2021-102 |
In this paper, we propose a method to detect the editing of video signals using a robust hashing algorithm. The assumed ... [more] |
EMM2021-102 pp.58-63 |
EMM |
2022-03-07 17:00 |
Online |
(Primary: Online, Secondary: On-site) (Primary: Online, Secondary: On-site) |
Extention of robust image classification system with Adversarial Example Detectors Miki Tanaka, Takayuki Osakabe, Hitoshi Kiya (Tokyo Metro. Univ.) EMM2021-105 |
In image classification with deep learning, there is a risk that an attacker can intentionally manipulate the prediction... [more] |
EMM2021-105 pp.76-80 |
EMM, EA, ASJ-H |
2021-11-15 09:00 |
Online |
Online |
[Poster Presentation]
A consideration of training datasets for universal detectors of CNN-generated images Miki Tanaka, Hitoshi Kiya (Tokyo Metro. Univ.) EA2021-32 EMM2021-59 |
Recent rapid advances in convolutional neural networks (CNNs) have made manipulating and generating images easy, so synt... [more] |
EA2021-32 EMM2021-59 pp.31-36 |
EMM, IT |
2021-05-20 14:35 |
Online |
Online |
A universal detector of CNN-generated images based on properties of checkerboard artifacts Miki Tanaka, Hitoshi Kiya (Metro Univ.) IT2021-3 EMM2021-3 |
We propose a universal detector of images generated by using any CNNs to detect CNN-generated images.
We consider prope... [more] |
IT2021-3 EMM2021-3 pp.13-18 |
SIS, ITE-BCT |
2020-10-01 13:20 |
Online |
Online |
Robustness Evaluation of Detectinon methods for Image manipulation with GANs Miki Tanaka, Hitoshi Kiya (Tokyo Metropolitan Univ.) SIS2020-14 |
Recent rapid advances in image manipulation tools and deep image synthesis techniques, such as Generative Adversarial Ne... [more] |
SIS2020-14 pp.23-28 |
|
|
|
Copyright and reproduction :
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
|
[Return to Top Page]
[Return to IEICE Web Page]
|