Presentation | 2021-12-16 Anomaly Detection using PatchCore with Self-attention module Yuki Takena, Yoshiki Nota, Rinpei Mochizuki, Itaru Matsumura, Gosuke Ohashi, |
---|---|
PDF Download Page | PDF download Page Link |
Abstract(in Japanese) | (See Japanese page) |
Abstract(in English) | In recent years, in visual inspection of industrial products using deep learning, There are many models that achieve excellent accuracy in detecting anomalies in local area such as scratches and stains. However, there is a problem that it is weak in detecting anomalies in cooccurrence relations between parts. Therefore, we focus on Transformer’s Self-attention module, which can determine the relationship between pixels, and enable anomaly detection of cooccurrence relationships. By introducing a Self-attention module into PatchCore, which is a State-of-the-art of MVTec AD Datasets of the anomaly detection benchmark, we propose a model that can identify anomalies in the cooccurrence relationships between parts and localize the parts with different relationships. |
Keyword(in Japanese) | (See Japanese page) |
Keyword(in English) | Deep learning / Anomaly detection / Unsupervised learning / PatchCore / Self-attention |
Paper # | PRMU2021-29 |
Date of Issue | 2021-12-09 (PRMU) |
Conference Information | |
Committee | PRMU |
---|---|
Conference Date | 2021/12/16(2days) |
Place (in Japanese) | (See Japanese page) |
Place (in English) | Online |
Topics (in Japanese) | (See Japanese page) |
Topics (in English) | |
Chair | Seiichi Uchida(Kyushu Univ.) |
Vice Chair | Masakazu Iwamura(Osaka Pref. Univ.) / Mitsuru Anpai(Denso IT Lab.) |
Secretary | Masakazu Iwamura(NTT) / Mitsuru Anpai(Tottori Univ.) |
Assistant | Kouta Yamaguchi(CyberAgent) / Yusuke Matsui(Univ. of Tokyo) |
Paper Information | |
Registration To | Technical Committee on Pattern Recognition and Media Understanding |
---|---|
Language | JPN |
Title (in Japanese) | (See Japanese page) |
Sub Title (in Japanese) | (See Japanese page) |
Title (in English) | Anomaly Detection using PatchCore with Self-attention module |
Sub Title (in English) | |
Keyword(1) | Deep learning |
Keyword(2) | Anomaly detection |
Keyword(3) | Unsupervised learning |
Keyword(4) | PatchCore |
Keyword(5) | Self-attention |
1st Author's Name | Yuki Takena |
1st Author's Affiliation | Shizuoka University(Shizuoka Univ.) |
2nd Author's Name | Yoshiki Nota |
2nd Author's Affiliation | Meidensya Corporation(Meidensya Corp.) |
3rd Author's Name | Rinpei Mochizuki |
3rd Author's Affiliation | Meidensya Corporation(Meidensya Corp.) |
4th Author's Name | Itaru Matsumura |
4th Author's Affiliation | Railway Technical Research Institute(Railway Technical Research Inst.) |
5th Author's Name | Gosuke Ohashi |
5th Author's Affiliation | Shizuoka University(Shizuoka Univ.) |
Date | 2021-12-16 |
Paper # | PRMU2021-29 |
Volume (vol) | vol.121 |
Number (no) | PRMU-304 |
Page | pp.pp.31-36(PRMU), |
#Pages | 6 |
Date of Issue | 2021-12-09 (PRMU) |