Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
PRMU, IBISML, IPSJ-CVIM [detail] |
2023-03-02 16:40 |
Hokkaido |
Future University Hakodate (Primary: On-site, Secondary: Online) |
Does the class imbalance in the pre-training always adversely affect transfer learning performance? Shojun Nakayama (Toshiba) PRMU2022-89 IBISML2022-96 |
In this work, we studied how class-imbalance in the pre-training affect to the accuracy of the transfer learning. We div... [more] |
PRMU2022-89 IBISML2022-96 pp.151-156 |
NLC |
2022-09-13 14:10 |
Kanagawa |
Keio Univ. Yagami Campus. (Primary: On-site, Secondary: Online) |
A Proposal of an Ensemble Learning Method by Using Pre-Trained BERT Models for Authorship Attribution Taisei Kanda, Liu Yejia, Jin Mingzhe (Doshisha Univ.) NLC2022-6 |
Although bidirectional encoder representations from transformers (BERT) is a highly versatile model, there have been rep... [more] |
NLC2022-6 pp.9-12 |
RCS |
2022-06-17 10:25 |
Okinawa |
University of the Ryukyus, Senbaru Campus and online (Primary: On-site, Secondary: Online) |
A Study on Support Vector Classification-Aided Regression with Ensemble Learning for GNSS Positioning Shugo Maruyama, Shinsuke Ibi (Doshisha Univ.), Takumi Takahashi (Osaka Univ.), Hisato Iwai (Doshisha Univ.) RCS2022-60 |
In Global Navigation Satellite System (GNSS) positioning, the receiver position is estimated by solving the nonlinear si... [more] |
RCS2022-60 pp.212-217 |
PRMU |
2021-12-16 14:55 |
Online |
Online |
Fully automatic scoring of handwritten descriptive answers in Japanese language tests Hung Tuan Nguyen, Cuong Tuan Nguyen (TUAT), Haruki Oka (UTokyo), Tsunenori Ishioka (The National Center for University Entrance Examinations), Masaki Nakagawa (TUAT) PRMU2021-32 |
This paper presents an experiment of automatically scoring handwritten descriptive answers in the trial tests for the ne... [more] |
PRMU2021-32 pp.45-50 |
HCS |
2020-11-01 14:30 |
Online |
Online |
Influences of types of lessons on children's music ensembles Satoshi Kawase (YMF/Kobe Gakuin Univ.), Masahiro Okano (JSPS/Ritsumeikan Univ.), Yoshitaka Kumasaka (YMF), Chika Nagisa (YMF/Tokyo Coll. of Mus.) HCS2020-48 |
The aim of this study was to investigate associations between types of musical training for children and coordination in... [more] |
HCS2020-48 pp.32-35 |
MSS, NLP (Joint) |
2020-03-09 15:45 |
Aichi |
(Cancelled but technical report was issued) |
Two convolutional neural networks trained through Co-teaching perform a complementary role Toshikazu Samura, Katsumi Tadamura (Yamaguchi Univ.) NLP2019-123 |
Deep learning technology needs big labeled data without noisy labels to improve its performance. However, the costs for ... [more] |
NLP2019-123 pp.61-64 |
PRMU, IBISML, IPSJ-CVIM [detail] |
2018-09-20 09:40 |
Fukuoka |
|
Arrangement of Complementary Weak Learners using Weights Assigned to Data in Parallel Ensemble Learning Shota Utsumi, Keisuke Kameyama (Univ. of Tsukuba) PRMU2018-37 IBISML2018-14 |
The accuracy of each weak learner and acquisition of complementary functions among weak learners are important for impro... [more] |
PRMU2018-37 IBISML2018-14 pp.9-15 |
IA |
2017-11-15 13:50 |
Overseas |
KMITL, Bangkok, Thailand |
Machine Learning Approach for Phishing Detection in SDN Networking Yu-Hung Chen, Jiun-Yu Yang, Po-Chun Hou, Jiann-Liang Chen (National Taiwan University of Science & Technology) IA2017-30 |
People have become increasingly dependent on information technology since the emergence of the Internet. Therefore, many... [more] |
IA2017-30 pp.1-6 |
IBISML |
2017-11-09 13:00 |
Tokyo |
Univ. of Tokyo |
Application of Transfer Learning to Smallscale Data and Its Evaluation Using Open Datasets Arika Fukushima, Toru Yano, Shuuichiro Imahara, Hideyuki Aisu (Toshiba) IBISML2017-41 |
Large sample size of the training data is essential for high performance of prediction on machine learning.
However, in... [more] |
IBISML2017-41 pp.47-53 |
IBISML |
2016-11-16 15:00 |
Kyoto |
Kyoto Univ. |
[Poster Presentation]
Optimization Method of Deep Ensemble Learning using Hierarchical Clustering Natsuki Koda, Sumio Watanabe (Tokyo Tech) IBISML2016-70 |
The method which is used for prediction by combining many different learning machines generated by using same training d... [more] |
IBISML2016-70 pp.171-176 |
SP |
2016-08-25 13:35 |
Kyoto |
ACCMS, Kyoto Univ. |
Diversity-driven Semi-supervised Ensemble DNN Acoustic Model Training Sheng Li (Kyoto Univ.), Xugang Lu (NICT), Shinsuke Sakai, Tatsuya Kawahara (Kyoto Univ.) SP2016-40 |
We focus on effective training DNN (Deep Neural Network) acoustic models for Chinese spoken lectures with only limited l... [more] |
SP2016-40 pp.71-76 |
IBISML |
2015-11-27 14:00 |
Ibaraki |
Epochal Tsukuba |
[Poster Presentation]
Recursive Ensemble Land Cover Classification for Few Training Data and Many Class Yu Oya, Katsutoshi Kanamori, Hayato Ohwada (TUS) IBISML2015-77 |
Many global and environmental applications require land use and land cover information. A land cover classification is o... [more] |
IBISML2015-77 pp.183-188 |
IT |
2014-07-17 10:10 |
Hyogo |
Kobe University |
Distance Metric Learning with Low Computational Complexity based on Ensemble of Low-dimensional Matrixes Hiroshi Saito, Fumihiro Yamazaki, Kenta Mikawa, Masayuki Goto (Waseda Univ.) IT2014-12 |
The distance metric learning is the approach which enables to acquire a good metric for automatic data classification. I... [more] |
IT2014-12 pp.7-12 |
IT |
2014-07-18 10:20 |
Hyogo |
Kobe University |
A Prediction Method based on Weighted Ensemble of Decision Tree on Alternating Decision Forests. Shotaro Misawa, Naohiro Fujiwara, Kenta Mikawa, Masayuki Goto (Waseda Univ.) IT2014-29 |
In this study, we focus on the Alternating Decision Forests (ADF). The ADF introduces the weights which represent the de... [more] |
IT2014-29 pp.101-106 |
IBISML |
2013-11-12 15:45 |
Tokyo |
Tokyo Institute of Technology, Kuramae-Kaikan |
[Poster Presentation]
A boosting method considering tolerance against noisy data by weighting each data according to the distance between incidents Shinjiro Fujita, Sayaka Kamei, Satoshi Fujita (Hiroshima Univ.) IBISML2013-38 |
AdaBoost is one of the major ensemble learning methods. It is easy to implement and
has high classification accuracy. ... [more] |
IBISML2013-38 pp.15-21 |
AI |
2010-06-25 14:15 |
Tokyo |
|
Refining Noisy Training Examples Based on Ensemble Learning for Intelligent Domain-Specific WEB Search Hiroki Hirabayashi, Koji Iwanuma, Yoshitaka Yamamoto, Hidetomo Nabeshima (Univ. of Yamanashi) AI2010-5 |
The Keyword Spices, proposed Oyama et al., is a sort of a query-expansion technology, which adds pre-computed additional... [more] |
AI2010-5 pp.25-30 |
PRMU |
2009-09-01 09:00 |
Miyagi |
Tohoku Univ. |
Implementation and Experimental Evaluation of Ensemble Minimum Classification Error Training Shin'ichi Taniguchi (Doshisha University), Hideyuki Watanabe (NICT), Shigeru Katagiri, Kohta Yamada (Doshisha University), Atsushi Nakamura, Erik McDermott, Shinji Watanabe (NTT), Naho Nishijima, Miho Ohsaki (Doshisha Univ.) PRMU2009-67 |
Recently, we developed a noble Ensemble-based Minimum Classification Error training method (EMCE) by combining the advan... [more] |
PRMU2009-67 pp.103-108 |
PRMU |
2009-03-13 15:20 |
Miyagi |
Tohoku Institute of Technology |
A Proposal of Ensemble-based Minimum Classification Error Training Hideyuki Watanabe (NICT/ATR), Shigeru Katagiri, Kohta Yamada (Doshisha Univ.), Atsushi Nakamura, Erik McDermott, Shinji Watanabe (NTT), Shin'ichi Taniguchi, Naho Nishijima, Miho Ohsaki (Doshisha Univ.) PRMU2008-250 |
We propose an ensemble-based minimum classification error (MCE) training method to combine multiple weak classifiers in ... [more] |
PRMU2008-250 pp.71-76 |
NC |
2006-01-24 09:00 |
Hokkaido |
Hokkaido Univ. |
Adaptive Classifiers-Ensemble System for Concept-Drifting Environments Kyosuke Nishida, Koichiro Yamauchi, Takashi Omori (Hokkaido Univ.) |
Most machine learning algorithms assume stationary environments, require a large number of training examples in advance,... [more] |
NC2005-98 pp.1-6 |
NLP |
2005-11-18 13:25 |
Fukuoka |
Kyushu Institute of Technology |
Ensemble Self-Generating Neural Networks for Chaotic Time Series Prediction Masaki Nakahara, Hirotaka Inoue (KNCT) |
In this paper,we present a performanse characteristic of self-generating neural networks(SGNNs) applied
to time series ... [more] |
NLP2005-63 pp.7-12 |