Committee 
Date Time 
Place 
Paper Title / Authors 
Abstract 
Paper # 
IBISML 
20220308 10:55 
Online 
Online 
Real log canonical threshold of reduced rank regression when inputs are on a low dimensional hyperplane Joe Hirose, Sumio Watanabe (Tokyo Tech) 
[more] 

NC, MBE (Joint) 
20200305 13:00 
Tokyo 
University of Electro Communications (Cancelled but technical report was issued) 
Bayesian learning curve for the case when the optimal distribution is not unique Shuya Nagayasu, Sumio Watanabe (Tokyo Tech) NC201994 
Bayesian inference is a widely used statistical method. Asymptotic behaviors of generalization loss and free energy in B... [more] 
NC201994 pp.107112 
IBISML 
20200109 13:00 
Tokyo 
ISM 
Asymptotic Behavior of Bayesian Generalization Error in Multinomial Mixtures Takumi Watanabe, Sumio Watanabe (Tokyo Tech) IBISML201918 
Multinomial mixtures are widely used in the information engineering field. However, it is not subject to the conventiona... [more] 
IBISML201918 pp.18 
IBISML 
20200109 13:25 
Tokyo 
ISM 
Real Log Canonical Threshold of Three Layered Neural Network with Swish Activation Function Raiki Tanaka, Sumio Watanabe (Tokyo Tech) IBISML201919 
In neural network learning, it is known that selection of activation function effects generalization performance. Althou... [more] 
IBISML201919 pp.915 
MBE, NC (Joint) 
20180314 10:25 
Tokyo 
KikaiShinkoKaikan Bldg. 
Experimental Analysis of Real Log Canonical Threshold in Stochastic Matrix Factorization using Hamiltonian Monte Carlo Method Naoki Hayashi, Sumio Watanabe (Tokyo Tech) NC201789 
For the real log canonical threshold (RLCT) that gives the Bayesian generalization error of stochastic matrix factorizat... [more] 
NC201789 pp.127131 
IBISML 
20180305 13:00 
Fukuoka 
Nishijin Plaza, Kyushu University 
Real Log Canonical Threshold and Bayesian Generalization Error of Mixture of Poisson Distributions Kenichiro Sato, Sumio Watanabe (Tokyo Inst. of Tech.) IBISML201790 
[more] 
IBISML201790 pp.16 
IBISML 
20171109 13:00 
Tokyo 
Univ. of Tokyo 
[Poster Presentation]
Real Log Canonical Threshold of Stochastic Matrix Factorization and its Application to Bayesian Learning Naoki Hayashi, Sumio Watanabe (TokyoTech) IBISML201738 
In stochastic matrix factorization (SMF), we deal with problems that we predict an observed stochastic matrix as a produ... [more] 
IBISML201738 pp.2330 
MBE, NC (Joint) 
20170313 10:00 
Tokyo 
KikaiShinkoKaikan Bldg. 
Experimental Analysis of Real Log Canonical Threshold in Nonnegative Matrix Factorization Naoki Hayashi, Sumio Watanabe (Tokyo Tech) NC201678 
For the real log canonical threshold ( RLCT ) that gives the Bayesian generalization error of nonnegative matrix factor... [more] 
NC201678 pp.8590 
IBISML 
20161116 15:00 
Kyoto 
Kyoto Univ. 
Estimation of Vehicular HeadwayVelocity Characteristics in Mixture of Piecewise Linear Model using Variational Bayes Method Fumito Nakamura, Sumio Watanabe (Tokyo Tech) IBISML201665 
[more] 
IBISML201665 pp.137142 
IBISML 
20161116 15:00 
Kyoto 
Kyoto Univ. 
[Poster Presentation]
Optimization Method of Deep Ensemble Learning using Hierarchical Clustering Natsuki Koda, Sumio Watanabe (Tokyo Tech) IBISML201670 
The method which is used for prediction by combining many different learning machines generated by using same training d... [more] 
IBISML201670 pp.171176 
IBISML 
20161117 14:00 
Kyoto 
Kyoto Univ. 
[Poster Presentation]
A real log canonical threshold of nonnegative matrix factorization and its application to Bayesian learning Naoki Hayashi, Sumio Watanabe (Tokyo Tech) IBISML201676 
In nonnegative matrix factorization(NMF)，we deal with problems that we predict a data matrix as a product of two
nonne... [more] 
IBISML201676 pp.215220 
MBE, NC (Joint) 
20160323 13:10 
Tokyo 
Tamagawa University 
Evaluation Method of Free Energy Calculation by Replica Monte Carlo Method using Nongaussian and Solvable Models Shoji Sugai, Sumio Watanabe (Tokyo Tech) NC201583 
[more] 
NC201583 pp.7782 
IBISML 
20160317 15:45 
Tokyo 
Institute of Statistical Mathematics 
Learning and Generalization in Neural Networks using Hamiltonian Monte Carlo Method Fumito Nakamura, Sumio Watanabe (Tokyo Tech) IBISML201597 
[more] 
IBISML201597 pp.2529 
IBISML 
20151126 15:00 
Ibaraki 
Epochal Tsukuba 
[Poster Presentation]
Classification of Training Results in Nonlinear MultiLayer Principal Component Analysis using Sparse Representation Natsuki Koda, Sumio Watanabe (Tokyo Tech) IBISML201555 
The bottleneck neural network or Nonlinear MultiLayer Principal Component Analysis(NMPCA) is used to extract the low di... [more] 
IBISML201555 pp.1924 
NC, MBE 
20150317 13:50 
Tokyo 
Tamagawa University 
Optimization of LASSO Learning using WAIC and Its Application to City Data Analysis Dai Miyazaki, Sumio Watanabe (Tokyo Tech) MBE2014175 NC2014126 
LASSO(Least Absolute Shrinkage and Selection Operator) is a method adding a penalty term consisting of absolute values o... [more] 
MBE2014175 NC2014126 pp.331336 
IBISML 
20141118 15:00 
Aichi 
Nagoya Univ. 
[Poster Presentation]
Optimization Method of LASSO Hyperparameter using WAIC Dai Miyazaki, Sumio Watanabe (Tokyo Tech) IBISML201463 
LASSO (Least Absolute Shrinkage and Selection Operator) was proposed as a regression method using a penalty term made of... [more] 
IBISML201463 pp.213218 
SP, IPSJSLP (Joint) 
20140725 13:20 
Iwate 
Hotel Hanamaki 
[Invited Talk]
Evaluation Criteria of Statistical Learning when Gaussian Approximation can not be Applied to Likelihood Function Sumio Watanabe (Tokyo Inst. of Tech.) SP201468 
Conventional statistical asymptotic theory was established based on the assumption that the likelihood function can be a... [more] 
SP201468 pp.3136 
NC, MBE (Joint) 
20140318 13:40 
Tokyo 
Tamagawa University 
Computational validation of the information criterion WBIC by the exchange Monte Carlo method Satoru Tokuda, Kenji Nagata (Univ. of Tokyo), Sumio Watanabe (Tokyo Inst. of Tech.), Masato Okada (Univ. of Tokyo/RIKEN) NC2013109 
In the models with hierarchy like artificial neural networks and mixture models, asymptotic normality, which AIC and BIC... [more] 
NC2013109 pp.121126 
NC, MBE (Joint) 
20131221 13:30 
Gifu 
Gifu University 
Difference of Enough Nmbers for General and Regular Asymptotic Theories in Statistical Learning Sumio Watanabe (Tokyo Tech) NC201361 
There are two asymptotic theories in statistical learning. One is the regular theory which assumes that the likelihood f... [more] 
NC201361 pp.4752 
IBISML 
20131112 15:45 
Tokyo 
Tokyo Institute of Technology, KuramaeKaikan 
[Poster Presentation]
Model Selection of Layered Neural Networks using WBIC based on Steepest Descent and MCMC Method Yusuke Tamai, Sumio Watanabe (Tokyo Inst. of Tech.) IBISML201336 
Many learning machines such as neural networks, normal mixtures, and hidden Markov Models contain hierarchical layers, h... [more] 
IBISML201336 pp.16 