Paper Abstract and Keywords |
Presentation |
2016-09-06 11:15
Hyper-parameter Optimization with Derivative-free Method Yoshihiko Ozaki, Masaki Yano (Univ. Tsukuba/AIST), Masaki Onishi (AIST), Takahito Kuno (Univ. Tsukuba) PRMU2016-84 IBISML2016-39 |
Abstract |
(in Japanese) |
(See Japanese page) |
(in English) |
In machine learning methods, an appropriate hyper-parameter tuning is really important for classifiers to perform its best.
However, in general, the relationship between the performance of classifiers and their hyper-parameters is an unknown function.
So the features with the objective function such as gradient are not available to optimize hyper-parameters.
In this paper, we apply derivative-free optimization methods without the gradient information of the objective function in the hyper-parameter tuning of classifiers and evaluated their performance by computational experiments. |
Keyword |
(in Japanese) |
(See Japanese page) |
(in English) |
Derivative-Free Optimization / Machine Learning / Hyper-parameter Optimization / Coordinate-Search Method / Nelder-Mead Method / Support Vector Machine / Convolutional Neural Network / |
Reference Info. |
IEICE Tech. Rep., vol. 116, no. 208, PRMU2016-84, pp. 227-232, Sept. 2016. |
Paper # |
PRMU2016-84 |
Date of Issue |
2016-08-29 (PRMU, IBISML) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
Copyright and reproduction |
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
Download PDF |
PRMU2016-84 IBISML2016-39 |
|