Presentation 2007-12-22
Optimizing SVR Hyperparameters via Fast Cross-Validation using AOSVR
Masayuki KARASUYAMA, Ryohei NAKANO,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) The performance of Support Vector Regression (SVR) deeply depends on its hyperparameters. such as an insensitive zone thickness, a penalty factor, and kernel parameters. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters so that cross-validation error is minimized. However, the computational cost of CV is usually high. In this paper we apply Accurate Online Support Vector Regression (AOSVR) to the MCV-SVR cross-validation procedure. The AOSVR enables an efficient update of a trained SVR function. We show the AOSVR dramatically accelerates the MCV-SVR. Moreover, our experiments showed our faster MCV-SVR has better generalization than other existing methods.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) support vector machines / support vector regression / mimum cross-validation
Paper # NC2007-73
Date of Issue

Conference Information
Committee NC
Conference Date 2007/12/15(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Optimizing SVR Hyperparameters via Fast Cross-Validation using AOSVR
Sub Title (in English)
Keyword(1) support vector machines
Keyword(2) support vector regression
Keyword(3) mimum cross-validation
1st Author's Name Masayuki KARASUYAMA
1st Author's Affiliation Nagoya Institute of Technology()
2nd Author's Name Ryohei NAKANO
2nd Author's Affiliation Nagoya Institute of Technology
Date 2007-12-22
Paper # NC2007-73
Volume (vol) vol.107
Number (no) 410
Page pp.pp.-
#Pages 6
Date of Issue