Presentation | 2012-11-08 Optimization of Feature Allocation on Parallel Stochastic Gradient Descent for Sparse Data Kohei HAYASHI, Ryohei FUJIMAKI, |
---|---|
PDF Download Page | PDF download Page Link |
Abstract(in Japanese) | (See Japanese page) |
Abstract(in English) | In feature-wise parallelization of stochastic gradient descent, feature allocation governs the total computational time, which does not scale with the number of parallel units as we expected in general. We tackle the problem and propose efficient feature allocation methods. We evaluate the performance of them both theoretically and empirically with synthetic data. |
Keyword(in Japanese) | (See Japanese page) |
Keyword(in English) | Stochastic gradient descent / parallel computing / discrete optimization |
Paper # | IBISML2012-77 |
Date of Issue |
Conference Information | |
Committee | IBISML |
---|---|
Conference Date | 2012/10/31(1days) |
Place (in Japanese) | (See Japanese page) |
Place (in English) | |
Topics (in Japanese) | (See Japanese page) |
Topics (in English) | |
Chair | |
Vice Chair | |
Secretary | |
Assistant |
Paper Information | |
Registration To | Information-Based Induction Sciences and Machine Learning (IBISML) |
---|---|
Language | JPN |
Title (in Japanese) | (See Japanese page) |
Sub Title (in Japanese) | (See Japanese page) |
Title (in English) | Optimization of Feature Allocation on Parallel Stochastic Gradient Descent for Sparse Data |
Sub Title (in English) | |
Keyword(1) | Stochastic gradient descent |
Keyword(2) | parallel computing |
Keyword(3) | discrete optimization |
1st Author's Name | Kohei HAYASHI |
1st Author's Affiliation | Graduate School of Information Science and Technology, University of Tokyo:JSPS() |
2nd Author's Name | Ryohei FUJIMAKI |
2nd Author's Affiliation | NEC Laboratories America |
Date | 2012-11-08 |
Paper # | IBISML2012-77 |
Volume (vol) | vol.112 |
Number (no) | 279 |
Page | pp.pp.- |
#Pages | 8 |
Date of Issue |