Presentation 2010-03-10
Autonomous Composition of State Space Based on Action History Table in Reinforcement Learning
Michio Kimura, Hidehiro Nakano, Arata Miyauchi,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In the reinforcement learning, the discrete state space is generally constructed for red environments, and learning agents learn the policy in to each state. A high-dimensional and detailed discretization is often needed in the state input in the real environments. In order to learn practical tasks, it is necessary to reduce a huge computational complexity because of an increase in the number of states. Then, this study aims to reduce the computational complexity in large-scale environments. And, more efficient learning method based on an autonomous composition of state space is presented. Concretely, a method for designing state sets that considers state transitions is proposed.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Reinforcement learning / Autonomous composition of state space / Speeding up
Paper # NLP2009-185
Date of Issue

Conference Information
Committee NLP
Conference Date 2010/3/2(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Nonlinear Problems (NLP)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) Autonomous Composition of State Space Based on Action History Table in Reinforcement Learning
Sub Title (in English)
Keyword(1) Reinforcement learning
Keyword(2) Autonomous composition of state space
Keyword(3) Speeding up
1st Author's Name Michio Kimura
1st Author's Affiliation Tokyo City University()
2nd Author's Name Hidehiro Nakano
2nd Author's Affiliation Tokyo City University
3rd Author's Name Arata Miyauchi
3rd Author's Affiliation Tokyo City University
Date 2010-03-10
Paper # NLP2009-185
Volume (vol) vol.109
Number (no) 458
Page pp.pp.-
#Pages 6
Date of Issue