Presentation 2000/12/1
IMDEPENDENCE OF UNSCALED BASIS FUNCTIONS AND FINITE MAPPINGS BY NEURAL NETWORKS
Yoshifusa Ito,
PDF Download Page PDF download Page Link
Abstract(in Japanese) (See Japanese page)
Abstract(in English) In this paper, activation functions of neural networks are extended to higher dimensional functions. Accordingly, the sigmoid function and the radial basis function can be treated on the common base. We show that a three layered feedforward neural network having n hidden layer units can implement a mapping of n points of R^d into R without scaling the activation function. Even under this restriction, still wide is the class of functions which can be used as activation functions for the finite mapping b the neural network. However, they must be slowly increasing because theory of distribution is the proofs. The result extends Ito and Saito(1996b), where sigmoidal activations function are not scaled.
Keyword(in Japanese) (See Japanese page)
Keyword(in English) Three Layered Neural Network / Higher Dimensional Activation Function / Unscaled Activation Function / Finite Mapping
Paper # NC2000-73
Date of Issue

Conference Information
Committee NC
Conference Date 2000/12/1(1days)
Place (in Japanese) (See Japanese page)
Place (in English)
Topics (in Japanese) (See Japanese page)
Topics (in English)
Chair
Vice Chair
Secretary
Assistant

Paper Information
Registration To Neurocomputing (NC)
Language JPN
Title (in Japanese) (See Japanese page)
Sub Title (in Japanese) (See Japanese page)
Title (in English) IMDEPENDENCE OF UNSCALED BASIS FUNCTIONS AND FINITE MAPPINGS BY NEURAL NETWORKS
Sub Title (in English)
Keyword(1) Three Layered Neural Network
Keyword(2) Higher Dimensional Activation Function
Keyword(3) Unscaled Activation Function
Keyword(4) Finite Mapping
1st Author's Name Yoshifusa Ito
1st Author's Affiliation Aichi-Gakui University()
Date 2000/12/1
Paper # NC2000-73
Volume (vol) vol.100
Number (no) 490
Page pp.pp.-
#Pages 7
Date of Issue