This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Brought to you by:
Paper The following article is Open access

SoftMax Neural Best Approximation

and

Published under licence by IOP Publishing Ltd
, , Citation Hawraa A Almurieb and Eman S Bhaya 2020 IOP Conf. Ser.: Mater. Sci. Eng. 871 012040 DOI 10.1088/1757-899X/871/1/012040

1757-899X/871/1/012040

Abstract

Neural networks have a great place in approximating nonlinear functions, especially those Lebesgue integrable functions that are approximated by FNNs with one hidden layer and sigmoidal functions. Various operators of neural networks have been defined and achieved to get good rates of approximation depending on the modulus of smoothness. Here we define a new neural network operator with a generalized sigmoidal function (SoftMax) to improve the rate of approximation of a Lebesgue integrable function Lp, with p < 1, to be estimated using modulus of smoothness of order k. The importance of choosing SoftMax function as an activation function is its flexible properties and various applications.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1757-899X/871/1/012040