File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: A new optimization algorithm for single hidden layer feedforward neural networks

TitleA new optimization algorithm for single hidden layer feedforward neural networks
Authors
KeywordsEvolutionary Algorithm
Feedforward Neural Networks
Training Of Neural Networks
Issue Date2013
PublisherElsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/asoc
Citation
Applied Soft Computing Journal, 2013, v. 13 n. 5, p. 2857-2862 How to Cite?
AbstractFeedforward neural networks are the most commonly used function approximation techniques in neural networks. By the universal approximation theorem, it is clear that a single-hidden layer feedforward neural network (FNN) is sufficient to approximate the corresponding desired outputs arbitrarily close. Some researchers use genetic algorithms (GAs) to explore the global optimal solution of the FNN structure. However, it is rather time consuming to use GA for the training of FNN. In this paper, we propose a new optimization algorithm for a single-hidden layer FNN. The method is based on the convex combination algorithm for massaging information in the hidden layer. In fact, this technique explores a continuum idea which combines the classic mutation and crossover strategies in GA together. The proposed method has the advantage over GA which requires a lot of preprocessing works in breaking down the data into a sequence of binary codes before learning or mutation can apply. Also, we set up a new error function to measure the performance of the FNN and obtain the optimal choice of the connection weights and thus the nonlinear optimization problem can be solved directly. Several computational experiments are used to illustrate the proposed algorithm, which has good exploration and exploitation capabilities in search of the optimal weight for single hidden layer FNNs. © 2012.
Persistent Identifierhttp://hdl.handle.net/10722/155965
ISSN
2021 Impact Factor: 8.263
2020 SCImago Journal Rankings: 1.290
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, LKen_US
dc.contributor.authorShao, Sen_US
dc.contributor.authorYiu, KFCen_US
dc.date.accessioned2012-08-08T08:38:39Z-
dc.date.available2012-08-08T08:38:39Z-
dc.date.issued2013en_US
dc.identifier.citationApplied Soft Computing Journal, 2013, v. 13 n. 5, p. 2857-2862en_US
dc.identifier.issn1568-4946en_US
dc.identifier.urihttp://hdl.handle.net/10722/155965-
dc.description.abstractFeedforward neural networks are the most commonly used function approximation techniques in neural networks. By the universal approximation theorem, it is clear that a single-hidden layer feedforward neural network (FNN) is sufficient to approximate the corresponding desired outputs arbitrarily close. Some researchers use genetic algorithms (GAs) to explore the global optimal solution of the FNN structure. However, it is rather time consuming to use GA for the training of FNN. In this paper, we propose a new optimization algorithm for a single-hidden layer FNN. The method is based on the convex combination algorithm for massaging information in the hidden layer. In fact, this technique explores a continuum idea which combines the classic mutation and crossover strategies in GA together. The proposed method has the advantage over GA which requires a lot of preprocessing works in breaking down the data into a sequence of binary codes before learning or mutation can apply. Also, we set up a new error function to measure the performance of the FNN and obtain the optimal choice of the connection weights and thus the nonlinear optimization problem can be solved directly. Several computational experiments are used to illustrate the proposed algorithm, which has good exploration and exploitation capabilities in search of the optimal weight for single hidden layer FNNs. © 2012.en_US
dc.languageengen_US
dc.publisherElsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/asocen_US
dc.relation.ispartofApplied Soft Computing Journalen_US
dc.subjectEvolutionary Algorithmen_US
dc.subjectFeedforward Neural Networksen_US
dc.subjectTraining Of Neural Networksen_US
dc.titleA new optimization algorithm for single hidden layer feedforward neural networksen_US
dc.typeArticleen_US
dc.identifier.emailYiu, KFC:cedric@hkucc.hku.hken_US
dc.identifier.authorityYiu, KFC=rp00206en_US
dc.description.naturelink_to_subscribed_fulltexten_US
dc.identifier.doi10.1016/j.asoc.2012.04.034en_US
dc.identifier.scopuseid_2-s2.0-84885640902en_US
dc.identifier.isiWOS:000319205200055-
dc.publisher.placeNetherlandsen_US
dc.identifier.scopusauthoridLi, LK=7501445268en_US
dc.identifier.scopusauthoridShao, S=7102636557en_US
dc.identifier.scopusauthoridYiu, KFC=24802813000en_US
dc.identifier.citeulike10719488-
dc.identifier.issnl1568-4946-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats