File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: A computation-efficient on-line training algorithm for neurofuzzy networks

TitleA computation-efficient on-line training algorithm for neurofuzzy networks
Authors
Issue Date2000
PublisherTaylor & Francis Ltd. The Journal's web site is located at http://www.tandf.co.uk/journals/titles/00207721.asp
Citation
International Journal Of Systems Science, 2000, v. 31 n. 3, p. 297-306 How to Cite?
AbstractNeurofuzzy networks are often used to model linear or nonlinear processes, as they can provide some insights into the underlying processes and can be trained using experimental data. As the training of the networks involves intensive computation, it is often performed off line. However, it is well known that neurofuzzy networks trained off line may not be able to cope successully with time-varying processes. To overcome this problem, the weights of the networks are trained on line. In this paper, an on-line training algorithm with a computation time that is linear in the number of weights is derived by making full use of the local change property of neurofuzzy networks. It is shown that the estimated weights converge to that obtained from the least-squares method, and that the range of the input domain can be extended without retraining the network. Furthermore, it has a better ability in tracking time-varying systems than the recursive least-squares method, since in the proposed algorithm a positive definite submatrix is added to the relevant part of the covariance matrix. The performance of the proposed algorithm is illustrated by simulation examples and compared with that obtained using the recursive least-squares method.
Persistent Identifierhttp://hdl.handle.net/10722/156553
ISSN
2021 Impact Factor: 2.648
2020 SCImago Journal Rankings: 0.591
ISI Accession Number ID
References

 

DC FieldValueLanguage
dc.contributor.authorChan, CWen_HK
dc.contributor.authorCheung, KCen_HK
dc.contributor.authorYeung, WKen_HK
dc.date.accessioned2012-08-08T08:42:56Z-
dc.date.available2012-08-08T08:42:56Z-
dc.date.issued2000en_HK
dc.identifier.citationInternational Journal Of Systems Science, 2000, v. 31 n. 3, p. 297-306en_HK
dc.identifier.issn0020-7721en_HK
dc.identifier.urihttp://hdl.handle.net/10722/156553-
dc.description.abstractNeurofuzzy networks are often used to model linear or nonlinear processes, as they can provide some insights into the underlying processes and can be trained using experimental data. As the training of the networks involves intensive computation, it is often performed off line. However, it is well known that neurofuzzy networks trained off line may not be able to cope successully with time-varying processes. To overcome this problem, the weights of the networks are trained on line. In this paper, an on-line training algorithm with a computation time that is linear in the number of weights is derived by making full use of the local change property of neurofuzzy networks. It is shown that the estimated weights converge to that obtained from the least-squares method, and that the range of the input domain can be extended without retraining the network. Furthermore, it has a better ability in tracking time-varying systems than the recursive least-squares method, since in the proposed algorithm a positive definite submatrix is added to the relevant part of the covariance matrix. The performance of the proposed algorithm is illustrated by simulation examples and compared with that obtained using the recursive least-squares method.en_HK
dc.languageengen_US
dc.publisherTaylor & Francis Ltd. The Journal's web site is located at http://www.tandf.co.uk/journals/titles/00207721.aspen_HK
dc.relation.ispartofInternational Journal of Systems Scienceen_HK
dc.titleA computation-efficient on-line training algorithm for neurofuzzy networksen_HK
dc.typeArticleen_HK
dc.identifier.emailChan, CW: mechan@hkucc.hku.hken_HK
dc.identifier.emailCheung, KC: kccheung@hkucc.hku.hken_HK
dc.identifier.authorityChan, CW=rp00088en_HK
dc.identifier.authorityCheung, KC=rp01322en_HK
dc.description.naturelink_to_subscribed_fulltexten_US
dc.identifier.doi10.1080/002077200291145en_HK
dc.identifier.scopuseid_2-s2.0-0034161023en_HK
dc.identifier.hkuros49552-
dc.relation.referenceshttp://www.scopus.com/mlt/select.url?eid=2-s2.0-0034161023&selection=ref&src=s&origin=recordpageen_HK
dc.identifier.volume31en_HK
dc.identifier.issue3en_HK
dc.identifier.spage297en_HK
dc.identifier.epage306en_HK
dc.identifier.isiWOS:000086140600003-
dc.publisher.placeUnited Kingdomen_HK
dc.identifier.scopusauthoridChan, CW=7404814060en_HK
dc.identifier.scopusauthoridCheung, KC=7402406698en_HK
dc.identifier.scopusauthoridYeung, WK=24345897100en_HK
dc.identifier.issnl0020-7721-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats