File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On the principles of Parsimony and Self-consistency for the emergence of intelligence

TitleOn the principles of Parsimony and Self-consistency for the emergence of intelligence
Authors
KeywordsClosed-loop transcription
Deep networks
Intelligence
Parsimony
Rate reduction
Self-consistency
TP18
Issue Date2022
Citation
Frontiers of Information Technology and Electronic Engineering, 2022, v. 23, n. 9, p. 1298-1323 How to Cite?
AbstractTen years into the revival of deep networks and artificial intelligence, we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general. We introduce two fundamental principles, Parsimony and Self-consistency, which address two fundamental questions regarding intelligence: what to learn and how to learn, respectively. We believe the two principles serve as the cornerstone for the emergence of intelligence, artificial or natural. While they have rich classical roots, we argue that they can be stated anew in entirely measurable and computable ways. More specifically, the two principles lead to an effective and efficient computational framework, compressive closed-loop transcription, which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence. While we use mainly visual data modeling as an example, we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain.
Persistent Identifierhttp://hdl.handle.net/10722/327786
ISSN
2023 Impact Factor: 2.7
2023 SCImago Journal Rankings: 0.700
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorMa, Yi-
dc.contributor.authorTsao, Doris-
dc.contributor.authorShum, Heung Yeung-
dc.date.accessioned2023-05-08T02:26:48Z-
dc.date.available2023-05-08T02:26:48Z-
dc.date.issued2022-
dc.identifier.citationFrontiers of Information Technology and Electronic Engineering, 2022, v. 23, n. 9, p. 1298-1323-
dc.identifier.issn2095-9184-
dc.identifier.urihttp://hdl.handle.net/10722/327786-
dc.description.abstractTen years into the revival of deep networks and artificial intelligence, we propose a theoretical framework that sheds light on understanding deep networks within a bigger picture of intelligence in general. We introduce two fundamental principles, Parsimony and Self-consistency, which address two fundamental questions regarding intelligence: what to learn and how to learn, respectively. We believe the two principles serve as the cornerstone for the emergence of intelligence, artificial or natural. While they have rich classical roots, we argue that they can be stated anew in entirely measurable and computable ways. More specifically, the two principles lead to an effective and efficient computational framework, compressive closed-loop transcription, which unifies and explains the evolution of modern deep networks and most practices of artificial intelligence. While we use mainly visual data modeling as an example, we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain.-
dc.languageeng-
dc.relation.ispartofFrontiers of Information Technology and Electronic Engineering-
dc.subjectClosed-loop transcription-
dc.subjectDeep networks-
dc.subjectIntelligence-
dc.subjectParsimony-
dc.subjectRate reduction-
dc.subjectSelf-consistency-
dc.subjectTP18-
dc.titleOn the principles of Parsimony and Self-consistency for the emergence of intelligence-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1631/FITEE.2200297-
dc.identifier.scopuseid_2-s2.0-85135851367-
dc.identifier.volume23-
dc.identifier.issue9-
dc.identifier.spage1298-
dc.identifier.epage1323-
dc.identifier.eissn2095-9230-
dc.identifier.isiWOS:000840009700002-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats