File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: A synaptic learning rule for exploiting nonlinear dendritic computation

TitleA synaptic learning rule for exploiting nonlinear dendritic computation
Authors
Keywordsbiophysical model
cable theory
dendritic computation
feature-binding problem
learning rule
morphology
NMDA receptors
pyramidal neuron
synaptic plasticity
Issue Date2021
Citation
Neuron, 2021, v. 109, n. 24, p. 4001-4017.e10 How to Cite?
AbstractInformation processing in the brain depends on the integration of synaptic input distributed throughout neuronal dendrites. Dendritic integration is a hierarchical process, proposed to be equivalent to integration by a multilayer network, potentially endowing single neurons with substantial computational power. However, whether neurons can learn to harness dendritic properties to realize this potential is unknown. Here, we develop a learning rule from dendritic cable theory and use it to investigate the processing capacity of a detailed pyramidal neuron model. We show that computations using spatial or temporal features of synaptic input patterns can be learned, and even synergistically combined, to solve a canonical nonlinear feature-binding problem. The voltage dependence of the learning rule drives coactive synapses to engage dendritic nonlinearities, whereas spike-timing dependence shapes the time course of subthreshold potentials. Dendritic input-output relationships can therefore be flexibly tuned through synaptic plasticity, allowing optimal implementation of nonlinear functions by single neurons.
Persistent Identifierhttp://hdl.handle.net/10722/343519
ISSN
2023 Impact Factor: 14.7
2023 SCImago Journal Rankings: 7.728

 

DC FieldValueLanguage
dc.contributor.authorBicknell, Brendan A.-
dc.contributor.authorHäusser, Michael-
dc.date.accessioned2024-05-10T09:08:45Z-
dc.date.available2024-05-10T09:08:45Z-
dc.date.issued2021-
dc.identifier.citationNeuron, 2021, v. 109, n. 24, p. 4001-4017.e10-
dc.identifier.issn0896-6273-
dc.identifier.urihttp://hdl.handle.net/10722/343519-
dc.description.abstractInformation processing in the brain depends on the integration of synaptic input distributed throughout neuronal dendrites. Dendritic integration is a hierarchical process, proposed to be equivalent to integration by a multilayer network, potentially endowing single neurons with substantial computational power. However, whether neurons can learn to harness dendritic properties to realize this potential is unknown. Here, we develop a learning rule from dendritic cable theory and use it to investigate the processing capacity of a detailed pyramidal neuron model. We show that computations using spatial or temporal features of synaptic input patterns can be learned, and even synergistically combined, to solve a canonical nonlinear feature-binding problem. The voltage dependence of the learning rule drives coactive synapses to engage dendritic nonlinearities, whereas spike-timing dependence shapes the time course of subthreshold potentials. Dendritic input-output relationships can therefore be flexibly tuned through synaptic plasticity, allowing optimal implementation of nonlinear functions by single neurons.-
dc.languageeng-
dc.relation.ispartofNeuron-
dc.subjectbiophysical model-
dc.subjectcable theory-
dc.subjectdendritic computation-
dc.subjectfeature-binding problem-
dc.subjectlearning rule-
dc.subjectmorphology-
dc.subjectNMDA receptors-
dc.subjectpyramidal neuron-
dc.subjectsynaptic plasticity-
dc.titleA synaptic learning rule for exploiting nonlinear dendritic computation-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.neuron.2021.09.044-
dc.identifier.pmid34715026-
dc.identifier.scopuseid_2-s2.0-85120912816-
dc.identifier.volume109-
dc.identifier.issue24-
dc.identifier.spage4001-
dc.identifier.epage4017.e10-
dc.identifier.eissn1097-4199-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats