... | @@ -48,12 +48,12 @@ author and presenter: Willem Wybo, INM-6 |
... | @@ -48,12 +48,12 @@ author and presenter: Willem Wybo, INM-6 |
|
|
|
|
|
[paper](https://www.biorxiv.org/content/10.1101/2022.11.25.517941v1)
|
|
[paper](https://www.biorxiv.org/content/10.1101/2022.11.25.517941v1)
|
|
|
|
|
|
Zoom link: https://fz-juelich-de.zoom.us/j/94480107394?pwd=dFdFZTRwaytONzJrcEp3WkJ0bmNqUT09
|
|
Zoom link: https://fz-juelich-de.zoom.us/j/94480107394?pwd=dFdFZTRwaytONzJrcEp3WkJ0bmNqUT09<br>
|
|
|
|
|
|
Meeting ID: 944 8010 7394<br>
|
|
Meeting ID: 944 8010 7394<br>
|
|
Passcode: 092935
|
|
Passcode: 092935
|
|
|
|
|
|
|
|
**Abstract:**<br>
|
|
|
|
While sensory representations in the brain depend on context, it remains unclear how such modulations are implemented at the biophysical level, and how processing layers further in the hierarchy can extract useful features for each possible contextual state. Here, we first demonstrate that thin dendritic branches are well suited to implementing contextual modulation of feedforward processing. Such neuron-specific modulations exploit prior knowledge, encoded in stable feedforward weights, to achieve transfer learning across contexts. In a network of biophysically realistic neuron models with context-independent feedforward weights, we show that modulatory inputs to thin dendrites can solve linearly non-separable learning problems with a Hebbian, error-modulated learning rule. Finally, we demonstrate that local prediction of whether representations originate either from different inputs, or from different contextual modulations of the same input, results in representation learning of hierarchical feedforward weights across processing layers that accommodate a multitude of contexts.
|
|
|
|
|
|
## Past Meetings
|
|
## Past Meetings
|
|
|
|
|
... | | ... | |