Weight

Sources:

In the context of neuroscience, particularly neuromorphic computing, "weight" often refers to the concept of synaptic weight, which is crucial for memory formation. Jeffrey Shainline discusses how synaptic weights in a neural network can be dynamically adjusted based on certain criteria in both supervised and unsupervised learning scenarios. This adjustment affects the strength of connectivity between neurons, essential for storing and recalling memory patterns 1.

Memory Formation

Jeffrey explains the two poles of synaptic weight update and the importance of changing the strength of connection between neurons for memory formation. He also discusses the difference between supervised and unsupervised learning and how synaptic weights are dynamically changing in the network based on physical properties. Finally, he talks about the formation of new synaptic connections and how it is not the primary mechanism by which the brain learns.

Lex Fridman Podcast

Jeffrey Shainline: Neuromorphic Computing and Optoelectronic Intelligence | Lex Fridman Podcast #225
1
RELATED QUESTIONS