backpropagation
Jump to navigation
Jump to search
English
[edit]Etymology
[edit]From back + propagation.
Noun
[edit]backpropagation (countable and uncountable, plural backpropagations)
- (machine learning) An error correction technique used in neural networks.
- 2020, Timothy P. Lillicrap, Adam Santoro, Luke Marris, Colin J. Akerman, Geoffrey Hinton, “Backpropagation and the brain”, in Nature[1]:
- In machine learning, backpropagation of error (‘backprop’) is the algorithm most often used to train deep neural networks and is the most successful learning procedure for these networks.
- (neurology) A phenomenon in which the action potential of a neuron creates a voltage spike both at the end of the axon, as normally, and also back through to the dendrites from which much of the original input current originated.
- 2000 October 27, Idan Segev, Michael London, “Untangling Dendrites with Quantitative Models”, in Science[2], volume 290, number 5492, , pages 744–750:
- Experiments show that these ion channels furnish the dendrites with a rich repertoire of electrical behaviors, from essentially passive responses, to subthreshold active responses, to active backpropagation of the action potential (AP) from the soma into the dendrites, to the initiation of APs in the dendritic tree.
Derived terms
[edit]- backprop (back-formation)
- backpropagational
Related terms
[edit]Translations
[edit]in computing
|
Further reading
[edit]- backpropagation on Wikipedia.Wikipedia