Skip to search formSkip to main contentSkip to account menu

Backpropagation

Known as: Error back-propagation, Backpropogation, Back prop 
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2001
Highly Cited
2001
This paper provides guidance to some of the concepts surrounding recurrent neural networks. Contrary to feedforward networks… 
Highly Cited
1996
Highly Cited
1996
We saw in the last chapter that multilayered networks are capable of computing a wider range of Boolean functions than networks… 
Highly Cited
1995
Highly Cited
1995
Since the publication of the PDP volumes in 1986, learning by backpropagation has become the most popular method of training… 
Highly Cited
1995
Highly Cited
1995
Sigmoid function is the most commonly known function used in feed forward neural networks because of its nonlinearity and the… 
Highly Cited
1992
Highly Cited
1992
A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework… 
Highly Cited
1992
Highly Cited
1992
The authors propose a theoretical framework for backpropagation (BP) in order to identify some of its limitations as a general… 
Highly Cited
1991
Highly Cited
1991
The backpropagation (BP) algorithm that provides a popular method for the design of a multilayer neural network to include… 
Review
1990
Review
1990
Fundamental developments in feedforward artificial neural networks from the past thirty years are reviewed. The history… 
Highly Cited
1990
Highly Cited
1990
This paper explores the effect of initial weight selection on feed-forward networks learning simple functions with the back… 
Review
1989
Review
1989
  • R. Hecht-Nielsen
  • 1989
  • Corpus ID: 5691634