Backpropagation: Difference between revisions

Content deleted Content added
Fc2046 (talk | contribs)
described research suggesting that backpropagation can explain electrical brain activity
Line 235:
In 1986 [[David E. Rumelhart|Rumelhart]], [[Geoffrey E. Hinton|Hinton]] and [[Ronald J. Williams|Williams]] showed experimentally that this method can generate useful internal representations of incoming data in hidden layers of neural networks.{{sfn|Rumelhart|Hinton|Williams|1986a}}{{sfn|Rumelhart|Hinton|Williams|1986b}}<ref>{{cite book|url={{google books |plainurl=y |id=4j9GAQAAIAAJ}}|title=Introduction to Machine Learning|last=Alpaydin|first=Ethem|publisher=MIT Press|year=2010|isbn=978-0-262-01243-0}}</ref> [[Yann LeCun]], inventor of the Convolutional Neural Network architecture, proposed the modern form of the back-propagation learning algorithm for neural networks in his PhD thesis in 1987. But it is only much later, in 1993, that Wan was able to win an international pattern recognition contest through backpropagation.<ref name="schmidhuber2015"/><ref>{{cite book |first=Eric A. |last=Wan |chapter=Time Series Prediction by Using a Connectionist Network with Internal Delay Lines |title=Time Series Prediction : Forecasting the Future and Understanding the Past |editor-first=Andreas S. |editor-last=Weigend |editor-link=Andreas Weigend |editor2-first=Neil A. |editor2-last=Gershenfeld |editor2-link=Neil Gershenfeld |series=Proceedings of the NATO Advanced Research Workshop on Comparative Time Series Analysis |volume=Volume 15 |location=Reading |publisher=Addison-Wesley |year=1994 |isbn=0-201-62601-2 |pages=195–217 |chapterurl=https://pdfs.semanticscholar.org/667c/2b372d3387011510a21cc7e0b267e36259dd.pdf }}</ref>
 
During the 2000s it fell out of favour, but returned in the 2010s, benefitting from cheap, powerful [[GPU]]-based computing systems. This has been especially so in language structure learning research, where the connectionist models using this algorithm have been able to explain a variety of phenomena related to first<ref>{{Cite journal|last=Chang|first=Franklin|last2=Dell|first2=Gary S.|last3=Bock|first3=Kathryn|date=2006|title=Becoming syntactic.|journal=Psychological Review|volume=113|issue=2|pages=234–272|doi=10.1037/0033-295x.113.2.234|pmid=16637761}}</ref> and second language learning.<ref>{{Cite journal|last=Janciauskas|first=Marius|last2=Chang|first2=Franklin|date=2017-07-26|title=Input and Age-Dependent Variation in Second Language Learning: A Connectionist Account|journal=Cognitive Science|volume=42|pages=519–554|doi=10.1111/cogs.12519|pmid=28744901|pmc=6001481}}</ref> In particular, error backpropagation can explain human brain [[Event-related potential|ERP]] components like the [[N400 (neuroscience)|N400]] and [[P600 (neuroscience)|P600]]. <ref>{{Cite journal|last=Fitz|first=Hartmut|last2=Chang|first2=Franklin|date=2019|title=Language ERPs reflect learning through prediction error propagation|url=https://linkinghub.elsevier.com/retrieve/pii/S0010028518300124|journal=Cognitive Psychology|language=en|volume=111|pages=15–52|doi=10.1016/j.cogpsych.2019.03.002|via=}}</ref>
 
==See also==