en.m.wikipedia.org
Connectionism: Difference between revisions
← Previous edit
Next edit →
Connectionism (edit)
Revision as of 23:15, 18 December 2019
835 BYTES ADDED
,  1 YEAR AGO
provided more background
 
===Biological realism===
Connectionist work in general does not need to be biologically realistic and therefore suffers from a lack of neuroscientific plausibility.<ref>{{Cite web|url=http://www.encephalos.gr/48-1-01e.htm|title=Encephalos Journal|website=www.encephalos.gr|access-date=2018-02-20}}</ref><ref>{{Cite book|url=https://books.google.com/books?id=s-OCCwAAQBAJ&pg=PT18&lpg=PT18&dq=%22accurate%22&f=false#v=onepage&q=%22accurate%22&f=false|title=Neural Geographies: Feminism and the Microstructure of Cognition|last=Wilson|first=Elizabeth A.|date=2016-02-04|publisher=Routledge|isbn=9781317958765|language=en}}</ref><ref>{{Cite web|url=https://pdfs.semanticscholar.org/e953/59bc80e624a963a3d8c943e3b2898a397ef7.pdf|title=Organismically-inspired robotics: homeostatic adaptation and teleology beyond the closed sensorimotor loop|last=|first=|date=|website=|access-date=}}</ref><ref>{{Cite journal|last=Zorzi|first=Marco|last2=Testolin|first2=Alberto|last3=Stoianov|first3=Ivilin P.|date=2013-08-20|title=Modeling language and cognition with deep unsupervised learning: a tutorial overview|journal=Frontiers in Psychology|volume=4|doi=10.3389/fpsyg.2013.00515|issn=1664-1078|pmc=3747356|pmid=23970869}}</ref><ref>{{Cite web|url=http://scholarworks.sjsu.edu/cgi/viewcontent.cgi?article=1015&context=comparativephilosophy|title=ANALYTIC AND CONTINENTAL PHILOSOPHY|last=|first=|date=|website=|access-date=}}</ref><ref>{{Cite book|url=https://books.google.com/books?id=uV9TZzOITMwC&pg=PA17&lpg=PA17&dq=%22biological%20plausibility%22&f=false#v=onepage&q=%22biological%20plausibility%22&f=false|title=Neural Network Perspectives on Cognition and Adaptive Robotics|last=Browne|first=A.|date=1997-01-01|publisher=CRC Press|isbn=9780750304559|language=en}}</ref><ref>{{Cite book|url=https://books.google.com/books?id=7pPv0STSos8C&pg=PA63&lpg=PA63&dq=%22biological+realism%22#v=onepage&q=%22biological%20realism%22&f=false|title=Connectionism in Perspective|last=Pfeifer|first=R.|last2=Schreter|first2=Z.|last3=Fogelman-Soulié|first3=F.|last4=Steels|first4=L.|date=1989-08-23|publisher=Elsevier|year=|isbn=9780444598769|location=|pages=|language=en}}</ref> However, the structure of neural networks is derived from that of biological [[Neuron|neurons]], and this parallel in low-level structure is often argued to be an advantage of connectionism in modeling cognitive structures compared with other approaches.<ref name=":2" /> ItOne hasarea alsowhere beenconnectionist shownmodels are thought to be biologically implausible is with respect to error-propagation networks that connectionistare needed to support learning algorithms<ref>{{Cite that​journal|last=Crick|first=Francis|date=1989-01|title=The recent excitement about neural networks|url=https://www.nature.com/articles/337129a0|journal=Nature|language=en|volume=337|issue=6203|pages=129–132|doi=10.1038/337129a0|issn=1476-4687}}</ref><ref name=":4">{{Cite journal|last=Rumelhart|first=David E.|last2=Hinton|first2=Geoffrey E.|last3=Williams|first3=Ronald J.|date=1986-10|title=Learning representations by back-propagating errors|url=https://www.nature.com/articles/323533a0|journal=Nature|language=en|volume=323|issue=6088|pages=533–536|doi=10.1038/323533a0|issn=1476-4687}}</ref>,​use​but error propagation can explain some of the biologically-generated electrical activity seen at the scalp in [[Event-related potential|event-related potentials]] such as the [[N400 (neuroscience)|N400]] and [[P600 (neuroscience)|P600]] <ref>{{Cite journal|last=Fitz|first=Hartmut|last2=Chang|first2=Franklin|date=2019-06-01|title=Language ERPs reflect learning through prediction error propagation|url=http://www.sciencedirect.com/science/article/pii/S0010028518300124|journal=Cognitive Psychology|volume=111|pages=15–52|doi=10.1016/j.cogpsych.2019.03.002|issn=0010-0285}}</ref>​, and this provides some biological support for one of the key assumptions of connectionist learning procedures.
 
===Learning===
The weights in a neural network are adjusted according to some [[learning rule]] or algorithm, such as [[Hebbian theory|Hebbian learning]]. Thus, connectionists have created many sophisticated learning procedures for neural networks. Learning always involves modifying the connection weights. In general, these involve mathematical formulas to determine the change in weights when given sets of data consisting of activation vectors for some subset of the neural units. Several studies have been focused on designing teaching-learning methods based on connectionism.<ref>{{Cite journal|last=Novo|first=María-Luisa|last2=Alsina|first2=Ángel|last3=Marbán|first3=José-María|last4=Berciano|first4=Ainhoa|date=2017|title=Connective Intelligence for Childhood Mathematics Education|url=https://doi.org/10.3916/C52-2017-03|journal=Comunicar|language=es|volume=25|issue=52|pages=29–39|doi=10.3916/c52-2017-03|issn=1134-3478}}</ref>
 
By formalizing learning in such a way, connectionists have many tools. A very common strategy in connectionist learning methods is to incorporate [[gradient descent]] over an error surface in a space defined by the weight matrix. All gradient descent learning in connectionist models involves changing each weight by the [[partial derivative]] of the error surface with respect to the weight. [[Backpropagation]] (BP), first made popular in the 1980s, is probably the most commonly known connectionist gradient descent algorithm today <ref name=":4" />.
 
==History==
Fc2046
24
EDITS
Content is available under CC BY-SA 3.0 unless otherwise noted.
Privacy policy
Terms of Use
Desktop
HomeRandomNearbyLog inSettingsDonateAbout WikipediaDisclaimers