Connectionism: Difference between revisions
← Previous edit
Next edit →
Connectionism (edit)
Revision as of 20:17, 17 June 2020
→‎Basic principles
By formalizing learning in such a way, connectionists have many tools. A very common strategy in connectionist learning methods is to incorporate [[gradient descent]] over an error surface in a space defined by the weight matrix. All gradient descent learning in connectionist models involves changing each weight by the [[partial derivative]] of the error surface with respect to the weight. [[Backpropagation]] (BP), first made popular in the 1980s, is probably the most commonly known connectionist gradient descent algorithm today.<ref name=":4" />
Connectionism can be traced to ideas more than a century old, which were little more than speculation until the mid-to-late 20th century.
Anonymous user
Content is available under CC BY-SA 3.0 unless otherwise noted.
Privacy policy
Terms of Use
HomeRandomNearbyLog inSettingsDonateAbout WikipediaDisclaimers