Details

Connectionism


Connectionism

A Hands-on Approach
1. Aufl.

von: Michael R. W. Dawson

98,99 €

Verlag: Wiley-Blackwell
Format: PDF
Veröffentl.: 15.04.2008
ISBN/EAN: 9781405143899
Sprache: englisch
Anzahl Seiten: 208

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b><i>Connectionism</i> is a "hands on" introduction to connectionist modeling through practical exercises in different types of connectionist architectures.</b></p> <ul> <li>explores three different types of connectionist architectures – distributed associative memory, perceptron, and multilayer perceptron</li> <li>provides a brief overview of each architecture, a detailed introduction on how to use a program to explore this network, and a series of practical exercises that are designed to highlight the advantages, and disadvantages, of each</li> <li>accompanied by a website at <b>http://www.bcp.psych.ualberta.ca/~mike/Book3/</b> that includes practice exercises and software, as well as the files and blank exercise sheets required for performing the exercises</li> <li>designed to be used as a stand-alone volume or alongside <i>Minds and Machines: Connectionism and Psychological Modeling</i> (by Michael R.W. Dawson, Blackwell 2004)</li> </ul>
<b>1. Hands-on Connectionism</b>. <p>1.1 Connectionism In Principle And In Practice.</p> <p>1.2 The Organization Of This Book.</p> <p><b>2. The Distributed Associative Memory.</b></p> <p>2.1 The Paired Associates Task.</p> <p>2.2 The Standard Pattern Associator.</p> <p>2.3 Exploring The Distributed associative memory.</p> <p><b>3. The James Program.</b></p> <p>3.1 Introduction.</p> <p>3.2 Installing The Program.</p> <p>3.3 Teaching A Distributed Memory.</p> <p>3.4 Testing What The Memory Has Learned.</p> <p>3.5 Using The Program.</p> <p><b>4. Introducing Hebb Learning.</b></p> <p>4.1 Overview Of The Exercises.</p> <p>4.2 Hebb Learning Of Basis Vectors.</p> <p>4.3 Hebb Learning Of Orthonormal, Non-Basis Vectors.</p> <p><b>5. Limitations Of Hebb Learning</b>.</p> <p>5.1 Introduction.</p> <p>5.2 The Effect Of Repetition.</p> <p>5.3 The Effect Of Correlation.</p> <p><b>6. Introducing The Delta Rule.</b></p> <p>6.1 Introduction.</p> <p>6.2 The Delta Rule.</p> <p>6.3 The Delta Rule And The Effect Of Repetition.</p> <p>6.4 The Delta Rule And The Effect Of Correlation.</p> <p><b>7. Distributed Networks And Human Memory.</b></p> <p>7.1 Background On The Paired Associate Paradigm.</p> <p>7.2 The Effect Of Similarity On The Distributed Associative Memory.</p> <p><b>8. Limitations Of Delta Rule Learning.</b></p> <p>8.1 Introduction.</p> <p>8.2 The Delta Rule And Linear Dependency.</p> <p><b>9. The Perceptron.</b></p> <p>9.1 Introduction.</p> <p>9.2 The Limits Of Distributed Associative Memories, And Beyond.</p> <p>9.3 Properties Of The Perceptron.</p> <p>9.4 What Comes Next.</p> <p><b>10. The Rosenblatt Program.</b></p> <p>10.1 Introduction.</p> <p>10.2 Installing The Program.</p> <p>10.3 Training A Perceptron.</p> <p>10.4 Testing What The Memory Has Learned.</p> <p><b>11. Perceptrons And Logic Gates.</b></p> <p>11.1 Introduction.</p> <p>11.2 Boolean Algebra.</p> <p>11.3 Perceptrons And Two-Valued Algebra.</p> <p><b>12. Performing More Logic With Perceptrons.</b></p> <p>12.1 Two-Valued Algebra And Pattern Spaces.</p> <p>12.2 Perceptrons And Linear Separability.</p> <p>12.3 Appendix Concerning The DawsonJots Font.</p> <p><b>13. Value Units And Linear Nonseparability.</b></p> <p>13.1 Linear Separability And Its Implications.</p> <p>13.2 Value Units And The Exclusive-Or Relation.</p> <p>13.3 Value Units And Connectedness.</p> <p><b>14. Network By Problem Type Interactions.</b></p> <p>14.1 All Networks Were Not Created Equally.</p> <p>14.2 Value Units And The Two-Valued Algebra.</p> <p><b>15. Perceptrons And Generalization.</b></p> <p>15.1 Background.</p> <p>15.2 Generalization And Savings For The 9-Majority Problem.</p> <p><b>16. Animal Learning Theory And Perceptrons.</b></p> <p>16.1 Discrimination Learning.</p> <p>16.2 Linearly Separable Versions Of Patterning.</p> <p><b>17. The Multilayer Perceptron.</b></p> <p>17.1 Creating Sequences Of Logical Operations.</p> <p>17.2 Multilayer Perceptrons And The Credit Assignment Problem.</p> <p>17.3 The Implications Of The Generalized Delta Rule.</p> <p><b>18. The Rumelhart Program.</b></p> <p>18.1 Introduction.</p> <p>18.2 Installing The Program.</p> <p>18.3 Training A Multilayer Perceptron.</p> <p>18.4 Testing What The Network Has Learned.</p> <p><b>19. Beyond The Perceptron’s Limits.</b></p> <p>19.1 Introduction.</p> <p>19.2 The Generalized Delta Rule And Exclusive-Or.</p> <p><b>20. Symmetry As A Second Case Study.</b></p> <p>20.1 Background.</p> <p>20.2 Solving Symmetry Problems With Multilayer Perceptrons.</p> <p><b>21. How Many Hidden Units?</b>.</p> <p>21.1 Background.</p> <p>21.2 How Many Hidden Value Units Are Required For 5-Bit Parity?.</p> <p><b>22. Scaling Up With The Parity Problem.</b></p> <p>22.1 Overview Of The Exercises.</p> <p>22.2 Background.</p> <p>22.3 Exploring The Parity Problem.</p> <p><b>23. Selectionism And Parity.</b></p> <p>23.1 Background.</p> <p>23.2 From Connectionism To Selectionism.</p> <p><b>24. Interpreting A Small Network.</b></p> <p>24.1 Background.</p> <p>24.2 A Small Network.</p> <p>24.3 Interpreting This Small Network.</p> <p><b>25. Interpreting Networks Of Value Units.</b></p> <p>25.1 Background.</p> <p>25.2 Banding In The First Monks Problem.</p> <p>25.3 Definite Features In The First Monks Problem.</p> <p><b>26. Interpreting Distributed Representations.</b></p> <p>26.1 Background.</p> <p>26.2 Interpreting A 5-Parity Network.</p> <p><b>27. Creating Your Own Training Sets.</b></p> <p>27.1 Background.</p> <p>27.2 Designing And Building A Training Set.</p> <p>References.</p>
“This is a first-rate textbook, Enabling readers to perform simulations described, it provides a very user-friendly introduction to the essential material, which it sets in an engaging, historically informed context.” <i>Anne Jaap Jacobson, University of Houston</i>
<b>Michael R. W. Dawson </b>is a member of the Department of Psychology and the Biological Computation Project at the University of Alberta, Canada. He is the author of <i>Understanding Cognitive Science </i>(Blackwell , 1998) and <i>Minds and Machines</i> (Blackwell, 2004).
<b>CONNNECTIONISM</b> is a “hands on” introduction to connectionist modeling. Three different types of connectionist architectures – distributed associative memory, perceptron, and multilayer perceptron – are explored. In an accessible style, Dawson provides a brief overview of each architecture, a detailed introduction on how to use a program to explore this network, and a series of practical exercises that are designed to highlight the advantages, and disadvantages, of each and to provide a “road map” to the field of cognitive modeling.<br /> <p><br /> </p> <p>This book is designed to be used as a stand-alone volume, or alongside <i>Minds and Machines: Connectionism and Psychological Modeling</i> (Blackwell Publishing, 2004). An accompanying website is available at www.bcp.psych.ualberta.ca/%7emike/book3/index.html and includes practice exercises and software, as well as the files and blank exercise sheets that are required for performing the exercises.</p>
“This is a first-rate textbook, Enabling readers to perform simulations described, it provides a very user-friendly introduction to the essential material, which it sets in an engaging, historically informed context.” <i>Anne Jaap Jacobson, University of Houston</i>

Diese Produkte könnten Sie auch interessieren:

Empirical Research in Teaching and Learning
Empirical Research in Teaching and Learning
von: Debra Mashek, Elizabeth Yost Hammer
PDF ebook
90,99 €
Prejudice
Prejudice
von: Rupert Brown
EPUB ebook
34,99 €
The Wiley-Blackwell Handbook of Childhood Social Development
The Wiley-Blackwell Handbook of Childhood Social Development
von: Peter K. Smith, Craig H. Hart
EPUB ebook
136,99 €