Details

Cognitive Modeling of Human Memory and Learning


Cognitive Modeling of Human Memory and Learning

A Non-invasive Brain-Computer Interfacing Approach
IEEE Press 1. Aufl.

von: Lidia Ghosh, Amit Konar, Pratyusha Rakshit

117,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 02.09.2020
ISBN/EAN: 9781119705918
Sprache: englisch
Anzahl Seiten: 272

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>Proposes computational models of human memory and learning using a brain-computer interfacing (BCI) approach</b></p> <p>Human memory modeling is important from two perspectives. First, the precise fitting of the model to an individual's short-term or working memory may help in predicting memory performance of the subject in future. Second, memory models provide a biological insight to the encoding and recall mechanisms undertaken by the neurons present in active brain lobes, participating in the memorization process. This book models human memory from a cognitive standpoint by utilizing brain activations acquired from the cortex by electroencephalographic (EEG) and functional near-infrared-spectroscopic (f-NIRs) means.</p> <p><i>Cognitive Modeling of Human Memory and Learning A Non-invasive Brain-Computer Interfacing Approach</i> begins with an overview of the early models of memory. The authors then propose a simplistic model of Working Memory (WM) built with fuzzy Hebbian learning. A second perspective of memory models is concerned with Short-Term Memory (STM)-modeling in the context of 2-dimensional object-shape reconstruction from visually examined memorized instances. A third model assesses the subjective motor learning skill in driving from erroneous motor actions. Other models introduce a novel strategy of designing a two-layered deep Long Short-Term Memory (LSTM) classifier network and also deal with cognitive load assessment in motor learning tasks associated with driving. The book ends with concluding remarks based on principles and experimental results acquired in previous chapters.</p> <ul> <li>Examines the scope of computational models of memory and learning with special emphasis on classification of memory tasks by deep learning-based models</li> <li>Proposes two algorithms of type-2 fuzzy reasoning: Interval Type-2 fuzzy reasoning (IT2FR) and General Type-2 Fuzzy Sets (GT2FS)</li> <li>Considers three classes of cognitive loads in the motor learning tasks for driving learners</li> </ul> <p><i>Cognitive Modeling of Human Memory and Learning A Non-invasive Brain-Computer Interfacing Approach</i> will appeal to researchers in cognitive neuro-science and human/brain-computer interfaces. It is also beneficial to graduate students of computer science/electrical/electronic engineering.</p>
<p>Preface xi</p> <p>Acknowledgments xvii</p> <p>About the Authors xix</p> <p><b>1 Introduction to Brain-Inspired Memory and Learning Models </b><b>1</b></p> <p>1.1 Introduction 1</p> <p>1.2 Philosophical Contributions to Memory Research 3</p> <p>1.2.1 Atkinson and Shiffrin’s Model 4</p> <p>1.2.2 Tveter’s Model 5</p> <p>1.2.3 Tulving’s Model 6</p> <p>1.2.4 The Parallel and Distributed Processing (PDP) Approach 6</p> <p>1.2.5 Procedural and Declarative Memory 8</p> <p>1.3 Brain-Theoretic Interpretation of Memory Formation 10</p> <p>1.3.1 Coding for Memory 10</p> <p>1.3.2 Memory Consolidation 12</p> <p>1.3.3 Location of Stored Memories 14</p> <p>1.3.4 Isolation of Information in Memory 15</p> <p>1.4 Cognitive Maps 16</p> <p>1.5 Neural Plasticity 17</p> <p>1.6 Modularity 18</p> <p>1.7 The Cellular Process Behind STM Formation 18</p> <p>1.8 LTM Formation 20</p> <p>1.9 Brain Signal Analysis in the Context of Memory and Learning 20</p> <p>1.9.1 Association of EEG α and θ Band with Memory Performances 21</p> <p>1.9.2 Oscillatory β and γ Frequency Band Activation in STM Performance 24</p> <p>1.9.3 Change in EEG Band Power with Changing Working Memory Load 24</p> <p>1.9.4 Effects of Electromagnetic Field on the EEG Response of Working Memory 27</p> <p>1.9.5 EEG Analysis to Discriminate Focused Attention and WM Performance 28</p> <p>1.9.6 EEG Power Changes in Memory Repetition Effect 29</p> <p>1.9.7 Correlation Between LTM Retrieval and EEG Features 32</p> <p>1.9.8 Impact of Math Anxiety on WM Response: An EEG Study 34</p> <p>1.10 Memory Modeling by Computational Intelligence Techniques 35</p> <p>1.11 Scope of the Book 39</p> <p>References 43</p> <p><b>2 Working Memory Modeling Using Inverse Fuzzy Relational Approach </b><b>51</b></p> <p>2.1 Introduction 52</p> <p>2.2 Problem Formulation and Approach 54</p> <p>2.2.1 Independent Component Analysis as a Source Localization Tool 55</p> <p>2.2.2 Independent Component Analysis vs. Principal Component Analysis 58</p> <p>2.2.3 Feature Extraction 58</p> <p>2.2.4 Phase 1: WM Modeling 59</p> <p>2.2.4.1 Step I: WM Modeling of Subject Using EEG Signals During Full Face Encoding and Recall from Specific Part of Same Face 60</p> <p>2.2.4.2 Step II: WM Modeling of Subject Using EEG Signals During Full Face Encoding and Recall from All Parts of Same Face 62</p> <p>2.2.5 Phase 2: WM Analysis 62</p> <p>2.2.6 Finding Max–Min Compositional Inverse of Weight Matrix W<i><sup>c</sup><sub>k</sub> </i>65</p> <p>2.3 Experiments and Performance Analysis 70</p> <p>2.3.1 Experimental Set-up 71</p> <p>2.3.2 Source Localization Using eLORETA 73</p> <p>2.3.3 Pre-processing 74</p> <p>2.3.4 Selection of EEG Features 74</p> <p>2.3.5 WM Model Consistency Across Partial Face Stimuli 77</p> <p>2.3.6 Inter-person Variability of W 77</p> <p>2.3.7 Variation in Imaging Attributes 77</p> <p>2.3.8 Comparative Analysis with Existing Fuzzy Inverse Relations 84</p> <p>2.4 Discussion 85</p> <p>2.5 Conclusions 86</p> <p>References 88</p> <p><b>3 Short-Term Memory Modeling in Shape-Recognition Task by Type-2 Fuzzy Deep Brain Learning </b><b>93</b></p> <p>3.1 Introduction 94</p> <p>3.2 System Overview 96</p> <p>3.3 Brain Functional Mapping Using Type-2 Fuzzy DBLN 101</p> <p>3.3.1 Overview of Type-2 Fuzzy Sets 103</p> <p>3.3.2 Type-2 Fuzzy Mapping and Parameter Adaptation by Perceptron-Like Learning 104</p> <p>3.3.2.1 Construction of the Proposed Interval Type-2 Fuzzy Membership Function (IT2MF) 104</p> <p>3.3.2.2 Construction of IT2FS-Induced Mapping Function 105</p> <p>3.3.2.3 Secondary Membership Function Computation of Proposed GT2FS 107</p> <p>3.3.2.4 Proposed General Type-2 Fuzzy Mapping 108</p> <p>3.3.3 Perceptron-Like Learning for Weight Adaptation 110</p> <p>3.3.4 Training of the Proposed Shape-Reconstruction Architecture 111</p> <p>3.3.5 The Test Phase of the Memory Model 113</p> <p>3.4 Experiments and Results 113</p> <p>3.4.1 Experimental Set-up 113</p> <p>3.4.2 Experiment 1: Validation of the STM Model with Respect to Error Metric <i>𝜉 </i>116</p> <p>3.4.3 Experiment 2: Similar Encoding by a Subject for Similar Input Object Shapes 116</p> <p>3.4.4 Experiment 3: Study of Subjects’ Learning Ability with Increasing Complexity in Object Shape 117</p> <p>3.4.5 Experiment 4: Convergence Time of the Weight Matrix G for Increased Complexity of the Input Shape Stimuli 118</p> <p>3.4.6 Experiment 5: Abnormality in G matrix for the Subjects with Brain Impairment 119</p> <p>3.5 Biological Implications 120</p> <p>3.6 Performance Analysis 122</p> <p>3.6.1 Performance Analysis of the Proposed T2FS Methods 123</p> <p>3.6.2 Computational Performance Analysis of the Proposed T2FS Methods 123</p> <p>3.6.3 Statistical Validation Using Wilcoxon Signed-Rank Test 124</p> <p>3.6.4 Optimal Parameter Selection and Robustness Study 126</p> <p>3.7 Conclusions 127</p> <p>References 130</p> <p><b>4 EEG Analysis for Subjective Assessment of Motor Learning Skill in Driving Using Type-2 Fuzzy Reasoning </b><b>137</b></p> <p>4.1 Introduction 138</p> <p>4.2 System Overview 140</p> <p>4.2.1 Rule Design to Determine the Degree of Learning 141</p> <p>4.2.2 Single Trial Detection of Brain Signals 144</p> <p>4.2.2.1 Feature Extraction 144</p> <p>4.2.2.2 Feature Selection 145</p> <p>4.2.2.3 Classification 145</p> <p>4.2.3 Type-2 Fuzzy Reasoning 146</p> <p>4.2.4 Training and Testing of the Classifiers 146</p> <p>4.3 Determining Type and Degree of Learning by Type-2 Fuzzy Reasoning 147</p> <p>4.3.1 Preliminaries on IT2FS and GT2FS 147</p> <p>4.3.2 Proposed Reasoning Method 1: CIT2FS-Based Reasoning 148</p> <p>4.3.3 Computation of Percentage Normalized Degree of Learning 150</p> <p>4.3.4 Optimal <i>𝜆 </i>Selection in IT2FS Reasoning 151</p> <p>4.3.5 Proposed Reasoning Method 2: Triangular Vertical Slice (TVS)-Based CGT2FS Reasoning 151</p> <p>4.3.5.1 Closed General Type-2 Fuzzy Inference Generation 151</p> <p>4.3.5.2 Time complexity 154</p> <p>4.3.6 Proposed Reasoning Method 3: CGT2FS Reasoning with Gaussian Secondary MF 154</p> <p>4.3.6.1 Time-Complexity 156</p> <p>4.4 Experiments and Results 157</p> <p>4.4.1 The Experimental Set-up 157</p> <p>4.4.2 Stimulus Presentation 157</p> <p>4.4.3 Experiment 1: Source Localization Using eLORETA 158</p> <p>4.4.4 Experiment 2: Validation of the Rules 159</p> <p>4.4.5 Experiment 3: Pre-processing and Artifact Removal Using ICA 159</p> <p>4.4.6 Experiment 4: N400 Old/New Effect Observation over the Successive Trials 163</p> <p>4.4.7 Experiment 5: Selection of the Discriminating EEG Features Using PCA 163</p> <p>4.5 Performance Analysis and Statistical Validation 164</p> <p>4.5.1 Performance Analysis of the LSVM Classifiers 164</p> <p>4.5.2 Robustness Study 165</p> <p>4.5.3 Performance Analysis of the Proposed T2FS Reasoning Methods 166</p> <p>4.5.4 Computational Performance Analysis of the Proposed T2FS Reasoning Methods 166</p> <p>4.5.5 Statistical Validation Using Wilcoxon Signed-Rank Test 168</p> <p>4.6 Conclusions 169</p> <p>References 169</p> <p><b>5 EEG Analysis to Decode Human Memory Responses in Face Recognition Task Using Deep LSTM Network </b><b>175</b></p> <p>5.1 Introduction 176</p> <p>5.2 CSP Modeling 179</p> <p>5.2.1 The Standard CSP Algorthm 179</p> <p>5.2.2 The Proposed CSP Algorithm 180</p> <p>5.3 Proposed LSTM Classifier with Attention Mechanism 183</p> <p>5.3.1 Attention Mechanism in Each LSTM Unit 184</p> <p>5.4 Experiments and Results 188</p> <p>5.4.1 Experimental Set-up 188</p> <p>5.4.2 Experiment 1: Activated Brain Region Selection Using eLORETA 188</p> <p>5.4.3 Experiment 2: Detection of the ERP Signals Associated with the Familiar and Unfamiliar Face Discrimination 190</p> <p>5.4.4 Experiment 3: Performance Analysis of the Proposed CSP Algorithm as a Feature Extraction Technique 191</p> <p>5.4.5 Experiment 4: Performance Analysis of the Proposed LSTM-Based Classifier 192</p> <p>5.4.6 Experiment 5: Classifier Performance Analysis with Varying EEG Time-Window Length 194</p> <p>5.4.7 Statistical Validation of the Proposed LSTM Classifier Using McNemar’s Test 195</p> <p>5.5 Conclusions 196</p> <p>References 197</p> <p><b>6 Cognitive Load Assessment in Motor Learning Tasks by Near-Infrared Spectroscopy Using Type-2 Fuzzy Sets </b><b>203</b></p> <p>6.1 Introduction 203</p> <p>6.2 Principles and Methodologies 206</p> <p>6.2.1 Normalization of the Raw Data 206</p> <p>6.2.2 Pre-processing 207</p> <p>6.2.3 Feature Extraction 208</p> <p>6.2.4 Training Instance Generation for Offline Training 208</p> <p>6.2.5 Feature Selection Using Evolutionary Algorithm 209</p> <p>6.2.6 Classifier Training and Testing 210</p> <p>6.3 Classifier Design 211</p> <p>6.3.1 Preliminaries on IT2FS and GT2FS 211</p> <p>6.3.2 IT2FS-Induced Classifier Design 212</p> <p>6.3.3 GT2FS-Induced Classifier Design 216</p> <p>6.4 Experiments and Results 219</p> <p>6.4.1 Experimental Set-up 219</p> <p>6.4.2 Participants 219</p> <p>6.4.3 Stimulus Presentation for Online Classification 221</p> <p>6.4.4 Experiment 1: Demonstration of Decreasing Cognitive Load with Increasing Learning Epochs for Similar Stimulus 221</p> <p>6.4.5 Experiment 2: Automatic Extraction of Discriminating fNIRs Features 223</p> <p>6.4.6 Experiment 3: Optimal Parameter Setting of Feature Selection and Classifier Units 223</p> <p>6.5 Biological Implications 226</p> <p>6.6 Performance Analysis 226</p> <p>6.6.1 Performance Analysis of the Proposed IT2FS and GT2FS Classifier 226</p> <p>6.6.2 Statistical Validation of the Classifier Using McNemar’s Test 229</p> <p>6.7 Conclusions 232</p> <p>References 232</p> <p><b>7 Conclusions and Future Directions of Research on BCI-Based Memory and Learning </b><b>239</b></p> <p>7.1 Self-Review of the Works Undertaken in the Book 239</p> <p>7.2 Limitations of EEG BCI-Based Memory Experiments 242</p> <p>7.3 Further Scope of Future Research on Memory and Learning 242</p> <p>References 245</p> <p>Index 247</p>
<p><b>LIDIA GHOSH, P<small>H</small>D,</b> is currently a post-doctoral research fellow on Brain Science and Memory Research, granted by Liverpool Hope University to Jadavpur University, India. <p><b>AMIT KONAR, P<small>H</small>D,</b> is currently a Professor in the dept. of Electronics and Tele-Communication Engineering (ETCE), Jadavpur University. He is an author of 15 books including a Wiley title: <i>Emotion Recognition-A Pattern Analysis Approach</i>. <p><b>PRATYUSHA RAKSHIT, P<small>H</small>D,</b> is an Assistant Professor of ETCE dept., Jadavpur University, India and is currently on lien to Basque Centre for Applied Mathematics, Bilbao, Spain.
<p><b>Proposes computational models of human memory and learning using a brain-computer interfacing (BCI) approach</b> <p>Human memory modeling is important from two perspectives. First, the precise fitting of the model to an individual's short-term or working memory may help in predicting memory performance of the subject in future. Second, memory models provide a biological insight to the encoding and recall mechanisms undertaken by the neurons present in active brain lobes, participating in the memorization process. This book models human memory from a cognitive standpoint by utilizing brain activations acquired from the cortex by electroencephalographic (EEG) and functional near-infrared-spectroscopic (f-NIRs) means. <p><i>Cognitive Modeling of Human Memory and Learning: A Non-invasive Brain-Computer Interfacing Approach</i> begins with an overview of the early models of memory. The authors then propose a simplistic model of Working Memory (WM) built with fuzzy Hebbian learning. A second perspective of memory models is concerned with Short-Term Memory (STM)-modeling in the context of 2-dimensional object-shape reconstruction from visually examined memorized instances. A third model assesses the subjective motor learning skill in driving from erroneous motor actions. Other models introduce a novel strategy of designing a two-layered deep Long Short-Term Memory (LSTM) classifier network and also deal with cognitive load assessment in motor learning tasks associated with driving. The book ends with concluding remarks based on principles and experimental results acquired in previous chapters. <ul> <li>Examines the scope of computational models of memory and learning with special emphasis on classification of memory tasks by deep learning-based models</li> <li>Proposes Interval Type-2 fuzzy sets (IT2FS) and General Type-2 Fuzzy Sets (GT2FS) based reasoning in the context of memory modeling and learning</li> <li>Employs Brain-Computer Interfaces for memory modeling and also cognitive load classification in motor learning tasks for driving learners</li> </ul> <p><i>Cognitive Modeling of Human Memory and Learning: A Non-invasive Brain-Computer Interfacing Approach</i> will appeal to researchers in cognitive neuro-science and human/brain-computer interfaces. It is also beneficial to graduate students of computer science/electrical/electronic engineering.

Diese Produkte könnten Sie auch interessieren:

Bandwidth Efficient Coding
Bandwidth Efficient Coding
von: John B. Anderson
PDF ebook
114,99 €
Bandwidth Efficient Coding
Bandwidth Efficient Coding
von: John B. Anderson
EPUB ebook
114,99 €