Proposes computational models of human memory and learning using a brain-computer interfacing (BCI) approach
Human memory modeling is important from two perspectives. First, the precise fitting of the model to an individual's short-term or working memory may help in predicting memory performance of the subject in future. Second, memory models provide a biological insight to the encoding and recall mechanisms undertaken by the neurons present in active brain lobes, participating in the memorization process. This book models human memory from a cognitive standpoint by utilizing brain activations acquired from the cortex by electroencephalographic (EEG) and functional near-infrared-spectroscopic (f-NIRs) means.
Cognitive Modeling of Human Memory and Learning A Non-invasive Brain-Computer Interfacing Approach begins with an overview of the early models of memory. The authors then propose a simplistic model of Working Memory (WM) built with fuzzy Hebbian learning. A second perspective of memory models is concerned with Short-Term Memory (STM)-modeling in the context of 2-dimensional object-shape reconstruction from visually examined memorized instances. A third model assesses the subjective motor learning skill in driving from erroneous motor actions. Other models introduce a novel strategy of designing a two-layered deep Long Short-Term Memory (LSTM) classifier network and also deal with cognitive load assessment in motor learning tasks associated with driving. The book ends with concluding remarks based on principles and experimental results acquired in previous chapters.
- Examines the scope of computational models of memory and learning with special emphasis on classification of memory tasks by deep learning-based models
- Proposes two algorithms of type-2 fuzzy reasoning: Interval Type-2 fuzzy reasoning (IT2FR) and General Type-2 Fuzzy Sets (GT2FS)
- Considers three classes of cognitive loads in the motor learning tasks for driving learners
Cognitive Modeling of Human Memory and Learning A Non-invasive Brain-Computer Interfacing Approach will appeal to researchers in cognitive neuro-science and human/brain-computer interfaces. It is also beneficial to graduate students of computer science/electrical/electronic engineering.
Table of Contents
Preface xi
Acknowledgments xvii
About the Authors xix
1 Introduction to Brain-Inspired Memory and Learning Models 1
1.1 Introduction 1
1.2 Philosophical Contributions to Memory Research 3
1.2.1 Atkinson and Shiffrin’s Model 4
1.2.2 Tveter’s Model 5
1.2.3 Tulving’s Model 6
1.2.4 The Parallel and Distributed Processing (PDP) Approach 6
1.2.5 Procedural and Declarative Memory 8
1.3 Brain-Theoretic Interpretation of Memory Formation 10
1.3.1 Coding for Memory 10
1.3.2 Memory Consolidation 12
1.3.3 Location of Stored Memories 14
1.3.4 Isolation of Information in Memory 15
1.4 Cognitive Maps 16
1.5 Neural Plasticity 17
1.6 Modularity 18
1.7 The Cellular Process Behind STM Formation 18
1.8 LTM Formation 20
1.9 Brain Signal Analysis in the Context of Memory and Learning 20
1.9.1 Association of EEG α and θ Band with Memory Performances 21
1.9.2 Oscillatory β and γ Frequency Band Activation in STM Performance 24
1.9.3 Change in EEG Band Power with Changing Working Memory Load 24
1.9.4 Effects of Electromagnetic Field on the EEG Response of Working Memory 27
1.9.5 EEG Analysis to Discriminate Focused Attention and WM Performance 28
1.9.6 EEG Power Changes in Memory Repetition Effect 29
1.9.7 Correlation Between LTM Retrieval and EEG Features 32
1.9.8 Impact of Math Anxiety on WM Response: An EEG Study 34
1.10 Memory Modeling by Computational Intelligence Techniques 35
1.11 Scope of the Book 39
References 43
2 Working Memory Modeling Using Inverse Fuzzy Relational Approach 51
2.1 Introduction 52
2.2 Problem Formulation and Approach 54
2.2.1 Independent Component Analysis as a Source Localization Tool 55
2.2.2 Independent Component Analysis vs. Principal Component Analysis 58
2.2.3 Feature Extraction 58
2.2.4 Phase 1: WM Modeling 59
2.2.4.1 Step I: WM Modeling of Subject Using EEG Signals During Full Face Encoding and Recall from Specific Part of Same Face 60
2.2.4.2 Step II: WM Modeling of Subject Using EEG Signals During Full Face Encoding and Recall from All Parts of Same Face 62
2.2.5 Phase 2: WM Analysis 62
2.2.6 Finding Max-Min Compositional Inverse of Weight Matrix Wck 65
2.3 Experiments and Performance Analysis 70
2.3.1 Experimental Set-up 71
2.3.2 Source Localization Using eLORETA 73
2.3.3 Pre-processing 74
2.3.4 Selection of EEG Features 74
2.3.5 WM Model Consistency Across Partial Face Stimuli 77
2.3.6 Inter-person Variability of W 77
2.3.7 Variation in Imaging Attributes 77
2.3.8 Comparative Analysis with Existing Fuzzy Inverse Relations 84
2.4 Discussion 85
2.5 Conclusions 86
References 88
3 Short-Term Memory Modeling in Shape-Recognition Task by Type-2 Fuzzy Deep Brain Learning 93
3.1 Introduction 94
3.2 System Overview 96
3.3 Brain Functional Mapping Using Type-2 Fuzzy DBLN 101
3.3.1 Overview of Type-2 Fuzzy Sets 103
3.3.2 Type-2 Fuzzy Mapping and Parameter Adaptation by Perceptron-Like Learning 104
3.3.2.1 Construction of the Proposed Interval Type-2 Fuzzy Membership Function (IT2MF) 104
3.3.2.2 Construction of IT2FS-Induced Mapping Function 105
3.3.2.3 Secondary Membership Function Computation of Proposed GT2FS 107
3.3.2.4 Proposed General Type-2 Fuzzy Mapping 108
3.3.3 Perceptron-Like Learning for Weight Adaptation 110
3.3.4 Training of the Proposed Shape-Reconstruction Architecture 111
3.3.5 The Test Phase of the Memory Model 113
3.4 Experiments and Results 113
3.4.1 Experimental Set-up 113
3.4.2 Experiment 1: Validation of the STM Model with Respect to Error Metric 𝜉 116
3.4.3 Experiment 2: Similar Encoding by a Subject for Similar Input Object Shapes 116
3.4.4 Experiment 3: Study of Subjects’ Learning Ability with Increasing Complexity in Object Shape 117
3.4.5 Experiment 4: Convergence Time of the Weight Matrix G for Increased Complexity of the Input Shape Stimuli 118
3.4.6 Experiment 5: Abnormality in G matrix for the Subjects with Brain Impairment 119
3.5 Biological Implications 120
3.6 Performance Analysis 122
3.6.1 Performance Analysis of the Proposed T2FS Methods 123
3.6.2 Computational Performance Analysis of the Proposed T2FS Methods 123
3.6.3 Statistical Validation Using Wilcoxon Signed-Rank Test 124
3.6.4 Optimal Parameter Selection and Robustness Study 126
3.7 Conclusions 127
References 130
4 EEG Analysis for Subjective Assessment of Motor Learning Skill in Driving Using Type-2 Fuzzy Reasoning 137
4.1 Introduction 138
4.2 System Overview 140
4.2.1 Rule Design to Determine the Degree of Learning 141
4.2.2 Single Trial Detection of Brain Signals 144
4.2.2.1 Feature Extraction 144
4.2.2.2 Feature Selection 145
4.2.2.3 Classification 145
4.2.3 Type-2 Fuzzy Reasoning 146
4.2.4 Training and Testing of the Classifiers 146
4.3 Determining Type and Degree of Learning by Type-2 Fuzzy Reasoning 147
4.3.1 Preliminaries on IT2FS and GT2FS 147
4.3.2 Proposed Reasoning Method 1: CIT2FS-Based Reasoning 148
4.3.3 Computation of Percentage Normalized Degree of Learning 150
4.3.4 Optimal 𝜆 Selection in IT2FS Reasoning 151
4.3.5 Proposed Reasoning Method 2: Triangular Vertical Slice (TVS)-Based CGT2FS Reasoning 151
4.3.5.1 Closed General Type-2 Fuzzy Inference Generation 151
4.3.5.2 Time complexity 154
4.3.6 Proposed Reasoning Method 3: CGT2FS Reasoning with Gaussian Secondary MF 154
4.3.6.1 Time-Complexity 156
4.4 Experiments and Results 157
4.4.1 The Experimental Set-up 157
4.4.2 Stimulus Presentation 157
4.4.3 Experiment 1: Source Localization Using eLORETA 158
4.4.4 Experiment 2: Validation of the Rules 159
4.4.5 Experiment 3: Pre-processing and Artifact Removal Using ICA 159
4.4.6 Experiment 4: N400 Old/New Effect Observation over the Successive Trials 163
4.4.7 Experiment 5: Selection of the Discriminating EEG Features Using PCA 163
4.5 Performance Analysis and Statistical Validation 164
4.5.1 Performance Analysis of the LSVM Classifiers 164
4.5.2 Robustness Study 165
4.5.3 Performance Analysis of the Proposed T2FS Reasoning Methods 166
4.5.4 Computational Performance Analysis of the Proposed T2FS Reasoning Methods 166
4.5.5 Statistical Validation Using Wilcoxon Signed-Rank Test 168
4.6 Conclusions 169
References 169
5 EEG Analysis to Decode Human Memory Responses in Face Recognition Task Using Deep LSTM Network 175
5.1 Introduction 176
5.2 CSP Modeling 179
5.2.1 The Standard CSP Algorthm 179
5.2.2 The Proposed CSP Algorithm 180
5.3 Proposed LSTM Classifier with Attention Mechanism 183
5.3.1 Attention Mechanism in Each LSTM Unit 184
5.4 Experiments and Results 188
5.4.1 Experimental Set-up 188
5.4.2 Experiment 1: Activated Brain Region Selection Using eLORETA 188
5.4.3 Experiment 2: Detection of the ERP Signals Associated with the Familiar and Unfamiliar Face Discrimination 190
5.4.4 Experiment 3: Performance Analysis of the Proposed CSP Algorithm as a Feature Extraction Technique 191
5.4.5 Experiment 4: Performance Analysis of the Proposed LSTM-Based Classifier 192
5.4.6 Experiment 5: Classifier Performance Analysis with Varying EEG Time-Window Length 194
5.4.7 Statistical Validation of the Proposed LSTM Classifier Using McNemar’s Test 195
5.5 Conclusions 196
References 197
6 Cognitive Load Assessment in Motor Learning Tasks by Near-Infrared Spectroscopy Using Type-2 Fuzzy Sets 203
6.1 Introduction 203
6.2 Principles and Methodologies 206
6.2.1 Normalization of the Raw Data 206
6.2.2 Pre-processing 207
6.2.3 Feature Extraction 208
6.2.4 Training Instance Generation for Offline Training 208
6.2.5 Feature Selection Using Evolutionary Algorithm 209
6.2.6 Classifier Training and Testing 210
6.3 Classifier Design 211
6.3.1 Preliminaries on IT2FS and GT2FS 211
6.3.2 IT2FS-Induced Classifier Design 212
6.3.3 GT2FS-Induced Classifier Design 216
6.4 Experiments and Results 219
6.4.1 Experimental Set-up 219
6.4.2 Participants 219
6.4.3 Stimulus Presentation for Online Classification 221
6.4.4 Experiment 1: Demonstration of Decreasing Cognitive Load with Increasing Learning Epochs for Similar Stimulus 221
6.4.5 Experiment 2: Automatic Extraction of Discriminating fNIRs Features 223
6.4.6 Experiment 3: Optimal Parameter Setting of Feature Selection and Classifier Units 223
6.5 Biological Implications 226
6.6 Performance Analysis 226
6.6.1 Performance Analysis of the Proposed IT2FS and GT2FS Classifier 226
6.6.2 Statistical Validation of the Classifier Using McNemar’s Test 229
6.7 Conclusions 232
References 232
7 Conclusions and Future Directions of Research on BCI-Based Memory and Learning 239
7.1 Self-Review of the Works Undertaken in the Book 239
7.2 Limitations of EEG BCI-Based Memory Experiments 242
7.3 Further Scope of Future Research on Memory and Learning 242
References 245
Index 247