Details
Brain-Computer Interfaces 1
Methods and Perspectives1. Aufl.
139,99 € |
|
Verlag: | Wiley |
Format: | EPUB |
Veröffentl.: | 14.07.2016 |
ISBN/EAN: | 9781119144991 |
Sprache: | englisch |
Anzahl Seiten: | 330 |
DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.
Beschreibungen
<p>Brain–computer interfaces (BCI) are devices which measure brain activity and translate it into messages or commands, thereby opening up many investigation and application possibilities. This book provides keys for understanding and designing these multi-disciplinary interfaces, which require many fields of expertise such as neuroscience, statistics, informatics and psychology.</p> <p>This first volume, <i>Methods and Perspectives</i>, presents all the basic knowledge underlying the working principles of BCI. It opens with the anatomical and physiological organization of the brain, followed by the brain activity involved in BCI, and following with information extraction, which involves signal processing and machine learning methods. BCI usage is then described, from the angle of human learning and human-machine interfaces.</p> <p>The basic notions developed in this reference book are intended to be accessible to all readers interested in BCI, whatever their background. More advanced material is also offered, for readers who want to expand their knowledge in disciplinary fields underlying BCI.</p> <p>This first volume will be followed by a second volume, entitled <i>Technology and Applications</i>.</p>
<p>Foreword xiii<br /><i>José DEL R. MILLAN</i></p> <p>Introduction xv<br /><i>Maureen CLERC, Laurent BOUGRAIN and Fabien LOTTE</i></p> <p><b>Part 1. Anatomy and Physiology 1</b></p> <p><b>Chapter 1. Anatomy of the Nervous System 3</b><br /><i>Matthieu KANDEL and Maude TOLLET</i></p> <p>1.1. General description of the nervous system 4</p> <p>1.2. The central nervous system 5</p> <p>1.2.1. The telencephalon 6</p> <p>1.2.2. The diencephalon 10</p> <p>1.2.3. The brain stem 12</p> <p>1.3. The cerebellum 14</p> <p>1.4. The spinal cord and its roots 15</p> <p>1.5. The peripheral nervous system 18</p> <p>1.5.1. Nerves 18</p> <p>1.5.2. General organization of the PNS 19</p> <p>1.5.3. The autonomic nervous system 20</p> <p>1.6. Some syndromes and pathologies targeted by Brain–Computer Interfaces 21</p> <p>1.6.1. Motor syndromes 21</p> <p>1.6.2. Some pathologies that may be treated with BCIs 22</p> <p>1.7. Conclusions 23</p> <p>1.8. Bibliography 24</p> <p><b>Chapter 2. Functional Neuroimaging 25</b><br /><i>Christian BÉNAR</i></p> <p>2.1. Functional MRI 26</p> <p>2.1.1. Basic principles of MRI 26</p> <p>2.1.2. Principles of fMRI 26</p> <p>2.1.3. Statistical data analysis: the linear model 27</p> <p>2.1.4. Independent component analysis 29</p> <p>2.1.5. Connectivity measures 30</p> <p>2.2. Electrophysiology: EEG and MEG 31</p> <p>2.2.1. Basic principles of signal generation 31</p> <p>2.2.2. Event-related potentials and fields 31</p> <p>2.2.3. Source localization 32</p> <p>2.2.4. Independent component analysis 34</p> <p>2.2.5. Time–frequency analysis 34</p> <p>2.2.6. Connectivity 35</p> <p>2.2.7. Statistical analysis 36</p> <p>2.3. Simultaneous EEG-fMRI 37</p> <p>2.3.1. Basic principles 37</p> <p>2.3.2. Applications and data analysis 37</p> <p>2.3.3. Connections between EEG and fMRI 38</p> <p>2.4. Discussion and outlook for the future 38</p> <p>2.5. Bibliography 40</p> <p><b>Chapter 3. Cerebral Electrogenesis 45</b><br /><i>Franck VIDAL</i></p> <p>3.1. Electrical neuronal activity detected in EEG 45</p> <p>3.1.1. Action and postsynaptic potentials 46</p> <p>3.1.2. Resting potential, electrochemical gradient and PSPs 47</p> <p>3.1.3. From PSPs to EEG 48</p> <p>3.2. Dipolar and quadrupole fields 51</p> <p>3.2.1. Field created by an ion current due to the opening of ion channels 51</p> <p>3.2.2. Factors determining the value of the potential created by an ion current 56</p> <p>3.3. The importance of geometry 57</p> <p>3.3.1. Spatial summation, closed fields and open fields 57</p> <p>3.3.2. Effect of synapse position on the polarity of EEG 60</p> <p>3.3.3. Effect of active areas’ position 61</p> <p>3.4. The influence of conductive media 62</p> <p>3.4.1. Influence of glial cells 62</p> <p>3.4.2. Influence of skull bones 63</p> <p>3.5. Conclusions 64</p> <p>3.6. Bibliography 64</p> <p><b>Chapter 4. Physiological Markers for Controlling Active and Reactive BCIs 67</b><br /><i>François CABESTAING and Philippe DERAMBURE</i></p> <p>4.1. Introduction 67</p> <p>4.2. Markers that enable active interface control 72</p> <p>4.2.1. Spatiotemporal variations in potential 72</p> <p>4.2.2. Spatiotemporal wave variations 74</p> <p>4.3. Markers that make it possible to control reactive interfaces 77</p> <p>4.3.1. Sensory evoked potentials 77</p> <p>4.3.2. Endogenous P300 potential 80</p> <p>4.4. Conclusions 81</p> <p>4.5. Bibliography 82</p> <p><b>Chapter 5. Neurophysiological Markers for Passive Brain–Computer Interfaces 85</b><br /><i>Raphaëlle N. ROY and Jérémy FREY</i></p> <p>5.1. Passive BCI and mental states 85</p> <p>5.1.1. Passive BCI: definition 85</p> <p>5.1.2. The notion of mental states 86</p> <p>5.1.3. General categories of neurophysiological markers 87</p> <p>5.2. Cognitive load 87</p> <p>5.2.1. Definition 87</p> <p>5.2.2. Behavioral markers 87</p> <p>5.2.3. EEG markers 87</p> <p>5.2.4. Application example: air traffic control 88</p> <p>5.3. Mental fatigue and vigilance 89</p> <p>5.3.1. Definition 89</p> <p>5.3.2. Behavioral markers 89</p> <p>5.3.3. EEG markers 89</p> <p>5.3.4. Application example: driving 90</p> <p>5.4. Attention 90</p> <p>5.4.1. Definition 90</p> <p>5.4.2. Behavioral markers 91</p> <p>5.4.3. EEG markers 91</p> <p>5.4.4. Application example: teaching 92</p> <p>5.5. Error detection 92</p> <p>5.5.1. Definition 92</p> <p>5.5.2. Behavioral markers 92</p> <p>5.5.3. EEG markers 93</p> <p>5.5.4. Application example: tactile and robotic interfaces 93</p> <p>5.6. Emotions 94</p> <p>5.6.1. Definition 94</p> <p>5.6.2. Behavioral markers 94</p> <p>5.6.3. EEG markers 94</p> <p>5.6.4. Application example: communication and personal development 95</p> <p>5.7. Conclusions 96</p> <p>5.8. Bibliography 96</p> <p><b>Part 2. Signal Processing and Machine Learning 101</b></p> <p><b>Chapter 6. Electroencephalography Data Preprocessing 103</b><br /><i>Maureen CLERC</i></p> <p>6.1. Introduction 103</p> <p>6.2. Principles of EEG acquisition 104</p> <p>6.2.1. Montage 104</p> <p>6.2.2. Sampling and quantification 105</p> <p>6.3. Temporal representation and segmentation 105</p> <p>6.3.1. Segmentation 106</p> <p>6.3.2. Time domain preprocessing 106</p> <p>6.4. Frequency representation 107</p> <p>6.4.1. Fourier transform 107</p> <p>6.4.2. Frequency filtering 108</p> <p>6.5. Time–frequency representations 109</p> <p>6.5.1. Time–frequency atom 109</p> <p>6.5.2. Short-time Fourier transform 111</p> <p>6.5.3. Wavelet transform 112</p> <p>6.5.4. Time–frequency transforms of discrete signals 114</p> <p>6.5.5. Toward other redundant representations 114</p> <p>6.6. Spatial representations 115</p> <p>6.6.1. Topographic representations 115</p> <p>6.6.2. Spatial filtering 116</p> <p>6.6.3. Source reconstruction 118</p> <p>6.6.4. Using spatial representations in BCI 120</p> <p>6.7. Statistical representations 121</p> <p>6.7.1. Principal component analysis 121</p> <p>6.7.2. Independent component analysis 122</p> <p>6.7.3. Using statistical representations in BCI 122</p> <p>6.8. Conclusions 123</p> <p>6.9. Bibliography 124</p> <p><b>Chapter 7. EEG Feature Extraction 127</b><br /><i>Fabien LOTTE and Marco CONGEDO</i></p> <p>7.1. Introduction 127</p> <p>7.2. Feature extraction 127</p> <p>7.3. Feature extraction for BCIs employing oscillatory activity 130</p> <p>7.3.1. Basic design for BCI using oscillatory activity 130</p> <p>7.3.2. Toward more advanced, multiple electrode BCIs 131</p> <p>7.3.3. The CSP algorithm 133</p> <p>7.3.4. Illustration on real data 135</p> <p>7.4. Feature extraction for the BCIs employing EPs 137</p> <p>7.4.1. Spatial filtering for BCIs employing EPs 138</p> <p>7.5. Alternative methods and the Riemannian geometry approach 139</p> <p>7.6. Conclusions 141</p> <p>7.7. Bibliography 142</p> <p><b>Chapter 8. Analysis of Extracellular Recordings 145</b><br /><i>Christophe POUZAT</i></p> <p>8.1. Introduction 145</p> <p>8.1.1. Why is recording neuronal populations desirable? 146</p> <p>8.1.2. How can neuronal populations be recorded? 146</p> <p>8.1.3. The properties of extracellular data and the necessity of spike sorting 147</p> <p>8.2. The origin of the signal and its consequences 148</p> <p>8.2.1. Relationship between current and potential in a homogeneous medium 148</p> <p>8.2.2. Relationship between the derivatives of the membrane potential and the transmembrane current 150</p> <p>8.2.3. “From electrodes to tetrodes” 154</p> <p>8.3. Spike sorting: a chronological presentation 155</p> <p>8.3.1. Naked eye sorting 155</p> <p>8.3.2. Window discriminator (1963) 155</p> <p>8.3.3. Template matching (1964) 156</p> <p>8.3.4. Dimension reduction and clustering (1965) 157</p> <p>8.3.5. Principal component analysis (1968) 158</p> <p>8.3.6. Resolving superposition (1972) 160</p> <p>8.3.7. Dynamic amplitude profiles of action potentials (1973) 161</p> <p>8.3.8. Optimal filters (1975) 162</p> <p>8.3.9. Stereotrodes and amplitude ratios (1983) 165</p> <p>8.3.10. Sampling jitter (1984) 168</p> <p>8.3.11. Graphical tools 170</p> <p>8.3.12. Automatic clustering 171</p> <p>8.4. Recommendations 179</p> <p>8.5. Bibliography 181</p> <p><b>Chapter 9. Statistical Learning for BCIs 185</b><br /><i>Rémi FLAMARY, Alain RAKOTOMAMONJY and Michèle SEBAG</i></p> <p>9.1. Supervised statistical learning 185</p> <p>9.1.1. Training data and the predictor function 186</p> <p>9.1.2. Empirical risk and regularization 187</p> <p>9.1.3. Classical methods of classification 190</p> <p>9.2. Specific training methods 192</p> <p>9.2.1. Selection of variables and sensors 192</p> <p>9.2.2. Multisubject learning, information transfer 194</p> <p>9.3. Performance metrics 194</p> <p>9.3.1. Classification performance metrics 195</p> <p>9.3.2. Regression performance metrics 196</p> <p>9.4. Validation and model selection 197</p> <p>9.4.1. Estimation of the performance metric 197</p> <p>9.4.2. Optimization of hyperparameters 200</p> <p>9.5. Conclusions 202</p> <p>9.6. Bibliography 202</p> <p><b>Part 3. Human Learning and Human–Machine Interaction 207</b></p> <p><b>Chapter 10. Adaptive Methods in Machine Learning 209</b><br /><i>Maureen CLERC, Emmanuel DAUCÉ and Jérémie MATTOUT</i></p> <p>10.1. The primary sources of variability 209</p> <p>10.1.1. Intrasubject variability 210</p> <p>10.1.2. Intersubject variability 211</p> <p>10.2. Adaptation framework for BCIs 213</p> <p>10.3. Adaptive statistical decoding 214</p> <p>10.3.1. Covariate shift 214</p> <p>10.3.2. Classifier adaptation 216</p> <p>10.3.3. Subject-adapted calibration 218</p> <p>10.3.4. Optimal tasks 219</p> <p>10.3.5. Correspondence between task and command 221</p> <p>10.4. Generative model and adaptation 221</p> <p>10.4.1. Bayesian approach 221</p> <p>10.4.2. Sequential decision 224</p> <p>10.4.3. Online optimization of stimulations 226</p> <p>10.5. Conclusions 229</p> <p>10.6. Bibliography 229</p> <p><b>Chapter 11. Human Learning for Brain–Computer Interfaces 233</b><br /><i>Camille JEUNET, Fabien LOTTE and Bernard N’KAOUA</i></p> <p>11.1. Introduction 233</p> <p>11.2. Illustration: two historical BCI protocols 235</p> <p>11.3. Limitations of standard protocols used for BCIs 237</p> <p>11.4. State-of-the-art in BCI learning protocols 238</p> <p>11.4.1. Instructions 238</p> <p>11.4.2. Training tasks 239</p> <p>11.4.3. Feedback 239</p> <p>11.4.4. Learning environment 242</p> <p>11.4.5. In summary: guidelines for designing more effective training protocols 243</p> <p>11.5. Perspectives: toward user-adapted and user-adaptable learning protocols 244</p> <p>11.6. Conclusions 247</p> <p>11.7. Bibliography 247</p> <p><b>Chapter 12. Brain–Computer Interfaces for Human–Computer Interaction 251</b><br /><i>Andéol EVAIN, Nicolas ROUSSEL, Géry CASIEZ, Fernando ARGELAGUET-SANZ and Anatole LÉCUYER</i></p> <p>12.1. A brief introduction to human–computer interaction 251</p> <p>12.1.1. Interactive systems, interface and interaction 252</p> <p>12.1.2. Elementary tasks and interaction techniques 252</p> <p>12.1.3. Theory of action feedback 253</p> <p>12.1.4. Usability 254</p> <p>12.2. Properties of BCIs from the perspective of HCI 255</p> <p>12.3. Which pattern for which task? 257</p> <p>12.4. Paradigms of interaction for BCIs 259</p> <p>12.4.1. BCI interaction loop 259</p> <p>12.4.2. Main paradigms of interaction for BCIs 260</p> <p>12.5. Conclusions 265</p> <p>12.6. Bibliography 266</p> <p><b>Chapter 13. Brain Training with Neurofeedback 271</b><br /><i>Lorraine PERRONNET, Anatole LÉCUYER, Fabien LOTTE, Maureen CLERC and Christian BARILLOT</i></p> <p>13.1. Introduction 271</p> <p>13.2. How does it work? 274</p> <p>13.2.1. Design of an NF training program 274</p> <p>13.2.2. Course of an NF session: where the eyes “look” at the brain 275</p> <p>13.2.3. A learning procedure that we still do not fully understand 276</p> <p>13.3. Fifty years of history 278</p> <p>13.3.1. A premature infatuation 278</p> <p>13.3.2. Diversification of approaches 279</p> <p>13.4. Where NF meets BCI 281</p> <p>13.5. Applications 283</p> <p>13.6. Conclusions 287</p> <p>13.7. Bibliography 288</p> <p>List of Authors 293</p> <p>Index 295</p> <p>Contents of Volume 2 299</p>
<p><strong>Maureen Clerc</strong> is Senior Researcher at Inria Sophia Antipolis, France. <p><strong>Laurent Bougrain</strong> is Assistant Professor at the University of Lorraine, France. <p><strong>Fabien Lotte</strong> is Junior Researcher at Inria Bordeaux, France.