Details

Deep Learning Approaches for Security Threats in IoT Environments


Deep Learning Approaches for Security Threats in IoT Environments


1. Aufl.

von: Mohamed Abdel-Basset, Nour Moustafa, Hossam Hawash

107,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 18.11.2022
ISBN/EAN: 9781119884156
Sprache: englisch
Anzahl Seiten: 384

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<b>Deep Learning Approaches for Security Threats in IoT Environments</b> <p><b>An expert discussion of the application of deep learning methods in the IoT security environment</b> <p>In <i>Deep Learning Approaches for Security Threats in IoT Environments</i>, a team of distinguished cybersecurity educators deliver an insightful and robust exploration of how to approach and measure the security of Internet-of-Things (IoT) systems and networks. In this book, readers will examine critical concepts in artificial intelligence (AI) and IoT, and apply effective strategies to help secure and protect IoT networks. The authors discuss supervised, semi-supervised, and unsupervised deep learning techniques, as well as reinforcement and federated learning methods for privacy preservation. <p>This book applies deep learning approaches to IoT networks and solves the security problems that professionals frequently encounter when working in the field of IoT, as well as providing ways in which smart devices can solve cybersecurity issues. <p>Readers will also get access to a companion website with PowerPoint presentations, links to supporting videos, and additional resources. They’ll also find: <ul><li> A thorough introduction to artificial intelligence and the Internet of Things, including key concepts like deep learning, security, and privacy</li> <li> Comprehensive discussions of the architectures, protocols, and standards that form the foundation of deep learning for securing modern IoT systems and networks</li> <li> In-depth examinations of the architectural design of cloud, fog, and edge computing networks</li> <li> Fulsome presentations of the security requirements, threats, and countermeasures relevant to IoT networks</li></ul> <p>Perfect for professionals working in the AI, cybersecurity, and IoT industries, <i>Deep Learning Approaches for Security Threats in IoT Environments</i> will also earn a place in the libraries of undergraduate and graduate students studying deep learning, cybersecurity, privacy preservation, and the security of IoT networks.
<p>About the Authors xv</p> <p><b>1 Introducing Deep Learning for IoT Security </b><b>1</b></p> <p>1.1 Introduction 1</p> <p>1.2 Internet of Things (IoT) Architecture 1</p> <p>1.2.1 Physical Layer 3</p> <p>1.2.2 Network Layer 4</p> <p>1.2.3 Application Layer 5</p> <p>1.3 Internet of Things’ Vulnerabilities and Attacks 6</p> <p>1.3.1 Passive Attacks 6</p> <p>1.3.2 Active Attacks 7</p> <p>1.4 Artificial Intelligence 11</p> <p>1.5 Deep Learning 14</p> <p>1.6 Taxonomy of Deep Learning Models 15</p> <p>1.6.1 Supervision Criterion 15</p> <p>1.6.1.1 Supervised Deep Learning 15</p> <p>1.6.1.2 Unsupervised Deep Learning 17</p> <p>1.6.1.3 Semi-Supervised Deep Learning 18</p> <p>1.6.1.4 Deep Reinforcement Learning 19</p> <p>1.6.2 Incrementality Criterion 19</p> <p>1.6.2.1 Batch Learning 20</p> <p>1.6.2.2 Online Learning 21</p> <p>1.6.3 Generalization Criterion 21</p> <p>1.6.3.1 Model-Based Learning 22</p> <p>1.6.3.2 Instance-Based Learning 22</p> <p>1.6.4 Centralization Criterion 22</p> <p>1.7 Supplementary Materials 25</p> <p>References 25</p> <p><b>2 Deep Neural Networks </b><b>27</b></p> <p>2.1 Introduction 27</p> <p>2.2 From Biological Neurons to Artificial Neurons 28</p> <p>2.2.1 Biological Neurons 28</p> <p>2.2.2 Artificial Neurons 30</p> <p>2.3 Artificial Neural Network 31</p> <p>2.3.1 Input Layer 34</p> <p>2.3.2 Hidden Layer 34</p> <p>2.3.3 Output Layer 34</p> <p>2.4 Activation Functions 35</p> <p>2.4.1 Types of Activation 35</p> <p>2.4.1.1 Binary Step Function 35</p> <p>2.4.1.2 Linear Activation Function 36</p> <p>2.4.1.3 Nonlinear Activation Functions 36</p> <p>2.5 The Learning Process of ANN 40</p> <p>2.5.1 Forward Propagation 41</p> <p>2.5.2 Backpropagation (Gradient Descent) 42</p> <p>2.6 Loss Functions 49</p> <p>2.6.1 Regression Loss Functions 49</p> <p>2.6.1.1 Mean Absolute Error (MAE) Loss 50</p> <p>2.6.1.2 Mean Squared Error (MSE) Loss 50</p> <p>2.6.1.3 Huber Loss 50</p> <p>2.6.1.4 Mean Bias Error (MBE) Loss 51</p> <p>2.6.1.5 Mean Squared Logarithmic Error (MSLE) 51</p> <p>2.6.2 Classification Loss Functions 52</p> <p>2.6.2.1 Binary Cross Entropy (BCE) Loss 52</p> <p>2.6.2.2 Categorical Cross Entropy (CCE) Loss 52</p> <p>2.6.2.3 Hinge Loss 53</p> <p>2.6.2.4 Kullback–Leibler Divergence (KL) Loss 53</p> <p>2.7 Supplementary Materials 53</p> <p>References 54</p> <p><b>3 Training Deep Neural Networks </b><b>55</b></p> <p>3.1 Introduction 55</p> <p>3.2 Gradient Descent Revisited 56</p> <p>3.2.1 Gradient Descent 56</p> <p>3.2.2 Stochastic Gradient Descent 57</p> <p>3.2.3 Mini-batch Gradient Descent 59</p> <p>3.3 Gradient Vanishing and Explosion 60</p> <p>3.4 Gradient Clipping 61</p> <p>3.5 Parameter Initialization 62</p> <p>3.5.1 Zero Initialization 62</p> <p>3.5.2 Random Initialization 63</p> <p>3.5.3 Lecun Initialization 65</p> <p>3.5.4 Xavier Initialization 65</p> <p>3.5.5 Kaiming (He) Initialization 66</p> <p>3.6 Faster Optimizers 67</p> <p>3.6.1 Momentum Optimization 67</p> <p>3.6.2 Nesterov Accelerated Gradient 69</p> <p>3.6.3 AdaGrad 69</p> <p>3.6.4 RMSProp 70</p> <p>3.6.5 Adam Optimizer 70</p> <p>3.7 Model Training Issues 71</p> <p>3.7.1 Bias 72</p> <p>3.7.2 Variance 72</p> <p>3.7.3 Overfitting Issues 72</p> <p>3.7.4 Underfitting Issues 73</p> <p>3.7.5 Model Capacity 74</p> <p>3.8 Supplementary Materials 74</p> <p>References 75</p> <p><b>4 Evaluating Deep Neural Networks </b><b>77</b></p> <p>4.1 Introduction 77</p> <p>4.2 Validation Dataset 78</p> <p>4.3 Regularization Methods 79</p> <p>4.3.1 Early Stopping 79</p> <p>4.3.2 <i>L</i>1 and <i>L</i>2 Regularization 80</p> <p>4.3.3 Dropout 81</p> <p>4.3.4 Max-Norm Regularization 82</p> <p>4.3.5 Data Augmentation 82</p> <p>4.4 Cross-Validation 83</p> <p>4.4.1 Hold-Out Cross-Validation 84</p> <p>4.4.2 <i>k</i>-Folds Cross-Validation 85</p> <p>4.4.3 Stratified <i>k</i>-Folds’ Cross-Validation 86</p> <p>4.4.4 Repeated <i>k</i>-Folds’ Cross-Validation 87</p> <p>4.4.5 Leave-One-Out Cross-Validation 88</p> <p>4.4.6 Leave-<i>p</i>-Out Cross-Validation 89</p> <p>4.4.7 Time Series Cross-Validation 90</p> <p>4.4.8 Rolling Cross-Validation 90</p> <p>4.4.9 Block Cross-Validation 90</p> <p>4.5 Performance Metrics 92</p> <p>4.5.1 Regression Metrics 92</p> <p>4.5.1.1 Mean Absolute Error (MAE) 92</p> <p>4.5.1.2 Root Mean Squared Error (RMSE) 93</p> <p>4.5.1.3 Coefficient of Determination (<i>R</i><sup>2</sup>) 93</p> <p>4.5.1.4 Adjusted <i>R</i><sup>2</sup> 94</p> <p>4.5.2 Classification Metrics 94</p> <p>4.5.2.1 Confusion Matrix 94</p> <p>4.5.2.2 Accuracy 96</p> <p>4.5.2.3 Precision 96</p> <p>4.5.2.4 Recall 97</p> <p>4.5.2.5 Precision–Recall Curve 97</p> <p>4.5.2.6 <i>F</i>1-Score 97</p> <p>4.5.2.7 Beta <i>F</i>1 Score 98</p> <p>4.5.2.8 False Positive Rate (FPR) 98</p> <p>4.5.2.9 Specificity 99</p> <p>4.5.2.10 Receiving Operating Characteristics (ROC) Curve 99</p> <p>4.6 Supplementary Materials 99</p> <p>References 100</p> <p><b>5 Convolutional Neural Networks </b><b>103</b></p> <p>5.1 Introduction 103</p> <p>5.2 Shift from Full Connected to Convolutional 104</p> <p>5.3 Basic Architecture 106</p> <p>5.3.1 The Cross-Correlation Operation 106</p> <p>5.3.2 Convolution Operation 107</p> <p>5.3.3 Receptive Field 108</p> <p>5.3.4 Padding and Stride 109</p> <p>5.3.4.1 Padding 109</p> <p>5.3.4.2 Stride 111</p> <p>5.4 Multiple Channels 113</p> <p>5.4.1 Multi-Channel Inputs 113</p> <p>5.4.2 Multi-Channel Output 114</p> <p>5.4.3 Convolutional Kernel 1 × 1 115</p> <p>5.5 Pooling Layers 116</p> <p>5.5.1 Max Pooling 117</p> <p>5.5.2 Average Pooling 117</p> <p>5.6 Normalization Layers 119</p> <p>5.6.1 Batch Normalization 119</p> <p>5.6.2 Layer Normalization 122</p> <p>5.6.3 Instance Normalization 124</p> <p>5.6.4 Group Normalization 126</p> <p>5.6.5 Weight Normalization 126</p> <p>5.7 Convolutional Neural Networks (LeNet) 127</p> <p>5.8 Case Studies 129</p> <p>5.8.1 Handwritten Digit Classification (One Channel Input) 129</p> <p>5.8.2 Dog vs. Cat Image Classification (Multi-Channel Input) 130</p> <p>5.9 Supplementary Materials 130</p> <p>References 130</p> <p><b>6 Dive Into Convolutional Neural Networks </b><b>133</b></p> <p>6.1 Introduction 133</p> <p>6.2 One-Dimensional Convolutional Network 134</p> <p>6.2.1 One-Dimensional Convolution 134</p> <p>6.2.2 One-Dimensional Pooling 135</p> <p>6.3 Three-Dimensional Convolutional Network 136</p> <p>6.3.1 Three-Dimensional Convolution 136</p> <p>6.3.2 Three-Dimensional Pooling 136</p> <p>6.4 Transposed Convolution Layer 137</p> <p>6.5 Atrous/Dilated Convolution 144</p> <p>6.6 Separable Convolutions 145</p> <p>6.6.1 Spatially Separable Convolutions 146</p> <p>6.6.2 Depth-wise Separable (DS) Convolutions 148</p> <p>6.7 Grouped Convolution 150</p> <p>6.8 Shuffled Grouped Convolution 152</p> <p>6.9 Supplementary Materials 154</p> <p>References 154</p> <p><b>7 Advanced Convolutional Neural Network </b><b>157</b></p> <p>7.1 Introduction 157</p> <p>7.2 AlexNet 158</p> <p>7.3 Block-wise Convolutional Network (VGG) 159</p> <p>7.4 Network in Network 160</p> <p>7.5 Inception Networks 162</p> <p>7.5.1 GoogLeNet 163</p> <p>7.5.2 Inception Network v2 (Inception v2) 166</p> <p>7.5.3 Inception Network v3 (Inception v3) 170</p> <p>7.6 Residual Convolutional Networks 170</p> <p>7.7 Dense Convolutional Networks 173</p> <p>7.8 Temporal Convolutional Network 176</p> <p>7.8.1 One-Dimensional Convolutional Network 177</p> <p>7.8.2 Causal and Dilated Convolution 180</p> <p>7.8.3 Residual Blocks 185</p> <p>7.9 Supplementary Materials 188</p> <p>References 188</p> <p><b>8 Introducing Recurrent Neural Networks </b><b>189</b></p> <p>8.1 Introduction 189</p> <p>8.2 Recurrent Neural Networks 190</p> <p>8.2.1 Recurrent Neurons 190</p> <p>8.2.2 Memory Cell 192</p> <p>8.2.3 Recurrent Neural Network 193</p> <p>8.3 Different Categories of RNNs 194</p> <p>8.3.1 One-to-One RNN 195</p> <p>8.3.2 One-to-Many RNN 195</p> <p>8.3.3 Many-to-One RNN 196</p> <p>8.3.4 Many-to-Many RNN 197</p> <p>8.4 Backpropagation Through Time 198</p> <p>8.5 Challenges Facing Simple RNNs 202</p> <p>8.5.1 Vanishing Gradient 202</p> <p>8.5.2 Exploding Gradient 204</p> <p>8.5.2.1 Truncated Backpropagation Through Time (TBPTT) 204</p> <p>8.5.2.2 Penalty on the Recurrent Weights <i>W<sub>hh</sub></i>205</p> <p>8.5.2.3 Clipping Gradients 205</p> <p>8.6 Case Study: Malware Detection 205</p> <p>8.7 Supplementary Material 206</p> <p>References 207</p> <p><b>9 Dive Into Recurrent Neural Networks </b><b>209</b></p> <p>9.1 Introduction 209</p> <p>9.2 Long Short-Term Memory (LSTM) 210</p> <p>9.2.1 LSTM Gates 211</p> <p>9.2.2 Candidate Memory Cells 213</p> <p>9.2.3 Memory Cell 214</p> <p>9.2.4 Hidden State 216</p> <p>9.3 LSTM with Peephole Connections 217</p> <p>9.4 Gated Recurrent Units (GRU) 218</p> <p>9.4.1 CRU Cell Gates 218</p> <p>9.4.2 Candidate State 220</p> <p>9.4.3 Hidden State 221</p> <p>9.5 ConvLSTM 222</p> <p>9.6 Unidirectional vs. Bidirectional Recurrent Network 223</p> <p>9.7 Deep Recurrent Network 226</p> <p>9.8 Insights 227</p> <p>9.9 Case Study of Malware Detection 228</p> <p>9.10 Supplementary Materials 229</p> <p>References 229</p> <p><b>10 Attention Neural Networks </b><b>231</b></p> <p>10.1 Introduction 231</p> <p>10.2 From Biological to Computerized Attention 232</p> <p>10.2.1 Biological Attention 232</p> <p>10.2.2 Queries, Keys, and Values 234</p> <p>10.3 Attention Pooling: Nadaraya–Watson Kernel Regression 235</p> <p>10.4 Attention-Scoring Functions 237</p> <p>10.4.1 Masked Softmax Operation 239</p> <p>10.4.2 Additive Attention (AA) 239</p> <p>10.4.3 Scaled Dot-Product Attention 240</p> <p>10.5 Multi-Head Attention (MHA) 240</p> <p>10.6 Self-Attention Mechanism 242</p> <p>10.6.1 Self-Attention (SA) Mechanism 242</p> <p>10.6.2 Positional Encoding 244</p> <p>10.7 Transformer Network 244</p> <p>10.8 Supplementary Materials 247</p> <p>References 247</p> <p><b>11 Autoencoder Networks </b><b>249</b></p> <p>11.1 Introduction 249</p> <p>11.2 Introducing Autoencoders 250</p> <p>11.2.1 Definition of Autoencoder 250</p> <p>11.2.2 Structural Design 253</p> <p>11.3 Convolutional Autoencoder 256</p> <p>11.4 Denoising Autoencoder 258</p> <p>11.5 Sparse Autoencoders 260</p> <p>11.6 Contractive Autoencoders 262</p> <p>11.7 Variational Autoencoders 263</p> <p>11.8 Case Study 268</p> <p>11.9 Supplementary Materials 269</p> <p>References 269</p> <p><b>12 Generative Adversarial Networks (GANs) </b><b>271</b></p> <p>12.1 Introduction 271</p> <p>12.2 Foundation of Generative Adversarial Network 272</p> <p>12.3 Deep Convolutional GAN 279</p> <p>12.4 Conditional GAN 281</p> <p>12.5 Supplementary Materials 285</p> <p>References 285</p> <p><b>13 Dive Into Generative Adversarial Networks </b><b>287</b></p> <p>13.1 Introduction 287</p> <p>13.2 Wasserstein GAN 288</p> <p>13.2.1 Distance Functions 289</p> <p>13.2.2 Distance Function in GANs 291</p> <p>13.2.3 Wasserstein Loss 293</p> <p>13.3 Least-Squares GAN (LSGAN) 298</p> <p>13.4 Auxiliary Classifier GAN (ACGAN) 300</p> <p>13.5 Supplementary Materials 301</p> <p>References 301</p> <p><b>14 Disentangled Representation GANs </b><b>303</b></p> <p>14.1 Introduction 303</p> <p>14.2 Disentangled Representations 304</p> <p>14.3 InfoGAN 306</p> <p>14.4 StackedGAN 309</p> <p>14.5 Supplementary Materials 316</p> <p>References 316</p> <p><b>15 Introducing Federated Learning for Internet of Things (IoT) </b><b>317</b></p> <p>15.1 Introduction 317</p> <p>15.2 Federated Learning in the Internet of Things 319</p> <p>15.3 Taxonomic View of Federated Learning 322</p> <p>15.3.1 Network Structure 322</p> <p>15.3.1.1 Centralized Federated Learning 322</p> <p>15.3.1.2 Decentralized Federated Learning 323</p> <p>15.3.1.3 Hierarchical Federated Learning 324</p> <p>15.3.2 Data Partition 325</p> <p>15.3.3 Horizontal Federated Learning 326</p> <p>15.3.4 Vertical Federated Learning 327</p> <p>15.3.5 Federated Transfer Learning 328</p> <p>15.4 Open-Source Frameworks 330</p> <p>15.4.1 TensorFlow Federated 330</p> <p>15.4.2 PySyft and PyGrid 331</p> <p>15.4.3 FedML 331</p> <p>15.4.4 LEAF 332</p> <p>15.4.5 PaddleFL 332</p> <p>15.4.6 Federated AI Technology Enabler (FATE) 333</p> <p>15.4.7 OpenFL 333</p> <p>15.4.8 IBM Federated Learning 333</p> <p>15.4.9 NVIDIA Federated Learning Application Runtime Environment (NVIDIA FLARE) 334</p> <p>15.4.10 Flower 334</p> <p>15.4.11 Sherpa.ai 335</p> <p>15.5 Supplementary Materials 335</p> <p>References 335</p> <p><b>16 Privacy-Preserved Federated Learning </b><b>337</b></p> <p>16.1 Introduction 337</p> <p>16.2 Statistical Challenges in Federated Learning 338</p> <p>16.2.1 Nonindependent and Identically Distributed (Non-IID) Data 338</p> <p>16.2.1.1 Class Imbalance 338</p> <p>16.2.1.2 Distribution Imbalance 341</p> <p>16.2.1.3 Size Imbalance 346</p> <p>16.2.2 Model Heterogeneity 346</p> <p>16.2.2.1 Extracting the Essence of a Subject 346</p> <p>16.2.3 Block Cycles 348</p> <p>16.3 Security Challenge in Federated Learning 348</p> <p>16.3.1 Untargeted Attacks 349</p> <p>16.3.2 Targeted Attacks 349</p> <p>16.4 Privacy Challenges in Federated Learning 350</p> <p>16.4.1 Secure Aggregation 351</p> <p>16.4.1.1 Homomorphic Encryption (HE) 351</p> <p>16.4.1.2 Secure Multiparty Computation 352</p> <p>16.4.1.3 Blockchain 352</p> <p>16.4.2 Perturbation Method 353</p> <p>16.5 Supplementary Materials 355</p> <p>References 355</p> <p>Index 357</p>
<p><b>Mohamed Abdel-Basset, PhD,</b> is an Associate Professor in the Faculty of Computers and Informatics at Zagazig University, Egypt. He is a Senior Member of the IEEE. <p><b>Nour Moustafa, PhD,</b> is a Postgraduate Discipline Coordinator (Cyber) and Senior Lecturer in Cybersecurity and Computing at the School of Engineering and Information Technology at the University of New South Wales, UNSW Canberra, Australia. <p><b>Hossam Hawash</b> is an Assistant Lecturer in the Department of Computer Science, Faculty of Computers and Informatics at Zagazig University, Egypt.
<p><b>An expert discussion of the application of deep learning methods in the IoT security environment</b> <p>In <i>Deep Learning Approaches for Security Threats in IoT Environments</i>, a team of distinguished cybersecurity educators deliver an insightful and robust exploration of how to approach and measure the security of Internet-of-Things (IoT) systems and networks. In this book, readers will examine critical concepts in artificial intelligence (AI) and IoT, and apply effective strategies to help secure and protect IoT networks. The authors discuss supervised, semi-supervised, and unsupervised deep learning techniques, as well as reinforcement and federated learning methods for privacy preservation. <p>This book applies deep learning approaches to IoT networks and solves the security problems that professionals frequently encounter when working in the field of IoT, as well as providing ways in which smart devices can solve cybersecurity issues. <p>Readers will also get access to a companion website with PowerPoint presentations, links to supporting videos, and additional resources. They’ll also find: <ul><li> A thorough introduction to artificial intelligence and the Internet of Things, including key concepts like deep learning, security, and privacy</li> <li> Comprehensive discussions of the architectures, protocols, and standards that form the foundation of deep learning for securing modern IoT systems and networks</li> <li> In-depth examinations of the architectural design of cloud, fog, and edge computing networks</li> <li> Fulsome presentations of the security requirements, threats, and countermeasures relevant to IoT networks</li></ul> <p>Perfect for professionals working in the AI, cybersecurity, and IoT industries, <i>Deep Learning Approaches for Security Threats in IoT Environments</i> will also earn a place in the libraries of undergraduate and graduate students studying deep learning, cybersecurity, privacy preservation, and the security of IoT networks.

Diese Produkte könnten Sie auch interessieren:

Impact of Artificial Intelligence on Organizational Transformation
Impact of Artificial Intelligence on Organizational Transformation
von: S. Balamurugan, Sonal Pathak, Anupriya Jain, Sachin Gupta, Sachin Sharma, Sonia Duggal
EPUB ebook
190,99 €
The CISO Evolution
The CISO Evolution
von: Matthew K. Sharp, Kyriakos Lambros
PDF ebook
33,99 €