Details

Machine Learning and Big Data


Machine Learning and Big Data

Concepts, Algorithms, Tools and Applications
1. Aufl.

von: Uma N. Dulhare, Khaleel Ahmad, Khairol Amali Bin Ahmad

197,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 02.09.2020
ISBN/EAN: 9781119654797
Sprache: englisch
Anzahl Seiten: 544

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>This book is intended for academic and industrial developers, exploring and developing applications in the area of big data and machine learning, including those that are solving technology requirements, evaluation of methodology advances and algorithm demonstrations.</b></p> <p>The intent of this book is to provide awareness of algorithms used for machine learning and big data in the academic and professional community. The 17 chapters are divided into 5 sections: Theoretical Fundamentals; Big Data and Pattern Recognition; Machine Learning: Algorithms & Applications; Machine Learning's Next Frontier and Hands-On and Case Study. While it dwells on the foundations of machine learning and big data as a part of analytics, it also focuses on contemporary topics for research and development. In this regard, the book covers machine learning algorithms and their modern applications in developing automated systems.</p> <p>Subjects covered in detail include:</p> <ul> <li>Mathematical foundations of machine learning with various examples.</li> <li>An empirical study of supervised learning algorithms like Naïve Bayes, KNN and semi-supervised learning algorithms viz. S3VM, Graph-Based, Multiview.</li> <li>Precise study on unsupervised learning algorithms like GMM, K-mean clustering, Dritchlet process mixture model, X-means and Reinforcement learning algorithm with Q learning, R learning, TD learning, SARSA Learning, and so forth.</li> <li>Hands-on machine leaning open source tools viz. Apache Mahout, H<sub>2</sub>O.</li> <li>Case studies for readers to analyze the prescribed cases and present their solutions or interpretations with intrusion detection in MANETS using machine learning.</li> <li>Showcase on novel user-cases: Implications of Electronic Governance as well as Pragmatic Study of BD/ML technologies for agriculture, healthcare, social media, industry, banking, insurance and so on.</li> </ul>
<p>Preface xix</p> <p><b>Section 1: Theoretical Fundamentals 1</b></p> <p><b>1 Mathematical Foundation 3<br /></b><i>Afroz and Basharat Hussain</i></p> <p>1.1 Concept of Linear Algebra 3</p> <p>1.1.1 Introduction 3</p> <p>1.1.2 Vector Spaces 5</p> <p>1.1.3 Linear Combination 6</p> <p>1.1.4 Linearly Dependent and Independent Vectors 7</p> <p>1.1.5 Linear Span, Basis and Subspace 8</p> <p>1.1.6 Linear Transformation (or Linear Map) 9</p> <p>1.1.7 Matrix Representation of Linear Transformation 10</p> <p>1.1.8 Range and Null Space of Linear Transformation 13</p> <p>1.1.9 Invertible Linear Transformation 15</p> <p>1.2 Eigenvalues, Eigenvectors, and Eigendecomposition of a Matrix 15</p> <p>1.2.1 Characteristics Polynomial 16</p> <p>1.2.1.1 Some Results on Eigenvalue 16</p> <p>1.2.2 Eigendecomposition 18</p> <p>1.3 Introduction to Calculus 20</p> <p>1.3.1 Function 20</p> <p>1.3.2 Limits of Functions 21</p> <p>1.3.2.1 Some Properties of Limits 22</p> <p>1.3.2.2 1nfinite Limits 25</p> <p>1.3.2.3 Limits at Infinity 26</p> <p>1.3.3 Continuous Functions and Discontinuous Functions 26</p> <p>1.3.3.1 Discontinuous Functions 27</p> <p>1.3.3.2 Properties of Continuous Function 27</p> <p>1.3.4 Differentiation 28</p> <p>References 29</p> <p><b>2 Theory of Probability 31<br /></b><i>Parvaze Ahmad Dar and Afroz</i></p> <p>2.1 Introduction 31</p> <p>2.1.1 Definition 31</p> <p>2.1.1.1 Statistical Definition of Probability 31</p> <p>2.1.1.2 Mathematical Definition of Probability 32</p> <p>2.1.2 Some Basic Terms of Probability 32</p> <p>2.1.2.1 Trial and Event 32</p> <p>2.1.2.2 Exhaustive Events (Exhaustive Cases) 33</p> <p>2.1.2.3 Mutually Exclusive Events 33</p> <p>2.1.2.4 Equally Likely Events 33</p> <p>2.1.2.5 Certain Event or Sure Event 33</p> <p>2.1.2.6 Impossible Event or Null Event (ϕ) 33</p> <p>2.1.2.7 Sample Space 34</p> <p>2.1.2.8 Permutation and Combination 34</p> <p>2.1.2.9 Examples 35</p> <p>2.2 Independence in Probability 38</p> <p>2.2.1 Independent Events 38</p> <p>2.2.2 Examples: Solve the Following Problems 38</p> <p>2.3 Conditional Probability 41</p> <p>2.3.1 Definition 41</p> <p>2.3.2 Mutually Independent Events 42</p> <p>2.3.3 Examples 42</p> <p>2.4 Cumulative Distribution Function 43</p> <p>2.4.1 Properties 44</p> <p>2.4.2 Example 44</p> <p>2.5 Baye’s Theorem 46</p> <p>2.5.1 Theorem 46</p> <p>2.5.1.1 Examples 47</p> <p>2.6 Multivariate Gaussian Function 50</p> <p>2.6.1 Definition 50</p> <p>2.6.1.1 Univariate Gaussian (i.e., One Variable Gaussian) 50</p> <p>2.6.1.2 Degenerate Univariate Gaussian 51</p> <p>2.6.1.3 Multivariate Gaussian 51</p> <p>References 51</p> <p><b>3 Correlation and Regression 53<br /></b><i>Mohd. Abdul Haleem Rizwan</i></p> <p>3.1 Introduction 53</p> <p>3.2 Correlation 54</p> <p>3.2.1 Positive Correlation and Negative Correlation 54</p> <p>3.2.2 Simple Correlation and Multiple Correlation 54</p> <p>3.2.3 Partial Correlation and Total Correlation 54</p> <p>3.2.4 Correlation Coefficient 55</p> <p>3.3 Regression 57</p> <p>3.3.1 Linear Regression 64</p> <p>3.3.2 Logistic Regression 64</p> <p>3.3.3 Polynomial Regression 65</p> <p>3.3.4 Stepwise Regression 66</p> <p>3.3.5 Ridge Regression 67</p> <p>3.3.6 Lasso Regression 67</p> <p>3.3.7 Elastic Net Regression 68</p> <p>3.4 Conclusion 68</p> <p>References 69</p> <p><b>Section 2: Big Data and Pattern Recognition 71</b></p> <p><b>4 Data Preprocess 73<br /></b><i>Md. Sharif Hossen</i></p> <p>4.1 Introduction 73</p> <p>4.1.1 Need of Data Preprocessing 74</p> <p>4.1.2 Main Tasks in Data Preprocessing 75</p> <p>4.2 Data Cleaning 77</p> <p>4.2.1 Missing Data 77</p> <p>4.2.2 Noisy Data 78</p> <p>4.3 Data Integration 80</p> <p>4.3.1 χ2 Correlation Test 82</p> <p>4.3.2 Correlation Coefficient Test 82</p> <p>4.3.3 Covariance Test 83</p> <p>4.4 Data Transformation 83</p> <p>4.4.1 Normalization 83</p> <p>4.4.2 Attribute Selection 85</p> <p>4.4.3 Discretization 86</p> <p>4.4.4 Concept Hierarchy Generation 86</p> <p>4.5 Data Reduction 88</p> <p>4.5.1 Data Cube Aggregation 88</p> <p>4.5.2 Attribute Subset Selection 90</p> <p>4.5.3 Numerosity Reduction 91</p> <p>4.5.4 Dimensionality Reduction 95</p> <p>4.6 Conclusion 101</p> <p>Acknowledgements 101</p> <p>References 101</p> <p><b>5 Big Data 105<br /></b><i>R. Chinnaiyan</i></p> <p>5.1 Introduction 105</p> <p>5.2 Big Data Evaluation With Its Tools 107</p> <p>5.3 Architecture of Big Data 107</p> <p>5.3.1 Big Data Analytics Framework Workflow 107</p> <p>5.4 Issues and Challenges 109</p> <p>5.4.1 Volume 109</p> <p>5.4.2 Variety of Data 110</p> <p>5.4.3 Velocity 110</p> <p>5.5 Big Data Analytics Tools 110</p> <p>5.6 Big Data Use Cases 114</p> <p>5.6.1 Banking and Finance 114</p> <p>5.6.2 Fraud Detection 114</p> <p>5.6.3 Customer Division and Personalized Marketing 114</p> <p>5.6.4 Customer Support 115</p> <p>5.6.5 Risk Management 116</p> <p>5.6.6 Life Time Value Prediction 116</p> <p>5.6.7 Cyber Security Analytics 117</p> <p>5.6.8 Insurance Industry 118</p> <p>5.6.9 Health Care Sector 118</p> <p>5.6.9.1 Big Data Medical Decision Support 120</p> <p>5.6.9.2 Big Data–Based Disorder Management 120</p> <p>5.6.9.3 Big Data–Based Patient Monitoring and Control 120</p> <p>5.6.9.4 Big Data–Based Human Routine Analytics 120</p> <p>5.6.10 Internet of Things 121</p> <p>5.6.11 Weather Forecasting 121</p> <p>5.7 Where IoT Meets Big Data 122</p> <p>5.7.1 IoT Platform 122</p> <p>5.7.2 Sensors or Devices 123</p> <p>5.7.3 Device Aggregators 123</p> <p>5.7.4 IoT Gateway 123</p> <p>5.7.5 Big Data Platform and Tools 124</p> <p>5.8 Role of Machine Learning For Big Data and IoT 124</p> <p>5.8.1 Typical Machine Learning Use Cases 125</p> <p>5.9 Conclusion 126</p> <p>References 127</p> <p><b>6 Pattern Recognition Concepts 131<br /></b><i>Ambeshwar Kumar, R. Manikandan and C. Thaventhiran</i></p> <p>6.1 Classifier 132</p> <p>6.1.1 Introduction 132</p> <p>6.1.2 Explanation-Based Learning 133</p> <p>6.1.3 Isomorphism and Clique Method 135</p> <p>6.1.4 Context-Dependent Classification 138</p> <p>6.1.5 Summary 139</p> <p>6.2 Feature Processing 140</p> <p>6.2.1 Introduction 140</p> <p>6.2.2 Detection and Extracting Edge With Boundary Line 141</p> <p>6.2.3 Analyzing the Texture 142</p> <p>6.2.4 Feature Mapping in Consecutive Moving Frame 143</p> <p>6.2.5 Summary 145</p> <p>6.3 Clustering 145</p> <p>6.3.1 Introduction 145</p> <p>6.3.2 Types of Clustering Algorithms 146</p> <p>6.3.2.1 Dynamic Clustering Method 148</p> <p>6.3.2.2 Model-Based Clustering 148</p> <p>6.3.3 Application 149</p> <p>6.3.4 Summary 150</p> <p>6.4 Conclusion 151</p> <p>References 151</p> <p><b>Section 3: Machine Learning: Algorithms & Applications 153</b></p> <p><b>7 Machine Learning 155<br /></b><i>Elham Ghanbari and Sara Najafzadeh</i></p> <p>7.1 History and Purpose of Machine Learning 155</p> <p>7.1.1 History of Machine Learning 155</p> <p>7.1.1.1 What is Machine Learning? 156</p> <p>7.1.1.2 When the Machine Learning is Needed? 157</p> <p>7.1.2 Goals and Achievements in Machine Learning 158</p> <p>7.1.3 Applications of Machine Learning 158</p> <p>7.1.3.1 Practical Machine Learning Examples 159</p> <p>7.1.4 Relation to Other Fields 161</p> <p>7.1.4.1 Data Mining 161</p> <p>7.1.4.2 Artificial Intelligence 162</p> <p>7.1.4.3 Computational Statistics 162</p> <p>7.1.4.4 Probability 163</p> <p>7.1.5 Limitations of Machine Learning 163</p> <p>7.2 Concept of Well-Defined Learning Problem 164</p> <p>7.2.1 Concept Learning 164</p> <p>7.2.1.1 Concept Representation 166</p> <p>7.2.1.2 Instance Representation 167</p> <p>7.2.1.3 The Inductive Learning Hypothesis 167</p> <p>7.2.2 Concept Learning as Search 167</p> <p>7.2.2.1 Concept Generality 168</p> <p>7.3 General-to-Specific Ordering Over Hypotheses 169</p> <p>7.3.1 Basic Concepts: Hypothesis, Generality 169</p> <p>7.3.2 Structure of the Hypothesis Space 169</p> <p>7.3.2.1 Hypothesis Notations 169</p> <p>7.3.2.2 Hypothesis Evaluations 170</p> <p>7.3.3 Ordering on Hypotheses: General to Specific 170</p> <p>7.3.3.1 Most Specific Generalized 171</p> <p>7.3.3.2 Most General Specialized 173</p> <p>7.3.3.3 Generalization and Specialization Operators 173</p> <p>7.3.4 Hypothesis Space Search by Find-S Algorithm 174</p> <p>7.3.4.1 Properties of the Find-S Algorithm 176</p> <p>7.3.4.2 Limitations of the Find-S Algorithm 176</p> <p>7.4 Version Spaces and Candidate Elimination Algorithm 177</p> <p>7.4.1 Representing Version Spaces 177</p> <p>7.4.1.1 General Boundary 178</p> <p>7.4.1.2 Specific Boundary 178</p> <p>7.4.2 Version Space as Search Strategy 179</p> <p>7.4.3 The List-Eliminate Method 179</p> <p>7.4.4 The Candidate-Elimination Method 180</p> <p>7.4.4.1 Example 181</p> <p>7.4.4.2 Convergence of Candidate-Elimination Method 183</p> <p>7.4.4.3 Inductive Bias for Candidate-Elimination 184</p> <p>7.5 Concepts of Machine Learning Algorithm 185</p> <p>7.5.1 Types of Learning Algorithms 185</p> <p>7.5.1.1 Incremental vs. Batch Learning Algorithms 186</p> <p>7.5.1.2 Offline vs. Online Learning Algorithms 188</p> <p>7.5.1.3 Inductive vs. Deductive Learning Algorithms 189</p> <p>7.5.2 A Framework for Machine Learning Algorithms 189</p> <p>7.5.2.1 Training Data 190</p> <p>7.5.2.2 Target Function 190</p> <p>7.5.2.3 Construction Model 191</p> <p>7.5.2.4 Evaluation 191</p> <p>7.5.3 Types of Machine Learning Algorithms 194</p> <p>7.5.3.1 Supervised Learning 196</p> <p>7.5.3.2 Unsupervised Learning 198</p> <p>7.5.3.3 Semi Supervised Learning 200</p> <p>7.5.3.4 Reinforcement Learning 200</p> <p>7.5.3.5 Deep Learning 202</p> <p>7.5.4 Types of Machine Learning Problems 203</p> <p>7.5.4.1 Classification 204</p> <p>7.5.4.2 Clustering 204</p> <p>7.5.4.3 Optimization 205</p> <p>7.5.4.4 Regression 205</p> <p>Conclusion 205</p> <p>References 206</p> <p><b>8 Performance of Supervised Learning Algorithms on Multi-Variate Datasets 209<br /></b><i>Asif Iqbal Hajamydeen and Rabab Alayham Abbas Helmi</i></p> <p>8.1 Introduction 209</p> <p>8.2 Supervised Learning Algorithms 210</p> <p>8.2.1 Datasets and Experimental Setup 211</p> <p>8.2.2 Data Treatment/Preprocessing 212</p> <p>8.3 Classification 212</p> <p>8.3.1 Support Vector Machines (SVM) 213</p> <p>8.3.2 Naive Bayes (NB) Algorithm 214</p> <p>8.3.3 Bayesian Network (BN) 214</p> <p>8.3.4 Hidden Markov Model (HMM) 215</p> <p>8.3.5 K-Nearest Neighbour (KNN) 216</p> <p>8.3.6 Training Time 216</p> <p>8.4 Neural Network 217</p> <p>8.4.1 Artificial Neural Networks Architecture 219</p> <p>8.4.2 Application Areas 222</p> <p>8.4.3 Artificial Neural Networks and Time Series 224</p> <p>8.5 Comparisons and Discussions 225</p> <p>8.5.1 Comparison of Classification Accuracy 225</p> <p>8.5.2 Forecasting Efficiency Comparison 226</p> <p>8.5.3 Recurrent Neural Network (RNN) 226</p> <p>8.5.4 Backpropagation Neural Network (BPNN) 228</p> <p>8.5.5 General Regression Neural Network 229</p> <p>8.6 Summary and Conclusion 230</p> <p>References 231</p> <p><b>9 Unsupervised Learning 233<br /></b><i>M. Kumara Swamy and Tejaswi Puligilla</i></p> <p>9.1 Introduction 233</p> <p>9.2 Related Work 234</p> <p>9.3 Unsupervised Learning Algorithms 235</p> <p>9.4 Classification of Unsupervised Learning Algorithms 238</p> <p>9.4.1 Hierarchical Methods 238</p> <p>9.4.2 Partitioning Methods 239</p> <p>9.4.3 Density-Based Methods 242</p> <p>9.4.4 Grid-Based Methods 245</p> <p>9.4.5 Constraint-Based Clustering 245</p> <p>9.5 Unsupervised Learning Algorithms in ML 246</p> <p>9.5.1 Parametric Algorithms 246</p> <p>9.5.2 Non-Parametric Algorithms 246</p> <p>9.5.3 Dirichlet Process Mixture Model 247</p> <p>9.5.4 X-Means 248</p> <p>9.6 Summary and Conclusions 248</p> <p>References 248</p> <p><b>10 Semi-Supervised Learning 251<br /></b><i>Manish Devgan, Gaurav Malik and Deepak Kumar Sharma</i></p> <p>10.1 Introduction 252</p> <p>10.1.1 Semi-Supervised Learning 252</p> <p>10.1.2 Comparison With Other Paradigms 255</p> <p>10.2 Training Models 257</p> <p>10.2.1 Self-Training 257</p> <p>10.2.2 Co-Training 259</p> <p>10.3 Generative Models—Introduction 261</p> <p>10.3.1 Image Classification 264</p> <p>10.3.2 Text Categorization 266</p> <p>10.3.3 Speech Recognition 268</p> <p>10.3.4 Baum-Welch Algorithm 268</p> <p>10.4 S3VMs 270</p> <p>10.5 Graph-Based Algorithms 274</p> <p>10.5.1 Mincut 275</p> <p>10.5.2 Harmonic 276</p> <p>10.5.3 Manifold Regularization 277</p> <p>10.6 Multiview Learning 277</p> <p>10.7 Conclusion 278</p> <p>References 279</p> <p><b>11 Reinforcement Learning 281<br /></b><i>Amandeep Singh Bhatia, Mandeep Kaur Saggi, Amit Sundas and Jatinder Ashta</i></p> <p>11.1 Introduction: Reinforcement Learning 281</p> <p>11.1.1 Elements of Reinforcement Learning 283</p> <p>11.2 Model-Free RL 284</p> <p>11.2.1 Q-Learning 285</p> <p>11.2.2 R-Learning 286</p> <p>11.3 Model-Based RL 287</p> <p>11.3.1 SARSA Learning 289</p> <p>11.3.2 Dyna-Q Learning 290</p> <p>11.3.3 Temporal Difference 291</p> <p>11.3.3.1 TD(0) Algorithm 292</p> <p>11.3.3.2 TD(1) Algorithm 293</p> <p>11.3.3.3 TD(λ) Algorithm 294</p> <p>11.3.4 Monte Carlo Method 294</p> <p>11.3.4.1 Monte Carlo Reinforcement Learning 296</p> <p>11.3.4.2 Monte Carlo Policy Evaluation 296</p> <p>11.3.4.3 Monte Carlo Policy Improvement 298</p> <p>11.4 Conclusion 298</p> <p>References 299</p> <p><b>12 Application of Big Data and Machine Learning 305<br /></b><i>Neha Sharma, Sunil Kumar Gautam, Azriel A. Henry and Abhimanyu Kumar</i></p> <p>12.1 Introduction 306</p> <p>12.2 Motivation 307</p> <p>12.3 Related Work 308</p> <p>12.4 Application of Big Data and ML 309</p> <p>12.4.1 Healthcare 309</p> <p>12.4.2 Banking and Insurance 312</p> <p>12.4.3 Transportation 314</p> <p>12.4.4 Media and Entertainment 316</p> <p>12.4.5 Education 317</p> <p>12.4.6 Ecosystem Conservation 319</p> <p>12.4.7 Manufacturing 321</p> <p>12.4.8 Agriculture 322</p> <p>12.5 Issues and Challenges 324</p> <p>12.6 Conclusion 326</p> <p>References 326</p> <p><b>Section 4: Machine Learning’s Next Frontier 335</b></p> <p><b>13 Transfer Learning 337<br /></b><i>Riyanshi Gupta, Kartik Krishna Bhardwaj and Deepak Kumar Sharma</i></p> <p>13.1 Introduction 338</p> <p>13.1.1 Motivation, Definition, and Representation 338</p> <p>13.2 Traditional Learning vs. Transfer Learning 338</p> <p>13.3 Key Takeaways: Functionality 340</p> <p>13.4 Transfer Learning Methodologies 341</p> <p>13.5 Inductive Transfer Learning 342</p> <p>13.6 Unsupervised Transfer Learning 344</p> <p>13.7 Transductive Transfer Learning 346</p> <p>13.8 Categories in Transfer Learning 347</p> <p>13.9 Instance Transfer 348</p> <p>13.10 Feature Representation Transfer 349</p> <p>13.11 Parameter Transfer 349</p> <p>13.12 Relational Knowledge Transfer 350</p> <p>13.13 Relationship With Deep Learning 351</p> <p>13.13.1 Transfer Learning in Deep Learning 351</p> <p>13.13.2 Types of Deep Transfer Learning 352</p> <p>13.13.3 Adaptation of Domain 352</p> <p>13.13.4 Domain Confusion 353</p> <p>13.13.5 Multitask Learning 354</p> <p>13.13.6 One-Shot Learning 354</p> <p>13.13.7 Zero-Shot Learning 355</p> <p>13.14 Applications: Allied Classical Problems 355</p> <p>13.14.1 Transfer Learning for Natural Language Processing 356</p> <p>13.14.2 Transfer Learning for Computer Vision 356</p> <p>13.14.3 Transfer Learning for Audio and Speech 357</p> <p>13.15 Further Advancements and Conclusion 357</p> <p>References 358</p> <p><b>Section 5: Hands-On and Case Study 361</b></p> <p><b>14 Hands on MAHOUT—Machine Learning Tool<br /></b><i>Uma N. Dulhare and Sheikh Gouse</i></p> <p>14.1 Introduction to Mahout 363</p> <p>14.1.1 Features 366</p> <p>14.1.2 Advantages 366</p> <p>14.1.3 Disadvantages 366</p> <p>14.1.4 Application 366</p> <p>14.2 Installation Steps of Apache Mahout Using Cloudera 367</p> <p>14.2.1 Installation of VMware Workstation 367</p> <p>14.2.2 Installation of Cloudera 368</p> <p>14.2.3 Installation of Mahout 383</p> <p>14.2.4 Installation of Maven 384</p> <p>14.2.5 Testing Mahout 386</p> <p>14.3 Installation Steps of Apache Mahout Using Windows 10 386</p> <p>14.3.1 Installation of Java 386</p> <p>14.3.2 Installation of Hadoop 387</p> <p>14.3.3 Installation of Mahout 387</p> <p>14.3.4 Installation of Maven 387</p> <p>14.3.5 Path Setting 388</p> <p>14.3.6 Hadoop Configuration 391</p> <p>14.4 Installation Steps of Apache Mahout Using Eclipse 395</p> <p>14.4.1 Eclipse Installation 395</p> <p>14.4.2 Installation of Maven Through Eclipse 396</p> <p>14.4.3 Maven Setup for Mahout Configuration 399</p> <p>14.4.4 Building the Path- 402</p> <p>14.4.5 Modifying the pom.xml File 405</p> <p>14.4.6 Creating the Data File 407</p> <p>14.4.7 Adding External Jar Files 408</p> <p>14.4.8 Creating the New Package and Classes 410</p> <p>14.4.9 Result 411</p> <p>14.5 Mahout Algorithms 412</p> <p>14.5.1 Classification 412</p> <p>14.5.2 Clustering 413</p> <p>14.5.3 Recommendation 415</p> <p>14.6 Conclusion 418</p> <p>References 418</p> <p><b>15 Hands-On H2O Machine Learning Tool 423<br /></b><i>Uma N. Dulhare, Azmath Mubeen and Khaleel Ahmed</i></p> <p>15.1 Introduction 424</p> <p>15.2 Installation 425</p> <p>15.2.1 The Process of Installation 425</p> <p>15.3 Interfaces 431</p> <p>15.4 Programming Fundamentals 432</p> <p>15.4.1 Data Manipulation 432</p> <p>15.4.1.1 Data Types 432</p> <p>15.4.1.2 Data Import 435</p> <p>15.4.2 Models 436</p> <p>15.4.2.1 Model Training 436</p> <p>15.4.3 Discovering Aspects 437</p> <p>15.4.3.1 Converting Data Frames 437</p> <p>15.4.4 H2O Cluster Actions 438</p> <p>15.4.4.1 H2O Key Value Retrieval 438</p> <p>15.4.4.2 H2O Cluster Connection 438</p> <p>15.4.5 Commands 439</p> <p>15.4.5.1 Cluster Information 439</p> <p>15.4.5.2 General Data Operations 441</p> <p>15.4.5.3 String Manipulation Commands 442</p> <p>15.5 Machine Learning in H2O 442</p> <p>15.5.1 Supervised Learning 442</p> <p>15.5.2 Unsupervised Learning 443</p> <p>15.6 Applications of H2O 443</p> <p>15.6.1 Deep Learning 443</p> <p>15.6.2 K-Fold Cross-Authentication or Validation 448</p> <p>15.6.3 Stacked Ensemble and Random Forest Estimator 450</p> <p>15.7 Conclusion 452</p> <p>References 453</p> <p><b>16 Case Study: Intrusion Detection System Using Machine Learning 455<br /></b><i>Syeda Hajra Mahin, Fahmina Taranum and Reshma Nikhat</i></p> <p>16.1 Introduction 456</p> <p>16.1.1 Components Used to Design the Scenario Include 456</p> <p>16.1.1.1 Black Hole 456</p> <p>16.1.1.2 Intrusion Detection System 457</p> <p>16.1.1.3 Components Used From MATLAB Simulator 458</p> <p>16.2 System Design 465</p> <p>16.2.1 Three Sub-Network Architecture 465</p> <p>16.2.2 Using Classifiers of MATLAB 465</p> <p>16.3 Existing Proposals 467</p> <p>16.4 Approaches Used in Designing the Scenario 469</p> <p>16.4.1 Algorithm Used in QualNet 469</p> <p>16.4.2 Algorithm Applied in MATLAB 471</p> <p>16.5 Result Analysis 471</p> <p>16.5.1 Results From QualNet 471</p> <p>16.5.1.1 Deployment 471</p> <p>16.5.1.2 Detection 472</p> <p>16.5.1.3 Avoidance 473</p> <p>16.5.1.4 Validation of Conclusion 473</p> <p>16.5.2 Applying Results to MATLAB 473</p> <p>16.5.2.1 K-Nearest Neighbor 475</p> <p>16.5.2.2 SVM 477</p> <p>16.5.2.3 Decision Tree 477</p> <p>16.5.2.4 Naive Bayes 479</p> <p>16.5.2.5 Neural Network 479</p> <p>16.6 Conclusion 484</p> <p>References 484</p> <p><b>17 Inclusion of Security Features for Implications of Electronic Governance Activities 487<br /></b><i>Prabal Pratap and Nripendra Dwivedi</i></p> <p>17.1 Introduction 487</p> <p>17.2 Objective of E-Governance 491</p> <p>17.3 Role of Identity in E-Governance 493</p> <p>17.3.1 Identity 493</p> <p>17.3.2 Identity Management and its Buoyancy Against Identity Theft in E-Governance 494</p> <p>17.4 Status of E-Governance in Other Countries 496</p> <p>17.4.1 E-Governance Services in Other Countries Like Australia and South Africa 496</p> <p>17.4.2 Adaptation of Processes and Methodology for Developing Countries 496</p> <p>17.4.3 Different Programs Related to E-Governance 499</p> <p>17.5 Pros and Cons of E-Governance 501</p> <p>17.6 Challenges of E-Governance in Machine Learning 502</p> <p>17.7 Conclusion 503</p> <p>References 503</p> <p>Index 505</p>
<p><b>Uma N. Dulhare</b> is a Professor in the Department of Computer Science & Eng., MJCET affiliated to Osmania University, Hyderabad, India. She has more than 20 years teaching experience years with many publications in reputed international conferences, journals and online book chapter contributions. She received her PhD from Osmania University, Hyderabad. <p><b>Khaleel Ahmad</b> is an Assistant Professor in the Department of Computer Science & Information Technology at Maulana Azad National Urdu University, Hyderabad, India. He holds a PhD in Computer Science & Engineering. He has published more than 25 papers in refereed journals and conferences as well as edited two books. <p><b>Khairol Amali bin Ahmad</b> obtained a BSc in Electrical Engineering in 1992 from the United States Military Academy, West Point, MSc in Military Electronic Systems Engineering in 1999 from Cranfield University, England, and PhD from ISAE-SUPAERO, France in 2015. Currently, he is the Dean of the Engineering Faculty at the National Defense University of Malaysia.
<p><b>This book is intended for academic and industrial developers, exploring and developing applications in the area of big data and machine learning, including those that are solving technology requirements, evaluation of methodology advances and algorithm demonstrations.</b> <p>The intent of this book is to provide awareness of algorithms used for machine learning and big data in the academic and professional community. The 17 chapters are divided into 5 sections: Theoretical Fundamentals; Big Data and Pattern Recognition; Machine Learning: Algorithms & Applications; Machine Learning's Next Frontier and Hands-On and Case Study. While it dwells on the foundations of machine learning and big data as a part of analytics, it also focuses on contemporary topics for research and development. In this regard, the book covers machine learning algorithms and their modern applications in developing automated systems. <p>Subjects covered in detail include: <ul> <li>Mathematical foundations of machine learning with various examples.</li> <li>An empirical study of supervised learning algorithms like Naïve Bayes, KNN and semi-supervised learning algorithms viz. S3VM, Graph-Based, Multiview.</li> <li>Precise study on unsupervised learning algorithms like GMM, K-mean clustering, Dritchlet process mixture model, X-means and Reinforcement learning algorithm with Q learning, R learning, TD learning, SARSA Learning, and so forth.</li> <li>Hands-on machine leaning open source tools viz. Apache Mahout, H<sub>2</sub>O.</li> <li>Case studies for readers to analyze the prescribed cases and present their solutions or interpretations with intrusion detection in MANETS using machine learning.</li> <li>Showcase on novel user-cases: Implications of Electronic Governance as well as Pragmatic Study of BD/ML technologies for agriculture, healthcare, social media, industry, banking, insurance and so on.</li> </ul> <p><b>Audience</b><br> Researchers and engineers in artificial intelligence, information technologies, computer science as well as software developers and product/process managers.

Diese Produkte könnten Sie auch interessieren:

MDX Solutions
MDX Solutions
von: George Spofford, Sivakumar Harinath, Christopher Webb, Dylan Hai Huang, Francesco Civardi
PDF ebook
53,99 €
Concept Data Analysis
Concept Data Analysis
von: Claudio Carpineto, Giovanni Romano
PDF ebook
107,99 €
Handbook of Virtual Humans
Handbook of Virtual Humans
von: Nadia Magnenat-Thalmann, Daniel Thalmann
PDF ebook
150,99 €