Details

Deep Learning for the Earth Sciences


Deep Learning for the Earth Sciences

A Comprehensive Approach to Remote Sensing, Climate Science and Geosciences
1. Aufl.

von: Gustau Camps-Valls, Devis Tuia, Xiao Xiang Zhu, Markus Reichstein

117,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 18.08.2021
ISBN/EAN: 9781119646167
Sprache: englisch
Anzahl Seiten: 432

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

DEEP LEARNING FOR THE EARTH SCIENCES <p><b>Explore this insightful treatment of deep learning in the field of earth sciences, from four leading voices</b> <p>Deep learning is a fundamental technique in modern Artificial Intelligence and is being applied to disciplines across the scientific spectrum; earth science is no exception. Yet, the link between deep learning and Earth sciences has only recently entered academic curricula and thus has not yet proliferated. <i>Deep Learning for the Earth Sciences</i> delivers a unique perspective and treatment of the concepts, skills, and practices necessary to quickly become familiar with the application of deep learning techniques to the Earth sciences. The book prepares readers to be ready to use the technologies and principles described in their own research. <p>The distinguished editors have also included resources that explain and provide new ideas and recommendations for new research especially useful to those involved in advanced research education or those seeking PhD thesis orientations. Readers will also benefit from the inclusion of: <ul><li>An introduction to deep learning for classification purposes, including advances in image segmentation and encoding priors, anomaly detection and target detection, and domain adaptation</li><li>An exploration of learning representations and unsupervised deep learning, including deep learning image fusion, image retrieval, and matching and co-registration</li><li>Practical discussions of regression, fitting, parameter retrieval, forecasting and interpolation</li><li>An examination of physics-aware deep learning models, including emulation of complex codes and model parametrizations</li></ul> <p>Perfect for PhD students and researchers in the fields of geosciences, image processing, remote sensing, electrical engineering and computer science, and machine learning, <i>Deep Learning for the Earth Sciences</i> will also earn a place in the libraries of machine learning and pattern recognition researchers, engineers, and scientists.
<p>Foreword xvi<br /> <i>by Vipin Kumar, Regents Professor, University of Minnesota </i></p> <p>Acknowledgments<i> xvii</i></p> <p>List of Contributors xviii</p> <p>List of Acronyms xxiv</p> <p><b>1 Introduction 1<br /> </b><i>Gustau Camps-Valls, Xiao Xiang Zhu, Devis Tuia, and Markus Reichstein</i></p> <p>1.1 A Taxonomy of Deep Learning Approaches 2</p> <p>1.2 Deep Learning in Remote Sensing 3</p> <p>1.3 Deep Learning in Geosciences and Climate 7</p> <p>1.4 Book Structure and Roadmap 9</p> <p><b>Part I Deep Learning to Extract Information from Remote Sensing Images 13</b></p> <p><b>2 Learning Unsupervised Feature Representations of Remote Sensing Data with Sparse Convolutional Networks 15<br /> </b><i>Jose E. Adsuara, Manuel Campos-Taberner, Javier García-Haro, Carlo Gatta, Adriana Romero, and Gustau Camps-Valls</i></p> <p>2.1 Introduction 15</p> <p>2.2 Sparse Unsupervised Convolutional Networks 17</p> <p>2.2.1 Sparsity as the Guiding Criterion 17</p> <p>2.2.2 The EPLS Algorithm 18</p> <p>2.2.3 Remarks 18</p> <p>2.3 Applications 19</p> <p>2.3.1 Hyperspectral Image Classification 19</p> <p>2.3.2 Multisensor Image Fusion 21</p> <p>2.4 Conclusions 22</p> <p><b>3 Generative Adversarial Networks in the Geosciences 24<br /> </b><i>Gonzalo Mateo-García, Valero Laparra, Christian Requena-Mesa, and Luis Gómez-Chova</i></p> <p>3.1 Introduction 24</p> <p>3.2 Generative Adversarial Networks 25</p> <p>3.2.1 Unsupervised GANs 25</p> <p>3.2.2 Conditional GANs 26</p> <p>3.2.3 Cycle-consistent GANs 27</p> <p>3.3 GANs in Remote Sensing and Geosciences 28</p> <p>3.3.1 GANs in Earth Observation 28</p> <p>3.3.2 Conditional GANs in Earth Observation 30</p> <p>3.3.3 CycleGANs in Earth Observation 30</p> <p>3.4 Applications of GANs in Earth Observation 31</p> <p>3.4.1 Domain Adaptation Across Satellites 31</p> <p>3.4.2 Learning to Emulate Earth Systems from Observations 33</p> <p>3.5 Conclusions and Perspectives 36</p> <p><b>4 Deep Self-taught Learning in Remote Sensing 37<br /> </b><i>Ribana Roscher</i></p> <p>4.1 Introduction 37</p> <p>4.2 Sparse Representation 38</p> <p>4.2.1 Dictionary Learning 39</p> <p>4.2.2 Self-taught Learning 40</p> <p>4.3 Deep Self-taught Learning 40</p> <p>4.3.1 Application Example 43</p> <p>4.3.2 Relation to Deep Neural Networks 44</p> <p>4.4 Conclusion 45</p> <p><b>5 Deep Learning-based Semantic Segmentation in Remote Sensing 46<br /> </b><i>Devis Tuia, Diego Marcos, Konrad Schindler, and Bertrand Le Saux</i></p> <p>5.1 Introduction 46</p> <p>5.2 Literature Review 47</p> <p>5.3 Basics on Deep Semantic Segmentation: Computer Vision Models 49</p> <p>5.3.1 Architectures for Image Data 49</p> <p>5.3.2 Architectures for Point-clouds 52</p> <p>5.4 Selected Examples 55</p> <p>5.4.1 Encoding Invariances to Train Smaller Models: The example of Rotation 55</p> <p>5.4.2 Processing 3D Point Clouds as a Bundle of Images: SnapNet 59</p> <p>5.4.3 Lake Ice Detection from Earth and from Space 62</p> <p>5.5 Concluding Remarks 66</p> <p><b>6 Object Detection in Remote Sensing 67<br /> </b><i>Jian Ding, Jinwang Wang, Wen Yang, and Gui-Song Xia</i></p> <p>6.1 Introduction 67</p> <p>6.1.1 Problem Description 67</p> <p>6.1.2 Problem Settings of Object Detection 69</p> <p>6.1.3 Object Representation in Remote Sensing 69</p> <p>6.1.4 Evaluation Metrics 69</p> <p>6.1.4.1 Precision-Recall Curve 70</p> <p>6.1.4.2 Average Precision and Mean Average Precision 71</p> <p>6.1.5 Applications 71</p> <p>6.2 Preliminaries on Object Detection with Deep Models 72</p> <p>6.2.1 Two-stage Algorithms 72</p> <p>6.2.1.1 R-CNNs 72</p> <p>6.2.1.2 R-fcn 73</p> <p>6.2.2 One-stage Algorithms 73</p> <p>6.2.2.1 Yolo 73</p> <p>6.2.2.2 Ssd 73</p> <p>6.3 Object Detection in Optical RS Images 75</p> <p>6.3.1 Related Works 75</p> <p>6.3.1.1 Scale Variance 75</p> <p>6.3.1.2 Orientation Variance 75</p> <p>6.3.1.3 Oriented Object Detection 75</p> <p>6.3.1.4 Detecting in Large-size Images 76</p> <p>6.3.2 Datasets and Benchmark 77</p> <p>6.3.2.1 Dota 77</p> <p>6.3.2.2 VisDrone 77</p> <p>6.3.2.3 Dior 77</p> <p>6.3.2.4 xView 77</p> <p>6.3.3 Two Representative Object Detectors in Optical RS Images 78</p> <p>6.3.3.1 Mask OBB 78</p> <p>6.3.3.2 RoI Transformer 82</p> <p>6.4 Object Detection in SAR Images 86</p> <p>6.4.1 Challenges of Detection in SAR Images 86</p> <p>6.4.2 Related Works 86</p> <p>6.4.3 Datasets and Benchmarks 88</p> <p>6.5 Conclusion 89</p> <p><b>7 Deep Domain Adaptation in Earth Observation 90<br /> </b><i>Benjamin Kellenberger, Onur Tasar, Bharath Bhushan Damodaran, Nicolas Courty, and Devis Tuia</i></p> <p>7.1 Introduction 90</p> <p>7.2 Families of Methodologies 91</p> <p>7.3 Selected Examples 93</p> <p>7.3.1 Adapting the Inner Representation 93</p> <p>7.3.2 Adapting the Inputs Distribution 97</p> <p>7.3.3 Using (few, well chosen) Labels from the Target Domain 100</p> <p>7.4 Concluding remarks 104</p> <p><b>8 Recurrent Neural Networks and the Temporal Component 105<br /> </b><i>Marco Körner and Marc Rußwurm</i></p> <p>8.1 Recurrent Neural Networks 106</p> <p>8.1.1 Training RNNs 107</p> <p>8.1.1.1 Exploding and Vanishing Gradients 107</p> <p>8.1.1.2 Circumventing Exploding and Vanishing Gradients 109</p> <p>8.2 Gated Variants of RNNs 111</p> <p>8.2.1 Long Short-term Memory Networks 111</p> <p>8.2.1.1 The Cell State c t and the Hidden State h t 112</p> <p>8.2.1.2 The Forget Gate f t 112</p> <p>8.2.1.3 The Modulation Gate V T and the Input Gate I T 112</p> <p>8.2.1.4 The Output Gate o t 112</p> <p>8.2.1.5 Training LSTM Networks 113</p> <p>8.2.2 Other Gated Variants 113</p> <p>8.3 Representative Capabilities of Recurrent Networks 114</p> <p>8.3.1 Recurrent Neural Network Topologies 114</p> <p>8.3.2 Experiments 115</p> <p>8.4 Application in Earth Sciences 117</p> <p>8.5 Conclusion 118</p> <p><b>9 Deep Learning for Image Matching and Co-registration 120<br /> </b><i>Maria Vakalopoulou, Stergios Christodoulidis, Mihir Sahasrabudhe, and Nikos Paragios</i></p> <p>9.1 Introduction 120</p> <p>9.2 Literature Review 123</p> <p>9.2.1 Classical Approaches 123</p> <p>9.2.2 Deep Learning Techniques for Image Matching 124</p> <p>9.2.3 Deep Learning Techniques for Image Registration 125</p> <p>9.3 Image Registration with Deep Learning 126</p> <p>9.3.1 2D Linear and Deformable Transformer 126</p> <p>9.3.2 Network Architectures 127</p> <p>9.3.3 Optimization Strategy 128</p> <p>9.3.4 Dataset and Implementation Details 129</p> <p>9.3.5 Experimental Results 129</p> <p>9.4 Conclusion and Future Research 134</p> <p>9.4.1 Challenges and Opportunities 134</p> <p>9.4.1.1 Dataset with Annotations 134</p> <p>9.4.1.2 Dimensionality of Data 135</p> <p>9.4.1.3 Multitemporal Datasets 135</p> <p>9.4.1.4 Robustness to Changed Areas 135</p> <p><b>10 Multisource Remote Sensing Image Fusion 136<br /> </b><i>Wei He, Danfeng Hong, Giuseppe Scarpa, Tatsumi Uezato, and Naoto Yokoya</i></p> <p>10.1 Introduction 136</p> <p>10.2 Pansharpening 137</p> <p>10.2.1 Survey of Pansharpening Methods Employing Deep Learning 137</p> <p>10.2.2 Experimental Results 140</p> <p>10.2.2.1 Experimental Design 140</p> <p>10.2.2.2 Visual and Quantitative Comparison in Pansharpening 140</p> <p>10.3 Multiband Image Fusion 143</p> <p>10.3.1 Supervised Deep Learning-based Approaches 143</p> <p>10.3.2 Unsupervised Deep Learning-based Approaches 145</p> <p>10.3.3 Experimental Results 146</p> <p>10.3.3.1 Comparison Methods and Evaluation Measures 146</p> <p>10.3.3.2 Dataset and Experimental Setting 146</p> <p>10.3.3.3 Quantitative Comparison and Visual Results 147</p> <p>10.4 Conclusion and Outlook 148</p> <p><b>11 Deep Learning for Image Search and Retrieval in Large Remote Sensing Archives 150<br /> </b><i>Gencer Sumbul, Jian Kang, and Begüm Demir</i></p> <p>11.1 Introduction 150</p> <p>11.2 Deep Learning for RS CBIR 152</p> <p>11.3 Scalable RS CBIR Based on Deep Hashing 156</p> <p>11.4 Discussion and Conclusion 159</p> <p>Acknowledgement 160</p> <p><b>Part II Making a Difference in the Geosciences with Deep Learning 161</b></p> <p><b>12 Deep Learning for Detecting Extreme Weather Patterns 163<br /> </b><i>Mayur Mudigonda, Prabhat Ram, Karthik Kashinath, Evan Racah, Ankur Mahesh, Yunjie Liu, Christopher Beckham, Jim Biard, Thorsten Kurth, Sookyung Kim, Samira Kahou, Tegan Maharaj, Burlen Loring, Christopher Pal, Travis O’Brien, Kenneth E. Kunkel, Michael F. Wehner, and William D. Collins</i></p> <p>12.1 Scientific Motivation 163</p> <p>12.2 Tropical Cyclone and Atmospheric River Classification 166</p> <p>12.2.1 Methods 166</p> <p>12.2.2 Network Architecture 167</p> <p>12.2.3 Results 169</p> <p>12.3 Detection of Fronts 170</p> <p>12.3.1 Analytical Approach 170</p> <p>12.3.2 Dataset 171</p> <p>12.3.3 Results 172</p> <p>12.3.4 Limitations 174</p> <p>12.4 Semi-supervised Classification and Localization of Extreme Events 175</p> <p>12.4.1 Applications of Semi-supervised Learning in Climate Modeling 175</p> <p>12.4.1.1 Supervised Architecture 176</p> <p>12.4.1.2 Semi-supervised Architecture 176</p> <p>12.4.2 Results 176</p> <p>12.4.2.1 Frame-wise Reconstruction 176</p> <p>12.4.2.2 Results and Discussion 178</p> <p>12.5 Detecting Atmospheric Rivers and Tropical Cyclones Through Segmentation Methods 179</p> <p>12.5.1 Modeling Approach 179</p> <p>12.5.1.1 Segmentation Architecture 180</p> <p>12.5.1.2 Climate Dataset and Labels 181</p> <p>12.5.2 Architecture Innovations: Weighted Loss and Modified Network 181</p> <p>12.5.3 Results 183</p> <p>12.6 Challenges and Implications for the Future 184</p> <p>12.7 Conclusions 185</p> <p><b>13 Spatio-temporal Autoencoders in Weather and Climate Research 186<br /> </b><i>Xavier-Andoni Tibau, Christian Reimers, Christian Requena-Mesa, and Jakob Runge</i></p> <p>13.1 Introduction 186</p> <p>13.2 Autoencoders 187</p> <p>13.2.1 A Brief History of Autoencoders 188</p> <p>13.2.2 Archetypes of Autoencoders 189</p> <p>13.2.3 Variational Autoencoders (VAE) 191</p> <p>13.2.4 Comparison Between Autoencoders and Classical Methods 192</p> <p>13.3 Applications 193</p> <p>13.3.1 Use of the Latent Space 193</p> <p>13.3.1.1 Reduction of Dimensionality for the Understanding of the System Dynamics and its Interactions 195</p> <p>13.3.1.2 Dimensionality Reduction for Feature Extraction and Prediction 199</p> <p>13.3.2 Use of the Decoder 199</p> <p>13.3.2.1 As a Random Sample Generator 201</p> <p>13.3.2.2 Anomaly Detection 201</p> <p>13.3.2.3 Use of a Denoising Autoencoder (DAE) Decoder 202</p> <p>13.4 Conclusions and Outlook 203</p> <p><b>14 Deep Learning to Improve Weather Predictions 204<br /> </b><i>Peter D. Dueben, Peter Bauer, and Samantha Adams</i></p> <p>14.1 Numerical Weather Prediction 204</p> <p>14.2 How Will Machine Learning Enhance Weather Predictions? 207</p> <p>14.3 Machine Learning Across the Workflow of Weather Prediction 208</p> <p>14.4 Challenges for the Application of ML in Weather Forecasts 213</p> <p>14.5 The Way Forward 216</p> <p><b>15 Deep Learning and the Weather Forecasting Problem: Precipitation Nowcasting 218<br /> </b><i>Zhihan Gao, Xingjian Shi, Hao Wang, Dit-Yan Yeung, Wang-chun Woo, and Wai-Kin Wong</i></p> <p>15.1 Introduction 218</p> <p>15.2 Formulation 220</p> <p>15.3 Learning Strategies 221</p> <p>15.4 Models 223</p> <p>15.4.1 FNN-based Odels 223</p> <p>15.4.2 RNN-based Models 225</p> <p>15.4.3 Encoder-forecaster Structure 226</p> <p>15.4.4 Convolutional LSTM 226</p> <p>15.4.5 ConvLSTM with Star-shaped Bridge 227</p> <p>15.4.6 Predictive RNN 228</p> <p>15.4.7 Memory in Memory Network 229</p> <p>15.4.8 Trajectory GRU 231</p> <p>15.5 Benchmark 233</p> <p>15.5.1 HKO-7 Dataset 234</p> <p>15.5.2 Evaluation Methodology 234</p> <p>15.5.3 Evaluated Algorithms 235</p> <p>15.5.4 Evaluation Results 236</p> <p>15.6 Discussion 236</p> <p>Appendix 238</p> <p>Acknowledgement 239</p> <p><b>16 Deep Learning for High-dimensional Parameter Retrieval 240<br /> </b><i>David Malmgren-Hansen</i></p> <p>16.1 Introduction 240</p> <p>16.2 Deep Learning Parameter Retrieval Literature 242</p> <p>16.2.1 Land 242</p> <p>16.2.2 Ocean 243</p> <p>16.2.3 Cryosphere 244</p> <p>16.2.4 Global Weather Models 244</p> <p>16.3 The Challenge of High-dimensional Problems 244</p> <p>16.3.1 Computational Load of CNNs 247</p> <p>16.3.2 Mean Square Error or Cross-entropy Optimization? 249</p> <p>16.4 Applications and Examples 250</p> <p>16.4.1 Utilizing High-Dimensional Spatio-spectral Information with CNNs 250</p> <p>16.4.2 The Effect of Loss Functions in Retrieval of Sea Ice Concentrations 253</p> <p>16.5 Conclusion 257</p> <p><b>17 A Review of Deep Learning for Cryospheric Studies 258<br /> </b><i>Lin Liu</i></p> <p>17.1 Introduction 258</p> <p>17.2 Deep-learning-based Remote Sensing Studies of the Cryosphere 260</p> <p>17.2.1 Glaciers 260</p> <p>17.2.2 Ice Sheet 261</p> <p>17.2.3 Snow 262</p> <p>17.2.4 Permafrost 263</p> <p>17.2.5 Sea Ice 264</p> <p>17.2.6 River Ice 265</p> <p>17.3 Deep-learning-based Modeling of the Cryosphere 265</p> <p>17.4 Summary and Prospect 266</p> <p>Appendix: List of Data and Codes 267</p> <p><b>18 Emulating Ecological Memory with Recurrent Neural Networks 269<br /> </b><i>Basil Kraft, Simon Besnard, and Sujan Koirala</i></p> <p>18.1 Ecological Memory Effects: Concepts and Relevance 269</p> <p>18.2 Data-driven Approaches for Ecological memory Effects 270</p> <p>18.2.1 A Brief Overview of Memory Effects 270</p> <p>18.2.2 Data-driven Methods for Memory Effects 271</p> <p>18.3 Case Study: Emulating a Physical Model Using Recurrent Neural Networks 272</p> <p>18.3.1 Physical Model Simulation Data 272</p> <p>18.3.2 Experimental Design 273</p> <p>18.3.3 RNN Setup and Training 274</p> <p>18.4 Results and Discussion 276</p> <p>18.4.1 The Predictive Capability Across Scales 276</p> <p>18.4.2 Prediction of Seasonal Dynamics 279</p> <p>18.5 Conclusions 281</p> <p><b>Part III Linking Physics and Deep Learning Models 283</b></p> <p><b>19 Applications of Deep Learning in Hydrology 285<br /> </b><i>Chaopeng Shen and Kathryn Lawson</i></p> <p>19.1 Introduction 285</p> <p>19.2 Deep Learning Applications in Hydrology 286</p> <p>19.2.1 Dynamical System Modeling 286</p> <p>19.2.1.1 Large-scale Hydrologic Modeling with Big Data 286</p> <p>19.2.1.2 Data-limited LSTM Applications 290</p> <p>19.2.2 Physics-constrained Hydrologic Machine Learning 292</p> <p>19.2.3 Information Retrieval for Hydrology 293</p> <p>19.2.4 Physically-informed Machine Learning for Subsurface Flow and Reactive Transport Modeling 294</p> <p>19.2.5 Additional Observations 296</p> <p>19.3 Current Limitations and Outlook 296</p> <p><b>20 Deep Learning of Unresolved Turbulent Ocean Processes in Climate Models 298<br /> </b><i>Laure Zanna and Thomas Bolton</i></p> <p>20.1 Introduction 298</p> <p>20.2 The Parameterization Problem 299</p> <p>20.3 Deep Learning Parameterizations of Subgrid Ocean Processes 300</p> <p>20.3.1 Why DL for Subgrid Parameterizations? 300</p> <p>20.3.2 Recent Advances in DL for Subgrid Parameterizations 300</p> <p>20.4 Physics-aware Deep Learning 301</p> <p>20.5 Further Challenges ahead for Deep Learning Parameterizations 303</p> <p><b>21 Deep Learning for the Parametrization of Subgrid Processes in Climate Models 307<br /> </b><i>Pierre Gentine, Veronika Eyring, and Tom Beucler</i></p> <p>21.1 Introduction 307</p> <p>21.2 Deep Neural Networks for Moist Convection (Deep Clouds) Parametrization 309</p> <p>21.3 Physical Constraints and Generalization 312</p> <p>21.4 Future Challenges 314</p> <p><b>22 Using Deep Learning to Correct Theoretically-derived Models 315<br /> </b><i>PeterA.G.Watson</i></p> <p>22.1 Experiments with the Lorenz ’96 System 317</p> <p>22.1.1 The Lorenz’96 Equations and Coarse-scale Models 318</p> <p>22.1.1.1 Theoretically-derived Coarse-scale Model 318</p> <p>22.1.1.2 Models with ANNs 319</p> <p>22.1.2 Results 320</p> <p>22.1.2.1 Single-timestep Tendency Prediction Errors 320</p> <p>22.1.2.2 Forecast and Climate Prediction Skill 321</p> <p>22.1.3 Testing Seamless Prediction 324</p> <p>22.2 Discussion and Outlook 324</p> <p>22.2.1 Towards Earth System Modeling 325</p> <p>22.2.2 Application to Climate Change Studies 326</p> <p>22.3 Conclusion 327</p> <p><b>23 Outlook 328<br /> </b><i>Markus Reichstein, Gustau Camps-Valls, Devis Tuia, and Xiao Xiang Zhu</i></p> <p>Bibliography 331</p> <p>Index 401</p>
<p><b>Gustau Camps-Valls</b> is Professor of Electrical Engineering and Lead Researcher in the Image Processing Laboratory (IPL) at the Universitat de València. His interests include development of statistical learning, mainly kernel machines and neural networks, for Earth sciences, from remote sensing to geoscience data analysis. Models efficiency and accuracy but also interpretability, consistency and causal discovery are driving his agenda on AI for Earth and climate.</p> <p><b>Devis Tuia, PhD,</b> is Associate Professor at the Ecole Polytechnique Fédérale de Lausanne (EPFL). He leads the Environmental Computational Science and Earth Observation laboratory, which focuses on the processing of Earth observation data with computational methods to advance Environmental science. <p><b>Xiao Xiang Zhu</b> is Professor of Data Science in Earth Observation and Director of the Munich AI Future Lab AI4EO at the Technical University of Munich and heads the Department EO Data Science at the German Aerospace Center. Her lab develops innovative machine learning methods and big data analytics solutions to extract large scale geo-information from big Earth observation data, aiming at tackling societal grand challenges, e.g. Urbanization, UN’s SDGs and Climate Change. <p><b>Markus Reichstein</b> is Director of the Biogeochemical Integration Department at the Max-Planck- Institute for Biogeochemistry and Professor for Global Geoecology at the University of Jena. His main research interests include the response and feedback of ecosystems (vegetation and soils) to climatic variability with an Earth system perspective, considering coupled carbon, water and nutrient cycles. He has been tackling these topics with applied statistical learning for more than 15 years.
<p><b>Explore this insightful treatment of deep learning in the field of earth sciences, from four leading voices</b></p> <p>Deep learning is a fundamental technique in modern Artificial Intelligence and is being applied to disciplines across the scientific spectrum; earth science is no exception. Yet, the link between deep learning and Earth sciences has only recently entered academic curricula and thus has not yet proliferated. <i>Deep Learning for the Earth Sciences</i> delivers a unique perspective and treatment of the concepts, skills, and practices necessary to quickly become familiar with the application of deep learning techniques to the Earth sciences. The book prepares readers to be ready to use the technologies and principles described in their own research. <p>The distinguished editors have also included resources that explain and provide new ideas and recommendations for new research especially useful to those involved in advanced research education or those seeking PhD thesis orientations. Readers will also benefit from the inclusion of: <ul><li>An introduction to deep learning for classification purposes, including advances in image segmentation and encoding priors, anomaly detection and target detection, and domain adaptation</li><li>An exploration of learning representations and unsupervised deep learning, including deep learning image fusion, image retrieval, and matching and co-registration</li><li>Practical discussions of regression, fitting, parameter retrieval, forecasting and interpolation</li><li>An examination of physics-aware deep learning models, including emulation of complex codes and model parametrizations</li></ul> <p>Perfect for PhD students and researchers in the fields of geosciences, image processing, remote sensing, electrical engineering and computer science, and machine learning, <i>Deep Learning for the Earth Sciences</i> will also earn a place in the libraries of machine learning and pattern recognition researchers, engineers, and scientists.

Diese Produkte könnten Sie auch interessieren:

Bandwidth Efficient Coding
Bandwidth Efficient Coding
von: John B. Anderson
PDF ebook
114,99 €
Bandwidth Efficient Coding
Bandwidth Efficient Coding
von: John B. Anderson
EPUB ebook
114,99 €