Details

Distributed Source Coding


Distributed Source Coding

Theory and Practice
1. Aufl.

von: Shuang Wang, Yong Fang, Samuel Cheng

81,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 05.01.2017
ISBN/EAN: 9781118705971
Sprache: englisch
Anzahl Seiten: 384

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p>Distributed source coding is one of the key enablers for efficient cooperative communication. The potential applications range from wireless sensor networks, ad-hoc networks, and surveillance networks, to robust low-complexity video coding, stereo/Multiview video coding, HDTV, hyper-spectral and multispectral imaging, and biometrics.</p> <p>The book is divided into three sections: theory, algorithms, and applications. Part one covers the background of information theory with an emphasis on DSC; part two discusses designs of algorithmic solutions for DSC problems, covering the three most important DSC problems: Slepian-Wolf, Wyner-Ziv, and MT source coding; and part three is dedicated to a variety of potential DSC applications.</p> <p>Key features:</p> <ul> <li>Clear explanation of distributed source coding theory and algorithms including both lossless and lossy designs.</li> <li>Rich applications of distributed source coding, which covers multimedia communication and data security applications.</li> <li>Self-contained content for beginners from basic information theory to practical code implementation.</li> </ul> <p>The book provides fundamental knowledge for engineers and computer scientists to access the topic of distributed source coding. It is also suitable for senior undergraduate and first year graduate students in electrical engineering; computer engineering; signal processing; image/video processing; and information theory and communications.</p>
<p>Preface xiii</p> <p>Acknowledgment xv</p> <p><b>About the Companion Website xvii</b></p> <p>1 Introduction 1</p> <p>1.1 What is Distributed Source Coding? 2</p> <p>1.2 Historical Overview and Background 2</p> <p>1.3 Potential and Applications 3</p> <p>1.4 Outline 4</p> <p><b>Part I Theory of Distributed Source Coding 7</b></p> <p><b>2 Lossless Compression of Correlated Sources 9</b></p> <p>2.1 Slepian–Wolf Coding 10</p> <p>2.1.1 Proof of the SWTheorem 15</p> <p>Achievability of the SWTheorem 16</p> <p>Converse of the SWTheorem 19</p> <p>2.2 Asymmetric and Symmetric SWCoding 21</p> <p>2.3 SWCoding of Multiple Sources 22</p> <p><b>3 Wyner–Ziv Coding Theory 25</b></p> <p>3.1 Forward Proof ofWZ Coding 27</p> <p>3.2 Converse Proof of WZ Coding 29</p> <p>3.3 Examples 30</p> <p>3.3.1 Doubly Symmetric Binary Source 30</p> <p>Problem Setup 30</p> <p>A Proposed Scheme 31</p> <p>Verify the Optimality of the Proposed Scheme 32</p> <p>3.3.2 Quadratic Gaussian Source 35</p> <p>Problem Setup 35</p> <p>Proposed Scheme 36</p> <p>Verify the Optimality of the Proposed Scheme 37</p> <p>3.4 Rate Loss of theWZ Problem 38</p> <p>Binary Source Case 39</p> <p>Rate loss of General Cases 39</p> <p><b>4 Lossy Distributed Source Coding 41</b></p> <p>4.1 Berger–Tung Inner Bound 42</p> <p>4.1.1 Berger–Tung Scheme 42</p> <p>Codebook Preparation 42</p> <p>Encoding 42</p> <p>Decoding 43</p> <p>4.1.2 Distortion Analysis 43</p> <p>4.2 Indirect Multiterminal Source Coding 45</p> <p>4.2.1 Quadratic Gaussian CEO Problem with Two Encoders 45</p> <p>Forward Proof of Quadratic Gaussian CEO Problem with Two Terminals 46</p> <p>Converse Proof of Quadratic Gaussian CEO Problem with Two Terminals 48</p> <p>4.3 Direct Multiterminal Source Coding 54</p> <p>4.3.1 Forward Proof of Gaussian Multiterminal Source Coding Problem with Two Sources 55</p> <p>4.3.2 Converse Proof of Gaussian Multiterminal Source Coding Problem with Two Sources 63</p> <p>Bounds for R1 and R2 64</p> <p>Collaborative Lower Bound 66</p> <p>𝜇-sum Bound 67</p> <p><b>Part II Implementation 75</b></p> <p><b>5 Slepian–Wolf Code Designs Based on Channel Coding 77</b></p> <p>5.1 Asymmetric SWCoding 77</p> <p>5.1.1 Binning Idea 78</p> <p>5.1.2 Syndrome-based Approach 79</p> <p>Hamming Binning 80</p> <p>SWEncoding 80</p> <p>SWDecoding 80</p> <p>LDPC-based SWCoding 81</p> <p>5.1.3 Parity-based Approach 82</p> <p>5.1.4 Syndrome-based Versus Parity-based Approach 84</p> <p>5.2 Non-asymmetric SWCoding 85</p> <p>5.2.1 Generalized Syndrome-based Approach 86</p> <p>5.2.2 Implementation using IRA Codes 88</p> <p>5.3 Adaptive Slepian–Wolf Coding 90</p> <p>5.3.1 Particle-based Belief Propagation for SWCoding 91</p> <p>5.4 Latest Developments and Trends 93</p> <p><b>6 Distributed Arithmetic Coding 97</b></p> <p>6.1 Arithmetic Coding 97</p> <p>6.2 Distributed Arithmetic Coding 101</p> <p>6.3 Definition of the DAC Spectrum 103</p> <p>6.3.1 Motivations 103</p> <p>6.3.2 Initial DAC Spectrum 104</p> <p>6.3.3 Depth-i DAC Spectrum 105</p> <p>6.3.4 Some Simple Properties of the DAC Spectrum 107</p> <p>6.4 Formulation of the Initial DAC Spectrum 107</p> <p>6.5 Explicit Form of the Initial DAC Spectrum 110</p> <p>6.6 Evolution of the DAC Spectrum 113</p> <p>6.7 Numerical Calculation of the DAC Spectrum 116</p> <p>6.7.1 Numerical Calculation of the Initial DAC Spectrum 117</p> <p>6.7.2 Numerical Estimation of DAC Spectrum Evolution 118</p> <p>6.8 Analyses on DAC Codes with Spectrum 120</p> <p>6.8.1 Definition of DAC Codes 121</p> <p>6.8.2 Codebook Cardinality 122</p> <p>6.8.3 Codebook Index Distribution 123</p> <p>6.8.4 Rate Loss 123</p> <p>6.8.5 Decoder Complexity 124</p> <p>6.8.6 Decoding Error Probability 126</p> <p>6.9 Improved Binary DAC Codec 130</p> <p>6.9.1 Permutated BDAC Codec 130</p> <p>Principle 130</p> <p>Proof of SWLimit Achievability 131</p> <p>6.9.2 BDAC Decoder withWeighted Branching 132</p> <p>6.10 Implementation of the Improved BDAC Codec 134</p> <p>6.10.1 Encoder 134</p> <p>Principle 134</p> <p>Implementation 135</p> <p>6.10.2 Decoder 135</p> <p>Principle 135</p> <p>Implementation 136</p> <p>6.11 Experimental Results 138</p> <p>Effect of Segment Size on Permutation Technique 139</p> <p>Effect of Surviving-Path Number onWB Technique 139</p> <p>Comparison with LDPC Codes 139</p> <p>Application of PBDAC to Nonuniform Sources 140</p> <p>6.12 Conclusion 141</p> <p><b>7 Wyner–Ziv Code Design 143</b></p> <p>7.1 Vector Quantization 143</p> <p>7.2 Lattice Theory 146</p> <p>7.2.1 What is a Lattice? 146</p> <p>Examples 146</p> <p>Dual Lattice 147</p> <p>Integral Lattice 147</p> <p>Lattice Quantization 148</p> <p>7.2.2 What is a Good Lattice? 149</p> <p>Packing Efficiency 149</p> <p>Covering Efficiency 150</p> <p>Normalized Second Moment 150</p> <p>Kissing Number 150</p> <p>Some Good Lattices 151</p> <p>7.3 Nested Lattice Quantization 151</p> <p>Encoding/decoding 152</p> <p>Coset Binning 152</p> <p>Quantization Loss and Binning Loss 153</p> <p>SW Coded NLQ 154</p> <p>7.3.1 Trellis Coded Quantization 154</p> <p>7.3.2 Principle of TCQ 155</p> <p>Generation of Codebooks 156</p> <p>Generation of Trellis from Convolutional Codes 156</p> <p>Mapping of Trellis Branches onto Sub-codebooks 157</p> <p>Quantization 157</p> <p>Example 158</p> <p>7.4 WZ Coding Based on TCQ and LDPC Codes 159</p> <p>7.4.1 Statistics of TCQ Indices 159</p> <p>7.4.2 LLR of Trellis Bits 162</p> <p>7.4.3 LLR of Codeword Bits 163</p> <p>7.4.4 Minimum MSE Estimation 163</p> <p>7.4.5 Rate Allocation of Bit-planes 164</p> <p>7.4.6 Experimental Results 166</p> <p><b>Part III Applications 167</b></p> <p><b>8 Wyner–Ziv Video Coding 169</b></p> <p>8.1 Basic Principle 169</p> <p>8.2 Benefits of WZ Video Coding 170</p> <p>8.3 Key Components of WZ Video Decoding 171</p> <p>8.3.1 Side-information Preparation 171</p> <p>Bidirectional Motion Compensation 172</p> <p>8.3.2 Correlation Modeling 173</p> <p>Exploiting Spatial Redundancy 174</p> <p>8.3.3 Rate Controller 175</p> <p>8.4 Other Notable Features of Miscellaneous WZ Video Coders 175</p> <p><b>9 Correlation Estimation in DVC 177</b></p> <p>9.1 Background to Correlation Parameter Estimation in DVC 177</p> <p>9.1.1 Correlation Model inWZ Video Coding 177</p> <p>9.1.2 Offline Correlation Estimation 178</p> <p>Pixel Domain Offline Correlation Estimation 178</p> <p>Transform Domain Offline Correlation Estimation 180</p> <p>9.1.3 Online Correlation Estimation 181</p> <p>Pixel Domain Online Correlation Estimation 182</p> <p>Transform Domain Online Correlation Estimation 184</p> <p>9.2 Recap of Belief Propagation and Particle Filter Algorithms 185</p> <p>9.2.1 Belief Propagation Algorithm 185</p> <p>9.2.2 Particle Filtering 186</p> <p>9.3 Correlation Estimation in DVC with Particle Filtering 187</p> <p>9.3.1 Factor Graph Construction 187</p> <p>9.3.2 Correlation Estimation in DVC with Particle Filtering 190</p> <p>9.3.3 Experimental Results 192</p> <p>9.3.4 Conclusion 197</p> <p>9.4 Low Complexity Correlation Estimation using Expectation Propagation 199</p> <p>9.4.1 System Architecture 199</p> <p>9.4.2 Factor Graph Construction 199</p> <p>Joint Bit-plane SWCoding (Region II) 200</p> <p>Correlation Parameter Tracking (Region I) 201</p> <p>9.4.3 Message Passing on the Constructed Factor Graph 202</p> <p>Expectation Propagation 203</p> <p>9.4.4 Posterior Approximation of the Correlation Parameter using Expectation Propagation 204</p> <p>Moment Matching 205</p> <p>9.4.5 Experimental Results 206</p> <p>9.4.6 Conclusion 211</p> <p><b>10 DSC for Solar Image Compression 213</b></p> <p>10.1 Background 213</p> <p>10.2 RelatedWork 215</p> <p>10.3 Distributed Multi-view Image Coding 217</p> <p>10.4 Adaptive Joint Bit-plane WZ Decoding of Multi-view Images with Disparity Estimation 217</p> <p>10.4.1 Joint Bit-planeWZ Decoding 217</p> <p>10.4.2 Joint Bit-planeWZ Decoding with Disparity Estimation 219</p> <p>10.4.3 Joint Bit-planeWZ Decoding with Correlation Estimation 220</p> <p>10.5 Results and Discussion 221</p> <p>10.6 Summary 224</p> <p><b>11 Secure Distributed Image Coding 225</b></p> <p>11.1 Background 225</p> <p>11.2 System Architecture 227</p> <p>11.2.1 Compression of Encrypted Data 228</p> <p>11.2.2 Joint Decompression and Decryption Design 230</p> <p>11.3 Practical Implementation Issues 233</p> <p>11.4 Experimental Results 233</p> <p>11.4.1 Experiment Setup 234</p> <p>11.4.2 Security and Privacy Protection 235</p> <p>11.4.3 Compression Performance 236</p> <p>11.5 Discussion 239</p> <p><b>12 Secure Biometric Authentication Using DSC 241</b></p> <p>12.1 Background 241</p> <p>12.2 RelatedWork 243</p> <p>12.3 System Architecture 245</p> <p>12.3.1 Feature Extraction 246</p> <p>12.3.2 Feature Pre-encryption 248</p> <p>12.3.3 SeDSC Encrypter/decrypter 248</p> <p>12.3.4 Privacy-preserving Authentication 249</p> <p>12.4 SeDSC Encrypter Design 249</p> <p>12.4.1 Non-asymmetric SWCodes with Code Partitioning 250</p> <p>12.4.2 Implementation of SeDSC Encrypter using IRA Codes 251</p> <p>12.5 SeDSC Decrypter Design 252</p> <p>12.6 Experiments 256</p> <p>12.6.1 Dataset and Experimental Setup 256</p> <p>12.6.2 Feature Length Selection 257</p> <p>12.6.3 Authentication Accuracy 257</p> <p>Authentication Performances on Small Feature Length (i.e., <i>N</i> = 100) 257</p> <p>Performances on Large Feature Lengths (i.e., <i>N</i> ≥ 300) 258</p> <p>12.6.4 Privacy and Security 259</p> <p>12.6.5 Complexity Analysis 261</p> <p>12.7 Discussion 261</p> <p><b>A Basic Information Theory 263</b></p> <p>A.1 Information Measures 263</p> <p>A.1.1 Entropy 263</p> <p>A.1.2 Relative Entropy 267</p> <p>A.1.3 Mutual Information 268</p> <p>A.1.4 Entropy Rate 269</p> <p>A.2 Independence and Mutual Information 270</p> <p>A.3 Venn Diagram Interpretation 273</p> <p>A.4 Convexity and Jensen’s Inequality 274</p> <p>A.5 Differential Entropy 277</p> <p>A.5.1 Gaussian Random Variables 278</p> <p>A.5.2 Entropy Power Inequality 278</p> <p>A.6 Typicality 279</p> <p>A.6.1 Jointly Typical Sequences 282</p> <p>A.7 Packing Lemmas and Covering Lemmas 284</p> <p>A.8 Shannon’s Source CodingTheorem 286</p> <p>A.9 Lossy Source Coding—Rate-distortionTheorem 289</p> <p>A.9.1 Rate-distortion Problem with Side Information 291</p> <p><b>B Background on Channel Coding 293</b></p> <p>B.1 Linear Block Codes 294</p> <p>B.1.1 Syndrome Decoding of Block Codes 295</p> <p>B.1.2 Hamming Codes, Packing Bound, and Perfect Codes 295</p> <p>B.2 Convolutional Codes 297</p> <p>B.2.1 Viterbi Decoding Algorithm 298</p> <p>B.3 Shannon’s Channel CodingTheorem 301</p> <p>B.3.1 Achievability Proof of the Channel CodingTheorem 303</p> <p>B.3.2 Converse Proof of Channel CodingTheorem 305</p> <p>B.4 Low-density Parity-check Codes 306</p> <p>B.4.1 A Quick Summary of LDPC Codes 306</p> <p>B.4.2 Belief Propagation Algorithm 307</p> <p>B.4.3 LDPC Decoding using BP 312</p> <p>B.4.4 IRA Codes 314</p> <p><b>C Approximate Inference 319</b></p> <p>C.1 Stochastic Approximation 319</p> <p>C.1.1 Importance SamplingMethods 320</p> <p>C.1.2 Markov Chain Monte Carlo 321</p> <p>Markov Chains 321</p> <p>Markov Chain Monte Carlo 321</p> <p>C.2 Deterministic Approximation 322</p> <p>C.2.1 Preliminaries 322</p> <p>Exponential Family 322</p> <p>Kullback–Leibler Divergence 323</p> <p>Assumed-density Filtering 324</p> <p>C.2.2 Expectation Propagation 325</p> <p>Relationship with BP 326</p> <p>C.2.3 Relationship with Other Variational Inference Methods 328</p> <p><b>D Multivariate Gaussian Distribution 331</b></p> <p>D.1 Introduction 331</p> <p>D.2 Probability Density Function 331</p> <p>D.3 Marginalization 332</p> <p>D.4 Conditioning 333</p> <p>D.5 Product of Gaussian pdfs 334</p> <p>D.6 Division of Gaussian pdfs 337</p> <p>D.7 Mixture of Gaussians 337</p> <p>D.7.1 Reduce the Number of Components in Gaussian Mixtures 338</p> <p>Which Components to Merge? 340</p> <p>How to Merge Components? 341</p> <p>D.8 Summary 342</p> <p>Appendix: Matrix Equations 343</p> <p>Bibliography 345</p> <p>Index 357</p>
<p><b>SHUANG WANG, </b>University of California, San Diego, USA</p> <p><b>YONG FANG, </b>Northwest A&F University, China <p><b>SAMUEL CHENG,</b> University of Oklahoma, USA
<p><b>Understanding distributed source coding from theory to practice</b></p> <p>Distributed source coding is one of the key enablers for ef cient cooperative communication. The potential applications range from wireless sensor networks, ad-hoc networks, and surveillance networks, to robust low-complexity video coding, stereo/multiview video coding, HDTV, hyper-spectral and multispectral imaging, and biometrics. <p>The book is divided into three sections: theory, algorithms, and applications. Part I covers the background of information theory with an emphasis on distributed source coding, Part II discusses designs of algorithmic solutions for distributed source coding problems, covering the three most important distributed source coding problems (Slepian–Wolf, Wyner–Ziv, and MT source coding), and Part III is dedicated to a variety of potential distributed source coding applications. <p><b>Key features</b> <ul><li>Clear explanation of distributed source coding theory and algorithms, including both lossless and lossy designs.</li> <li>Rich applications of distributed source coding, which covers multimedia communication and data security applications.</li> <li>Self-contained content for beginners from basic information theory to practical code implementation.</li></ul> <p>The book provides fundamental knowledge for engineers and computer scientists to access the topic of distributed source coding. It is also suitable for senior undergraduate and rst-year graduate students in electrical engineering, computer engineering, signal processing, image/video processing, and information theory and communications.

Diese Produkte könnten Sie auch interessieren:

Strategies to the Prediction, Mitigation and Management of Product Obsolescence
Strategies to the Prediction, Mitigation and Management of Product Obsolescence
von: Bjoern Bartels, Ulrich Ermel, Peter Sandborn, Michael G. Pecht
PDF ebook
116,99 €