This edition first published 2020
© 2020 John Wiley & Sons Ltd
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Hosameldin Ahmed and Asoke K. Nandi to be identified as the authors of this work has been asserted in accordance with law.
Registered Offices
John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
Editorial Office
The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of Warranty
In view of ongoing research, equipment modifications, changes in governmental regulations, and the constant flow of information relating to the use of experimental reagents, equipment, and devices, the reader is urged to review and evaluate the information provided in the package insert or instructions for each chemical, piece of equipment, reagent, or device for, among other things, any changes in the instructions or indication of usage and for added warnings and precautions. While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Ahmed, Hosameldin, author. | Nandi, Asoke Kumar, 1976- author.
Title: Condition monitoring with vibration signals : compressive sampling
and learning algorithms for rotating machines / Hosameldin Ahmed and
Asoke K. Nandi, Brunel University London, UK.
Description: Hoboken, NJ, USA : Wiley-IEEE Press, 2019. | Includes
bibliographical references and index.
Identifiers: LCCN 2019024456 (print) | LCCN 2019024457 (ebook) | ISBN
9781119544623 (cloth) | ISBN 9781119544630 (adobe pdf) | ISBN
9781119544647 (epub)
Subjects: LCSH: Machinery-Monitoring. | Machinery-Vibration-Measurement.
| Rotors-Maintenance and repair.
Classification: LCC TJ177 .N36 2019 (print) | LCC TJ177 (ebook) | DDC
621.8028/7-dc23
LC record available at https://lccn.loc.gov/2019024456
LC ebook record available at https://lccn.loc.gov/2019024457
Cover Design: Wiley
Cover Image: © PIRO4D/2442 images/Pixabay
To
My parents, my wife Intesar, and our children – Monia, Malaz, Mohamed, and Abubaker Ahmed.
Hosameldin Ahmed
My wife, Marion, and our children – Robin, David, and Anita Nandi.
Asoke K. Nandi
As an essential element of most engineering processes in many critical functions of industries, rotating machine condition monitoring is a key technique for ensuring the efficiency and quality of any product. Machine condition monitoring is a process of monitoring machine health conditions. It is incorporated into various sensitive applications of rotating machines, e.g. wind turbines, oil and gas, aerospace and defence, automotive, marine, etc. The increased level of complexity of modern rotating machines requires more effective and efficient condition monitoring techniques. For that reason, a growing body of literature has resulted from research and development efforts by many research groups around the world. These publications have made a direct impact on current and future developments in machine condition monitoring. However, there is no collection of works, including previous and recently developed methods, devoted to the field of condition monitoring for rotating machines using vibration signals. As this field is still developing, such a book cannot be definitive or complete. But this book attempts to bring together many techniques in one place, and outlines the complete guide from the basics of rotating machines to the generation of knowledge using vibration signals. It provides an introduction to rotating machines and the vibration signals produced from them at a level that can be easily understood by readers such as postgraduate students, researchers, and practicing engineers. The introduction introduces those readers to the basic knowledge needed to appreciate the specific applications of the methods in this book.
Based on the stages of the machine condition monitoring framework, and with the aim of designing effective techniques for detecting and classifying faults in rotating machines, a major part of the book covers various feature‐extraction, feature‐selection, and feature‐classification methods as well as their applications to machine vibration datasets. Moreover, this book presents the latest methods, including machine learning and compressive sampling. These offer significant improvements in accuracy with reduced computational costs. It is important for these to be made available to all researchers as well as practitioners and new people coming into this field, to help improve safety, reliability, and performance. Although this is not intended to be a textbook, examples and case studies using vibration data are given throughout the book to show the use and application of the included methods in monitoring the condition of rotating machines.
The layout of the book is as follows:
Chapter 1 offers an introduction to machine condition monitoring and its application in condition‐based maintenance. The chapter explains the importance of machine condition monitoring and its use in various rotating machine applications, machine maintenance approaches, and machine condition monitoring techniques that can be used to identify machine health conditions.
Chapter 2 is concerned with the principles of rotating machine vibration and acquisition techniques. The first part of this chapter is a presentation of the basics of vibration, vibration signals produced by rotating machines, and types of vibration signals. The second part is concerned with vibration data acquisition techniques and highlights the advantages and limitations of vibration signals.
Chapter 3 introduces signal processing in the time domain by giving an explanation of mathematical and statistical functions and other advanced techniques that can be used to extract basic signal information from time‐indexed raw vibration signals that can sufficiently represent machine health conditions.
Chapter 4 presents signal processing in the frequency domain, which has the ability to extract information based on frequency characteristics that are not easy to observe in the time domain. The first part describes the Fourier transform, the most commonly used signal‐transformation technique, which allows one to transform the time domain signal to the frequency domain. In addition, this chapter gives an explanation of different techniques that can be used to extract various frequency spectrum features that can more efficiently represent machine health conditions.
Chapter 5 introduces signal processing in the time‐frequency domain and gives an explanation of several techniques that can be used to examine time‐frequency characteristics of time‐indexed series signals, which can be figured more effectively than the Fourier transform and its corresponding frequency spectrum features.
Chapter 6 is concerned with vibration‐based machine condition monitoring using machine learning algorithms. The first part of this chapter gives an overview of the vibration‐based machine condition monitoring process, and describes fault detection, the problem diagnosis framework, and types of learning that can be applied to vibration data. The second part defines the main problems of learning from vibration data for the purpose of fault diagnosis and describes techniques to prepare vibration data for analysis to overcome the aforementioned problems.
Chapter 7 presents common, appropriate methods for linear subspace learning that can be used to reduce a large amount of collected vibration data to a few dimensions without significant loss of information.
Chapter 8 introduces common, suitable methods for nonlinear subspace learning that can be used to reduce a large amount of collected vibration data to a reduced amount without loss of information.
Chapter 9 introduces generally applicable methods that can be used to select the most important features that can effectively represent the original features. Also, it provides an explanation of feature ranking and feature subset selection techniques.
Chapter 10 is concerned with the basic theory of the diagnosis tool decision tree, its data structure, the ensemble model that combines decision trees into a decision forest model, and their applications in diagnosing machine faults.
Chapter 11 is devoted to a description of two probabilistic models for classification: (i) the hidden Markov model (HMM) as a probabilistic generative model, and (ii) the logistic regression model and generalised logistic regression model, also called multinomial logistic regression or multiple logistic regression, as probabilistic discriminative models, and their applications in diagnosing machine faults.
Chapter 12 begins with a discussion of the basic principles of the learning method known as artificial neural networks (ANNs). Then, the chapter describes three different types of ANNs (i.e. multi‐layer perceptron, radial basis function network, and Kohonen network), which can be used for fault classification. In addition, the applications of these methods in diagnosing machine faults are described.
Chapter 13 presents the support vector machine (SVM) classifier, by giving a brief description of the basic idea of the SVM model for binary classification problems. Then, the chapter explains the multiclass SVM approach and the different techniques that can be used for multiclass SVMs. Examples of their applications in diagnosing machine faults are provided.
Chapter 14 describes recent trends of deep learning in the field of machine condition monitoring and provides an explanation of commonly used techniques and examples of their applications in diagnosing machine faults.
Chapter 15 provides an overview of the efficacy of the classification algorithms introduced in this book. This chapter describes different validation techniques that can be used to validate the efficacy of classification algorithms in terms of classification results.
Chapter 16 presents new feature‐learning frameworks based on compressive sampling and subspace learning techniques for machine condition monitoring. The chapter starts with a concise exposition of the basic theory of compressive sampling and shows how to perform compressive sampling for sparse frequency representations and sparse time‐frequency representations. Then, the chapter presents an overview of compressive sampling in machine condition monitoring. The second part of the chapter describes three frameworks based on compressive sampling and presents different method implementation based on these frameworks. In the third part, two case studies and applications of these methods to different classes of machine health conditions are considered.
Chapter 17 presents an original framework combining compressive sampling and a deep neural network based on a sparse autoencoder. Overcomplete features with a different number of hidden layers in the deep neural network are considered in the application of this method to different classes of machine health conditions, using the same two case studies as Chapter 16.
Chapter 18 provides conclusions and recommendations for the application of the different methods studied in this book. These will benefit practitioners and researchers involved in the field of vibration‐based machine condition monitoring.
This book is up‐to‐date and covers many techniques used for machine condition monitoring, including recently developed methods. In addition, this book will provide new methods, including machine learning and compressive sampling, which cover various topics of current research interest. Additional to the material provided in the book, publicly accessible software for most of the introduced techniques in this book and links to publicly available vibration datasets are provided in the appendix.
A work of this magnitude will unfortunately contain errors and omissions. We would like to take this opportunity to apologise unreservedly for all such indiscretions in advance. We would welcome comments and corrections; please send them by email to a.k.nandi@ieee.org or by any other means.
February 2019
Hosameldin Ahmed and Asoke K. Nandi
London, UK
Hosameldin Ahmed received the degree of B.Sc. (Hons.) in Engineering Technology, specialisation in Electronic Engineering, from the Faculty of Science and Technology University of Gezira, Sudan, in 1999 and M.Sc. degree in Computer Engineering and Networks from the University of Gezira, Sudan, in 2010. He has recently received the Ph.D. degree in Electronic and Computer Engineering at Brunel University London, UK. Since 2014, he has been working with his supervisor, Professor Asoke. K. Nandi, in the area of machine condition monitoring. Their collaboration has made several contributions to the advancement of vibration based machine condition monitoring using compressive sampling and modern machine learning algorithms. His work has been published in high‐quality journals and international conferences. His research interests lie in the areas of signal processing, compressive sampling, and machine learning with application to vibration‐based machine condition monitoring.
Asoke K. Nandi received the degree of Ph.D. in Physics from the University of Cambridge (Trinity College), Cambridge, UK. He has held academic positions in several universities, including Oxford, Imperial College London, Strathclyde, and Liverpool as well as the Finland Distinguished Professorship in Jyvaskyla (Finland). In 2013, he moved to Brunel University London, to become the Chair and Head of Electronic and Computer Engineering. Professor Nandi is a Distinguished Visiting Professor at Tongji University (China) and an Adjunct Professor at the University of Calgary (Canada).
In 1983, Professor Nandi jointly discovered the three fundamental particles known as W+, W−, and Z0, providing the evidence for the unification of the electromagnetic and weak forces, for which the Nobel Committee for Physics in 1984 awarded the prize to his two team leaders for their decisive contributions. His current research interests lie in the areas of signal processing and machine learning, with applications to communications, gene expression data, functional magnetic resonance data, machine condition monitoring, and biomedical data. He has made many fundamental theoretical and algorithmic contributions to many aspects of signal processing and machine learning. He has much expertise in ‘Big Data’, dealing with heterogeneous data, and extracting information from multiple datasets obtained in different laboratories and at different times. Professor Nandi has authored over 590 technical publications, including 240 journal papers, as well as 4 books: Automatic Modulation Classification: Principles, Algorithms and Applications (Wiley, 2015), Integrative Cluster Analysis in Bioinformatics (Wiley, 2015), Blind Estimation Using Higher‐Order Statistics (Springer, 1999), and Automatic Modulation Recognition of Communications Signals (Springer, 1996). The h‐index of his publications is 73 (Google Scholar) and the ERDOS number is 2.
Professor Nandi is a Fellow of the Royal Academy of Engineering (UK) and of seven other institutions. Among the many awards he has received are the Institute of Electrical and Electronics Engineers (USA) Heinrich Hertz Award in 2012; the Glory of Bengal Award for his outstanding achievements in scientific research in 2010; award from the Society for Machinery Failure Prevention Technology, a division of the Vibration Institute (USA) in 2000; the Water Arbitration Prize of the Institution of Mechanical Engineers (UK) in 1999; and the Mountbatten Premium of the Institution of Electrical Engineers (UK) in 1998. Professor Nandi is an IEEE Distinguished Lecturer (2018–2019).
AANN | Auto‐associative neural network |
ACO | Ant colony optimisation |
ADC | Analog to digital converter |
AE | Acoustic emission |
AFSA | Artificial fish swarm algorithm |
AI | Artificial intelligence |
AIC | Akaike's information criterion |
AID | Automatic interaction detector |
AM | Amplitude modulation |
ANC | Adaptive noise cancellation |
ANFIS | Adaptive neuro‐fuzzy inference system |
ANN | Artificial neural network |
ANNC | Adaptive nearest neighbour classifier |
AR | Autoregressive |
ARIMA | Autoregressive integrated moving average |
ARMA | Autoregressive moving average |
ART2 | Adaptive resonance theory‐2 |
AUC | Area under a ROC curve |
BFDF | Bearing fundamental defect frequency |
BPFI | Bearing pass frequency of inner race |
BPFO | Bearing pass frequency of outer race |
BPNN | Backpropagation neural network |
BS | Binary search |
BSF | Ball spin frequency |
BSS | Blind source separation |
CAE | Contractive autoencoder |
CART | Classification and regression tree |
CBLSTM | Convolutional bi‐directional long short‐term memory |
CBM | Condition‐based maintenance |
CBR | Case‐based reasoning |
CCA | Canonical correlation analysis |
CDF | Characteristics defect frequency |
CF | Crest factor |
CFT | Continuous Fourier transform |
CHAID | Chi‐square automatic integration detector |
Chi‐2 | Chi‐squared |
cICA | Constraint‐independent component analysis |
CLF | Clearance factor |
CM | Condition monitoring |
CMF | Combined mode function |
CMFE | Composite multiscale fuzzy entropy |
CNN | Convolutional neural network |
CoSaMP | Compressive sampling matching pursuit |
CS‐Chi‐2 | Compressive sampling and Chi‐square feature selection algorithm |
CS‐CMDS | Compressive sampling and classical multidimensional scaling |
CS‐CPDC | Compressive sampling and correlated principal and discriminant components |
CS‐FR | Compressive sampling and feature ranking |
CS‐FS | Compressive sampling and Fisher score |
CS‐GSN | Compressive sampling and GMST, SPE, and neighbourhood component analysis |
CS‐KLDA | Compressive sampling and kernel linear discriminant analysis algorithm |
CS‐KPCA | Compressive sampling and kernel principal component analysis method |
CS‐LDA | Compressive sampling and linear discriminant analysis method |
CS‐LS | Compressive sampling and Laplacian score |
CS‐LSL | Compressive sampling and linear subspace learning |
CS‐NLSL | Compressive sampling and nonlinear subspace learning |
CS‐PCA | Compressive sampling and principal component analysis |
CS‐PCC | Compressive sampling and Pearson correlation coefficients |
CS‐Relief‐F | Compressive sampling and Relief‐F algorithm |
CS‐SAE‐DNN | Compressive sampling and sparse autoencoder‐based deep neural network |
CS‐SPE | Compressive sampling and stochastic proximity embedding |
CVM | Cross‐validation method |
CWT | Continuous wavelet transform |
DAG | Direct acyclic graph |
DBN | Deep belief network |
DDMA | Discrete diffusion maps analysis |
DFA | Detrended‐fluctuation analysis |
DFT | Discrete Fourier transform |
DIFS | Difference signal |
DM | Diffusion map |
DNN | Deep neural network |
DPCA | Dynamic principal component analysis |
DRFF | Deep random forest fusion |
DT | Decision tree |
DTCWPT | Dual‐tree complex wavelet packet transform |
DWT | Discrete wavelet transform |
EBP | Error backpropagation |
EDAE | Ensemble deep autoencoder |
EEMD | Ensemble empirical mode decomposition |
ELM | Extreme learning machine |
ELU | Exponential linear unit |
EMA | Exponential moving average |
EMD | Empirical mode decomposition |
ENT | Entropy |
EPGS | Electrical power generation and storage |
EPSO | Enhanced particle swarm optimisation |
ESVM | Ensemble support vector machine |
FC‐WTA | Fully connected winner‐take‐all autoencoder |
FDA | Fisher discriminant analysis |
FDK | Frequency domain kurtosis |
FFNN | Feedforward neural network |
FFT | Fast Fourier transform |
FHMM | Factorial hidden Markov model |
FIR | Finite impulse response |
FKNN | Fuzzy k‐nearest neighbour |
FM | Frequency modulation |
FMM | Fuzzy min‐max |
FR | Feature ranking |
Fs | Sampling frequency |
FS | Fisher score |
FSVM | Fuzzy support vector machine |
FTF | Fundamental train frequency |
GA | Genetic algorithm |
GMM | Gaussian mixture model |
GMST | Geodesic minimal spanning tree |
GP | Genetic programming |
GR | Gain ratio |
GRU | Gated recurrent unit |
HE | Hierarchical entropy |
HFD | Higher‐frequency domain |
HHT | Hilbert‐Huang transform |
HIST | Histogram |
HLLE | Hessian‐based local linear embedding |
HMM | Hidden Markov model |
HOC | Higher‐order cumulant |
HOM | Higher‐order moment |
HOS | Higher‐order statistics |
HT | Hilbert transform |
ICA | Independent component analysis |
ICDSVM | Inter‐cluster distance support vector machine |
ID3 | Iterative Dichotomiser 3 |
I‐ESLLE | Incremental enhanced supervised locally linear embedding |
IF | Impulse factor |
IG | Information gain |
IGA | Immune genetic algorithm |
IIR | Infinite impulse response |
IMF | Intrinsic mode function |
IMFE | Improved multiscale fuzzy entropy |
IMPE | Improved multiscale permutation entropy |
ISBM | Improved slope‐based method |
ISOMAP | Isometric feature mapping |
KA | Kernel Adatron |
KCCA | Kernel canonical correlation analysis |
KFCM | Kernel fuzzy c‐means |
KICA | Kernel independent component analysis |
K‐L | Kullback–Leibler divergence |
KLDA | Kernel linear discriminant analysis |
KNN | Kohonen neural network |
k‐NN | k‐nearest neighbours |
KPCA | Kernel principal component analysis |
KURT | Kurtosis |
LB | Lower bound |
LCN | Local connection network |
LDA | Linear discriminant analysis |
LE | Laplacian eigenmap |
Lh | Likelihood |
LLE | Local linear embedding |
LMD | Local mean decomposition |
LOOCV | Leave‐one‐out cross‐validation |
LPP | Locality preserving projection |
LR | Logistic regression |
LRC | Logistic regression classifier |
LS | Laplacian score |
LSL | Linear subspace learning |
LSSVM | Least‐square support vector machine |
LTSA | Local tangent space alignment |
LTSM | Long short‐term memory |
MA | Moving average |
MCCV | Monte Carlo cross‐validation |
MCE | Minimum classification error |
MCM | Machine condition monitoring |
MDS | Multidimensional scaling |
MED | Minimum entropy deconvolution |
MEISVM | Multivariable ensemble‐based incremental support vector machine |
MF | Margin factor |
MFB | Modulated filter‐bank structure |
MFD | Multi‐scale fractal dimension |
MFE | Multi‐scale fuzzy entropy |
MHD | Multilayer hybrid denoising |
MI | Mutual information |
MLP | Multilayer perceptron |
MLR | Multinomial logistic regression |
MLRC | Multinomial logistic regression classifier |
MMV | Multiple measurement vectors |
MRA | Multiresolution analysis |
MRF | Markov random field |
MSE | Multiscale entropy |
MSE | Mean square error |
MVU | Maximum variance unfolding |
NCA | Neighbourhood component analysis |
NILES | Nonlinear estimation by iterative least square |
NIPALS | Nonlinear iterative partial least squares |
NLSL | Nonlinear subspace learning |
NN | Neural network |
Nnl | Normal negative log‐likelihood |
NNR | Nearest neighbour rule |
NSAE | Normalised sparse autoencoder |
NRS | Neighbourhood rough set |
O&M | Operation and maintenance |
OLS | Ordinary least squares |
OMP | Orthogonal matching pursuit |
ONPE | Orthogonal preserving embedding |
ORDWT | Overcomplete rational dilation discrete wavelet transform |
ORT | Orthogonal criterion |
OSFCM | Optimal supervised fuzzy C‐means clustering |
OSLLTSA | Orthogonal supervised local tangent space alignment analysis |
PCA | Principal component analysis |
PCC | Pearson correlation coefficient |
PCHI | Piecewise cubic Hermite interpolation |
Probability density function | |
PF | Product function |
PHM | Prognostic and health management |
PLS | Partial least squares |
PLS‐PM | Partial least squares path modelling |
PLS‐R | Partial least squares regression |
PNN | Probabilistic neural network |
p–p | Peak to peak |
PReLU | Parametric rectified linear unit |
PSO | Particle swarm optimisation |
PSVM | Proximal support vector machine |
PWVD | Pseudo Wigner‐Ville distribution |
QP | Quadratic programming |
QPSP‐LSSVM | Quantum behaved particle optimisation‐least square support vector machine |
RBF | Radial basis function |
RBM | Restricted Boltzmann machine |
RCMFE | Refined composite multi‐scale fuzzy entropy |
ReLU | Rectified linear unit |
RES | Residual signal |
RF | Random forest |
RFE | Recursive feature elimination |
RIP | Restricted isometry property |
RL | Reinforcement learning |
RMS | Root mean square |
RMSE | Root mean square error |
RNN | Recurrent neural network |
ROC | Receiver operating characteristic |
RPM | Revolutions per minute |
RSA | Rescaled range analysis |
RSGWPT | Redundant second‐generation wavelet packet transform |
RUL | Remaining useful life |
RVM | Relevance vector machine |
SAE | Sparse autoencoder |
S‐ANC | Self‐adaptive noise cancellation |
SBFS | Sequential backward floating selection |
SBS | Sequential backward selection |
SCADA | Supervisory control and data acquisition system |
SCG | Scale conjugate gradient |
SDA | Stacked denoising autoencoder |
SDE | Semidefinite embedding |
SDOF | Single degree of freedom |
SDP | Semidefinite programming |
SELTSA | Supervised extended local tangent space alignment |
SF | Shape factor |
SFFS | Sequential forward floating selection |
SFS | Sequential forward selection |
SGWD | Second generation wavelet denoising |
SIDL | Shift‐invariant dictionary learning |
SILTSA | Supervised incremental local tangent space alignment |
SK | Skewness |
SK | Spectral kurtosis |
S‐LLE | Statistical local linear embedding |
SLLTA | Supervised learning local tangent space alignment |
SM | Sammon mapping |
SMO | Sequential minimal optimisation |
SMV | Single measurement vector |
SNR | Signal‐to‐noise ratio |
SOM | Self‐organising map |
SP | Subspace pursuit |
SpaEIAD | Sparse extraction of impulse by adaptive dictionary |
SPE | Stochastic proximity embedding |
SPWVD | Smoothed pseudo Wigner‐Ville distribution |
SSC | Slope sign change |
STD | Standard deviation |
STE | Standard error |
STFT | Short‐time Fourier transform |
STGS | Steam turbine‐generators |
StOMP | Stagewise orthogonal matching pursuit |
SU‐LSTM | Stacked unidirectional long short‐term memory |
SVD | Singular value decomposition |
SVDD | Support vector domain description |
SVM | Support vector machine |
SVR | Support vector regression |
SWSVM | Shannon wavelet support vector machine |
TAWS | Time average wavelet spectrum |
TBM | Time‐based maintenance |
TDIDT | Top‐down induction on decision trees |
TEO | Teager energy operator |
TSA | Time synchronous average |
UB | Upper bound |
VKF | Vold‐Kalman filter |
VMD | Variational mode decomposition |
VPMCD | Variable predictive model‐based class discrimination |
VR | Variance |
WA | Willison amplitude |
WD | Wigner distribution |
WFE | Waveform entropy |
WKLFDA | Wavelet kernel function and local Fisher discriminant analysis |
WL | Wavelength |
WPA | Wavelet packet analysis |
WPE | Wavelet packet energy |
WPT | Wavelet packet transform |
WSN | Wireless sensor network |
WSVM | Wave support vector machine |
WT | Wavelet transform |
WTD | Wavelet thresholding denoising |
WVD | Wigner‐Ville distribution |
ZC | Zero crossing |