To see the other types of publications on this topic, follow the link: Lempel-Ziv complexity method.

Journal articles on the topic 'Lempel-Ziv complexity method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Lempel-Ziv complexity method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Du, Jianxi, Lingli Cui, Jianyu Zhang, Jin Li, and Jinfeng Huang. "The Method of Quantitative Trend Diagnosis of Rolling Bearing Fault Based on Protrugram and Lempel–Ziv." Shock and Vibration 2018 (November 1, 2018): 1–8. http://dx.doi.org/10.1155/2018/4303109.

Full text
Abstract:
This paper proposes a new method to realize the quantitative trend diagnosis of bearings based on Protrugram and Lempel–Ziv. Firstly, the fault features of original fault signals of bearing inner and outer race with different severity are extracted using Protrugram algorithm, and the optimal analysis frequency band is selected which reflects the fault characteristic. Then, the Lempel–Ziv complexity of the frequency band is calculated. Finally, the relationship between Lempel–Ziv complexity and fault size is obtained. Analysis results show that the severity of fault is proportional to the Lempel–Ziv complexity index value under different fault types. The Lempel–Ziv complexity showed different trend rules, respectively, in the inner and outer race, which realized the quantitative trend diagnosis of bearing faults.
APA, Harvard, Vancouver, ISO, and other styles
2

Han, Bing, Shun Wang, Qingqi Zhu, Xiaohui Yang, and Yongbo Li. "Intelligent Fault Diagnosis of Rotating Machinery Using Hierarchical Lempel-Ziv Complexity." Applied Sciences 10, no. 12 (2020): 4221. http://dx.doi.org/10.3390/app10124221.

Full text
Abstract:
The health condition monitoring of rotating machinery can avoid the disastrous failure and guarantee the safe operation. The vibration-based fault diagnosis shows the most attractive character for fault diagnosis of rotating machinery (FDRM). Recently, Lempel-Ziv complexity (LZC) has been investigated as an effective tool for FDRM. However, the LZC only performs single-scale analysis, which is not suitable to extract the fault features embedded in vibrational signal over multiple scales. In this paper, a novel complexity analysis algorithm, called hierarchical Lempel-Ziv complexity (HLZC), was developed to extract the fault characteristics of rotating machinery. The proposed HLZC method considers the fault information hidden in both low-frequency and high-frequency components, resulting in a more accurate fault feature extraction. The superiority of the proposed HLZC method in detecting the periodical impulses was validated by using simulated signals. Meanwhile, two experimental signals were utilized to prove the effectiveness of the proposed HLZC method in extracting fault information. Results show that the proposed HLZC method had the best diagnosing performance compared with LZC and multi-scale Lempel-Ziv complexity methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Tang, Youfu, Feng Lin, and Qian Zou. "Complexity Analysis of Time-Frequency Features for Vibration Signals of Rolling Bearings Based on Local Frequency." Shock and Vibration 2019 (July 10, 2019): 1–13. http://dx.doi.org/10.1155/2019/7190568.

Full text
Abstract:
The multisource impact signal of rolling bearings often represents nonlinear and nonstationary characteristics, and quantitative description of the complexity of the signal with traditional spectrum analysis methods is difficult to be obtained. In this study, firstly, a novel concept of local frequency is defined to develop the limitation of traditional frequency. Then, an adaptive waveform decomposition method is proposed to extract the time-frequency features of nonstationary signals with multicomponents. Finally, the normalized Lempel–Ziv complexity method is applied to quantitatively measure the time-frequency features of vibration signals of rolling bearings. The results indicate that the time-frequency features extracted by the proposed method have clear physical meanings and can accurately distinguish the different fault states of rolling bearings. Furthermore, the normalized Lempel–Ziv complexity method can quantitatively measure the nonlinearity of the multisource impact signal. So, it supplies an effective basis for fault diagnosis of rolling bearings.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhao, Huan, Gangjin Wang, Cheng Xu, and Fei Yu. "Voice activity detection method based on multivalued coarse-graining Lempel-Ziv complexity." Computer Science and Information Systems 8, no. 3 (2011): 869–88. http://dx.doi.org/10.2298/csis100906032z.

Full text
Abstract:
One of the key issues in practical speech processing is to locate precisely endpoints of the input utterance to be free of nonspeech regions. Although lots of studies have been performed to solve this problem, the operation of existing voice activity detection (VAD) algorithms is still far away from ideal. This paper proposes a novel robust feature for VAD method that is based on multi-valued coarsegraining Lempel-Ziv Complexity (MLZC), which is an improved algorithm of the binary coarse-graining Lempel-Ziv Complexity (BLZC). In addition, we use fuzzy c-Means clustering algorithm and the Bayesian information criterion algorithm to estimate the thresholds of the MLZC characteristic, and adopt the dual-thresholds method for VAD. Experimental results on the TIMIT continuous speech database show that at low SNR environments, the detection performance of the proposed MLZC method is superior to the VAD in GSM ARM, G.729 and BLZC method.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Haobo, Tongguang Yang, Qingkai Han, and Zhong Luo. "Approach to the Quantitative Diagnosis of Rolling Bearings Based on Optimized VMD and Lempel–Ziv Complexity under Varying Conditions." Sensors 23, no. 8 (2023): 4044. http://dx.doi.org/10.3390/s23084044.

Full text
Abstract:
The quantitative diagnosis of rolling bearings is essential to automating maintenance decisions. Over recent years, Lempel–Ziv complexity (LZC) has been widely used for the quantitative assessment of mechanical failures as one of the most valuable indicators for detecting dynamic changes in nonlinear signals. However, LZC focuses on the binary conversion of 0–1 code, which can easily lose some effective information about the time series and cannot fully mine the fault characteristics. Additionally, the immunity of LZC to noise cannot be insured, and it is difficult to quantitatively characterize the fault signal under strong background noise. To overcome these limitations, a quantitative bearing fault diagnosis method based on the optimized Variational Modal Decomposition Lempel–Ziv complexity (VMD-LZC) was developed to fully extract the vibration characteristics and to quantitatively characterize the bearing faults under variable operating conditions. First, to compensate for the deficiency that the main parameters of the variational modal decomposition (VMD) have to be selected by human experience, a genetic algorithm (GA) is used to optimize the parameters of the VMD and adaptively determine the optimal parameters [k, α] of the bearing fault signal. Furthermore, the IMF components that contain the maximum fault information are selected for signal reconstruction based on the Kurtosis theory. The Lempel–Ziv index of the reconstructed signal is calculated and then weighted and summed to obtain the Lempel–Ziv composite index. The experimental results show that the proposed method is of high application value for the quantitative assessment and classification of bearing faults in turbine rolling bearings under various operating conditions such as mild and severe crack faults and variable loads.
APA, Harvard, Vancouver, ISO, and other styles
6

Xiao, Leilei. "A New Feature Extraction Method of Marine Ambient Noise Based on Multiscale Dispersion Entropy." Mathematical Problems in Engineering 2022 (October 8, 2022): 1–11. http://dx.doi.org/10.1155/2022/7618380.

Full text
Abstract:
Marine ambient noise (AN) is a nonlinear and unstable signal, traditional dispersion entropy can only analyze the marine AN from a single scale, which is easy to cause the loss of information. To address this problem, we introduced multiscale dispersion entropy (MDE), and then a new feature extraction method of marine ambient noise based on MDE is proposed. We used MDE, multiscale permutation entropy (MPE), multiscale permutation Lempel–Ziv complexity (MPLZC), and multi-scale dispersion Lempel–Ziv complexity (MDLZC) to carry out feature extraction and classification recognition experiments for six ANs. The experimental results show that for the feature extraction methods based on MDE, MPE, MDLZC, and MPLZC, with the increase of the number of features, the feature extraction effect becomes better, and the average recognition rate (ARR) becomes higher; compared with other three feature extraction methods, the feature extraction method based on MDE has the best feature extraction effect and the highest ARR for the six ANs under the same feature number.
APA, Harvard, Vancouver, ISO, and other styles
7

AHMED, SULTAN UDDIN, MD SHAHJAHAN, and KAZUYUKI MURASE. "A LEMPEL-ZIV COMPLEXITY-BASED NEURAL NETWORK PRUNING ALGORITHM." International Journal of Neural Systems 21, no. 05 (2011): 427–41. http://dx.doi.org/10.1142/s0129065711002936.

Full text
Abstract:
This paper presents a pruning method for artificial neural networks (ANNs) based on the 'Lempel-Ziv complexity' (LZC) measure. We call this method the 'silent pruning algorithm' (SPA). The term 'silent' is used in the sense that SPA prunes ANNs without causing much disturbance during the network training. SPA prunes hidden units during the training process according to their ranks computed from LZC. LZC extracts the number of unique patterns in a time sequence obtained from the output of a hidden unit and a smaller value of LZC indicates higher redundancy of a hidden unit. SPA has a great resemblance to biological brains since it encourages higher complexity during the training process. SPA is similar to, yet different from, existing pruning algorithms. The algorithm has been tested on a number of challenging benchmark problems in machine learning, including cancer, diabetes, heart, card, iris, glass, thyroid, and hepatitis problems. We compared SPA with other pruning algorithms and we found that SPA is better than the 'random deletion algorithm' (RDA) which prunes hidden units randomly. Our experimental results show that SPA can simplify ANNs with good generalization ability.
APA, Harvard, Vancouver, ISO, and other styles
8

Şener, Somay Kübra, and Emine Doğru Bolat. "Sleep-Apnea Detection with the Lempel-Ziv Complexity Analysis of the Electrocardiogram and Respiratory Signals." Euroasia Journal of Mathematics, Engineering, Natural & Medical Sciences 9, no. 25 (2022): 109–20. https://doi.org/10.5281/zenodo.7474702.

Full text
Abstract:
Sleep apnea is a common and life-threatening disease. Diagnosis of the disease is as important as its treatment. A remarkable increase is observed in the number of diagnosed patients with the increase in public awareness and the increase in the rate of being noticed by physicians. Polysomnography measurements used in the diagnosis of sleep apnea disturb the patient and require more than one physiological data collection. Due to such problems, new analysis methods are being investigated. Since Lempel-Ziv is a fast and non-linear signal processing method, it is very suitable for processing physiological data. By using the Lempel-Ziv complexity method, it is aimed to diagnose the disease with less time and less data. In line with this goal, the treatment process will also be brought forward. Disease detection studies were carried out by using ECG and respiratory data from the Physionet.org database. As a result of the analyzes, it was observed that there was a significant difference in the time intervals with apnea from the ECG, chest respiration (Resp C) and abdominal respiration (Resp A) data. With this method, sleep apnea can be diagnosed for EKG, Resp C and Resp A.
APA, Harvard, Vancouver, ISO, and other styles
9

Yan, Xiaoan, Daoming She, Yadong Xu, and Minping Jia. "Application of Generalized Composite Multiscale Lempel–Ziv Complexity in Identifying Wind Turbine Gearbox Faults." Entropy 23, no. 11 (2021): 1372. http://dx.doi.org/10.3390/e23111372.

Full text
Abstract:
Wind turbine gearboxes operate in harsh environments; therefore, the resulting gear vibration signal has characteristics of strong nonlinearity, is non-stationary, and has a low signal-to-noise ratio, which indicates that it is difficult to identify wind turbine gearbox faults effectively by the traditional methods. To solve this problem, this paper proposes a new fault diagnosis method for wind turbine gearboxes based on generalized composite multiscale Lempel–Ziv complexity (GCMLZC). Within the proposed method, an effective technique named multiscale morphological-hat convolution operator (MHCO) is firstly presented to remove the noise interference information of the original gear vibration signal. Then, the GCMLZC of the filtered signal was calculated to extract gear fault features. Finally, the extracted fault features were input into softmax classifier for automatically identifying different health conditions of wind turbine gearboxes. The effectiveness of the proposed method was validated by the experimental and engineering data analysis. The results of the analysis indicate that the proposed method can identify accurately different gear health conditions. Moreover, the identification accuracy of the proposed method is higher than that of traditional multiscale Lempel–Ziv complexity (MLZC) and several representative multiscale entropies (e.g., multiscale dispersion entropy (MDE), multiscale permutation entropy (MPE) and multiscale sample entropy (MSE)).
APA, Harvard, Vancouver, ISO, and other styles
10

Amigó, José M., Janusz Szczepański, Elek Wajnryb, and Maria V. Sanchez-Vives. "Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity." Neural Computation 16, no. 4 (2004): 717–36. http://dx.doi.org/10.1162/089976604322860677.

Full text
Abstract:
Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal. Furthermore, we present numerical and experimental comparisons of the new method against the standard method based on word frequencies, providing evidence that this new approach is an alternative entropy estimator for binned spike trains.
APA, Harvard, Vancouver, ISO, and other styles
11

Su, Zhou, Juanjuan Shi, Weiguo Huang, et al. "Bearing fault severity assessment using variable-step multiscale fusion Lempel-Ziv complexity." Journal of Physics: Conference Series 2184, no. 1 (2022): 012002. http://dx.doi.org/10.1088/1742-6596/2184/1/012002.

Full text
Abstract:
Abstract Most research on bearing health condition monitoring is devoted to distinguish the defective condition from the healthy condition. In practice, the fault severity assessment is also critical for performing the prognostics and maintenance of bearings. Lempel-Ziv complexity (LZC) has been widely used for the bearing quantitative fault diagnosis. However, the original LZC extracts fault information only at the single scale and often fails to portray the fault features. Then, the multiscale LZC is proposed to more comprehensively extract the fault information. However, multiscale analysis would shorten the length of time series and lead to inaccurate calculation results as the scale factor increases. As such, this paper proposes a novel bearing fault severity assessment method using variable-step multiscale fusion Lempel-Ziv complexity (VSMFLZC) to facilitate the quantitative fault diagnosis of bearings. The variable step length strategy is developed in the proposed method to optimize the coarse-grained procedure. Then, Laplace score is applied to evaluate the features and weights at each scale to obtain the proposed VSMFLZC. By such a fusion algorithm, the sequence can be converted into a single but comprehensive evaluation indicator for the fault severity assessment. The experimental results indicate that the proposed method outperforms the original LZC and multiscale LZC, where the fault features can be more comprehensively extracted and the fault severity assessment of rolling bearing can be successfully realized.
APA, Harvard, Vancouver, ISO, and other styles
12

Ji, Guoli, Yong Zeng, Zijiang Yang, Congting Ye, and Jingci Yao. "A multiple sequence alignment method with sequence vectorization." Engineering Computations 31, no. 2 (2014): 283–96. http://dx.doi.org/10.1108/ec-01-2013-0026.

Full text
Abstract:
Purpose – The time complexity of most multiple sequence alignment algorithm is O(N2) or O(N3) (N is the number of sequences). In addition, with the development of biotechnology, the amount of biological sequences grows significantly. The traditional methods have some difficulties in handling large-scale sequence. The proposed Lemk_MSA method aims to reduce the time complexity, especially for large-scale sequences. At the same time, it can keep similar accuracy level compared to the traditional methods. Design/methodology/approach – LemK_MSA converts multiple sequence alignment into corresponding 10D vector alignment by ten types of copy modes based on Lempel-Ziv. Then, it uses k-means algorithm and NJ algorithm to divide the sequences into several groups and calculate guide tree of each group. A complete guide tree for multiple sequence alignment could be constructed by merging guide tree of every group. Moreover, for large-scale multiple sequence, Lemk_MSA proposes a GPU-based parallel way for distance matrix calculation. Findings – Under this approach, the time efficiency to process multiple sequence alignment can be improved. The high-throughput mouse antibody sequences are used to validate the proposed method. Compared to ClustalW, MAFFT and Mbed, LemK_MSA is more than ten times efficient while ensuring the alignment accuracy at the same time. Originality/value – This paper proposes a novel method with sequence vectorization for multiple sequence alignment based on Lempel-Ziv. A GPU-based parallel method has been designed for large-scale distance matrix calculation. It provides a new way for multiple sequence alignment research.
APA, Harvard, Vancouver, ISO, and other styles
13

Yin, Jiancheng, Xuye Zhuang, Wentao Sui, and Yunlong Sheng. "Manifold learning and Lempel-Ziv complexity-based fault severity recognition method for bearing." Measurement 213 (May 2023): 112714. http://dx.doi.org/10.1016/j.measurement.2023.112714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Borowska, Marta. "Multiscale Permutation Lempel–Ziv Complexity Measure for Biomedical Signal Analysis: Interpretation and Application to Focal EEG Signals." Entropy 23, no. 7 (2021): 832. http://dx.doi.org/10.3390/e23070832.

Full text
Abstract:
This paper analyses the complexity of electroencephalogram (EEG) signals in different temporal scales for the analysis and classification of focal and non-focal EEG signals. Futures from an original multiscale permutation Lempel–Ziv complexity measure (MPLZC) were obtained. MPLZC measure combines a multiscale structure, ordinal analysis, and permutation Lempel–Ziv complexity for quantifying the dynamic changes of an electroencephalogram (EEG). We also show the dependency of MPLZC on several straight-forward signal processing concepts, which appear in biomedical EEG activity via a set of synthetic signals. The main material of the study consists of EEG signals, which were obtained from the Bern-Barcelona EEG database. The signals were divided into two groups: focal EEG signals (n = 100) and non-focal EEG signals (n = 100); statistical analysis was performed by means of non-parametric Mann–Whitney test. The mean value of MPLZC results in the non-focal group are significantly higher than those in the focal group for scales above 1 (p < 0.05). The result indicates that the non-focal EEG signals are more complex. MPLZC feature sets are used for the least squares support vector machine (LS-SVM) classifier to classify into the focal and non-focal EEG signals. Our experimental results confirmed the usefulness of the MPLZC method for distinguishing focal and non-focal EEG signals with a classification accuracy of 86%.
APA, Harvard, Vancouver, ISO, and other styles
15

Liu, Wei Dong, and Hu Sheng Wu. "Study on Mechanical Fault Diagnosis Based on IMF Complexity Feature and Support Vector Machine." Applied Mechanics and Materials 246-247 (December 2012): 37–42. http://dx.doi.org/10.4028/www.scientific.net/amm.246-247.37.

Full text
Abstract:
According to the non-stationarity characteristics of the vibration signals from reciprocating machinery,a fault diagnosis method based on empirical mode decomposition,Lempel-Ziv complexity and support vector machine(SVM) is proposed.Firstly,the vibration signals were decomposed into a finite number of intrinsic mode functions(IMF), then choosed some IMF components with the criteria of mutual correlation coefficient between IMF components and denoised signal.Thirdly the complexity feature of each IMF component was calculated as faulty eigenvector and served as input of SVM classifier so that the faults of machine are classified.Practical experimental data is used to verify this method,and the diagnosis results and comparative tests fully validate its effectiveness and generalization abilities.
APA, Harvard, Vancouver, ISO, and other styles
16

Silva, Ana Carolina de Sousa, Aldo Ivan Céspedes Arce, Hubert Arteaga, Valeria Cristina Rodrigues Sarnighausen, Gustavo Voltani von Atzingen, and Ernane José Xavier Costa. "Improving Behavior Monitoring of Free-Moving Dairy Cows Using Noninvasive Wireless EEG Approach and Digital Signal Processing Techniques." Applied Sciences 13, no. 19 (2023): 10722. http://dx.doi.org/10.3390/app131910722.

Full text
Abstract:
Electroencephalography (EEG) is the most common method to access brain information. Techniques to monitor and extract brain signal characteristics in farm animals are not as developed as those for humans and laboratory animals. The objective of this study was to develop a noninvasive method for monitoring brain signals in cattle, allowing the animals to move freely, and to characterize these signals. Brain signals from six Holstein heifers that could move freely in a paddock compartment were acquired. The control group consisted of the same number of bovines, contained in a climatic chamber (restrained group). In the second step, the signals were characterized by Power Spectral Density, Short-Time Fourier Transform, and Lempel–Ziv complexity. The preliminary results revealed an optimal electrode position, referred to as POS2, which is located at the center of the frontal region of the animal’s head. This positioning allowed for attaching the electrodes to the front of the bovine’s head, resulting in the acquisition of longer artifact-free signal sections. The signals showed typical EEG frequency bands, like the bands found in humans. The Lempel–Ziv complexity values indicated that the bovine brain signals contained random and chaotic components. As expected, the signals acquired from the retained bovine group displayed sections with a larger number of artifacts due to the hot 32 degree C temperature in the climatic chamber. We present a method that helps to monitor and extract brain signal features in unrestrained bovines. The method could be applied to investigate changes in brain electrical activity during animal farming, to monitor brain pathologies, and to other situations related to animal behavior.
APA, Harvard, Vancouver, ISO, and other styles
17

KEH, LANCE ONG-SIONG CO TING, ANA MARIA AQUINO CHUPUNGCO, and JOSE PERICO ESGUERRA. "NONLINEAR TIME SERIES ANALYSIS OF ELECTROENCEPHALOGRAM TRACINGS OF CHILDREN WITH AUTISM." International Journal of Bifurcation and Chaos 22, no. 03 (2012): 1250044. http://dx.doi.org/10.1142/s0218127412500447.

Full text
Abstract:
Three methods of nonlinear time series analysis, Lempel–Ziv complexity, prediction error and covariance complexity were employed to distinguish between the electroencephalograms (EEGs) of normal children, children with mild autism, and children with severe autism. Five EEG tracings per cluster of children aged three to seven medically diagnosed with mild, severe and no autism were used in the analysis. A general trend seen was that the EEGs of children with mild autism were significantly different from those with severe or no autism. No significant difference was observed between normal children and children with severe autism. Among the three methods used, the method that was best able to distinguish between EEG tracings of children with mild and severe autism was found to be the prediction error, with a t-Test confidence level of above 98%.
APA, Harvard, Vancouver, ISO, and other styles
18

Yu, Lan Lan, and Tian Xing Meng. "The Sleep EEG Partition by Stages Based on Complexity Measure." Applied Mechanics and Materials 29-32 (August 2010): 2720–25. http://dx.doi.org/10.4028/www.scientific.net/amm.29-32.2720.

Full text
Abstract:
Sleep is the important phenomenon in human’s life. About third time of human is spent on sleeping in which many important physical course happen and develop so that the research of sleep EEG gets more and more regards. The beginning of sleep research often starts from right partition by stages. Different sleep stages correspond with different brain states so that the sleep partition by stages has important meaning for the research of the sleep EEG. In this paper, a method is given for sleep segmentation using Lempel-Ziv complexity. From the result of the simulation, it can be drawn that the complexity measure can helpful distinguish the sleep stages so it plays an active role to find a reliable guideline for the automatic partition of sleep stages in sleeping periods in time.
APA, Harvard, Vancouver, ISO, and other styles
19

Kedadouche, Mourad, Marc Thomas, Antoine Tahan, and Raynald Guilbault. "Nonlinear Parameters for Monitoring Gear: Comparison Between Lempel-Ziv, Approximate Entropy, and Sample Entropy Complexity." Shock and Vibration 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/959380.

Full text
Abstract:
Vibration analysis is the most used technique for defect monitoring failures of industrial gearboxes. Detection and diagnosis of gear defects are thus crucial to avoid catastrophic failures. It is therefore important to detect early fault symptoms. This paper introduces signal processing methods based on approximate entropy (ApEn), sample entropy (SampEn), and Lempel-Ziv Complexity (LZC) for detection of gears defects. These methods are based on statistical measurements exploring the regularity of vibratory signals. Applied to gear signals, the parameter selection of ApEn, SampEn, and LZC calculation is first numerically investigated, and appropriate parameters are suggested. Finally, an experimental study is presented to investigate the effectiveness of these indicators and a comparative study with traditional time domain indicators is presented. The results demonstrate that ApEn, SampEn, and LZC provide alternative features for signal processing. A new methodology is presented combining both Kurtosis and LZC for early detection of faults. The results show that this proposed method may be used as an effective tool for early detection of gear faults.
APA, Harvard, Vancouver, ISO, and other styles
20

YANG, GE, and JUN WANG. "COMPLEXITY AND MULTIFRACTAL OF VOLATILITY DURATION FOR AGENT-BASED FINANCIAL DYNAMICS AND REAL MARKETS." Fractals 24, no. 04 (2016): 1650052. http://dx.doi.org/10.1142/s0218348x16500523.

Full text
Abstract:
A random agent-based financial model is developed and investigated by the finite-range multitype contact dynamic system, in an attempt to reproduce and study the dynamics of financial markets. And an analysis method of detecting duration and intensity relationship in volatility series is introduced, called the volatility duration analysis. Then the auto-correlation analysis suggests that there exists evident volatility clustering feature in absolute volatility durations for the simulation data and the real data. Besides, the Lempel–Ziv complexity analysis is applied to study the complexity of the returns, the corresponding absolute returns and the volatility duration returns, which can reflect the fluctuation behaviors, the volatility behaviors and the volatility duration behaviors. At last, the multifractal phenomena of volatility durations of returns are comparatively studied for Shanghai Composite Index and the proposed model by multifractal detrended fluctuation analysis.
APA, Harvard, Vancouver, ISO, and other styles
21

Ji, Guanni, Yu Wang, and Fei Wang. "Comparative Study on Feature Extraction of Marine Background Noise Based on Nonlinear Dynamic Features." Entropy 25, no. 6 (2023): 845. http://dx.doi.org/10.3390/e25060845.

Full text
Abstract:
Marine background noise (MBN) is the background noise of the marine environment, which can be used to invert the parameters of the marine environment. However, due to the complexity of the marine environment, it is difficult to extract the features of the MBN. In this paper, we study the feature extraction method of MBN based on nonlinear dynamics features, where the nonlinear dynamical features include two main categories: entropy and Lempel–Ziv complexity (LZC). We have performed single feature and multiple feature comparative experiments on feature extraction based on entropy and LZC, respectively: for entropy-based feature extraction experiments, we compared feature extraction methods based on dispersion entropy (DE), permutation entropy (PE), fuzzy entropy (FE), and sample entropy (SE); for LZC-based feature extraction experiments, we compared feature extraction methods based on LZC, dispersion LZC (DLZC) and permutation LZC (PLZC), and dispersion entropy-based LZC (DELZC). The simulation experiments prove that all kinds of nonlinear dynamics features can effectively detect the change of time series complexity, and the actual experimental results show that regardless of the entropy-based feature extraction method or LZC-based feature extraction method, they both present better feature extraction performance for MBN.
APA, Harvard, Vancouver, ISO, and other styles
22

Abu-Taieh, Evon, and Issam AlHadid. "CRUSH: A New Lossless Compression Algorithm." Modern Applied Science 12, no. 11 (2018): 387. http://dx.doi.org/10.5539/mas.v12n11p387.

Full text
Abstract:
Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).
APA, Harvard, Vancouver, ISO, and other styles
23

Abu-Taieh, Evon, and Issam AlHadid. "CRUSH: A New Lossless Compression Algorithm." Modern Applied Science 12, no. 11 (2018): 406. http://dx.doi.org/10.5539/mas.v12n11p406.

Full text
Abstract:
Multimedia is highly competitive world, one of the properties that is reflected is speed of download and upload of multimedia elements: text, sound, pictures, animation. This paper presents CRUSH algorithm which is a lossless compression algorithm. CRUSH algorithm can be used to compress files. CRUSH method is fast and simple with time complexity O(n) where n is the number of elements being compressed.Furthermore, compressed file is independent from algorithm and unnecessary data structures. As the paper will show comparison with other compression algorithms like Shannon–Fano code, Huffman coding, Run Length Encoding, Arithmetic Coding, Lempel-Ziv-Welch (LZW), Run Length Encoding (RLE), Burrows-Wheeler Transform.Move-to-Front (MTF) Transform, Haar, wavelet tree, Delta Encoding, Rice &Golomb Coding, Tunstall coding, DEFLATE algorithm, Run-Length Golomb-Rice (RLGR).
APA, Harvard, Vancouver, ISO, and other styles
24

Mitina, A., N. Orlova, A. Dergilev, and Yuriy Orlov. "COMPUTATIONAL TOOLS FOR THE DNA TEXT COMPLEXITY ESTIMATES FOR MICROBIAL GENOMES STRUCTURE ANALYSIS." Russian Journal of Biological Physics and Chemisrty 8, no. 4 (2024): 408–16. http://dx.doi.org/10.29039/rusjbpc.2023.0640.

Full text
Abstract:
One of the fundamental tasks in bioinformatics involves searching for repeats, which are statistically heterogeneous segments within DNA sequences and complete genomes of microorganisms. Theoretical approaches to analyzing the complexity of macromolecule sequences (DNA, RNA, and proteins) were established prior to the availability of complete genomic sequences. These approaches have experienced a resurgence due to the proliferation of mass parallel sequencing technologies and the exponential growth of accessible data. This article explores contemporary computer methods and existing programs designed to assess DNA text complexity as well as construct profiles of properties for analysing the genomic structures of microorganisms. The article offers a comprehensive overview of available online programs designed for detecting and visualising repeats within genetic text. Furthermore, the paper introduces a novel computer-based implementation of a method to evaluate the linguistic complexity of text and its compression using Lempel-Ziv. This approach aims to identify structural features and anomalies within the genomes of microorganisms. The article also provides examples of profiles generated through the analysis of text complexity. Application of these complexity estimates in the analysis of genome sequences, such as those of the SARS-CoV-2 coronavirus and the Mumps Orthorubulavirus, is discussed. Specific areas of low complexity within the genetic text have been successfully identified in this research.
APA, Harvard, Vancouver, ISO, and other styles
25

Dou, Xun, and Yu He. "A Short-Term Electricity Load Complementary Forecasting Method Based on Bi-Level Decomposition and Complexity Analysis." Mathematics 13, no. 7 (2025): 1066. https://doi.org/10.3390/math13071066.

Full text
Abstract:
With the increasing complexity of the power system and the increasing load volatility, accurate load forecasting plays a vital role in ensuring the safety of power supply, optimizing scheduling decisions and resource allocation. However, the traditional single model has limitations in extracting the multi-frequency features of load data and processing components with varying complexity. Therefore, this paper proposes a complementary forecasting method based on bi-level decomposition and complexity analysis. In the paper, Pyraformer is used as a complementary model for the Single Channel Enhanced Periodicity Decoupling Framework (SCEPDF). Firstly, a Hodrick Prescott Filter (HP Filter) is used to decompose the electricity data, extracting the trend and periodic components. Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) is used to further decompose the periodic components to obtain several IMF components. Secondly, based on the sample entropy, spectral entropy, and Lempel–Ziv complexity, a complexity evaluation index system is constructed to comprehensively analyze the complexity of each IMF component. Then, based on the comprehensive complexity of each IMF component, different components are fed into the complementary model. The predicted values of each component are combined to obtain the final result. Finally, the proposed method is tested on the quarterly electrical load dataset. The effectiveness of the proposed method is verified through comparative and ablation experiments. The experimental results show that the proposed method demonstrates excellent performance in short-term electricity load forecasting tasks.
APA, Harvard, Vancouver, ISO, and other styles
26

Sabeti, Elyas, Sehong Oh, Peter Song, and Alfred Hero. "A Pattern Dictionary Method for Anomaly Detection." Entropy 24, no. 8 (2022): 1095. http://dx.doi.org/10.3390/e24081095.

Full text
Abstract:
In this paper, we propose a compression-based anomaly detection method for time series and sequence data using a pattern dictionary. The proposed method is capable of learning complex patterns in a training data sequence, using these learned patterns to detect potentially anomalous patterns in a test data sequence. The proposed pattern dictionary method uses a measure of complexity of the test sequence as an anomaly score that can be used to perform stand-alone anomaly detection. We also show that when combined with a universal source coder, the proposed pattern dictionary yields a powerful atypicality detector that is equally applicable to anomaly detection. The pattern dictionary-based atypicality detector uses an anomaly score defined as the difference between the complexity of the test sequence data encoded by the trained pattern dictionary (typical) encoder and the universal (atypical) encoder, respectively. We consider two complexity measures: the number of parsed phrases in the sequence, and the length of the encoded sequence (codelength). Specializing to a particular type of universal encoder, the Tree-Structured Lempel–Ziv (LZ78), we obtain a novel non-asymptotic upper bound, in terms of the Lambert W function, on the number of distinct phrases resulting from the LZ78 parser. This non-asymptotic bound determines the range of anomaly score. As a concrete application, we illustrate the pattern dictionary framework for constructing a baseline of health against which anomalous deviations can be detected.
APA, Harvard, Vancouver, ISO, and other styles
27

Al Attar, Tara Nawzad Ahmad. "A Hybrid Genetic Algorithm-Particle Swarm Optimization Approach for Enhanced Text Compression." UHD Journal of Science and Technology 8, no. 2 (2024): 63–74. http://dx.doi.org/10.21928/uhdjst.v8n2y2024.pp63-74.

Full text
Abstract:
Text compression is a necessity for efficient data storage and transmission. Especially in the digital era, volumes of digital text have increased incredibly. Traditional text compression methods, including Huffman coding and Lempel-Ziv-Welch, have certain limitations regarding their adaptability and efficiency in dealing with such complexity and diversity of data. In this paper, we propose a hybrid method that combines Genetic Algorithm (GA) with Particle Swarm Optimization (PSO) to optimize the compression of text using the broad exploration capabilities of GA and fast convergence properties of PSO. The experimental results reflect that the proposed hybrid approach of GA-PSO yields much better performance in compression ratio than the standalone methods by reducing the size to about 65% while retaining integrity in the original content. The proposed method is also highly adaptable to various text forms and outperformed other state-of-the-art methods such as the Grey Wolf Optimizer, the Whale Optimization Algorithm, and the African Vulture Optimization Algorithm. These results support that the hybrid method GA-PSO seems promising for modern text compression.
APA, Harvard, Vancouver, ISO, and other styles
28

Yakovleva, Tatiana V., Ilya E. Kutepov, Antonina Yu Karas, et al. "EEG Analysis in Structural Focal Epilepsy Using the Methods of Nonlinear Dynamics (Lyapunov Exponents, Lempel–Ziv Complexity, and Multiscale Entropy)." Scientific World Journal 2020 (February 11, 2020): 1–13. http://dx.doi.org/10.1155/2020/8407872.

Full text
Abstract:
This paper analyzes a case with the patient having focal structural epilepsy by processing electroencephalogram (EEG) fragments containing the “sharp wave” pattern of brain activity. EEG signals were recorded using 21 channels. Based on the fact that EEG signals are time series, an approach has been developed for their analysis using nonlinear dynamics tools: calculating the Lyapunov exponent’s spectrum, multiscale entropy, and Lempel–Ziv complexity. The calculation of the first Lyapunov exponent is carried out by three methods: Wolf, Rosenstein, and Sano–Sawada, to obtain reliable results. The seven Lyapunov exponent spectra are calculated by the Sano–Sawada method. For the observed patient, studies showed that with medical treatment, his condition did not improve, and as a result, it was recommended to switch from conservative treatment to surgical. The obtained results of the patient’s EEG study using the indicated nonlinear dynamics methods are in good agreement with the medical report and MRI data. The approach developed for the analysis of EEG signals by nonlinear dynamics methods can be applied for early detection of structural changes.
APA, Harvard, Vancouver, ISO, and other styles
29

Stępień, Kuklik, Żebrowski, Sanders, Derejko, and Podziemski. "Kolmogorov Complexity of Coronary Sinus Atrial Electrograms before Ablation Predicts Termination of Atrial Fibrillation after Pulmonary Vein Isolation." Entropy 21, no. 10 (2019): 970. http://dx.doi.org/10.3390/e21100970.

Full text
Abstract:
Atrial fibrillation (AF) is related to a very complex local electrical activity reflected in the rich morphology of intracardiac electrograms. The link between electrogram complexity and efficacy of the catheter ablation is unclear. We test the hypothesis that the Kolmogorov complexity of a single atrial bipolar electrogram recorded during AF within the coronary sinus (CS) at the beginning of the catheter ablation may predict AF termination directly after pulmonary vein isolation (PVI). The study population consisted of 26 patients for whom 30 s baseline electrograms were recorded. In all cases PVI was performed. If AF persisted after PVI, ablation was extended beyond PVs. Kolmogorov complexity estimated by Lempel–Ziv complexity and the block decomposition method was calculated and compared with other measures: Shannon entropy, AF cycle length, dominant frequency, regularity, organization index, electrogram fractionation, sample entropy and wave morphology similarity index. A 5 s window length was chosen as optimal in calculations. There was a significant difference in Kolmogorov complexity between patients with AF termination directly after PVI compared to patients undergoing additional ablation (p < 0.01). No such difference was seen for remaining complexity parameters. Kolmogorov complexity of CS electrograms measured at baseline before PVI can predict self-termination of AF directly after PVI.
APA, Harvard, Vancouver, ISO, and other styles
30

Ge, Jianghua, Guibin Yin, Yaping Wang, Di Xu, and Fen Wei. "Rolling-Bearing Fault-Diagnosis Method Based on Multimeasurement Hybrid-Feature Evaluation." Information 10, no. 11 (2019): 359. http://dx.doi.org/10.3390/info10110359.

Full text
Abstract:
To improve the accuracy of rolling-bearing fault diagnosis and solve the problem of incomplete information about the feature-evaluation method of the single-measurement model, this paper combines the advantages of various measurement models and proposes a fault-diagnosis method based on multi-measurement hybrid-feature evaluation. In this study, an original feature set was first obtained through analyzing a collected vibration signal. The feature set included time- and frequency-domain features, and also, based on the empirical-mode decomposition (EMD)-obtained time-frequency domain, energy and Lempel–Ziv complexity features. Second, a feature-evaluation framework of multiplicative hybrid models was constructed based on correlation, distance, information, and other measures. The framework was used to rank features and obtain rank weights. Then the weights were multiplied by the features to obtain a new feature set. Finally, the fault-feature set was used as the input of the category-divergence fault-diagnosis model based on kernel principal component analysis (KPCA), and the fault-diagnosis model was based on a support vector machine (SVM). The clustering effect of different fault categories was more obvious and classification accuracy was improved.
APA, Harvard, Vancouver, ISO, and other styles
31

Li, Rui, Jun Wang, and Guochao Wang. "Complex Similarity and Fluctuation Dynamics of Financial Markets on Voter Interacting Dynamic System." International Journal of Bifurcation and Chaos 28, no. 13 (2018): 1850156. http://dx.doi.org/10.1142/s0218127418501560.

Full text
Abstract:
A financial price dynamics is developed based on the voter interacting system, in an attempt to investigate and reproduce the complex similarity and the fluctuation dynamics of financial markets. The complexity-invariance distance (CID) is applied to study the similarity of each stock pairs. A simple classification of seven real indexes and the simulated data is obtained according to the CID values for each stock pairs. The corresponding multiscale dynamical behaviors of CID values are also studied by combining CID with the multiscale method. Further, the similarity of the newest data and the historical data of the returns is investigated by a novel auto-CID analysis, and a corresponding exponent relationship is exhibited. Moveover, the cross correlation function (CCF) is applied to study the correlation of each stock pairs and the causalities of these stock pairs are investigated by the Granger causality method. Besides, the complexity and the randomness of fluctuations of returns, surrogate returns, shuffled returns and intrinsic mode functions (derived from empirical mode decomposition) are also explored at different thresholds with Lempel–Ziv complexity. The empirical study shows complex similarity and similar random property between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent.
APA, Harvard, Vancouver, ISO, and other styles
32

Diez, Pablo F., Vicente A. Mut, Eric Laciar, Abel Torres, and Enrique M. Avila Perona. "FEATURES EXTRACTION METHOD FOR BRAIN-MACHINE COMMUNICATION BASED ON THE EMPIRICAL MODE DECOMPOSITION." Biomedical Engineering: Applications, Basis and Communications 25, no. 06 (2013): 1350058. http://dx.doi.org/10.4015/s1016237213500580.

Full text
Abstract:
A brain-machine interface (BMI) is a communication system that translates human brain activity into commands, and then these commands are conveyed to a machine or a computer. It is proposes a technique for features extraction from electroencephalographic (EEG) signals and afterward, their classification on different mental tasks. The empirical mode decomposition (EMD) is a method capable of processing non-stationary and nonlinear signals, as the EEG. The EMD was applied on EEG signals of seven subjects performing five mental tasks. Six features were computed, namely, root mean square (RMS), variance, Shannon entropy, Lempel–Ziv complexity value, and central and maximum frequencies. In order to reduce the dimensionality of the feature vector, the Wilks' lambda (WL) parameter was used for the selection of the most important variables. The classification of mental tasks was performed using linear discriminant analysis (LDA) and neural networks (NN). Using this method, the average classification over all subjects in database is 91 ± 5% and 87 ± 5% using LDA and NN, respectively. Bit rate was ranging from 0.24 bits/trial up to 0.84 bits/trial. The proposed method allows achieving higher performances in the classification of mental tasks than other traditional methods using the same database. This represents an improvement in the brain-machine communication system.
APA, Harvard, Vancouver, ISO, and other styles
33

Barile, Claudia, Giovanni Pappalettera, Vimalathithan Paramsamy Kannan, and Caterina Casavola. "A Neural Network Framework for Validating Information–Theoretics Parameters in the Applications of Acoustic Emission Technique for Mechanical Characterization of Materials." Materials 16, no. 1 (2022): 300. http://dx.doi.org/10.3390/ma16010300.

Full text
Abstract:
A multiparameter approach is preferred while utilizing Acoustic Emission (AE) technique for mechanical characterization of composite materials. It is essential to utilize a statistical parameter, which is independent of the sensor characteristics, for this purpose. Thus, a new information–theoretics parameter, Lempel–Ziv (LZ) complexity, is used in this research work for mechanical characterization of Carbon Fibre Reinforced Plastic (CFRP) composites. CFRP specimens in plain weave fabric configurations were tested and the acoustic activity during the loading was recorded. The AE signals were classified based on their peak amplitudes, counts, and LZ complexity indices using k-means++ data clustering algorithm. The clustered data were compared with the mechanical results of the tensile tests on CFRP specimens. The results show that the clustered data are capable of identifying critical regions of failure. The LZ complexity indices of the AE signal can be used as an AE descriptor for mechanical characterization. This is validated by studying the clustered signals in their time–frequency domain using wavelet transform. Finally, a neural network framework based on SqueezeNet was trained using the wavelet scalograms for a quantitative validation of the data clustering approach proposed in this research work. The results show that the proposed method functions at an efficiency of more than 85% for three out of four clustered data. This validates the application of LZ complexity as an AE descriptor for AE signal data analysis.
APA, Harvard, Vancouver, ISO, and other styles
34

Li, Yuxing, Yingmin Yi, Junxian Wu, and Yunpeng Gu. "A novel feature extraction method for ship-radiated noise based on hierarchical refined composite multi-scale dispersion entropy-based Lempel-Ziv complexity." Deep Sea Research Part I: Oceanographic Research Papers 199 (September 2023): 104111. http://dx.doi.org/10.1016/j.dsr.2023.104111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Zhang, Xian, Diquan Li, Jin Li, et al. "Magnetotelluric Signal-Noise Separation Using IE-LZC and MP." Entropy 21, no. 12 (2019): 1190. http://dx.doi.org/10.3390/e21121190.

Full text
Abstract:
Eliminating noise signals of the magnetotelluric (MT) method is bound to improve the quality of MT data. However, existing de-noising methods are designed for use in whole MT data sets, causing the loss of low-frequency information and severe mutation of the apparent resistivity-phase curve in low-frequency bands. In this paper, we used information entropy (IE), the Lempel–Ziv complexity (LZC), and matching pursuit (MP) to distinguish and suppress MT noise signals. Firstly, we extracted IE and LZC characteristic parameters from each segment of the MT signal in the time-series. Then, the characteristic parameters were input into the FCM clustering to automatically distinguish between the signal and noise. Next, the MP de-noising algorithm was used independently to eliminate MT signal segments that were identified as interference. Finally, the identified useful signal segments were combined with the denoised data segments to reconstruct the signal. The proposed method was validated through clustering analysis based on the signal samples collected at the Qinghai test site and the measured sites, where the results were compared to those obtained using the remote reference method and independent use of the MP method. The findings show that strong interference is purposefully removed, and the apparent resistivity-phase curve is continuous and stable. Moreover, the processed data can accurately reflect the geoelectrical information and improve the level of geological interpretation.
APA, Harvard, Vancouver, ISO, and other styles
36

Matcharashvili, Teimuraz, Takahiro Hatano, Tamaz Chelidze, and Natalia Zhukova. "Simple statistics for complex Earthquake time distributions." Nonlinear Processes in Geophysics 25, no. 3 (2018): 497–510. http://dx.doi.org/10.5194/npg-25-497-2018.

Full text
Abstract:
Abstract. Here we investigated a statistical feature of earthquake time distributions in the southern California earthquake catalog. As a main data analysis tool, we used a simple statistical approach based on the calculation of integral deviation times (IDT) from the time distribution of regular markers. The research objective is to define whether and when the process of earthquake time distribution approaches to randomness. Effectiveness of the IDT calculation method was tested on the set of simulated color noise data sets with the different extent of regularity, as well as for Poisson process data sets. Standard methods of complex data analysis have also been used, such as power spectrum regression, Lempel and Ziv complexity, and recurrence quantification analysis, as well as multiscale entropy calculations. After testing the IDT calculation method for simulated model data sets, we have analyzed the variation in the extent of regularity in the southern California earthquake catalog. Analysis was carried out for different periods and at different magnitude thresholds. It was found that the extent of the order in earthquake time distributions is fluctuating over the catalog. Particularly, we show that in most cases, the process of earthquake time distributions is less random in periods of strong earthquake occurrence compared to periods with relatively decreased local seismic activity. Also, we noticed that the strongest earthquakes occur in periods when IDT values increase.
APA, Harvard, Vancouver, ISO, and other styles
37

Cuesta-Frau, David, Daniel Novák, Vacláv Burda, et al. "Influence of Duodenal–Jejunal Implantation on Glucose Dynamics: A Pilot Study Using Different Nonlinear Methods." Complexity 2019 (February 14, 2019): 1–10. http://dx.doi.org/10.1155/2019/6070518.

Full text
Abstract:
Diabetes is a disease of great and rising prevalence, with the obesity epidemic being a significant contributing risk factor. Duodenal–jejunal bypass liner (DJBL) is a reversible implant that mimics the effects of more aggressive surgical procedures, such as gastric bypass, to induce weight loss. We hypothesized that DJBL also influences the glucose dynamics in type II diabetes, based on the induced changes already demonstrated in other physiological characteristics and parameters. In order to assess the validity of this assumption, we conducted a quantitative analysis based on several nonlinear algorithms (Lempel–Ziv Complexity, Sample Entropy, Permutation Entropy, and modified Permutation Entropy), well suited to the characterization of biomedical time series. We applied them to glucose records drawn from two extreme cases available of DJBL implantation: before and after 10 months. The results confirmed the hypothesis and an accuracy of 86.4% was achieved with modified Permutation Entropy. Other metrics also yielded significant classification accuracy results, all above 70%, provided a suitable parameter configuration was chosen. With the Leave–One–Out method, the results were very similar, between 72% and 82% classification accuracy. There was also a decrease in entropy of glycaemia records during the time interval studied. These findings provide a solid foundation to assess how glucose metabolism may be influenced by DJBL implantation and opens a new line of research in this field.
APA, Harvard, Vancouver, ISO, and other styles
38

Huang, Zhehao, Benhuan Nie, Yuqiao Lan, and Changhong Zhang. "A Decomposition-Integration Framework of Carbon Price Forecasting Based on Econometrics and Machine Learning Methods." Mathematics 13, no. 3 (2025): 464. https://doi.org/10.3390/math13030464.

Full text
Abstract:
Carbon price forecasting and pricing are critical for stabilizing carbon markets, mitigating investment risks, and fostering economic development. This paper presents an advanced decomposition-integration framework which seamlessly integrates econometric models with machine learning techniques to enhance carbon price forecasting. First, the complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN) method is employed to decompose carbon price data into distinct modal components, each defined by specific frequency characteristics. Then, Lempel–Ziv complexity and dispersion entropy algorithms are applied to analyze these components, facilitating the identification of their unique frequency attributes. The framework subsequently employs GARCH models for predicting high-frequency components and a gated recurrent unit (GRU) neural network optimized by the grey wolf algorithm for low-frequency components. Finally, the optimized GRU model is utilized to integrate these predictive outcomes nonlinearly, ensuring a comprehensive and precise forecast. Empirical evidence demonstrates that this framework not only accurately captures the diverse characteristics of different data components but also significantly outperforms traditional benchmark models in predictive accuracy. By optimizing the GRU model with the grey wolf optimizer (GWO) algorithm, the framework enhances both prediction stability and adaptability, while the nonlinear integration approach effectively mitigates error accumulation. This innovative framework offers a scientifically rigorous and efficient tool for carbon price forecasting, providing valuable insights for policymakers and market participants in carbon trading.
APA, Harvard, Vancouver, ISO, and other styles
39

Saxena, Sulekha, Vijay Kumar Gupta, and P. N. Hrisheekesha. "CORONARY HEART DISEASE DETECTION USING NONLINEAR FEATURES AND ONLINE SEQUENTIAL EXTREME LEARNING MACHINE." Biomedical Engineering: Applications, Basis and Communications 31, no. 06 (2019): 1950046. http://dx.doi.org/10.4015/s1016237219500467.

Full text
Abstract:
In this paper, we propose an automated approach that combines the generalized discriminant analysis (GDA) as feature reduction scheme with radial basis function (RBF) kernel and the online sequential extreme learning machine (OSELM) having Sigmoid, Hardlim, RBF and Sine activation function as binary classifier for detection of congestive heart failure (CHF) and coronary artery disease (CAD). For this analysis, 13 nonlinear features as Correlation Dimension (CD), Detrended Fluctuation Analysis (DFA) as DFA-[Formula: see text]1 and DFA-[Formula: see text]2, Bubble Entropy (BBEn), Sample Entropy (SampEn), Dispersion Entropy (DISEn), Lempel–Ziv Complexity (LZ), Sinai Entropy (SIEn), Improved Multiscale Permutation Entropy (IMPE), Hurst Exponent (HE), Permutation Entropy (PE), Approximate Entropy (ApEn) and Standard Deviation (SD1/SD2) were extracted from Heart Rate Variability (HRV) signals. For validation of proposed method, HRV data were obtained from standard database of normal sinus rhythm (NSR), CHF and CAD subjects. Numerical experiments were done on the combination of database sets such as NSR-CAD, CHF-CAD and NSR-CHF subjects. The simulation results show a clear difference in combination of database sets by using GDA having RBF, Gaussian kernel function and OSELM binary classifier having Sigmoid, RBF and Sine activation function and achieved an accuracy of 98.17% for NSR-CAD, 100% for NSR-CHF and CAD-CHF subjects.
APA, Harvard, Vancouver, ISO, and other styles
40

Zhou, Jianguo, and Dongfeng Chen. "Carbon Price Forecasting Based on Improved CEEMDAN and Extreme Learning Machine Optimized by Sparrow Search Algorithm." Sustainability 13, no. 9 (2021): 4896. http://dx.doi.org/10.3390/su13094896.

Full text
Abstract:
Effective carbon pricing policies have become an effective tool for many countries to encourage emission reduction. An accurate carbon price prediction model is helpful for the implementation of energy conservation and emission reduction policies and the decision-making of governments and investors. However, it is difficult for a single prediction model to achieve high prediction accuracy because of the high complexity of the carbon price series. Many studies have proved the nonlinear characteristics of carbon trading prices, but there are very few studies on the chaotic nature of carbon price series. As a consequence, this paper proposes an innovative hybrid model for carbon price prediction. A decomposition-reconstruction-prediction-integration scheme is designed to predict carbon prices. Firstly, several intrinsic mode functions (IMFs) and one residue were obtained from the raw data decomposed by ICEEMDAN. Next, the decomposed subsection is reconstructed into a new sequence according to the calculation results by the Lempel-Ziv complexity algorithm. Then, considering the chaotic characteristics of sequence, the input variables of the models are determined through the phase space reconstruction (PSR) algorithm combined with the partial autocorrelation function (PACF). Finally, the Sparrow search algorithm (SSA) is introduced to optimize the extreme learning machine (ELM) model, which is applied in the carbon price prediction for the purpose of verifying the validity of the proposed combination model, which is applied to the pilots of Hubei, Beijing, and Guangdong. The empirical results show that the combination model outperformed the 13 other models in predicting accuracy, speed, and stability. The decomposition-reconstruction-prediction-integration strategy is a method for predicting the carbon price efficiently.
APA, Harvard, Vancouver, ISO, and other styles
41

Fiedor, Paweł, and Artur Hołda. "The Effects of Bankruptcy on the Predictability of Price Formation Processes on Warsaw’s Stock Market." e-Finanse 12, no. 1 (2016): 32–42. http://dx.doi.org/10.1515/fiqf-2016-0134.

Full text
Abstract:
AbstractIn this study we investigate how bankruptcy affects the market behaviour of prices of stocks on Warsaw’s Stock Exchange. As the behaviour of prices can be seen in a myriad of ways, we investigate a particular aspect of this behaviour, namely the predictability of these price formation processes. We approximate their predictability as the structural complexity of logarithmic returns. This method of analysing predictability of price formation processes using information theory follows closely the mathematical definition of predictability, and is equal to the degree to which redundancy is present in the time series describing stock returns. We use Shannon’s entropy rate (approximating Kolmogorov-Sinai entropy) to measure this redundancy, and estimate it using the Lempel-Ziv algorithm, computing it with a running window approach over the entire price history of 50 companies listed on the Warsaw market which have gone bankrupt in the last few years. This enables us not only to compare the differences between predictability of price formation processes before and after their filing for bankruptcy, but also to compare the changes in predictability over time, as well as divided into different categories of companies and bankruptcies. There exists a large body of research analysing the efficiency of the whole market and the predictability of price changes en large, but only a few detailed studies analysing the influence of external stimulion the efficiency of price formation processes. This study fills this gap in the knowledge of financial markets, and their response to extreme external events.
APA, Harvard, Vancouver, ISO, and other styles
42

Ruffini, Giulio, Giada Damiani, Diego Lozano-Soldevilla, et al. "LSD-induced increase of Ising temperature and algorithmic complexity of brain dynamics." PLOS Computational Biology 19, no. 2 (2023): e1010811. http://dx.doi.org/10.1371/journal.pcbi.1010811.

Full text
Abstract:
A topic of growing interest in computational neuroscience is the discovery of fundamental principles underlying global dynamics and the self-organization of the brain. In particular, the notion that the brain operates near criticality has gained considerable support, and recent work has shown that the dynamics of different brain states may be modeled by pairwise maximum entropy Ising models at various distances from a phase transition, i.e., from criticality. Here we aim to characterize two brain states (psychedelics-induced and placebo) as captured by functional magnetic resonance imaging (fMRI), with features derived from the Ising spin model formalism (system temperature, critical point, susceptibility) and from algorithmic complexity. We hypothesized, along the lines of the entropic brain hypothesis, that psychedelics drive brain dynamics into a more disordered state at a higher Ising temperature and increased complexity. We analyze resting state blood-oxygen-level-dependent (BOLD) fMRI data collected in an earlier study from fifteen subjects in a control condition (placebo) and during ingestion of lysergic acid diethylamide (LSD). Working with the automated anatomical labeling (AAL) brain parcellation, we first create “archetype” Ising models representative of the entire dataset (global) and of the data in each condition. Remarkably, we find that such archetypes exhibit a strong correlation with an average structural connectome template obtained from dMRI (r = 0.6). We compare the archetypes from the two conditions and find that the Ising connectivity in the LSD condition is lower than the placebo one, especially in homotopic links (interhemispheric connectivity), reflecting a significant decrease of homotopic functional connectivity in the LSD condition. The global archetype is then personalized for each individual and condition by adjusting the system temperature. The resulting temperatures are all near but above the critical point of the model in the paramagnetic (disordered) phase. The individualized Ising temperatures are higher in the LSD condition than the placebo condition (p = 9 × 10−5). Next, we estimate the Lempel-Ziv-Welch (LZW) complexity of the binarized BOLD data and the synthetic data generated with the individualized model using the Metropolis algorithm for each participant and condition. The LZW complexity computed from experimental data reveals a weak statistical relationship with condition (p = 0.04 one-tailed Wilcoxon test) and none with Ising temperature (r(13) = 0.13, p = 0.65), presumably because of the limited length of the BOLD time series. Similarly, we explore complexity using the block decomposition method (BDM), a more advanced method for estimating algorithmic complexity. The BDM complexity of the experimental data displays a significant correlation with Ising temperature (r(13) = 0.56, p = 0.03) and a weak but significant correlation with condition (p = 0.04, one-tailed Wilcoxon test). This study suggests that the effects of LSD increase the complexity of brain dynamics by loosening interhemispheric connectivity—especially homotopic links. In agreement with earlier work using the Ising formalism with BOLD data, we find the brain state in the placebo condition is already above the critical point, with LSD resulting in a shift further away from criticality into a more disordered state.
APA, Harvard, Vancouver, ISO, and other styles
43

T, Sujatha, and Selvam K. "LOSSLESS IMAGE COMPRESSION USING DIFFERENT ENCODING ALGORITHM FOR VARIOUS MEDICAL IMAGES." ICTACT Journal on Image and Video Processing 12, no. 4 (2022): 2704–9. https://doi.org/10.21917/ijivp.2022.0384.

Full text
Abstract:
In the medical industry, the amount of data that can be collected and kept is currently increasing. As a result, in order to handle these large amounts of data efficiently, compression methods must be re-examined while taking the algorithm complexity into account. An image processing strategy should be explored to eliminate the duplication image contents, so boosting the capability to retain or transport data in the best possible manner. Image Compression (IC) is a method of compressing images as they are being stored and processed. The information is preserved in a lossless image compression technique which allows for exact image reconstruction from compressed data with retain the quality of image to higher possible extend but it does not significantly decrease the size of the image. In this research work, the encoding algorithm is applied to various medical images such as brain image, dental x-ray image, hand x ray images, breast mammogram images and skin image can be used to minimize the bit size of the image pixels based on the different encoding algorithm such as Huffman, Lempel-Ziv-Welch (LZW) and Run Length Encoding (RLE) for effective compression and decompression without any quality loss to reconstruct the image. The image processing toolbox is used to apply the compression algorithms in MATLAB. To assess the compression efficiency of various medical images using different encoding techniques and performance indicators such as Compression Ratio (CR) and Compression Factor (CF). The LZW technique compresses binary images; however, it fails to generate a lossless image in this implementation. Huffman and RLE algorithms have a lower CR value, which means they compress data more efficiently than LZW, although RLE has a larger CF value than LZW and Huffman. When fewer CR and more CF are recorded, RLE coding becomes more viable. Finally, using state-of-the-art methodologies for the sample medical images, performance measures such as PSNR and MSE is retrieved and assessed.
APA, Harvard, Vancouver, ISO, and other styles
44

Cheng, Qiqi, Wenwei Yang, Kezhou Liu, et al. "Increased Sample Entropy in EEGs During the Functional Rehabilitation of an Injured Brain." Entropy 21, no. 7 (2019): 698. http://dx.doi.org/10.3390/e21070698.

Full text
Abstract:
Complex nerve remodeling occurs in the injured brain area during functional rehabilitation after a brain injury; however, its mechanism has not been thoroughly elucidated. Neural remodeling can lead to changes in the electrophysiological activity, which can be detected in an electroencephalogram (EEG). In this paper, we used EEG band energy, approximate entropy (ApEn), sample entropy (SampEn), and Lempel–Ziv complexity (LZC) features to characterize the intrinsic rehabilitation dynamics of the injured brain area, thus providing a means of detecting and exploring the mechanism of neurological remodeling during the recovery process after brain injury. The rats in the injury group (n = 12) and sham group (n = 12) were used to record the bilateral symmetrical EEG on days 1, 4, and 7 after a unilateral brain injury in awake model rats. The open field test (OFT) experiments were performed in the following three groups: an injury group, a sham group, and a control group (n = 10). An analysis of the EEG data using the energy, ApEn, SampEn, and LZC features demonstrated that the increase in SampEn was associated with the functional recovery. After the brain injury, the energy values of the delta1 bands on day 4; the delta2 bands on days 4 and 7; the theta, alpha, and beta bands and the values of ApEn, SampEn, and LZC of the cortical EEG signal on days 1, 4 and 7 were significantly lower in the injured brain area than in the non-injured area. During the process of recovery for the injured brain area, the values of the beta bands, ApEn, and SampEn of the injury group increased significantly, and gradually became equal to the value of the sham group. The improvement in the motor function of the model rats significantly correlated with the increase in SampEn. This study provides a method based on EEG nonlinear features for measuring neural remodeling in injured brain areas during brain function recovery. The results may aid in the study of neural remodeling mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
45

Xynidis, Michael A., Brian F. Goldiez, Jack E. Norfleet, and Nina Rothstein. "Methodology for quantitative assessment of combat casualty care." SIMULATION 95, no. 4 (2018): 289–95. http://dx.doi.org/10.1177/0037549718777898.

Full text
Abstract:
Evaluating proficiency in simulation-based combat casualty training includes the assessment of hands-on training with mannequins through instructor observation. The evaluation process is error-prone due to high student–instructor ratios as well as the subjective nature of the evaluation process. Other logistical inconsistencies, such as the short amount of time to observe individual student performance, can lessen training effectiveness as well. The simulation-based methodology described in this article addresses these challenges by way of quantitative assessment of training effectiveness in combat casualty training. The methodology discusses adaptation of Lempel–Ziv (LZ) complexity indexing to quantify psychomotor activity that is otherwise only subjectively estimated by an instructor. LZ indexing has been successfully used to assess proficiency in related studies of simulation-based training conducted by Bann et al. at the Imperial College of Science Technology and Medicine in London, and more recently by Watson at the University of North Carolina at Chapel Hill. This type of analysis has been applied to using simulation as a tool to assess not only mastery of a task, but as a method to assess whether a particular simulator and training approach actually works. Data have been gathered from nearly 100 military combat medic trainees at Joint Base Lewis McChord Medical Simulation Training Center. Participant hand-acceleration data from an emergency surgical cricothyrotomy reveals a statistically significant difference in ability between expertise levels. The higher the LZ scores and self-reported expertise level, the better the participant performed. The results show that when presented with demographic and video performance-based data, it is possible to gauge experience by applying LZ scoring to motion data. The methodology provides an objective measure that complements the subjective component of simulation-based cricothyrotomy training assessments. Further study is needed to determine whether this methodology would provide similar assessment advantages in other medical training in which speed and accuracy would be significant factors in determining procedural expertise.
APA, Harvard, Vancouver, ISO, and other styles
46

RAPP, P. E., C. J. CELLUCCI, T. A. A. WATANABE, and A. M. ALBANO. "QUANTITATIVE CHARACTERIZATION OF THE COMPLEXITY OF MULTICHANNEL HUMAN EEGS." International Journal of Bifurcation and Chaos 15, no. 05 (2005): 1737–44. http://dx.doi.org/10.1142/s0218127405012764.

Full text
Abstract:
In this contribution, eleven different measures of the complexity of multichannel EEGs are described, and their effectiveness in discriminating between two behavioral conditions (eyes open resting versus eyes closed resting) is compared. Ten of the methods were variants of the algorithmic complexity and the covariance complexity. The eleventh measure was a multivariate complexity measure proposed by Tononi and Edelman. The most significant between-condition change was observed with Tononi–Edelman complexity which decreased in the eyes open condition. Of the algorithmic complexity measures tested, the binary Lempel–Ziv complexity and the binary Lempel–Ziv redundancy of the first principal component following mean normalization and normalization against the standard deviation gave the most significant between-group discrimination. A time-dependent generalization of the covariance complexity that can be applied to nonstationary multichannel signals is also described.
APA, Harvard, Vancouver, ISO, and other styles
47

Zhao, Lulu, Licai Yang, Baimin Li, Zhonghua Su, and Chengyu Liu. "Frontal Alpha Complexity of Different Severity Depression Patients." Journal of Healthcare Engineering 2020 (September 22, 2020): 1–8. http://dx.doi.org/10.1155/2020/8854725.

Full text
Abstract:
Depression is a leading cause of disability worldwide, and objective biomarkers are required for future computer-aided diagnosis. This study aims to assess the variation of frontal alpha complexity among different severity depression patients and healthy subjects, therefore to explore the depressed neuronal activity and to suggest valid biomarkers. 69 depression patients (divided into three groups according to the disease severity) and 14 healthy subjects were employed to collect 3-channel resting Electroencephalogram signals. Sample entropy and Lempel–Ziv complexity methods were employed to evaluate the Electroencephalogram complexity among different severity depression groups and healthy group. Kruskal–Wallis rank test and group t-test were performed to test the difference significance among four groups and between each two groups separately. All indexes values show that depression patients have significantly increased complexity compared to healthy subjects, and furthermore, the complexity keeps increasing as the depression deepens. Sample entropy measures exhibit superiority in distinguishing mild depression from healthy group with significant difference even between nondepressive state group and healthy group. The results confirm the altered neuronal activity influenced by depression severity and suggest sample entropy and Lempel–Ziv complexity as promising biomarkers in future depression evaluation and diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
48

Tosun, Pinar Deniz, Derk-Jan Dijk, Raphaelle Winsky-Sommerer, and Daniel Abasolo. "Effects of Ageing and Sex on Complexity in the Human Sleep EEG: A Comparison of Three Symbolic Dynamic Analysis Methods." Complexity 2019 (January 2, 2019): 1–12. http://dx.doi.org/10.1155/2019/9254309.

Full text
Abstract:
Symbolic dynamic analysis (SDA) methods have been applied to biomedical signals and have been proven efficient in characterising differences in the electroencephalogram (EEG) in various conditions (e.g., epilepsy, Alzheimer’s, and Parkinson’s diseases). In this study, we investigated the use of SDA on EEGs recorded during sleep. Lempel-Ziv complexity (LZC), permutation entropy (PE), and permutation Lempel-Ziv complexity (PLZC), as well as power spectral analysis based on the fast Fourier transform (FFT), were applied to 8-h sleep EEG recordings in healthy men (n=31) and women (n=29), aged 20-74 years. The results of the SDA methods and FFT analysis were compared and the effects of age and sex were investigated. Surrogate data were used to determine whether the findings with SDA methods truly reflected changes in nonlinear dynamics of the EEG and not merely changes in the power spectrum. The surrogate data analysis showed that LZC merely reflected spectral changes in EEG activity, whereas PE and PLZC reflected genuine changes in the nonlinear dynamics of the EEG. All three SDA techniques distinguished the vigilance states (i.e., wakefulness, REM sleep, NREM sleep, and its sub-stages: stage 1, stage 2, and slow wave sleep). Complexity of the sleep EEG increased with ageing. Sex on the other hand did not affect the complexity values assessed with any of these three SDA methods, even though FFT detected sex differences. This study shows that SDA provides additional insights into the dynamics of sleep EEG and how it is affected by ageing.
APA, Harvard, Vancouver, ISO, and other styles
49

Wong, Nicole YE, Hanna van Waart, Jamie W. Sleigh, Simon J. Mitchell, and Xavier CE Vrijdag. "A systematic review of electroencephalography in acute cerebral hypoxia: clinical and diving implications." Diving and Hyperbaric Medicine Journal 53, no. 3 (2023): 268–80. http://dx.doi.org/10.28920/dhm53.3.268-280.

Full text
Abstract:
Introduction: Hypoxia can cause central nervous system dysfunction and injury. Hypoxia is a particular risk during rebreather diving. Given its subtle symptom profile and its catastrophic consequences there is a need for reliable hypoxia monitoring. Electroencephalography (EEG) is being investigated as a real time monitor for multiple diving problems related to inspired gas, including hypoxia. Methods: A systematic literature search identified articles investigating the relationship between EEG changes and acute cerebral hypoxia in healthy adults. Quality of clinical evidence was assessed using the Newcastle-Ottawa scale. Results: Eighty-one studies were included for analysis. Only one study investigated divers. Twelve studies described quantitative EEG spectral power differences. Moderate hypoxia tended to result in increased alpha activity. With severe hypoxia, alpha activity decreased whilst delta and theta activities increased. However, since studies that utilised cognitive testing during the hypoxic exposure more frequently reported opposite results it appears cognitive processing might mask hypoxic EEG changes. Other analysis techniques (evoked potentials and electrical equivalents of dipole signals), demonstrated sustained regulation of autonomic responses despite worsening hypoxia. Other studies utilised quantitative EEG analysis techniques, (Bispectral index [BISTM], approximate entropy and Lempel-Ziv complexity). No change was reported in BISTM value, whilst an increase in approximate entropy and Lempel-Ziv complexity occurred with worsening hypoxia. Conclusions: Electroencephalographic frequency patterns change in response to acute cerebral hypoxia. There is paucity of literature on the relationship between quantitative EEG analysis techniques and cerebral hypoxia. Because of the conflicting results in EEG power frequency analysis, future research needs to quantitatively define a hypoxia-EEG response curve, and how it is altered by concurrent cognitive task loading.
APA, Harvard, Vancouver, ISO, and other styles
50

Signorini, M., G. Magenes, and M. Ferrario. "Comparison between Fetal Heart Rate Standard Parameters and Complexity Indexes for the Identification of Severe Intrauterine Growth Restriction." Methods of Information in Medicine 46, no. 02 (2007): 186–90. http://dx.doi.org/10.1055/s-0038-1625404.

Full text
Abstract:
Summary Objectives : The intrauterine growth restriction (IUGR) is a pathological state: the fetus is at risk of hypoxia and this condition is associated with increased perinatal morbidity and mortality. However, evidence-based guidelines for clinical surveillance are poor and lack reliable indexes. This study introduces new procedures to extract parameters from the fetal heart rate signal in order to identify severe intrauterine growth restricted (IUGR) fetuses Methods : Standard parameters (time domain and frequency domain indexes) are compared to a new parameter, the Lempel Ziv complexity, and to two regularity estimators (approximate entropy and sample entropy). The paper analyzes the robustness of the indexes coming from the parameter extraction procedure. Results and Conclusions : The results show that the LZ complexity is a stable parameter and it is able to significantly discriminate the severe IUGR (preterm delivered) from moderate IUGR (at term delivered) and from healthy fetuses.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography