Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
870 result(s) for "sample entropy"
Sort by:
Analysis Of Portevin-Le Châtelier Effect Data Using Diffferent Sample Entropy Measures
This work is focused on calculating entropy measures for signals in order to identify Portevin-Le Châtelier (PLC) effect types. The PLC effect is a phenomenon occurring in metals, in particular steel and aluminum alloys, within a certain range of strain rates and temperatures. It is characterized by serrations (repetitive changes from hardening to softening) visible in a load-displacement diagram and associated strain rate bands moving through a sample. Three main PLC types are distinguished: A, B and C. Type A occurs in low temperature and for high strain rate, strain rate bands then propagate continuously. Type B occurs for medium temperature and strain rate, the bands then have a hopping character. Type C occurs in high temperature and for low strain rate, the bands then nucleate in a random manner. The entropy analysis is used as a way to distinguish the types. The so-called Sample Entropy, Sample Entropy 2d and Multiscale Sample Entropy are measures utilized in signal analysis to look for patterns in data. Sample Entropy takes into consideration only force values which need to be sampled at equal intervals. Sample Entropy 2D, on the other hand, also accounts for the distances between points. Multiscale Sample Entropy extends the standard approach by analyzing the signal across multiple time scales. For computations, experimental results in the form of load-displacement diagrams for tensile tests performed on bone-shape samples are used. The experimental tests have been performed in room temperature for three strain rates. The band types are first identified based on DIC data by band movement observation. It is found that for a high strain rate we observe type A, for a medium strain rate first type A and then type B and for a low strain rate type C. The Sample Entropy and Sample Entropy 2d measures for type C are low and for type A are high. Different behavior of those two types is also visible for higher time scales. It is also found that to assess type B of PLC effect more experiments are needed.
New Fast ApEn and SampEn Entropy Algorithms Implementation and Their Application to Supercomputer Power Consumption
Approximate Entropy and especially Sample Entropy are recently frequently used algorithms for calculating the measure of complexity of a time series. A lesser known fact is that there are also accelerated modifications of these two algorithms, namely Fast Approximate Entropy and Fast Sample Entropy. All these algorithms are effectively implemented in the R software package TSEntropies. This paper contains not only an explanation of all these algorithms, but also the principle of their acceleration. Furthermore, the paper contains a description of the functions of this software package and their parameters, as well as simple examples of using this software package to calculate these measures of complexity of an artificial time series and the time series of a complex real-world system represented by the course of supercomputer infrastructure power consumption. These time series were also used to test the speed of this package and to compare its speed with another R package pracma. The results show that TSEntropies is up to 100 times faster than pracma and another important result is that the computational times of the new Fast Approximate Entropy and Fast Sample Entropy algorithms are up to 500 times lower than the computational times of their original versions. At the very end of this paper, the possible use of this software package TSEntropies is proposed.
Coarse-Graining Approaches in Univariate Multiscale Sample and Dispersion Entropy
The evaluation of complexity in univariate signals has attracted considerable attention in recent years. This is often done using the framework of Multiscale Entropy, which entails two basic steps: coarse-graining to consider multiple temporal scales, and evaluation of irregularity for each of those scales with entropy estimators. Recent developments in the field have proposed modifications to this approach to facilitate the analysis of short-time series. However, the role of the downsampling in the classical coarse-graining process and its relationships with alternative filtering techniques has not been systematically explored yet. Here, we assess the impact of coarse-graining in multiscale entropy estimations based on both Sample Entropy and Dispersion Entropy. We compare the classical moving average approach with low-pass Butterworth filtering, both with and without downsampling, and empirical mode decomposition in Intrinsic Multiscale Entropy, in selected synthetic data and two real physiological datasets. The results show that when the sampling frequency is low or high, downsampling respectively decreases or increases the entropy values. Our results suggest that, when dealing with long signals and relatively low levels of noise, the refine composite method makes little difference in the quality of the entropy estimation at the expense of considerable additional computational cost. It is also found that downsampling within the coarse-graining procedure may not be required to quantify the complexity of signals, especially for short ones. Overall, we expect these results to contribute to the ongoing discussion about the development of stable, fast and robust-to-noise multiscale entropy techniques suited for either short or long recordings.
Novel multiscale E-metric cross-sample entropy-based cardiac arrhythmia detection and its performance investigation in reference to multiscale cross-sample entropy-based analysis
Cardiac arrhythmia is a common difficulty of human cardiovascular system and can be evaluated using cardiac rate variability. Multiscale Cross Sample Entropy (MCSEn) is used as a reference to quantify cardiac arrhythmia on the basis of complexity for double-interval series at multiple scales. This measure is failed to provide complexity with reduced scale factors for large data lengths. To hypothesize this measure for two series cardiac data by using coarse-grained process, Multiscale E-metric Cross Sample Entropy (MECSEn) has been proposed and is used to measure complexity between arrhythmia subjects, named atrial fibrillation (AF) and congestive heart failure (CHF) and healthy subjects at multiple scales. Besides short series data and undefined value, MECSEn has come up with a very new concept of banishing the use of a large number of scale factors for evaluating the complexity between two different interval series across multiple scales. It makes the proposed algorithm less time consumer. Both measures have found subjects derived from AF behave as white noise and subjects derived from CHF behave as pink noise. The t test validates MCSEn and the proposed algorithm, MECSEn by providing p  < 0.00001. Moreover, MCSEn and MECSEn algorithms are compared with multiscale sample entropy algorithm (MSEn) which uses single cardiac series to evaluate complexity of healthy and arrhythmia subjects.
Inverse sample entropy analysis for stock markets
Entropy has been an important tool for the complexity analysis of time series from various fields. Based on studying all the template mismatches, a modified sample entropy (SE) method, named as inverse sample entropy (ISE), for investigating the complexity of financial time series is proposed in this paper. Different from SE, ISE considers the far neighbors of templates; it also provides more comprehensive information combined with SE. Stock markets usually fluctuate with the economy policies; ISE allows us to detect the financial crisis by the change of complexity. By experiments on both simulated data and real-world stock data, ISE shows that the threshold r is more flexible compared with that of SE, which allows ISE to be applied not only to limited type of data. Besides, it is more robust to high dimension m , so ISE can be extended to the application of high dimension analysis. For studying the impact of embedding dimension m under multiple scales on both artificial and real-world data, we made a comparison on the use of SE and ISE. Both SE and ISE are able to distinguish time series with different features and characteristics. While SE is sensitive to high dimension analysis, ISE shows robustness.
A New Entropy-Based Atrial Fibrillation Detection Method for Scanning Wearable ECG Recordings
Entropy-based atrial fibrillation (AF) detectors have been applied for short-term electrocardiogram (ECG) analysis. However, existing methods suffer from several limitations. To enhance the performance of entropy-based AF detectors, we have developed a new entropy measure, named EntropyAF, which includes the following improvements: (1) use of a ranged function rather than the Chebyshev function to define vector distance, (2) use of a fuzzy function to determine vector similarity, (3) replacement of the probability estimation with density estimation for entropy calculation, (4) use of a flexible distance threshold parameter, and (5) use of adjusted entropy results for the heart rate effect. EntropyAF was trained using the MIT-BIH Atrial Fibrillation (AF) database, and tested on the clinical wearable long-term AF recordings. Three previous entropy-based AF detectors were used for comparison: sample entropy (SampEn), fuzzy measure entropy (FuzzyMEn) and coefficient of sample entropy (COSEn). For classifying AF and non-AF rhythms in the MIT-BIH AF database, EntropyAF achieved the highest area under receiver operating characteristic curve (AUC) values of 98.15% when using a 30-beat time window, which was higher than COSEn with AUC of 91.86%. SampEn and FuzzyMEn resulted in much lower AUCs of 74.68% and 79.24% respectively. For classifying AF and non-AF rhythms in the clinical wearable AF database, EntropyAF also generated the largest values of Youden index (77.94%), sensitivity (92.77%), specificity (85.17%), accuracy (87.10%), positive predictivity (68.09%) and negative predictivity (97.18%). COSEn had the second-best accuracy of 78.63%, followed by an accuracy of 65.08% in FuzzyMEn and an accuracy of 59.91% in SampEn. The new proposed EntropyAF also generated highest classification accuracy when using a 12-beat time window. In addition, the results from time cost analysis verified the efficiency of the new EntropyAF. This study showed the better discrimination ability for identifying AF when using EntropyAF method, indicating that it would be useful for the practical clinical wearable AF scanning.
Multivariate multiscale sample entropy of traffic time series
Multivariate time series are common in the traffic system and are necessary for understanding the property of the traffic system. This paper introduces the multivariate multiscale sample entropy (MMSE) to evaluate the complexity in multiple data channels over different timescales. We illustrate the necessity and advantage of MMSE method by comparing MMSE results with the multiscale sample entropy results on original and shuffled traffic time series, respectively. MMSE is capable of revealing the long-range correlations and providing robust estimates for the complexity of traffic time series. Then, we utilize the MMSE to assess relative complexity of normalized multichannel temporal data in the traffic system and also reveal the weekday and weekend patterns containing in traffic signals by MMSE. MMSE can provide more accurate and helpful knowledge about the complexity of traffic time series for the dynamics and inner mechanism of traffic system from the view of multiple variables.
A Novel Blind Signal Detector Based on the Entropy of the Power Spectrum Subband Energy Ratio
In this paper, we present a novel blind signal detector based on the entropy of the power spectrum subband energy ratio (PSER), the detection performance of which is significantly better than that of the classical energy detector. This detector is a full power spectrum detection method, and does not require the noise variance or prior information about the signal to be detected. According to the analysis of the statistical characteristics of the power spectrum subband energy ratio, this paper proposes concepts such as interval probability, interval entropy, sample entropy, joint interval entropy, PSER entropy, and sample entropy variance. Based on the multinomial distribution, in this paper the formulas for calculating the PSER entropy and the variance of sample entropy in the case of pure noise are derived. Based on the mixture multinomial distribution, the formulas for calculating the PSER entropy and the variance of sample entropy in the case of the signals mixed with noise are also derived. Under the constant false alarm strategy, the detector based on the entropy of the power spectrum subband energy ratio is derived. The experimental results for the primary signal detection are consistent with the theoretical calculation results, which proves that the detection method is correct.
Composite Multiscale Partial Cross-Sample Entropy Analysis for Quantifying Intrinsic Similarity of Two Time Series Affected by Common External Factors
In this paper, we propose a new cross-sample entropy, namely the composite multiscale partial cross-sample entropy (CMPCSE), for quantifying the intrinsic similarity of two time series affected by common external factors. First, in order to test the validity of CMPCSE, we apply it to three sets of artificial data. Experimental results show that CMPCSE can accurately measure the intrinsic cross-sample entropy of two simultaneously recorded time series by removing the effects from the third time series. Then CMPCSE is employed to investigate the partial cross-sample entropy of Shanghai securities composite index (SSEC) and Shenzhen Stock Exchange Component Index (SZSE) by eliminating the effect of Hang Seng Index (HSI). Compared with the composite multiscale cross-sample entropy, the results obtained by CMPCSE show that SSEC and SZSE have stronger similarity. We believe that CMPCSE is an effective tool to study intrinsic similarity of two time series.
Predicting suicidality in late‐life depression by 3D convolutional neural network and cross‐sample entropy analysis of resting‐state fMRI
Background: Predicting suicide is a pressing issue among older adults; however, predicting its risk is difficult. Capitalizing on the recent development of machine learning, considerable progress has been made in predicting complex behavior such as suicide. As depression remained the strongest risk for suicide, we aimed to apply deep learning algorithms to identify suicidality in a group with late‐life depression (LLD). Methods: We enrolled 83 patients with LLD, 35 of which were non‐suicidal and 48 were suicidal, including 26 with only suicidal ideation and 22 with past suicide attempts, for resting‐state functional magnetic resonance imaging (MRI). Cross‐sample entropy (CSE) analysis was conducted to examine the complexity of MRI signals among brain regions. Three‐dimensional (3D) convolutional neural networks (CNNs) were used, and the classification accuracy in each brain region was averaged to predict suicidality after sixfold cross‐validation. Results: We found brain regions with a mean accuracy above 75% to predict suicidality located mostly in default mode, fronto‐parietal, and cingulo‐opercular resting‐state networks. The models with right amygdala and left caudate provided the most reliable accuracy in all cross‐validation folds, indicating their neurobiological importance in late‐life suicide. Conclusion: Combining CSE analysis and the 3D CNN, several brain regions were found to be associated with suicidality. Predicting suicide in older adults is difficulty. Using machine learning, we can predict suicidality from the certain brain regions' complexity of the resting state fMRI data.