Entropy-based probabilistic fatigue damage prognosis and algorithmic performance comparison
National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...
Entropy-based Probabilistic Fatigue Damage Prognosis and Algorithmic Performance Comparison
National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...
Entropy Evaluation Based on Value Validity
Directory of Open Access Journals (Sweden)
Tarald O. Kvålseth
2014-09-01
Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.
Discretization Based on Entropy and Multiple Scanning
Directory of Open Access Journals (Sweden)
Jerzy W. Grzymala-Busse
2013-04-01
Full Text Available In this paper we present entropy driven methodology for discretization. Recently, the original entropy based discretization was enhanced by including two options of selecting the best numerical attribute. In one option, Dominant Attribute, an attribute with the smallest conditional entropy of the concept given the attribute is selected for discretization and then the best cut point is determined. In the second option, Multiple Scanning, all attributes are scanned a number of times, and at the same time the best cut points are selected for all attributes. The results of experiments on 17 benchmark data sets, including large data sets, with 175 attributes or 25,931 cases, are presented. For comparison, the results of experiments on the same data sets using the global versions of well-known discretization methods of Equal Interval Width and Equal Frequency per Interval are also included. The entropy driven technique enhanced both of these methods by converting them into globalized methods. Results of our experiments show that the Multiple Scanning methodology is significantly better than both: Dominant Attribute and the better results of Globalized Equal Interval Width and Equal Frequency per Interval methods (using two-tailed test and 0.01 level of significance.
Laplante, Jérémie; Groth, Clinton P. T.
2016-11-01
The Navier-Stokes-Fourier (NSF) equations are conventionally used to model continuum flow near local thermodynamic equilibrium. In the presence of more rarefied flows, there exists a transitional regime in which the NSF equations no longer hold, and where particle-based methods become too expensive for practical problems. To close this gap, moment closure techniques having the potential of being both valid and computationally tractable for these applications are sought. In this study, a number of five-moment closures for a model one-dimensional kinetic equation are assessed and compared. In particular, four different moment closures are applied to the solution of stationary shocks. The first of these is a Grad-type moment closure, which is known to fail for moderate departures from equilibrium. The second is an interpolative closure based on maximization of thermodynamic entropy which has previously been shown to provide excellent results for 1D gaskinetic theory. Additionally, two quadrature methods of moments (QMOM) are considered. One method is based on the representation of the distribution function in terms of a combination of three Dirac delta functions. The second method, an extended QMOM (EQMOM), extends the quadrature-based approach by assuming a bi-Maxwellian representation of the distribution function. The closing fluxes are analyzed in each case and the region of physical realizability is examined for the closures. Numerical simulations of stationary shock structures as predicted by each moment closure are compared to reference kinetic and the corresponding NSF-like equation solutions. It is shown that the bi-Maxwellian and interpolative maximum-entropy-based moment closures are able to closely reproduce the results of the true maximum-entropy distribution closure for this case very well, whereas the other methods do not. For moderate departures from local thermodynamic equilibrium, the Grad-type and QMOM closures produced unphysical subshocks and were
A comparison of EEG spectral entropy with conventional quantitative ...
African Journals Online (AJOL)
A comparison of EEG spectral entropy with conventional quantitative EEG at varying depths of sevoflurane anaesthesia. PR Bartel, FJ Smith, PJ Becker. Abstract. Background and Aim: Recently an electroencephalographic (EEG) spectral entropy module (M-ENTROPY) for an anaesthetic monitor has become commercially ...
Energy Technology Data Exchange (ETDEWEB)
Larche, Michael R.; Prowant, Matthew S.; Bruillard, Paul J.; Hagge, Tobias J.; Fifield, Leonard S.; Hughes, Michael S.; Sun, Xin
2017-04-19
This study compares different approaches for imaging the internal architecture of graphite/epoxy composites using backscattered ultrasound. Two cases are studied. In the first, near-surface defects in a thin graphite/epoxy plates are imaged. The same backscattered waveforms were used to produce peak-to-peak, logarithm of signal energy, as well as entropy images of different types. All of the entropy images exhibit better border delineation and defect contrast than the either peak-to-peak or logarithm of signal energy. The best results are obtained using the joint entropy of the backscattered waveforms with a reference function. Two different references are examined. The first is a reflection of the insonifying pulse from a stainless steel reflector. The second is an approximate optimum obtained from an iterative parametric search. The joint entropy images produced using this reference exhibit three times the contrast obtained in previous studies. These plates were later destructively analyzed to determine size and location of near-surface defects and the results found to agree with the defect location and shape as indicated by the entropy images. In the second study, images of long carbon graphite fibers (50% by weight) in polypropylene thermoplastic are obtained as a first step toward ultrasonic determination of the distributions of fiber position and orientation.
Directory of Open Access Journals (Sweden)
Jasleen Kaur
2013-01-01
Full Text Available Background: The induction dose of propofol is reduced with concomitant use of opioids as a result of a possible synergistic action. Aim and Objectives: The present study compared the effect of fentanyl and two doses of butorphanol pre-treatment on the induction dose of propofol, with specific emphasis on entropy. Methods: Three groups of 40 patients each, of the American Society of Anaesthesiologistsphysical status I and II, were randomized to receive fentanyl 2 μg/kg (Group F, butorphanol 20 μg/kg (Group B 20 or 40 μg/kg (Group B 40 as pre-treatment. Five minutes later, the degree of sedation was assessed by the observer′s assessment of alertness scale (OAA/S. Induction of anesthesia was done with propofol (30 mg/10 s till the loss of response to verbal commands. Thereafter, rocuronium 1 mg/kg was administered and endotracheal intubation was performed 2 min later. OAA/S, propofol induction dose, heart rate, blood pressure, oxygen saturation and entropy (response and state were compared in the three groups. Statistical Analysis: Data was analyzed using ANOVA test with posthoc significance, Kruskal-Wallis test, Chi-square test and Fischer exact test. A P<0.05 was considered as significant. Results: The induction dose of propofol (mg/kg was observed to be 1.1±0.50 in Group F, 1.05±0.35 in Group B 20 and 1.18±0.41 in Group B40. Induction with propofol occurred at higher entropy values on pre-treatment with both fentanyl as well as butorphanol. Hemodynamic variables were comparable in all the three groups. Conclusion: Butorphanol 20 μg/kg and 40 μg/kg reduce the induction requirement of propofol, comparable to that of fentanyl 2 μg/kg, and confer hemodynamic stability at induction and intubation.
The Entropy-Based Quantum Metric
Directory of Open Access Journals (Sweden)
Roger Balian
2014-07-01
Full Text Available The von Neumann entropy S( D ^ generates in the space of quantum density matrices D ^ the Riemannian metric ds2 = −d2S( D ^ , which is physically founded and which characterises the amount of quantum information lost by mixing D ^ and D ^ + d D ^ . A rich geometric structure is thereby implemented in quantum mechanics. It includes a canonical mapping between the spaces of states and of observables, which involves the Legendre transform of S( D ^ . The Kubo scalar product is recovered within the space of observables. Applications are given to equilibrium and non equilibrium quantum statistical mechanics. There the formalism is specialised to the relevant space of observables and to the associated reduced states issued from the maximum entropy criterion, which result from the exact states through an orthogonal projection. Von Neumann’s entropy specialises into a relevant entropy. Comparison is made with other metrics. The Riemannian properties of the metric ds2 = −d2S( D ^ are derived. The curvature arises from the non-Abelian nature of quantum mechanics; its general expression and its explicit form for q-bits are given, as well as geodesics.
Entropy based fingerprint for local crystalline order
Piaggi, Pablo M.; Parrinello, Michele
2017-09-01
We introduce a new fingerprint that allows distinguishing between liquid-like and solid-like atomic environments. This fingerprint is based on an approximate expression for the entropy projected on individual atoms. When combined with local enthalpy, this fingerprint acquires an even finer resolution and it is capable of discriminating between different crystal structures.
Robust Discriminant Analysis Based on Nonparametric Maximum Entropy
He, Ran; Hu, Bao-Gang; Yuan, Xiao-Tong
In this paper, we propose a Robust Discriminant Analysis based on maximum entropy (MaxEnt) criterion (MaxEnt-RDA), which is derived from a nonparametric estimate of Renyi’s quadratic entropy. MaxEnt-RDA uses entropy as both objective and constraints; thus the structural information of classes is preserved while information loss is minimized. It is a natural extension of LDA from Gaussian assumption to any distribution assumption. Like LDA, the optimal solution of MaxEnt-RDA can also be solved by an eigen-decomposition method, where feature extraction is achieved by designing two Parzen probability matrices that characterize the within-class variation and the between-class variation respectively. Furthermore, MaxEnt-RDA makes use of high order statistics (entropy) to estimate the probability matrix so that it is robust to outliers. Experiments on toy problem , UCI datasets and face datasets demonstrate the effectiveness of the proposed method with comparison to other state-of-the-art methods.
Directory of Open Access Journals (Sweden)
Jose Antonio Urigüen
Full Text Available Idiopathic epilepsy is characterized by generalized seizures with no apparent cause. One of its main problems is the lack of biomarkers to monitor the evolution of patients. The only tools they can use are limited to inspecting the amount of seizures during previous periods of time and assessing the existence of interictal discharges. As a result, there is a need for improving the tools to assist the diagnosis and follow up of these patients. The goal of the present study is to compare and find a way to differentiate between two groups of patients suffering from idiopathic epilepsy, one group that could be followed-up by means of specific electroencephalographic (EEG signatures (intercritical activity present, and another one that could not due to the absence of these markers. To do that, we analyzed the background EEG activity of each in the absence of seizures and epileptic intercritical activity. We used the Shannon spectral entropy (SSE as a metric to discriminate between the two groups and performed permutation-based statistical tests to detect the set of frequencies that show significant differences. By constraining the spectral entropy estimation to the [6.25-12.89 Hz range, we detect statistical differences (at below 0.05 alpha-level between both types of epileptic patients at all available recording channels. Interestingly, entropy values follow a trend that is inversely related to the elapsed time from the last seizure. Indeed, this trend shows asymptotical convergence to the SSE values measured in a group of healthy subjects, which present SSE values lower than any of the two groups of patients. All these results suggest that the SSE, measured in a specific range of frequencies, could serve to follow up the evolution of patients suffering from idiopathic epilepsy. Future studies remain to be conducted in order to assess the predictive value of this approach for the anticipation of seizures.
Entropy-based financial asset pricing.
Ormos, Mihály; Zibriczky, Dávid
2014-01-01
We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.
Entropy-based adaptive attitude estimation
Kiani, Maryam; Barzegar, Aylin; Pourtakdoust, Seid H.
2018-03-01
Gaussian approximation filters have increasingly been developed to enhance the accuracy of attitude estimation in space missions. The effective employment of these algorithms demands accurate knowledge of system dynamics and measurement models, as well as their noise characteristics, which are usually unavailable or unreliable. An innovation-based adaptive filtering approach has been adopted as a solution to this problem; however, it exhibits two major challenges, namely appropriate window size selection and guaranteed assurance of positive definiteness for the estimated noise covariance matrices. The current work presents two novel techniques based on relative entropy and confidence level concepts in order to address the abovementioned drawbacks. The proposed adaptation techniques are applied to two nonlinear state estimation algorithms of the extended Kalman filter and cubature Kalman filter for attitude estimation of a low earth orbit satellite equipped with three-axis magnetometers and Sun sensors. The effectiveness of the proposed adaptation scheme is demonstrated by means of comprehensive sensitivity analysis on the system and environmental parameters by using extensive independent Monte Carlo simulations.
A Deterministic Entropy Based on the Instantaneous Phase Space Volume
Diebner, Hans H.; Rössler, Otto E.
1998-02-01
A deterministic entropic measure is derived for the time evolution of Newtonian N-particle systems based on the volume of the instantaneously occupied phase space (IOPS). This measure is found as a natural extension of Boltzmann's entropy. The instantaneous arrangement of the particles is exploited in the form of spatial correlations. The new entropy is a bridge between the time-dependent Boltzmann entropy, formulated on the basis of densities in the one-particle phase space, and the static Gibbs entropy which uses densities in the full phase space. We apply the new concept in a molecular dynamics simulation (MDS) using an exactly time reversible "discrete Newtonian equation of motion" recently derived from the fundamental principle of least action in discretized space-time. The simulation therefore is consistent with micro-time-reversibility. Entropy becomes an exact momentary observable in both time directions in fulfillment of a dream of Boltzmann.
New class of entropy-power-based uncertainty relations
Jizba, P.; Hayes, A.; Dunningham, JA
2017-08-01
We use the concept of entropy power to introduce a new one-parameter class of information-theoretic uncertainty relations. This class constitutes an infinite hierarchy of uncertainty relations, which allows to determine the shape of the underlying information-distribution function by measuring the relevant entropy powers. The efficiency of such uncertainty relations in quantum mechanics is illustrated with two examples: superpositions of two squeezed states and the Lévy-type heavy-tailed wave function. Improvement over both the variance-based and Shannon entropy based uncertainty relations is demonstrated in both these cases.
Eukaryotic promoter prediction based on relative entropy and positional information.
Wu, Shuanhu; Xie, Xudong; Liew, Alan Wee-Chung; Yan, Hong
2007-04-01
The eukaryotic promoter prediction is one of the most important problems in DNA sequence analysis, but also a very difficult one. Although a number of algorithms have been proposed, their performances are still limited by low sensitivities and high false positives. We present a method for improving the performance of promoter regions prediction. We focus on the selection of most effective features for different functional regions in DNA sequences. Our feature selection algorithm is based on relative entropy or Kullback-Leibler divergence, and a system combined with position-specific information for promoter regions prediction is developed. The results of testing on large genomic sequences and comparisons with the PromoterInspector and Dragon Promoter Finder show that our algorithm is efficient with higher sensitivity and specificity in predicting promoter regions.
Entropy based file type identification and partitioning
2017-06-01
87 ix LIST OF FIGURES Figure 1. Process Adopted for File Type Identification...the identification of file types and file partitioning. This approach has applications in cybersecurity as it allows for a quick determination of...0.75p X p= = which yields 2 2( ) [(0.25 log 0.25) (0.75 log 0.75)] 0.8113.H X = − × + × = Entropy analysis offers a convenient and quick method
Deceiving entropy-based DoS detection
Özçelik, Ä.°lker; Brooks, Richard R.
2014-06-01
Denial of Service (DoS) attacks disable network services for legitimate users. A McAfee report shows that eight out of ten Critical Infrastructure Providers (CIPs) surveyed had a significant Distributed DoS (DDoS) attack in 2010.1 Researchers proposed many approaches for detecting these attacks in the past decade. Anomaly based DoS detection is the most common. In this approach, the detector uses statistical features; such as the entropy of incoming packet header fields like source IP addresses or protocol type. It calculates the observed statistical feature and triggers an alarm if an extreme deviation occurs. However, intrusion detection systems (IDS) using entropy based detection can be fooled by spoofing. An attacker can sniff the network to collect header field data of network packets coming from distributed nodes on the Internet and fuses them to calculate the entropy of normal background traffic. Then s/he can spoof attack packets to keep the entropy value in the expected range during the attack. In this study, we present a proof of concept entropy spoofing attack that deceives entropy based detection approaches. Our preliminary results show that spoofing attacks cause significant detection performance degradation.
Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm
Directory of Open Access Journals (Sweden)
Baljit Singh Khehra
2015-03-01
Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.
A new entropy based method for computing software structural complexity
Roca, J L
2002-01-01
In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relation...
Comparison of transfer entropy methods for financial time series
He, Jiayi; Shang, Pengjian
2017-09-01
There is a certain relationship between the global financial markets, which creates an interactive network of global finance. Transfer entropy, a measurement for information transfer, offered a good way to analyse the relationship. In this paper, we analysed the relationship between 9 stock indices from the U.S., Europe and China (from 1995 to 2015) by using transfer entropy (TE), effective transfer entropy (ETE), Rényi transfer entropy (RTE) and effective Rényi transfer entropy (ERTE). We compared the four methods in the sense of the effectiveness for identification of the relationship between stock markets. In this paper, two kinds of information flows are given. One reveals that the U.S. took the leading position when in terms of lagged-current cases, but when it comes to the same date, China is the most influential. And ERTE could provide superior results.
Fractal Image Compression Based on High Entropy Values Technique
Directory of Open Access Journals (Sweden)
Douaa Younis Abbaas
2018-04-01
Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.
Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory
Directory of Open Access Journals (Sweden)
Kai Zeng
2014-01-01
Full Text Available Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT yield better performance than classical ones.
Directory of Open Access Journals (Sweden)
Xiaowei Wang
2014-01-01
Full Text Available This paper proposes a stepped selection method based on spectral kurtosis relative energy entropy. Firstly, the length and type of window function are set; then when fault occurs, enter step 1: the polarity of first half-wave extremes is analyzed; if the ratios of extremes between neighboring lines are positive, the bus bar is the fault line, else, the SK relative energy entropies are calculated, and then enter step 2: if the obtained entropy multiple is bigger than the threshold or equal to the threshold, the overhead line of max entropy corresponding is the fault line, if not, enter step 3: the line of max entropy corresponding is the fault line. At last, the applicability of the proposed algorithm is presented, and the comparison results are discussed.
DDoS Attack Detection Algorithms Based on Entropy Computing
Li, Liying; Zhou, Jianying; Xiao, Ning
Distributed Denial of Service (DDoS) attack poses a severe threat to the Internet. It is difficult to find the exact signature of attacking. Moreover, it is hard to distinguish the difference of an unusual high volume of traffic which is caused by the attack or occurs when a huge number of users occasionally access the target machine at the same time. The entropy detection method is an effective method to detect the DDoS attack. It is mainly used to calculate the distribution randomness of some attributes in the network packets' headers. In this paper, we focus on the detection technology of DDoS attack. We improve the previous entropy detection algorithm, and propose two enhanced detection methods based on cumulative entropy and time, respectively. Experiment results show that these methods could lead to more accurate and effective DDoS detection.
Image coding based on maximum entropy partitioning for identifying ...
Indian Academy of Sciences (India)
A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...
Entropy-Based Privacy against Profiling of User Mobility
Directory of Open Access Journals (Sweden)
Alicia Rodriguez-Carrion
2015-06-01
Full Text Available Location-based services (LBSs flood mobile phones nowadays, but their use poses an evident privacy risk. The locations accompanying the LBS queries can be exploited by the LBS provider to build the user profile of visited locations, which might disclose sensitive data, such as work or home locations. The classic concept of entropy is widely used to evaluate privacy in these scenarios, where the information is represented as a sequence of independent samples of categorized data. However, since the LBS queries might be sent very frequently, location profiles can be improved by adding temporal dependencies, thus becoming mobility profiles, where location samples are not independent anymore and might disclose the user’s mobility patterns. Since the time dimension is factored in, the classic entropy concept falls short of evaluating the real privacy level, which depends also on the time component. Therefore, we propose to extend the entropy-based privacy metric to the use of the entropy rate to evaluate mobility profiles. Then, two perturbative mechanisms are considered to preserve locations and mobility profiles under gradual utility constraints. We further use the proposed privacy metric and compare it to classic ones to evaluate both synthetic and real mobility profiles when the perturbative methods proposed are applied. The results prove the usefulness of the proposed metric for mobility profiles and the need for tailoring the perturbative methods to the features of mobility profiles in order to improve privacy without completely loosing utility.
Epoch-based Entropy for Early Screening of Alzheimer's Disease.
Houmani, N; Dreyfus, G; Vialatte, F B
2015-12-01
In this paper, we introduce a novel entropy measure, termed epoch-based entropy. This measure quantifies disorder of EEG signals both at the time level and spatial level, using local density estimation by a Hidden Markov Model on inter-channel stationary epochs. The investigation is led on a multi-centric EEG database recorded from patients at an early stage of Alzheimer's disease (AD) and age-matched healthy subjects. We investigate the classification performances of this method, its robustness to noise, and its sensitivity to sampling frequency and to variations of hyperparameters. The measure is compared to two alternative complexity measures, Shannon's entropy and correlation dimension. The classification accuracies for the discrimination of AD patients from healthy subjects were estimated using a linear classifier designed on a development dataset, and subsequently tested on an independent test set. Epoch-based entropy reached a classification accuracy of 83% on the test dataset (specificity = 83.3%, sensitivity = 82.3%), outperforming the two other complexity measures. Furthermore, it was shown to be more stable to hyperparameter variations, and less sensitive to noise and sampling frequency disturbances than the other two complexity measures.
A new entropy based method for computing software structural complexity
International Nuclear Information System (INIS)
Roca, Jose L.
2002-01-01
In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relationship with the number of inherent software errors and it implies a basic hazard failure rate for it, so that a minimum structure assures a certain stability and maturity of the program. This metric can be used, either to evaluate the product or the process of software development, as development tool or for monitoring the stability and the quality of the final product. (author)
An Entropy-Based Network Anomaly Detection Method
Directory of Open Access Journals (Sweden)
Przemysław Bereziński
2015-04-01
Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.
Wu, Yue; Shang, Pengjian; Li, Yilong
2018-03-01
A modified multiscale sample entropy measure based on symbolic representation and similarity (MSEBSS) is proposed in this paper to research the complexity of stock markets. The modified algorithm reduces the probability of inducing undefined entropies and is confirmed to be robust to strong noise. Considering the validity and accuracy, MSEBSS is more reliable than Multiscale entropy (MSE) for time series mingled with much noise like financial time series. We apply MSEBSS to financial markets and results show American stock markets have the lowest complexity compared with European and Asian markets. There are exceptions to the regularity that stock markets show a decreasing complexity over the time scale, indicating a periodicity at certain scales. Based on MSEBSS, we introduce the modified multiscale cross-sample entropy measure based on symbolic representation and similarity (MCSEBSS) to consider the degree of the asynchrony between distinct time series. Stock markets from the same area have higher synchrony than those from different areas. And for stock markets having relative high synchrony, the entropy values will decrease with the increasing scale factor. While for stock markets having high asynchrony, the entropy values will not decrease with the increasing scale factor sometimes they tend to increase. So both MSEBSS and MCSEBSS are able to distinguish stock markets of different areas, and they are more helpful if used together for studying other features of financial time series.
International Nuclear Information System (INIS)
Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin
2014-01-01
To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts
An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression
Weiss, Brandi A.; Dardick, William
2016-01-01
This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…
Introducing E-tec: Ensemble-based Topological Entropy Calculation
Roberts, Eric; Smith, Spencer; Sindi, Suzanne; Smith, Kevin
2017-11-01
Topological entropy is a measurement of orbit complexity in a dynamical system that can be estimated in 2D by embedding an initial material curve L0 in the fluid and estimating its growth under the evolution of the flow. This growth is given by L (t) = | L0 | eht , where L (t) is the length of the curve as a function of t and h is the topological entropy. In order to develop a method for computing Eq. (1) that will efficiently scale up in both system size and modeling time, one must be clever about extracting the maximum information from the limited trajectories available. The relative motion of trajectories through phase space encodes global information that is not contained in any individual trajectory. That is, extra information is ''hiding'' in an ensemble of classical trajectories, which is not exploited in a trajectory-by-trajectory approach. Using tools from computational geometry, we introduce a new algorithm designed to take advantage of such additional information that requires only potentially sparse sets of particle trajectories as input and no reliance on any detailed knowledge of the velocity field: the Ensemble-Based Topological Entropy Calculation, or E-tec.
Comparison Between Bayesian and Maximum Entropy Analyses of Flow Networks†
Directory of Open Access Journals (Sweden)
Steven H. Waldrip
2017-02-01
Full Text Available We compare the application of Bayesian inference and the maximum entropy (MaxEnt method for the analysis of ﬂow networks, such as water, electrical and transport networks. The two methods have the advantage of allowing a probabilistic prediction of ﬂow rates and other variables, when there is insufﬁcient information to obtain a deterministic solution, and also allow the effects of uncertainty to be included. Both methods of inference update a prior to a posterior probability density function (pdf by the inclusion of new information, in the form of data or constraints. The MaxEnt method maximises an entropy function subject to constraints, using the method of Lagrange multipliers,to give the posterior, while the Bayesian method ﬁnds its posterior by multiplying the prior with likelihood functions incorporating the measured data. In this study, we examine MaxEnt using soft constraints, either included in the prior or as probabilistic constraints, in addition to standard moment constraints. We show that when the prior is Gaussian,both Bayesian inference and the MaxEnt method with soft prior constraints give the same posterior means, but their covariances are different. In the Bayesian method, the interactions between variables are applied through the likelihood function, using second or higher-order cross-terms within the posterior pdf. In contrast, the MaxEnt method incorporates interactions between variables using Lagrange multipliers, avoiding second-order correlation terms in the posterior covariance. The MaxEnt method with soft prior constraints, therefore, has a numerical advantage over Bayesian inference, in that the covariance terms are avoided in its integrations. The second MaxEnt method with soft probabilistic constraints is shown to give posterior means of similar, but not identical, structure to the other two methods, due to its different formulation.
Permutation entropy analysis based on Gini-Simpson index for financial time series
Jiang, Jun; Shang, Pengjian; Zhang, Zuoquan; Li, Xuemei
2017-11-01
In this paper, a new coefficient is proposed with the objective of quantifying the level of complexity for financial time series. For researching complexity measures from the view of entropy, we propose a new permutation entropy based on Gini-Simpson index (GPE). Logistic map is applied to simulate time series to show the accuracy of the GPE method, and expound the extreme robustness of our GPE by the results of simulated time series. Meanwhile, we compare the effect of the different order of GPE. And then we employ it to US and European and Chinese stock markets in order to reveal the inner mechanism hidden in the original financial time series. After comparison of these results of stock indexes, it can be concluded that the relevance of different stock markets are obvious. To study the complexity features and properties of financial time series, it can provide valuable information for understanding the inner mechanism of financial markets.
2D Tsallis Entropy for Image Segmentation Based on Modified Chaotic Bat Algorithm
Directory of Open Access Journals (Sweden)
Zhiwei Ye
2018-03-01
Full Text Available Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum entropy, 1D cross entropy, 1D Tsallis entropy, fuzzy entropy, 2D Fisher, 2D maximum entropy and 2D cross entropy. On the other hand, due to the existence of huge computational costs, meta-heuristics algorithms like genetic algorithm (GA, particle swarm optimization (PSO, ant colony optimization algorithm (ACO and differential evolution algorithm (DE are used to accelerate the 2D Tsallis entropy thresholding method. In this paper, considering 2D Tsallis entropy as a constrained optimization problem, the optimal thresholds are acquired by maximizing the objective function using a modified chaotic Bat algorithm (MCBA. The proposed algorithm has been tested on some actual and infrared images. The results are compared with that of PSO, GA, ACO and DE and demonstrate that the proposed method outperforms other approaches involved in the paper, which is a feasible and effective option for image segmentation.
Optimal Entropy-Based Cooperative Spectrum Sensing for Maritime Cognitive Radio Networks
Directory of Open Access Journals (Sweden)
Waleed Ejaz
2013-11-01
Full Text Available Maritime cognitive radio networks (MCRNs have recently been proposed for opportunistic utilization of the licensed band. Spectrum sensing is one of the key issues for the successful deployment of the MCRNs. The maritime environment is unique in terms of radio wave propagation over water, surface reflection and wave occlusions. In order to deal with the challenging maritime environment, we proposed an optimal entropy-based cooperative spectrum sensing. As the results of spectrum sensing are sensitive to the number of samples in an entropy-based local detection scheme, we first calculated the optimal number of samples. Next, a cooperative spectrum sensing scheme considering the conditions of the sea environment is proposed. Finally, the throughput optimization of the m-out-of-n rule is considered. Results revealed that although the existing schemes work well for the lower sea states, they fail to perform at higher sea states. Moreover, simulation results also indicated the robustness of the entropy-based scheme and the proposed cooperative spectrum sensing scheme at higher sea states in comparison with the traditional energy detector.
Entropy based classifier for cross-domain opinion mining
Directory of Open Access Journals (Sweden)
Jyoti S. Deshmukh
2018-01-01
Full Text Available In recent years, the growth of social network has increased the interest of people in analyzing reviews and opinions for products before they buy them. Consequently, this has given rise to the domain adaptation as a prominent area of research in sentiment analysis. A classifier trained from one domain often gives poor results on data from another domain. Expression of sentiment is different in every domain. The labeling cost of each domain separately is very high as well as time consuming. Therefore, this study has proposed an approach that extracts and classifies opinion words from one domain called source domain and predicts opinion words of another domain called target domain using a semi-supervised approach, which combines modified maximum entropy and bipartite graph clustering. A comparison of opinion classification on reviews on four different product domains is presented. The results demonstrate that the proposed method performs relatively well in comparison to the other methods. Comparison of SentiWordNet of domain-specific and domain-independent words reveals that on an average 72.6% and 88.4% words, respectively, are correctly classified.
Directory of Open Access Journals (Sweden)
Muhammad Shoaib
2016-10-01
Full Text Available Proper knowledge of the wind characteristics of a site is of fundamental importance in estimating wind energy output from a selected wind turbine. The present paper focuses on assessing the suitability and accuracy of the fitted distribution function to the measured wind speed data for Baburband site in Sindh Pakistan. Comparison is made between the wind power densities obtained using the fitted functions based on Maximum Entropy Principle (MEP and Weibull distribution. In case of MEP-based function a system of (N+1 non-linear equations containing (N+1 Lagrange multipliers is defined as probability density function. The maximum entropy probability density functions is calculated for 3–9 low order moments obtained from measured wind speed data. The annual actual wind power density (PA is found to be 309.25 W/m2 while the Weibull based wind power density (PW is 297.25 W/m2. The MEP-based density for orders 5, 7, 8 and 9 (PE is 309.21 W/m2, whereas for order 6 it is 309.43 W/m2. To validate the MEP-based function, the results are compared with the Weibull function and the measured data. Kolmogorov–Smirnov test is performed between the cdf of the measured wind data and the fitted distribution function (Q95 = 0.01457 > Q = 10−4. The test confirms the suitability of MEP-based function for modeling measured wind speed data and for the estimation of wind energy output from a wind turbine. R2 test is also performed giving analogous behavior of the fitted MEP-based pdf to the actual wind speed data (R2 ~ 0.9. The annual energy extracted using the chosen wind turbine based on Weibull function is PW = 2.54 GWh and that obtained using MEP-based function is PE = 2.57–2.67 GWh depending on the order of moments.
A rumor spreading model based on information entropy.
Wang, Chao; Tan, Zong Xuan; Ye, Ye; Wang, Lu; Cheong, Kang Hao; Xie, Neng-Gang
2017-08-29
Rumor spreading can have a significant impact on people's lives, distorting scientific facts and influencing political opinions. With technologies that have democratized the production and reproduction of information, the rate at which misinformation can spread has increased significantly, leading many to describe contemporary times as a 'post-truth era'. Research into rumor spreading has primarily been based on either model of social and biological contagion, or upon models of opinion dynamics. Here we present a comprehensive model that is based on information entropy, which allows for the incorporation of considerations like the role of memory, conformity effects, differences in the subjective propensity to produce distortions, and variations in the degree of trust that people place in each other. Variations in the degree of trust are controlled by a confidence factor β, while the propensity to produce distortions is controlled by a conservation factor K. Simulations were performed using a Barabási-Albert (BA) scale-free network seeded with a single piece of information. The influence of β and K upon the temporal evolution of the system was subsequently analyzed regarding average information entropy, opinion fragmentation, and the range of rumor spread. These results can aid in decision-making to limit the spread of rumors.
Shape modeling and analysis with entropy-based particle systems.
Cates, Joshua; Fletcher, P Thomas; Styner, Martin; Shenton, Martha; Whitaker, Ross
2007-01-01
This paper presents a new method for constructing compact statistical point-based models of ensembles of similar shapes that does not rely on any specific surface parameterization. The method requires very little preprocessing or parameter tuning, and is applicable to a wider range of problems than existing methods, including nonmanifold surfaces and objects of arbitrary topology. The proposed method is to construct a point-based sampling of the shape ensemble that simultaneously maximizes both the geometric accuracy and the statistical simplicity of the model. Surface point samples, which also define the shape-to-shape correspondences, are modeled as sets of dynamic particles that are constrained to lie on a set of implicit surfaces. Sample positions are optimized by gradient descent on an energy function that balances the negative entropy of the distribution on each shape with the positive entropy of the ensemble of shapes. We also extend the method with a curvature-adaptive sampling strategy in order to better approximate the geometry of the objects. This paper presents the formulation; several synthetic examples in two and three dimensions; and an application to the statistical shape analysis of the caudate and hippocampus brain structures from two clinical studies.
Multiscale Permutation Entropy Based Rolling Bearing Fault Diagnosis
Directory of Open Access Journals (Sweden)
Jinde Zheng
2014-01-01
Full Text Available A new rolling bearing fault diagnosis approach based on multiscale permutation entropy (MPE, Laplacian score (LS, and support vector machines (SVMs is proposed in this paper. Permutation entropy (PE was recently proposed and defined to measure the randomicity and detect dynamical changes of time series. However, for the complexity of mechanical systems, the randomicity and dynamic changes of the vibration signal will exist in different scales. Thus, the definition of MPE is introduced and employed to extract the nonlinear fault characteristics from the bearing vibration signal in different scales. Besides, the SVM is utilized to accomplish the fault feature classification to fulfill diagnostic procedure automatically. Meanwhile, in order to avoid a high dimension of features, the Laplacian score (LS is used to refine the feature vector by ranking the features according to their importance and correlations with the main fault information. Finally, the rolling bearing fault diagnosis method based on MPE, LS, and SVM is proposed and applied to the experimental data. The experimental data analysis results indicate that the proposed method could identify the fault categories effectively.
Humeau-Heurtier, Anne; Mahé, Guillaume; Abraham, Pierre
2015-12-01
Laser speckle contrast imaging (LSCI) enables a noninvasive monitoring of microvascular perfusion. Some studies have proposed to extract information from LSCI data through their multiscale entropy (MSE). However, for reaching a large range of scales, the original MSE algorithm may require long recordings for reliability. Recently, a novel approach to compute MSE with shorter data sets has been proposed: the short-time MSE (sMSE). Our goal is to apply, for the first time, the sMSE algorithm in LSCI data and to compare results with those given by the original MSE. Moreover, we apply the original MSE algorithm on data of different lengths and compare results with those given by longer recordings. For this purpose, synthetic signals and 192 LSCI regions of interest (ROIs) of different sizes are processed. Our results show that the sMSE algorithm is valid to compute the MSE of LSCI data. Moreover, with time series shorter than those initially proposed, the sMSE and original MSE algorithms give results with no statistical difference from those of the original MSE algorithm with longer data sets. The minimal acceptable length depends on the ROI size. Comparisons of MSE from healthy and pathological subjects can be performed with shorter data sets than those proposed until now.
Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing
2015-08-01
In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.
Directory of Open Access Journals (Sweden)
Suparna Bharadwaj
2016-01-01
Full Text Available Introduction: Depth of anaesthesia (DOA monitors are shown to reduce the intra-operative dose of anaesthetic agents, provide haemodynamic stability and shorten emergence times. Electroencephalography (EEG based DOA monitors such as bispectral index (BIS and entropy have been calibrated and validated in healthy subjects. Hence the clinical effectiveness of these monitors may be affected when monitoring patients with neurological disorders (e.g., epilepsy, dystonia, dementia and Parkinson's disease. The aim of this study was to determine whether BIS and entropy correlate with each other and with clinical indices of DOA in patients with movement disorders under general anaesthesia (GA. Materials and Methods: We conducted a prospective, observational study in patients with movement disorders undergoing internalization of deep brain stimulators. All patients received standard GA with age-adjusted mean alveolar concentration (aaMAC of an inhalational agent between 0.7 and 1.1. BIS and entropy sensors were applied on the patient's left forehead. Data collected included clinical parameters and EEG-based DOA indices. Correlation analysis was performed between entropy, BIS and the clinical indices of DOA. Bland Altman analysis was performed to determine the agreement between BIS and entropy. Results: Thirty patients were studied (mean age was 58.4 ± 11 years, male: female 18:12 and weight 79.2 ± 17 kg. Indications for deep brain stimulation were Parkinson's disease (n = 25, essential tremors (n = 2 and dystonia (n = 3. There was a very strong positive correlation between BIS and response entropy (RE (r = 0.932 and BIS and state entropy (SE (r = 0.950 and a strong negative correlation among aaMAC and BIS, RE and SE with r values of −0.686, −0.788 and −0.732, respectively. However, there was no correlation between BIS, RE, SE and haemodynamic values. Conclusion: Our study showed that BIS and entropy perform well in patients with movement disorders
A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.
Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro
2016-01-01
Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation
Giauque, Alexis; Birbaud, Anne Laure; Pitsch, Heinz
2007-11-01
Combustion noise is one of the main noise sources of aircraft auxiliary power unit gas turbine engines, and with the better understanding of jet and fan noise, combustion generated noise is also becoming increasingly important for the the main aircraft engines. Combustion generated noise comes from two major sources: direct noise, produced by the unsteady heat release, and indirect noise, produced by entropy fluctuations passing through pressure gradients in the turbine. Two approaches are considered to investigate the fundamental aspects of indirect noise. For the first, high order fully compressible simulations of a modulated entropy wave passing through a converging diverging nozzle are performed. The second approach uses a flow solver based on Goldstein's analogy in order to propagate the acoustic and entropy waves using the perturbed velocity, pressure, and entropy components coming from low Mach number or fully compressible calculations. The capability of these two methods to predict the acoustics produced by the modulated entropy wave are finally discussed. Numerical results are compared to the experiment performed by Bake et al. (GT2005-69029, ASME Turbo Expo 2005).
Some Comments on the Entropy-Based Criteria for Piping
Directory of Open Access Journals (Sweden)
Emöke Imre
2015-04-01
Full Text Available This paper is an extension of previous work which characterises soil behaviours using the grading entropy diagram. The present work looks at the piping process in granular soils, by considering some new data from flood-protection dikes. The piping process is divided into three parts here: particle movement at the micro scale to segregate free water; sand boil development (which is the initiation of the pipe, and pipe growth. In the first part of the process, which occurs during the rising flood, the increase in shear stress along the dike base may cause segregation of water into micro pipes if the subsoil in the dike base is relatively loose. This occurs at the maximum dike base shear stress level (ratio of shear stress and strength zone which is close to the toe. In the second part of the process, the shear strain increment causes a sudden, asymmetric slide and cracking of the dike leading to the localized excess pore pressure, liquefaction and the formation of a sand boil. In the third part of the process, the soil erosion initiated through the sand boil continues, and the pipe grows. The piping in the Hungarian dikes often occurs in a two-layer system; where the base layer is coarser with higher permeability and the cover layer is finer with lower permeability. The new data presented here show that the soils ejected from the sand boils are generally silty sands and sands, which are prone to both erosion (on the basis of the entropy criterion and liquefaction. They originate from the cover layer which is basically identical to the soil used in the Dutch backward erosion experiments.
IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment
Directory of Open Access Journals (Sweden)
Shyamal Dalapati
2017-12-01
Full Text Available Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set (INS environment, which we call IN-cross entropy measure and prove its basic properties. We also develop weighted IN-cross entropy measure and investigats its basic properties. Based on the weighted IN-cross entropy measure, we develop a novel strategy for multi attribute group decision making (MAGDM strategy under interval neutrosophic environment. The proposed multi attribute group decision making strategy is compared with the existing cross entropy measure based strategy in the literature under interval neutrosophic set environment. Finally, an illustrative example of multi attribute group decision making problem is solved to show the feasibility, validity and efficiency of the proposed MAGDM strategy.
LIBOR troubles: Anomalous movements detection based on maximum entropy
Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria
2016-05-01
According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.
Entropy-Based Block Processing for Satellite Image Registration
Directory of Open Access Journals (Sweden)
Ikhyun Lee
2012-11-01
Full Text Available Image registration is an important task in many computer vision applications such as fusion systems, 3D shape recovery and earth observation. Particularly, registering satellite images is challenging and time-consuming due to limited resources and large image size. In such scenario, state-of-the-art image registration methods such as scale-invariant feature transform (SIFT may not be suitable due to high processing time. In this paper, we propose an algorithm based on block processing via entropy to register satellite images. The performance of the proposed method is evaluated using different real images. The comparative analysis shows that it not only reduces the processing time but also enhances the accuracy.
Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location
Directory of Open Access Journals (Sweden)
Qiaoning Yang
2015-10-01
Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.
Coherence and entanglement measures based on Rényi relative entropies
International Nuclear Information System (INIS)
Zhu, Huangjun; Hayashi, Masahito; Chen, Lin
2017-01-01
We study systematically resource measures of coherence and entanglement based on Rényi relative entropies, which include the logarithmic robustness of coherence, geometric coherence, and conventional relative entropy of coherence together with their entanglement analogues. First, we show that each Rényi relative entropy of coherence is equal to the corresponding Rényi relative entropy of entanglement for any maximally correlated state. By virtue of this observation, we establish a simple operational connection between entanglement measures and coherence measures based on Rényi relative entropies. We then prove that all these coherence measures, including the logarithmic robustness of coherence, are additive. Accordingly, all these entanglement measures are additive for maximally correlated states. In addition, we derive analytical formulas for Rényi relative entropies of entanglement of maximally correlated states and bipartite pure states, which reproduce a number of classic results on the relative entropy of entanglement and logarithmic robustness of entanglement in a unified framework. Several nontrivial bounds for Rényi relative entropies of coherence (entanglement) are further derived, which improve over results known previously. Moreover, we determine all states whose relative entropy of coherence is equal to the logarithmic robustness of coherence. As an application, we provide an upper bound for the exact coherence distillation rate, which is saturated for pure states. (paper)
A comparison of EEG spectral entropy with conventional quantitative ...
African Journals Online (AJOL)
Adele
Table I. Modified Observer's Assessment of Alertness/Sedation Scale (MOAAS). Response. Score. Responds readily to name spoken in normal tone. 5. Lethargic response ..... conceptual memory, and quantitative electroencephalographical mea- sures during recovery from sevoflurane- and remifentanil-based anes- thesia.
Upper entropy axioms and lower entropy axioms
International Nuclear Information System (INIS)
Guo, Jin-Li; Suo, Qi
2015-01-01
The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover, different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics
Fundamental limits on quantum dynamics based on entropy change
Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.
2018-01-01
It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.
Towards an entropy-based analysis of log variability
DEFF Research Database (Denmark)
Back, Christoffer Olling; Debois, Søren; Slaats, Tijs
2017-01-01
the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....
Towards an Entropy-based Analysis of Log Variability
DEFF Research Database (Denmark)
Back, Christoffer Olling; Debois, Søren; Slaats, Tijs
2018-01-01
the development of hybrid miners: given a log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....
Properties of Risk Measures of Generalized Entropy in Portfolio Selection
Directory of Open Access Journals (Sweden)
Rongxi Zhou
2017-12-01
Full Text Available This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and Hybrid Entropy in the hybridized uncertainty of both fuzziness and randomness. We discover that none of the risk measures satisfy all six of the following properties, which various scholars have associated with effective risk measures: Monotonicity, Translation Invariance, Sub-additivity, Positive Homogeneity, Consistency and Convexity. Measures based on Fuzzy Entropy, Credibility Entropy, and Sine Entropy all exhibit the same properties: Sub-additivity, Positive Homogeneity, Consistency, and Convexity. These measures based on Information Entropy and Hybrid Entropy, meanwhile, only exhibit Sub-additivity and Consistency. Cumulative Residual Entropy satisfies just Sub-additivity, Positive Homogeneity, and Convexity. After identifying these properties, we develop seven portfolio models based on different risk measures and made empirical comparisons using samples from both the Shenzhen Stock Exchange of China and the New York Stock Exchange of America. The comparisons show that the Mean Fuzzy Entropy Model performs the best among the seven models with respect to both daily returns and relative cumulative returns. Overall, these results could provide an important reference for both constructing effective risk measures and rationally selecting the appropriate risk measure under different portfolio selection conditions.
Calculating the Entropy of Solid and Liquid Metals, Based on Acoustic Data
Tekuchev, V. V.; Kalinkin, D. P.; Ivanova, I. V.
2018-05-01
The entropies of iron, cobalt, rhodium, and platinum are studied for the first time, based on acoustic data and using the Debye theory and rigid-sphere model, from 298 K up to the boiling point. A formula for the melting entropy of metals is validated. Good agreement between the research results and the literature data is obtained.
Sample Entropy-Based Approach to Evaluate the Stability of Double-Wire Pulsed MIG Welding
Directory of Open Access Journals (Sweden)
Ping Yao
2014-01-01
Full Text Available According to the sample entropy, this paper deals with a quantitative method to evaluate the current stability in double-wire pulsed MIG welding. Firstly, the sample entropy of current signals with different stability but the same parameters is calculated. The results show that the more stable the current, the smaller the value and the standard deviation of sample entropy. Secondly, four parameters, which are pulse width, peak current, base current, and frequency, are selected for four-level three-factor orthogonal experiment. The calculation and analysis of desired signals indicate that sample entropy values are affected by welding current parameters. Then, a quantitative method based on sample entropy is proposed. The experiment results show that the method can preferably quantify the welding current stability.
Entropy-based particle correspondence for shape populations.
Oguz, Ipek; Cates, Josh; Datar, Manasi; Paniagua, Beatriz; Fletcher, Thomas; Vachet, Clement; Styner, Martin; Whitaker, Ross
2016-07-01
Statistical shape analysis of anatomical structures plays an important role in many medical image analysis applications such as understanding the structural changes in anatomy in various stages of growth or disease. Establishing accurate correspondence across object populations is essential for such statistical shape analysis studies. In this paper, we present an entropy-based correspondence framework for computing point-based correspondence among populations of surfaces in a groupwise manner. This robust framework is parameterization-free and computationally efficient. We review the core principles of this method as well as various extensions to deal effectively with surfaces of complex geometry and application-driven correspondence metrics. We apply our method to synthetic and biological datasets to illustrate the concepts proposed and compare the performance of our framework to existing techniques. Through the numerous extensions and variations presented here, we create a very flexible framework that can effectively handle objects of various topologies, multi-object complexes, open surfaces, and objects of complex geometry such as high-curvature regions or extremely thin features.
Aho, A J; Kamata, K; Jäntti, V; Kulkas, A; Hagihira, S; Huhtala, H; Yli-Hankala, A
2015-08-01
Concomitantly recorded Bispectral Index® (BIS) and Entropy™ values sometimes show discordant trends during general anaesthesia. Previously, no attempt had been made to discover which EEG characteristics cause discrepancies between BIS and Entropy. We compared BIS and Entropy values, and analysed the changes in the raw EEG signal during surgical anaesthesia with sevoflurane. In this prospective, open-label study, 65 patients receiving general anaesthesia with sevoflurane were enrolled. BIS, Entropy and multichannel digital EEG were recorded. Concurrent BIS and State Entropy (SE) values were selected. Whenever BIS and SE values showed ≥10-unit disagreement for ≥60 s, the raw EEG signal was analysed both in time and frequency domain. A ≥10-unit disagreement ≥60 s was detected 428 times in 51 patients. These 428 episodes accounted for 5158 (11%) out of 45 918 analysed index pairs. During EEG burst suppression, SE was higher than BIS in 35 out of 49 episodes. During delta-theta dominance, BIS was higher than SE in 141 out of 157 episodes. During alpha or beta activity, SE was higher than BIS in all 49 episodes. During electrocautery, both BIS and SE changed, sometimes in the opposite direction, but returned to baseline values after electrocautery. Electromyography caused index disagreement four times (BIS > SE). Certain specific EEG patterns, and artifacts, are associated with discrepancies between BIS and SE. Time and frequency domain analyses of the original EEG improve the interpretation of studies involving BIS, Entropy and other EEG-based indices. NCT01077674. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Information Theoretic Approach Based on Entropy for Classification of Bioacoustics Signals
Han, Ng Chee; Muniandy, Sithi V.; Dayou, Jedol; Mun, Ho Chong; Ahmad, Abdul Hamid; Dalimin, Mohd. Noh
2010-07-01
A new hybrid method for automated frog sound identification by incorporating entropy and spectral centroid concept is proposed. Entropy has important physical implications as the amount of "disorder" of a system. This study explores the use of various definitions of entropies such as the Shannon entropy, Kolmogorov-Rényi entropy and Tsallis entropy as measure of information contents or complexity for the purpose of the pattern recognition of bioacoustics signal. Each of these definitions of entropies characterizes different aspects of the signal. The entropies are combined with other standard pattern recognition tools such as the Fourier spectral analysis to form a hybrid spectral-entropic classification scheme. The efficiency of the system is tested using a database of sound syllables are obtained from a number of species of Microhylidae frogs. Nonparametric k-NN classifier is used to recognize the frog species based on the spectral-entropic features. The result showed that the k-NN classifier based on the selected features is able to identify the species of the frogs with relativity good accuracy compared to features relying on spectral contents alone. The robustness of the developed system is also tested for different noise levels.
An entropy-based analysis of lane changing behavior: An interactive approach.
Kosun, Caglar; Ozdemir, Serhan
2017-05-19
As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would
Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways
Directory of Open Access Journals (Sweden)
Keting Hu
2016-03-01
Full Text Available In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE, Wavelet Packet Energy Tsallis Entropy (WPETE with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6, Empirical Mode Decomposition Shannon Entropy (EMDESE, and Empirical Mode Decomposition Tsallis Entropy (EMDETE with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT. The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.
Permutation entropy based speckle analysis in metal cutting
Nair, Usha; Krishna, Bindu M.; Namboothiri, V. N. N.; Nampoori, V. P. N.
2008-08-01
for regular, chaotic, noisy or reality-based signals. PE work efficiently well even in the presence of dynamical and/or observational noise. Unlike other nonlinear techniques PE is easier and faster to calculate as the reconstruction of the state space from time series is not required. Increasing value of PE indicates increase in complexity of the system dynamics. PE of the time series is calculated using a one-sample shift sliding window technique. PE of order n>=2 is calculated from Shanon entropy where the sum runs over all n! permutations of order n. PE gives the information contained in comparing n consecutive values of the time series. The calculation of PE is fast and robust in nature. Under situations where the data sets are huge and there is no time for preprocessing and fine-tuning, PE can effectively detect dynamical changes of the system. This makes PE an ideal choice for online detection of chatter, which is not possible with other conventional methods.
Shannon Entropy-Based Prediction of Solar Cycle 25
Kakad, Bharati; Kakad, Amar; Ramesh, Durbha Sai
2017-07-01
A new model is proposed to forecast the peak sunspot activity of the upcoming solar cycle (SC) using Shannon entropy estimates related to the declining phase of the preceding SC. Daily and monthly smoothed international sunspot numbers are used in the present study. The Shannon entropy is the measure of inherent randomness in the SC and is found to vary with the phase of an SC as it progresses. In this model each SC with length T_{cy} is divided into five equal parts of duration T_{cy}/5. Each part is considered as one phase, and they are sequentially termed P1, P2, P3, P4, and P5. The Shannon entropy estimates for each of these five phases are obtained for the nth SC starting from n=10 - 23. We find that the Shannon entropy during the ending phase (P5) of the nth SC can be efficiently used to predict the peak smoothed sunspot number of the (n+1)th SC, i.e. S_{max}^{n+1}. The prediction equation derived in this study has a good correlation coefficient of 0.94. A noticeable decrease in entropy from 4.66 to 3.89 is encountered during P5 of SCs 22 to 23. The entropy value for P5 of the present SC 24 is not available as it has not yet ceased. However, if we assume that the fall in entropy continues for SC 24 at the same rate as that for SC 23, then we predict the peak smoothed sunspot number of 63±11.3 for SC 25. It is suggested that the upcoming SC 25 will be significantly weaker and comparable to the solar activity observed during the Dalton minimum in the past.
Zheng, Jinde; Pan, Haiyang; Yang, Shubao; Cheng, Junsheng
2018-01-01
Multiscale permutation entropy (MPE) is a recently proposed nonlinear dynamic method for measuring the randomness and detecting the nonlinear dynamic change of time series and can be used effectively to extract the nonlinear dynamic fault feature from vibration signals of rolling bearing. To solve the drawback of coarse graining process in MPE, an improved MPE method called generalized composite multiscale permutation entropy (GCMPE) was proposed in this paper. Also the influence of parameters on GCMPE and its comparison with the MPE are studied by analyzing simulation data. GCMPE was applied to the fault feature extraction from vibration signal of rolling bearing and then based on the GCMPE, Laplacian score for feature selection and the Particle swarm optimization based support vector machine, a new fault diagnosis method for rolling bearing was put forward in this paper. Finally, the proposed method was applied to analyze the experimental data of rolling bearing. The analysis results show that the proposed method can effectively realize the fault diagnosis of rolling bearing and has a higher fault recognition rate than the existing methods.
Entropy-Based Economic Denial of Sustainability Detection
Directory of Open Access Journals (Sweden)
Marco Antonio Sotelo Monge
2017-11-01
Full Text Available In recent years, an important increase in the amount and impact of Distributed Denial of Service (DDoS threats has been reported by the different information security organizations. They typically target the depletion of the computational resources of the victims, hence drastically harming their operational capabilities. Inspired by these methods, Economic Denial of Sustainability (EDoS attacks pose a similar motivation, but adapted to Cloud computing environments, where the denial is achieved by damaging the economy of both suppliers and customers. Therefore, the most common EDoS approach is making the offered services unsustainable by exploiting their auto-scaling algorithms. In order to contribute to their mitigation, this paper introduces a novel EDoS detection method based on the study of entropy variations related with metrics taken into account when deciding auto-scaling actuations. Through the prediction and definition of adaptive thresholds, unexpected behaviors capable of fraudulently demand new resource hiring are distinguished. With the purpose of demonstrate the effectiveness of the proposal, an experimental scenario adapted to the singularities of the EDoS threats and the assumptions driven by their original definition is described in depth. The preliminary results proved high accuracy.
Formulating the shear stress distribution in circular open channels based on the Renyi entropy
Khozani, Zohreh Sheikh; Bonakdari, Hossein
2018-01-01
The principle of maximum entropy is employed to derive the shear stress distribution by maximizing the Renyi entropy subject to some constraints and by assuming that dimensionless shear stress is a random variable. A Renyi entropy-based equation can be used to model the shear stress distribution along the entire wetted perimeter of circular channels and circular channels with flat beds and deposited sediments. A wide range of experimental results for 12 hydraulic conditions with different Froude numbers (0.375 to 1.71) and flow depths (20.3 to 201.5 mm) were used to validate the derived shear stress distribution. For circular channels, model performance enhanced with increasing flow depth (mean relative error (RE) of 0.0414) and only deteriorated slightly at the greatest flow depth (RE of 0.0573). For circular channels with flat beds, the Renyi entropy model predicted the shear stress distribution well at lower sediment depth. The Renyi entropy model results were also compared with Shannon entropy model results. Both models performed well for circular channels, but for circular channels with flat beds the Renyi entropy model displayed superior performance in estimating the shear stress distribution. The Renyi entropy model was highly precise and predicted the shear stress distribution in a circular channel with RE of 0.0480 and in a circular channel with a flat bed with RE of 0.0488.
Entropy-Based Video Steganalysis of Motion Vectors
Directory of Open Access Journals (Sweden)
Elaheh Sadat Sadat
2018-04-01
Full Text Available In this paper, a new method is proposed for motion vector steganalysis using the entropy value and its combination with the features of the optimized motion vector. In this method, the entropy of blocks is calculated to determine their texture and the precision of their motion vectors. Then, by using a fuzzy cluster, the blocks are clustered into the blocks with high and low texture, while the membership function of each block to a high texture class indicates the texture of that block. These membership functions are used to weight the effective features that are extracted by reconstructing the motion estimation equations. Characteristics of the results indicate that the use of entropy and the irregularity of each block increases the precision of the final video classification into cover and stego classes.
ISAR Image Formation Based on Minimum Entropy Criterion and Fractional Fourier Transform
Naghsh, Mohammad Mahdi; Modarres-Hashemi, Mahmood
Conventional radar imaging systems use Fourier transform for image formation, but due to the target's complicated motion the Doppler spectrum is time-varying and thus the reconstructed image becomes blurred even after applying standard motion compensation algorithms. Therefore, sophisticated algorithms such as polar reformatting are usually employed to produce clear images. Alternatively, Joint Time-Frequency (JTF) analysis can be used for image formation which produces clear image without using polar reformatting algorithm. In this paper, a new JTF-based method is proposed for image formation in inverse synthetic aperture radars (ISAR). This method uses minimum entropy criterion for optimum parameter adjustment of JTF algorithms. Short Time Fourier Transform (STFT) and Fractional Fourier Transform (FrFT) are applied as JTF for time-varying Doppler spectrum analysis. Both the width of Gaussian window of STFT and the order of FrFT, α, are adjusted using minimum entropy as local and total measures. Furthermore, a new statistical parameter, called normalized correlation, is defined for comparison of images reconstructed by different methods. Simulation results show that α-order FrFT with local adjustment has much better performance than the other methods in this category even in low SNR.
Directory of Open Access Journals (Sweden)
Zou-Qing Tan
2014-09-01
Full Text Available An entropy-controlled bending mechanism is presented to study the nanomechanics of microcantilever-based single-stranded DNA (ssDNA sensors. First; the conformational free energy of the ssDNA layer is given with an improved scaling theory of thermal blobs considering the curvature effect; and the mechanical energy of the non-biological layer is described by Zhang’s two-variable method for laminated beams. Then; an analytical model for static deflections of ssDNA microcantilevers is formulated by the principle of minimum energy. The comparisons of deflections predicted by the proposed model; Utz–Begley’s model and Hagan’s model are also examined. Numerical results show that the conformational entropy effect on microcantilever deflections cannot be ignored; especially at the conditions of high packing density or long chain systems; and the variation of deflection predicted by the proposed analytical model not only accords with that observed in the related experiments qualitatively; but also appears quantitatively closer to the experimental values than that by the preexisting models. In order to improve the sensitivity of static-mode biosensors; it should be as small as possible to reduce the substrate stiffness.
Directory of Open Access Journals (Sweden)
Dong Cui
2015-09-01
Full Text Available EEG characteristics that correlate with the cognitive functions are important in detecting mild cognitive impairment (MCI in T2DM. To investigate the complexity between aMCI group and age-matched non-aMCI control group in T2DM, six entropies combining empirical mode decomposition (EMD, including Approximate entropy (ApEn, Sample entropy (SaEn, Fuzzy entropy (FEn, Permutation entropy (PEn, Power spectrum entropy (PsEn and Wavelet entropy (WEn were used in the study. A feature extraction technique based on maximization of the area under the curve (AUC and a support vector machine (SVM were subsequently used to for features selection and classification. Finally, Pearson's linear correlation was employed to study associations between these entropies and cognitive functions. Compared to other entropies, FEn had a higher classification accuracy, sensitivity and specificity of 68%, 67.1% and 71.9%, respectively. Top 43 salient features achieved classification accuracy, sensitivity and specificity of 73.8%, 72.3% and 77.9%, respectively. P4, T4 and C4 were the highest ranking salient electrodes. Correlation analysis showed that FEn based on EMD was positively correlated to memory at electrodes F7, F8 and P4, and PsEn based on EMD was positively correlated to Montreal cognitive assessment (MoCA and memory at electrode T4. In sum, FEn based on EMD in right-temporal and occipital regions may be more suitable for early diagnosis of the MCI with T2DM.
A concept of heat dissipation coefficient for thermal cloak based on entropy generation approach
Directory of Open Access Journals (Sweden)
Guoqiang Xu
2016-09-01
Full Text Available In this paper, we design a 3D spherical thermal cloak with eight material layers based on transformation thermodynamics and it worked at steady state before approaching ‘static limit’. Different from the present research, we introduce local entropy generation to present the randomness in the cloaking system and propose the concept of a heat dissipation coefficient which is used to describe the capacity of heat diffusion in the ‘cloaking’ and ‘protected’ region to characterize the cloaking performance on the basis of non-equilibrium thermodynamics. We indicate the ability of heat dissipation for the thermal cloak responds to changes in anisotropy (caused by the change in the number of layers and differential temperatures. In addition, we obtain a comparison of results of different cloaks and believe that the concept of a heat dissipation coefficient can be an evaluation criterion for the thermal cloak.
Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing
Directory of Open Access Journals (Sweden)
Radu Mutihac
2009-06-01
Full Text Available Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.
Directory of Open Access Journals (Sweden)
Zhendong Mu
2017-02-01
Full Text Available Driver fatigue has become one of the major causes of traffic accidents, and is a complicated physiological process. However, there is no effective method to detect driving fatigue. Electroencephalography (EEG signals are complex, unstable, and non-linear; non-linear analysis methods, such as entropy, maybe more appropriate. This study evaluates a combined entropy-based processing method of EEG data to detect driver fatigue. In this paper, 12 subjects were selected to take part in an experiment, obeying driving training in a virtual environment under the instruction of the operator. Four types of enthrones (spectrum entropy, approximate entropy, sample entropy and fuzzy entropy were used to extract features for the purpose of driver fatigue detection. Electrode selection process and a support vector machine (SVM classification algorithm were also proposed. The average recognition accuracy was 98.75%. Retrospective analysis of the EEG showed that the extracted features from electrodes T5, TP7, TP8 and FP1 may yield better performance. SVM classification algorithm using radial basis function as kernel function obtained better results. A combined entropy-based method demonstrates good classification performance for studying driver fatigue detection.
Sharmila, A; Aman Raj, Suman; Shashank, Pandey; Mahalakshmi, P
2018-01-01
In this work, we have used a time-frequency domain analysis method called discrete wavelet transform (DWT) technique. This method stand out compared to other proposed methods because of its algorithmic elegance and accuracy. A wavelet is a mathematical function based on time-frequency analysis in signal processing. It is useful particularly because it allows a weak signal to be recovered from a noisy signal without much distortion. A wavelet analysis works by analysing the image and converting it to mathematical function which is decoded by the receiver. Furthermore, we have used Shannon entropy and approximate entropy (ApEn) for extracting the complexities associated with electroencephalographic (EEG) signals. The ApEn is a suitable feature to characterise the EEGs because its value drops suddenly due to excessive synchronous discharge of neurons in the brain during epileptic activity in this study. EEG signals are decomposed into six EEG sub-bands namely D1-D5 and A5 using DWT technique. Non-linear features such as ApEn and Shannon entropy are calculated from these sub-bands and support vector machine classifiers are used for classification purpose. This scheme is tested using EEG data recorded from five healthy subjects and five epileptic patients during the inter-ictal and ictal periods. The data are acquired from University of Bonn, Germany. The proposed method is evaluated through 15 classification problems, and obtained high classification accuracy of 100% for two cases and it indicates the good classifying performance of the proposed method.
Application of entropy measurement technique in grey based ...
African Journals Online (AJOL)
Gupta et al. (2014) was also used Entropy Measurement method for optimizing the process parameters of friction stir welding of aluminium alloy. In the present ..... FY. F-ratio of parameter of Y. SY. ' Pure sum of square. CY. Percentage (%) of contribution of parameter Y. Ce. Percentage (%) of contribution of error term. C.F.
Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity
2015-10-23
AFRL-AFOSR-VA-TR-2015-0337 Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity Jean-Luc Guermond TEXAS A & M UNIVERSITY 750... Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity 5a. CONTRACT NUMBER FA9550-12-1-0358 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...conservation equations can be stabilized by using the so-called entropy viscosity method and we proposed to to investigate this new technique. We
Directory of Open Access Journals (Sweden)
She Xiaoqiang
2017-10-01
Full Text Available This paper proposes a classification method for the intertidal area using quad-polarimetric synthetic aperture radar data. In this paper, a systematic comparison of four well-known multipolarization features is provided so that appropriate features can be selected based on the characteristics of the intertidal area. Analysis result shows that the two most powerful multipolarization features are polarimetric entropy and anisotropy. Furthermore, through our detailed analysis of the scattering mechanisms of the polarimetric entropy, the Generalized Extreme Value (GEV distribution is employed to describe the statistical characteristics of the intertidal area based on the extreme value theory. Consequently, a new classification method is proposed by combining the GEV Mixture Models and the EM algorithm. Finally, experiments are performed on the Radarsat-2 quad-polarization data of the Dongtan intertidal area, Shanghai, to validate our method.
Bearing Fault Diagnosis Based on Multiscale Permutation Entropy and Support Vector Machine
Directory of Open Access Journals (Sweden)
Jian-Jiun Ding
2012-07-01
Full Text Available Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, multiscale permutation entropy (MPE was introduced for feature extraction from faulty bearing vibration signals. After extracting feature vectors by MPE, the support vector machine (SVM was applied to automate the fault diagnosis procedure. Simulation results demonstrated that the proposed method is a very powerful algorithm for bearing fault diagnosis and has much better performance than the methods based on single scale permutation entropy (PE and multiscale entropy (MSE.
Towards an information extraction and knowledge formation framework based on Shannon entropy
Directory of Open Access Journals (Sweden)
Iliescu Dragoș
2017-01-01
Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.
Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu
2018-01-24
Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.
Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin
2017-12-01
Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.
Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P; Zhang, Zheng Gang; Lehman, Norman L; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan
2013-01-01
To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease.
The Grading Entropy-based Criteria for Structural Stability of Granular Materials and Filters
Directory of Open Access Journals (Sweden)
Janos Lőrincz
2015-05-01
Full Text Available This paper deals with three grading entropy-based rules that describe different soil structure stability phenomena: an internal stability rule, a filtering rule and a segregation rule. These rules are elaborated on the basis of a large amount of laboratory testing and from existing knowledge in the field. Use is made of the theory of grading entropy to derive parameters which incorporate all of the information of the grading curve into a pair of entropy-based parameters that allow soils with common behaviours to be grouped into domains on an entropy diagram. Applications of the derived entropy-based rules are presented by examining the reason of a dam failure, by testing against the existing filter rules from the literature, and by giving some examples for the design of non-segregating grading curves (discrete particle size distributions by dry weight. A physical basis for the internal stability rule is established, wherein the higher values of base entropy required for granular stability are shown to reflect the closeness between the mean and maximum grain diameters, which explains how there are sufficient coarser grains to achieve a stable grain skeleton.
Improved Ordinary Measure and Image Entropy Theory based intelligent Copy Detection Method
Directory of Open Access Journals (Sweden)
Dengpan Ye
2011-10-01
Full Text Available Nowadays, more and more multimedia websites appear in social network. It brings some security problems, such as privacy, piracy, disclosure of sensitive contents and so on. Aiming at copyright protection, the copy detection technology of multimedia contents becomes a hot topic. In our previous work, a new computer-based copyright control system used to detect the media has been proposed. Based on this system, this paper proposes an improved media feature matching measure and an entropy based copy detection method. The Levenshtein Distance was used to enhance the matching degree when using for feature matching measure in copy detection. For entropy based copy detection, we make a fusion of the two features of entropy matrix of the entropy feature we extracted. Firstly,we extract the entropy matrix of the image and normalize it. Then, we make a fusion of the eigenvalue feature and the transfer matrix feature of the entropy matrix. The fused features will be used for image copy detection. The experiments show that compared to use these two kinds of features for image detection singly, using feature fusion matching method is apparent robustness and effectiveness. The fused feature has a high detection for copy images which have been received some attacks such as noise, compression, zoom, rotation and so on. Comparing with referred methods, the method proposed is more intelligent and can be achieved good performance.
An Entropy-Based Approach to Path Analysis of Structural Generalized Linear Models: A Basic Idea
Directory of Open Access Journals (Sweden)
Nobuoki Eshima
2015-07-01
Full Text Available A path analysis method for causal systems based on generalized linear models is proposed by using entropy. A practical example is introduced, and a brief explanation of the entropy coefficient of determination is given. Direct and indirect effects of explanatory variables are discussed as log odds ratios, i.e., relative information, and a method for summarizing the effects is proposed. The example dataset is re-analyzed by using the method.
Directory of Open Access Journals (Sweden)
Hou Hucan
2017-01-01
Full Text Available Inspired by wide application of the second law of thermodynamics to flow and heat transfer devices, local entropy production analysis method was creatively introduced into energy assessment system of centrifugal water pump. Based on Reynolds stress turbulent model and energy equation model, the steady numerical simulation of the whole flow passage of one IS centrifugal pump was carried out. The local entropy production terms were calculated by user defined functions, mainly including wall entropy production, turbulent entropy production, and viscous entropy production. The numerical results indicated that the irreversible energy loss calculated by the local entropy production method agreed well with that calculated by the traditional method but with some deviations which were probably caused by high rotatability and high curvature of impeller and volute. The wall entropy production and turbulent entropy production took up large part of the whole entropy production about 48.61% and 47.91%, respectively, which indicated that wall friction and turbulent fluctuation were the major factors in affecting irreversible energy loss. Meanwhile, the entropy production rate distribution was discussed and compared with turbulent kinetic energy dissipation rate distribution, it showed that turbulent entropy production rate increased sharply at the near wall regions and both distributed more uniformly. The blade region in leading edge near suction side, trailing edge and volute tongue were the main regions to generate irreversible exergy loss. This research broadens a completely new view in evaluating energy loss and further optimizes pump using entropy production minimization.
Entropy-based critical reaction time for mixing-controlled reactive transport
DEFF Research Database (Denmark)
Chiogna, Gabriele; Rolle, Massimo
2017-01-01
Entropy-based metrics, such as the dilution index, have been proposed to quantify dilution and reactive mixing in solute transport problems. In this work, we derive the transient advection dispersion equation for the entropy density of a reactive plume. We restrict our analysis to the case where...... the concentration distribution of the transported species is Gaussian and we observe that, even in case of an instantaneous complete bimolecular reaction, dilution caused by dispersive processes dominates the entropy balance at early times and results in the net increase of the entropy density of a reactive species....... Our results show that, differently from the critical dilution index, the critical reaction time depends on solute transport processes such as advection and hydrodynamic dispersion....
Zhao, Yong; Hong, Wen-Xue
2011-11-01
Fast, nondestructive and accurate identification of special quality eggs is an urgent problem. The present paper proposed a new feature extraction method based on symbol entropy to identify near infrared spectroscopy of special quality eggs. The authors selected normal eggs, free range eggs, selenium-enriched eggs and zinc-enriched eggs as research objects and measured the near-infrared diffuse reflectance spectra in the range of 12 000-4 000 cm(-1). Raw spectra were symbolically represented with aggregation approximation algorithm and symbolic entropy was extracted as feature vector. An error-correcting output codes multiclass support vector machine classifier was designed to identify the spectrum. Symbolic entropy feature is robust when parameter changed and the highest recognition rate reaches up to 100%. The results show that the identification method of special quality eggs using near-infrared is feasible and the symbol entropy can be used as a new feature extraction method of near-infrared spectra.
A New Feature Extraction Algorithm Based on Entropy Cloud Characteristics of Communication Signals
Directory of Open Access Journals (Sweden)
Jingchao Li
2015-01-01
Full Text Available Identifying communication signals under low SNR environment has become more difficult due to the increasingly complex communication environment. Most relevant literatures revolve around signal recognition under stable SNR, but not applicable under time-varying SNR environment. To solve this problem, we propose a new feature extraction method based on entropy cloud characteristics of communication modulation signals. The proposed algorithm extracts the Shannon entropy and index entropy characteristics of the signals first and then effectively combines the entropy theory and cloud model theory together. Compared with traditional feature extraction methods, instability distribution characteristics of the signals’ entropy characteristics can be further extracted from cloud model’s digital characteristics under low SNR environment by the proposed algorithm, which improves the signals’ recognition effects significantly. The results from the numerical simulations show that entropy cloud feature extraction algorithm can achieve better signal recognition effects, and even when the SNR is −11 dB, the signal recognition rate can still reach 100%.
Constraints of Compound Systems: Prerequisites for Thermodynamic Modeling Based on Shannon Entropy
Directory of Open Access Journals (Sweden)
Martin Pfleger
2014-05-01
Full Text Available Thermodynamic modeling of extensive systems usually implicitly assumes the additivity of entropy. Furthermore, if this modeling is based on the concept of Shannon entropy, additivity of the latter function must also be guaranteed. In this case, the constituents of a thermodynamic system are treated as subsystems of a compound system, and the Shannon entropy of the compound system must be subjected to constrained maximization. The scope of this paper is to clarify prerequisites for applying the concept of Shannon entropy and the maximum entropy principle to thermodynamic modeling of extensive systems. This is accomplished by investigating how the constraints of the compound system have to depend on mean values of the subsystems in order to ensure additivity. Two examples illustrate the basic ideas behind this approach, comprising the ideal gas model and condensed phase lattice systems as limiting cases of fluid phases. The paper is the first step towards developing a new approach for modeling interacting systems using the concept of Shannon entropy.
Population entropies estimates of proteins
Low, Wai Yee
2017-05-01
The Shannon entropy equation provides a way to estimate variability of amino acids sequences in a multiple sequence alignment of proteins. Knowledge of protein variability is useful in many areas such as vaccine design, identification of antibody binding sites, and exploration of protein 3D structural properties. In cases where the population entropies of a protein are of interest but only a small sample size can be obtained, a method based on linear regression and random subsampling can be used to estimate the population entropy. This method is useful for comparisons of entropies where the actual sequence counts differ and thus, correction for alignment size bias is needed. In the current work, an R based package named EntropyCorrect that enables estimation of population entropy is presented and an empirical study on how well this new algorithm performs on simulated dataset of various combinations of population and sample sizes is discussed. The package is available at https://github.com/lloydlow/EntropyCorrect. This article, which was originally published online on 12 May 2017, contained an error in Eq. (1), where the summation sign was missing. The corrected equation appears in the Corrigendum attached to the pdf.
ACCUMULATED DEFORMATION MODELING OF PERMANENT WAY BASED ON ENTROPY SYSTEM
Directory of Open Access Journals (Sweden)
D. M. Kurhan
2015-07-01
Full Text Available Purpose. The work provides a theoretical research about the possibility of using methods that determine the lifetime of a railway track not only in terms of total stresses, and accounting its structure and dynamic characteristics. The aim of these studies is creation the model of deformations accumulation for assessment of service life of a railway track taking into account these features. Methodology. To simulate a gradual change state during the operation (accumulation of deformations the railway track is presented as a system that consists of many particles of different materials collected in a coherent design. It is appropriate to speak not about the appearance of deformations of a certain size in a certain section of the track, and the probability of such event on the site. If to operate the probability of occurrence of deviations, comfortable state of the system is characterized by the number of breaks of the conditional internal connections. The same state of the system may correspond to different combinations of breaks. The more breaks, the more the number of options changes in the structure of the system appropriate to its current state. Such a process can be represented as a gradual transition from an ordered state to a chaotic one. To describe the characteristics of the system used the numerical value of the entropy. Findings. Its entropy is constantly increasing at system aging. The growth of entropy is expressed by changes in the internal energy of the system, which can be determined using mechanical work forces, which leads to deformation. This gives the opportunity to show quantitative indication of breaking the bonds in the system as a consequence of performing mechanical work. According to the results of theoretical research methods for estimation of the timing of life cycles of railway operation considering such factors as the structure of the flow of trains, construction of the permanent way, the movement of trains at high
International Nuclear Information System (INIS)
Giangaspero, Giorgio; Sciubba, Enrico
2013-01-01
This paper presents an application of the entropy generation minimization method to the pseudo-optimization of the configuration of the heat exchange surfaces in a Solar Rooftile. An initial “standard” commercial configuration is gradually improved by introducing design changes aimed at the reduction of the thermodynamic losses due to heat transfer and fluid friction. Different geometries (pins, fins and others) are analysed with a commercial CFD (Computational Fluid Dynamics) code that also computes the local entropy generation rate. The design improvement process is carried out on the basis of a careful analysis of the local entropy generation maps and the rationale behind each step of the process is discussed in this perspective. The results are compared with other entropy generation minimization techniques available in the recent technical literature. It is found that the geometry with pin-fins has the best performance among the tested ones, and that the optimal pin array shape parameters (pitch and span) can be determined by a critical analysis of the integrated and local entropy maps and of the temperature contours. - Highlights: ► An entropy generation minimization method is applied to a solar heat exchanger. ► The approach is heuristic and leads to a pseudo-optimization process with CFD as main tool. ► The process is based on the evaluation of the local entropy generation maps. ► The geometry with pin-fins in general outperforms all other configurations. ► The entropy maps and temperature contours can be used to determine the optimal pin array design parameters
International Nuclear Information System (INIS)
Han, J; Dong, F; Xu, Y Y
2009-01-01
This paper introduces the fundamental of cross-section measurement system based on Electrical Resistance Tomography (ERT). The measured data of four flow regimes of the gas/liquid two-phase flow in horizontal pipe flow are obtained by an ERT system. For the measured data, five entropies are extracted to analyze the experimental data according to the different flow regimes, and the analysis method is examined and compared in three different perspectives. The results indicate that three different perspectives of entropy-based feature extraction are sensitive to the flow pattern transition in gas/liquid two-phase flow. By analyzing the results of three different perspectives with the changes of gas/liquid two-phase flow parameters, the dynamic structures of gas/liquid two-phase flow is obtained, and they also provide an efficient supplementary to reveal the flow pattern transition mechanism of gas/liquid two-phase flow. Comparison of the three different methods of feature extraction shows that the appropriate entropy should be used for the identification and prediction of flow regimes.
An entropy-based improved k-top scoring pairs (TSP) method for ...
African Journals Online (AJOL)
An entropy-based improved k-top scoring pairs (TSP) (Ik-TSP) method was presented in this study for the classification and prediction of human cancers based on gene-expression data. We compared Ik-TSP classifiers with 5 different machine learning methods and the k-TSP method based on 3 different feature selection ...
Directory of Open Access Journals (Sweden)
Yingjun Zhang
2013-01-01
Full Text Available Multiattribute decision making (MADM is one of the central problems in artificial intelligence, specifically in management fields. In most cases, this problem arises from uncertainty both in the data derived from the decision maker and the actions performed in the environment. Fuzzy set and high-order fuzzy sets were proven to be effective approaches in solving decision-making problems with uncertainty. Therefore, in this paper, we investigate the MADM problem with completely unknown attribute weights in the framework of interval-valued intuitionistic fuzzy (IVIF set (IVIFS. We first propose a new definition of IVIF entropy and some calculation methods for IVIF entropy. Furthermore, we propose an entropy-based decision-making method to solve IVIF MADM problems with completely unknown attribute weights. Particular emphasis is put on assessing the attribute weights based on IVIF entropy. Instead of the traditional methods, which use divergence among attributes or the probabilistic discrimination of attributes to obtain attribute weights, we utilize the IVIF entropy to assess the attribute weights based on the credibility of the decision-making matrix for solving the problem. Finally, a supplier selection example is given to demonstrate the feasibility and validity of the proposed MADM method.
Value at risk estimation with entropy-based wavelet analysis in exchange markets
He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung
2014-08-01
In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.
Directory of Open Access Journals (Sweden)
Lars Ronn Olsen
2011-12-01
Full Text Available Broad coverage of the pathogen population is particularly important when designing CD8+ T-cell epitope vaccines against viral pathogens. Traditional approaches to assembling broadly covering sets of peptides are commonly based on assembling highly conserved epitopes. Peptide block entropy analysis is a novel approach to assembling sets of broadly covering antigens. Since T-cell epitopes are recognized as peptides rather than individual residues, this method is based on calculating the information content of blocks of peptides from a multiple sequence alignment of homologous proteins rather than individual residues. The block entropy analysis provides broad coverage by variant inclusion, since high frequency may not be the sole determinant of the immunogenic potential of a predicted MHC class I binder. We applied block entropy analysis method to the proteomes of the four serotypes of dengue virus and found 1,551 blocks of 9-mer peptides, which covered all available sequences with five or fewer unique peptides. In contrast, the benchmark study by Khan et al. (2008, resulted in 165 9-mers being determined as conserved. Many of the blocks are located consecutively in the proteins, so connecting these blocks resulted in 78 conserved regions which can be covered with 457 subunit peptides. Of the 1551 blocks of 9-mer peptides, 110 blocks consisted of peptides all predicted to bind to MHC with similar affinity and the same HLA restriction. In total, we identified a pool of 333 peptides as T-cell epitope candidates. This set could form the basis for a broadly neutralizing dengue virus vaccine. The peptide block entropy analysis approach significantly increases the number of conserved peptide regions in comparison to traditional conservation analysis of individual residues. We determined 457 subunit peptides with the capacity to encompass the diversity of all sequenced DENV strains.
Minimum Entropy-Based Cascade Control for Governing Hydroelectric Turbines
Directory of Open Access Journals (Sweden)
Mifeng Ren
2014-06-01
Full Text Available In this paper, an improved cascade control strategy is presented for hydroturbine speed governors. Different from traditional proportional-integral-derivative (PID control and model predictive control (MPC strategies, the performance index of the outer controller is constructed by integrating the entropy and mean value of the tracking error with the constraints on control energy. The inner controller is implemented by a proportional controller. Compared with the conventional PID-P and MPC-P cascade control methods, the proposed cascade control strategy can effectively decrease fluctuations of hydro-turbine speed under non-Gaussian disturbance conditions in practical hydropower plants. Simulation results show the advantages of the proposed cascade control method.
Branch length similarity entropy-based descriptors for shape representation
Kwon, Ohsung; Lee, Sang-Hee
2017-11-01
In previous studies, we showed that the branch length similarity (BLS) entropy profile could be successfully used for the shape recognition such as battle tanks, facial expressions, and butterflies. In the present study, we proposed new descriptors, roundness, symmetry, and surface roughness, for the recognition, which are more accurate and fast in the computation than the previous descriptors. The roundness represents how closely a shape resembles to a circle, the symmetry characterizes how much one shape is similar with another when the shape is moved in flip, and the surface roughness quantifies the degree of vertical deviations of a shape boundary. To evaluate the performance of the descriptors, we used the database of leaf images with 12 species. Each species consisted of 10 - 20 leaf images and the total number of images were 160. The evaluation showed that the new descriptors successfully discriminated the leaf species. We believe that the descriptors can be a useful tool in the field of pattern recognition.
Entropy based statistical inference for methane emissions released from wetland
Czech Academy of Sciences Publication Activity Database
Sabolová, R.; Sečkárová, Vladimíra; Dušek, Jiří; Stehlík, M.
2015-01-01
Roč. 141, č. 1 (2015), s. 125-133 ISSN 0169-7439 R&D Projects: GA ČR GA13-13502S; GA ČR(CZ) GAP504/11/1151; GA MŠk(CZ) ED1.1.00/02.0073 Grant - others:GA ČR(CZ) GA201/12/0083; GA UK(CZ) SVV 2014-260105 Institutional support: RVO:67985556 ; RVO:67179843 Keywords : chaos * entropy * Kullback-Leibler divergence * Pareto distribution * saddlepoint approaximation * wetland ecosystem Subject RIV: BB - Applied Statistics, Operational Research; EH - Ecology, Behaviour (UEK-B) Impact factor: 2.217, year: 2015 http://library.utia.cas.cz/separaty/2014/AS/seckarova-0438651.pdf
A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy
Directory of Open Access Journals (Sweden)
Ge Jianjun
2017-12-01
Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.
International Nuclear Information System (INIS)
He, Z J; Zhang, X L; Chen, X F
2012-01-01
Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.
Entropy Based Analysis of DNS Query Traffic in the Campus Network
Directory of Open Access Journals (Sweden)
Dennis Arturo Ludeña Romaña
2008-10-01
Full Text Available We carried out the entropy based study on the DNS query traffic from the campus network in a university through January 1st, 2006 to March 31st, 2007. The results are summarized, as follows: (1 The source IP addresses- and query keyword-based entropies change symmetrically in the DNS query traffic from the outside of the campus network when detecting the spam bot activity on the campus network. On the other hand (2, the source IP addresses- and query keywordbased entropies change similarly each other when detecting big DNS query traffic caused by prescanning or distributed denial of service (DDoS attack from the campus network. Therefore, we can detect the spam bot and/or DDoS attack bot by only watching DNS query access traffic.
Entropy-Based Registration of Point Clouds Using Terrestrial Laser Scanning and Smartphone GPS
Directory of Open Access Journals (Sweden)
Maolin Chen
2017-01-01
Full Text Available Automatic registration of terrestrial laser scanning point clouds is a crucial but unresolved topic that is of great interest in many domains. This study combines terrestrial laser scanner with a smartphone for the coarse registration of leveled point clouds with small roll and pitch angles and height differences, which is a novel sensor combination mode for terrestrial laser scanning. The approximate distance between two neighboring scan positions is firstly calculated with smartphone GPS coordinates. Then, 2D distribution entropy is used to measure the distribution coherence between the two scans and search for the optimal initial transformation parameters. To this end, we propose a method called Iterative Minimum Entropy (IME to correct initial transformation parameters based on two criteria: the difference between the average and minimum entropy and the deviation from the minimum entropy to the expected entropy. Finally, the presented method is evaluated using two data sets that contain tens of millions of points from panoramic and non-panoramic, vegetation-dominated and building-dominated cases and can achieve high accuracy and efficiency.
Monte Carlo comparison of four normality tests using different entropy estimates
Czech Academy of Sciences Publication Activity Database
Esteban, M. D.; Castellanos, M. E.; Morales, D.; Vajda, Igor
2001-01-01
Roč. 30, č. 4 (2001), s. 761-785 ISSN 0361-0918 R&D Projects: GA ČR GA102/99/1137 Institutional research plan: CEZ:AV0Z1075907 Keywords : test of normality * entropy test and entropy estimator * table of critical values Subject RIV: BD - Theory of Information Impact factor: 0.153, year: 2001
Entropy-Based Modeling of Velocity Lag in Sediment-Laden Open Channel Turbulent Flow
Directory of Open Access Journals (Sweden)
Manotosh Kumbhakar
2016-08-01
Full Text Available In the last few decades, a wide variety of instruments with laser-based techniques have been developed that enable experimentally measuring particle velocity and fluid velocity separately in particle-laden flow. Experiments have revealed that stream-wise particle velocity is different from fluid velocity, and this velocity difference is commonly known as “velocity lag” in the literature. A number of experimental, as well as theoretical investigations have been carried out to formulate deterministic mathematical models of velocity lag, based on several turbulent features. However, a probabilistic study of velocity lag does not seem to have been reported, to the best of our knowledge. The present study therefore focuses on the modeling of velocity lag in open channel turbulent flow laden with sediment using the entropy theory along with a hypothesis on the cumulative distribution function. This function contains a parameter η, which is shown to be a function of specific gravity, particle diameter and shear velocity. The velocity lag model is tested using a wide range of twenty-two experimental runs collected from the literature and is also compared with other models of velocity lag. Then, an error analysis is performed to further evaluate the prediction accuracy of the proposed model, especially in comparison to other models. The model is also able to explain the physical characteristics of velocity lag caused by the interaction between the particles and the fluid.
Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method
Directory of Open Access Journals (Sweden)
Majid Shadman Roodposhti
2016-09-01
Full Text Available Assessing Landslide Susceptibility Mapping (LSM contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70% and testing (≈30% samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC curves in combination with area under the curve (AUC. The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area.
Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis
Directory of Open Access Journals (Sweden)
Jianping Li
2013-12-01
Full Text Available What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer entropy matrix using bank stock price sequences. This paper also evaluates the stability of Chinese banking system by simulating the risk contagion process. This paper contributes to the literature on interbank contagion mainly in two ways: it establishes a convincing connection between interbank market and transfer entropy, and exploits the market information (stock price rather than presumptions to determine the interbank exposure matrix. Second, the empirical analysis provides an in depth understanding of the stability of the current Chinese banking system.
Tuck, Adrian F
2017-09-07
There is no widely agreed definition of entropy, and consequently Gibbs energy, in open systems far from equilibrium. One recent approach has sought to formulate an entropy and Gibbs energy based on observed scale invariances in geophysical variables, particularly in atmospheric quantities, including the molecules constituting stratospheric chemistry. The Hamiltonian flux dynamics of energy in macroscopic open nonequilibrium systems maps to energy in equilibrium statistical thermodynamics, and corresponding equivalences of scale invariant variables with other relevant statistical mechanical variables such as entropy, Gibbs energy, and 1/(k Boltzmann T), are not just formally analogous but are also mappings. Three proof-of-concept representative examples from available adequate stratospheric chemistry observations-temperature, wind speed and ozone-are calculated, with the aim of applying these mappings and equivalences. Potential applications of the approach to scale invariant observations from the literature, involving scales from molecular through laboratory to astronomical, are considered. Theoretical support for the approach from the literature is discussed.
DEFF Research Database (Denmark)
Olsen, Lars Rønn; Zhang, Guang Lan; Keskin, Derin B.
2011-01-01
Broad coverage of the pathogen population is particularly important when designing CD8+ T-cell epitope vaccines against viral pathogens. Traditional approaches are based on combinations of highly conserved T-cell epitopes. Peptide block entropy analysis is a novel approach for assembling sets of ...
Activity-Based Approach for Teaching Aqueous Solubility, Energy, and Entropy
Eisen, Laura; Marano, Nadia; Glazier, Samantha
2014-01-01
We describe an activity-based approach for teaching aqueous solubility to introductory chemistry students that provides a more balanced presentation of the roles of energy and entropy in dissolution than is found in most general chemistry textbooks. In the first few activities, students observe that polar substances dissolve in water, whereas…
A Text Steganographic System Based on Word Length Entropy Rate
Directory of Open Access Journals (Sweden)
Francis Xavier Kofi Akotoye
2017-10-01
Full Text Available The widespread adoption of electronic distribution of material is accompanied by illicit copying and distribution. This is why individuals, businesses and governments have come to think of how to protect their work, prevent such illicit activities and trace the distribution of a document. It is in this context that a lot of attention is being focused on steganography. Implementing steganography in text document is not an easy undertaking considering the fact that text document has very few places in which to embed hidden data. Any minute change introduced to text objects can easily be noticed thus attracting attention from possible hackers. This study investigates the possibility of embedding data in text document by employing the entropy rate of the constituent characters of words not less than four characters long. The scheme was used to embed bits in text according to the alphabetic structure of the words, the respective characters were compared with their neighbouring characters and if the first character was alphabetically lower than the succeeding character according to their ASCII codes, a zero bit was embedded otherwise 1 was embedded after the characters had been transposed. Before embedding, the secret message was encrypted with a secret key to add a layer of security to the secret message to be embedded, and then a pseudorandom number was generated from the word counts of the text which was used to paint the starting point of the embedding process. The embedding capacity of the scheme was relatively high compared with the space encoding and semantic method.
A Real-Time Analysis Method for Pulse Rate Variability Based on Improved Basic Scale Entropy
Directory of Open Access Journals (Sweden)
Yongxin Chou
2017-01-01
Full Text Available Base scale entropy analysis (BSEA is a nonlinear method to analyze heart rate variability (HRV signal. However, the time consumption of BSEA is too long, and it is unknown whether the BSEA is suitable for analyzing pulse rate variability (PRV signal. Therefore, we proposed a method named sliding window iterative base scale entropy analysis (SWIBSEA by combining BSEA and sliding window iterative theory. The blood pressure signals of healthy young and old subjects are chosen from the authoritative international database MIT/PhysioNet/Fantasia to generate PRV signals as the experimental data. Then, the BSEA and the SWIBSEA are used to analyze the experimental data; the results show that the SWIBSEA reduces the time consumption and the buffer cache space while it gets the same entropy as BSEA. Meanwhile, the changes of base scale entropy (BSE for healthy young and old subjects are the same as that of HRV signal. Therefore, the SWIBSEA can be used for deriving some information from long-term and short-term PRV signals in real time, which has the potential for dynamic PRV signal analysis in some portable and wearable medical devices.
On the Entropy Based Associative Memory Model with Higher-Order Correlations
Directory of Open Access Journals (Sweden)
Masahiro Nakagawa
2010-01-01
Full Text Available In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval process. In the present approach, the updating dynamics will be constructed on the basis of the entropy minimization strategy which may be reduced asymptotically to the above-mentioned conventional dynamics as a special case ignoring the higher-order correlations. According to the introduction of the entropy functional, one may involve higer-order correlation effects between neurons in a self-contained manner without any heuristic coupling coefficients as in the conventional manner. In fact we shall show such higher order coupling tensors are to be uniquely determined in the framework of the entropy based approach. From numerical results, it will be found that the presently proposed novel approach realizes much larger memory capacity than that of the quadratic Lyapunov functional approach, e.g., associatron.
A Weighted Belief Entropy-Based Uncertainty Measure for Multi-Sensor Data Fusion.
Tang, Yongchuan; Zhou, Deyun; Xu, Shuai; He, Zichang
2017-04-22
In real applications, how to measure the uncertain degree of sensor reports before applying sensor data fusion is a big challenge. In this paper, in the frame of Dempster-Shafer evidence theory, a weighted belief entropy based on Deng entropy is proposed to quantify the uncertainty of uncertain information. The weight of the proposed belief entropy is based on the relative scale of a proposition with regard to the frame of discernment (FOD). Compared with some other uncertainty measures in Dempster-Shafer framework, the new measure focuses on the uncertain information represented by not only the mass function, but also the scale of the FOD, which means less information loss in information processing. After that, a new multi-sensor data fusion approach based on the weighted belief entropy is proposed. The rationality and superiority of the new multi-sensor data fusion method is verified according to an experiment on artificial data and an application on fault diagnosis of a motor rotor.
A Real-Time Analysis Method for Pulse Rate Variability Based on Improved Basic Scale Entropy
Zhang, Ruilei; Feng, Yufeng; Lu, Zhenli
2017-01-01
Base scale entropy analysis (BSEA) is a nonlinear method to analyze heart rate variability (HRV) signal. However, the time consumption of BSEA is too long, and it is unknown whether the BSEA is suitable for analyzing pulse rate variability (PRV) signal. Therefore, we proposed a method named sliding window iterative base scale entropy analysis (SWIBSEA) by combining BSEA and sliding window iterative theory. The blood pressure signals of healthy young and old subjects are chosen from the authoritative international database MIT/PhysioNet/Fantasia to generate PRV signals as the experimental data. Then, the BSEA and the SWIBSEA are used to analyze the experimental data; the results show that the SWIBSEA reduces the time consumption and the buffer cache space while it gets the same entropy as BSEA. Meanwhile, the changes of base scale entropy (BSE) for healthy young and old subjects are the same as that of HRV signal. Therefore, the SWIBSEA can be used for deriving some information from long-term and short-term PRV signals in real time, which has the potential for dynamic PRV signal analysis in some portable and wearable medical devices. PMID:29065639
Entropy of the Mixture of Sources and Entropy Dimension
Smieja, Marek; Tabor, Jacek
2011-01-01
We investigate the problem of the entropy of the mixture of sources. There is given an estimation of the entropy and entropy dimension of convex combination of measures. The proof is based on our alternative definition of the entropy based on measures instead of partitions.
Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model
Directory of Open Access Journals (Sweden)
Weiying Wang
2014-01-01
Full Text Available Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.
Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.
Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei
2014-01-01
Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.
Shi, Bin; Jiang, Jiping; Sivakumar, Bellie; Zheng, Yi; Wang, Peng
2018-05-01
Field monitoring strategy is critical for disaster preparedness and watershed emergency environmental management. However, development of such is also highly challenging. Despite the efforts and progress thus far, no definitive guidelines or solutions are available worldwide for quantitatively designing a monitoring network in response to river chemical spill incidents, except general rules based on administrative divisions or arbitrary interpolation on routine monitoring sections. To address this gap, a novel framework for spatial-temporal network design was proposed in this study. The framework combines contaminant transport modelling with discrete entropy theory and spectral analysis. The water quality model was applied to forecast the spatio-temporal distribution of contaminant after spills and then corresponding information transfer indexes (ITIs) and Fourier approximation periodic functions were estimated as critical measures for setting sampling locations and times. The results indicate that the framework can produce scientific preparedness plans of emergency monitoring based on scenario analysis of spill risks as well as rapid design as soon as the incident happened but not prepared. The framework was applied to a hypothetical spill case based on tracer experiment and a real nitrobenzene spill incident case to demonstrate its suitability and effectiveness. The newly-designed temporal-spatial monitoring network captured major pollution information at relatively low costs. It showed obvious benefits for follow-up early-warning and treatment as well as for aftermath recovery and assessment. The underlying drivers of ITIs as well as the limitations and uncertainty of the approach were analyzed based on the case studies. Comparison with existing monitoring network design approaches, management implications, and generalized applicability were also discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.
Zhao, Hui; Qu, Weilu; Qiu, Weiting
2018-03-01
In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.
COLLAGE-BASED INVERSE PROBLEMS FOR IFSM WITH ENTROPY MAXIMIZATION AND SPARSITY CONSTRAINTS
Directory of Open Access Journals (Sweden)
Herb Kunze
2013-11-01
Full Text Available We consider the inverse problem associated with IFSM: Given a target function f, find an IFSM, such that its invariant fixed point f is sufficiently close to f in the Lp distance. In this paper, we extend the collage-based method developed by Forte and Vrscay (1995 along two different directions. We first search for a set of mappings that not only minimizes the collage error but also maximizes the entropy of the dynamical system. We then include an extra term in the minimization process which takes into account the sparsity of the set of mappings. In this new formulation, the minimization of collage error is treated as multi-criteria problem: we consider three different and conflicting criteria i.e., collage error, entropy and sparsity. To solve this multi-criteria program we proceed by scalarization and we reduce the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented. Numerical studies indicate that a maximum entropy principle exists for this approximation problem, i.e., that the suboptimal solutions produced by collage coding can be improved at least slightly by adding a maximum entropy criterion.
Energy Technology Data Exchange (ETDEWEB)
Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)
2017-06-15
A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.
Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy
Yujun, Yang; Jianping, Li; Yimei, Yang
This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.
Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis
Chen, Lu; Singh, Vijay P.
2018-02-01
Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.
A quantile-based study of cumulative residual Tsallis entropy measures
Sunoj, S. M.; Krishnan, Aswathy S.; Sankaran, P. G.
2018-03-01
In the present paper we introduce a quantile-based cumulative residual Tsallis entropy (CRTE) and quantile-based CRTE for order statistics. Unlike the cumulative residual Tsallis entropy measures in the distribution function approach due to Sati and Gupta (2015) and Rajesh and Sunoj (2016) respectively, the corresponding quantile versions possess some unique properties. In many applied works there do not have any tractable distribution functions while the quantile function exists and in such cases the proposed measures become more useful in measuring uncertainty of random variables. We obtain some characterizations for distributions based on the quantile versions of CRTE and derive certain bounds. We also study various properties of quantile-based CRTE for order statistics.
Ai, Yan-Ting; Guan, Jiao-Yue; Fei, Cheng-Wei; Tian, Jing; Zhang, Feng-Ling
2017-05-01
To monitor rolling bearing operating status with casings in real time efficiently and accurately, a fusion method based on n-dimensional characteristic parameters distance (n-DCPD) was proposed for rolling bearing fault diagnosis with two types of signals including vibration signal and acoustic emission signals. The n-DCPD was investigated based on four information entropies (singular spectrum entropy in time domain, power spectrum entropy in frequency domain, wavelet space characteristic spectrum entropy and wavelet energy spectrum entropy in time-frequency domain) and the basic thought of fusion information entropy fault diagnosis method with n-DCPD was given. Through rotor simulation test rig, the vibration and acoustic emission signals of six rolling bearing faults (ball fault, inner race fault, outer race fault, inner-ball faults, inner-outer faults and normal) are collected under different operation conditions with the emphasis on the rotation speed from 800 rpm to 2000 rpm. In the light of the proposed fusion information entropy method with n-DCPD, the diagnosis of rolling bearing faults was completed. The fault diagnosis results show that the fusion entropy method holds high precision in the recognition of rolling bearing faults. The efforts of this study provide a novel and useful methodology for the fault diagnosis of an aeroengine rolling bearing.
EEG-Based Computer Aided Diagnosis of Autism Spectrum Disorder Using Wavelet, Entropy, and ANN.
Djemal, Ridha; AlSharabi, Khalil; Ibrahim, Sutrisno; Alsuwailem, Abdullah
2017-01-01
Autism spectrum disorder (ASD) is a type of neurodevelopmental disorder with core impairments in the social relationships, communication, imagination, or flexibility of thought and restricted repertoire of activity and interest. In this work, a new computer aided diagnosis (CAD) of autism based on electroencephalography (EEG) signal analysis is investigated. The proposed method is based on discrete wavelet transform (DWT), entropy (En), and artificial neural network (ANN). DWT is used to decompose EEG signals into approximation and details coefficients to obtain EEG subbands. The feature vector is constructed by computing Shannon entropy values from each EEG subband. ANN classifies the corresponding EEG signal into normal or autistic based on the extracted features. The experimental results show the effectiveness of the proposed method for assisting autism diagnosis. A receiver operating characteristic (ROC) curve metric is used to quantify the performance of the proposed method. The proposed method obtained promising results tested using real dataset provided by King Abdulaziz Hospital, Jeddah, Saudi Arabia.
Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory.
Zhang, Lichuan; Wang, Tonghao; Zhang, Feihu; Xu, Demin
2017-10-08
Cooperative localization (CL) is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs). In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD) filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers' observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position.
Comparison of Two Entropy Spectral Analysis Methods for Streamflow Forecasting in Northwest China
Directory of Open Access Journals (Sweden)
Zhenghong Zhou
2017-11-01
Full Text Available Monthly streamflow has elements of stochasticity, seasonality, and periodicity. Spectral analysis and time series analysis can, respectively, be employed to characterize the periodical pattern and the stochastic pattern. Both Burg entropy spectral analysis (BESA and configurational entropy spectral analysis (CESA combine spectral analysis and time series analysis. This study compared the predictive performances of BESA and CESA for monthly streamflow forecasting in six basins in Northwest China. Four criteria were selected to evaluate the performances of these two entropy spectral analyses: relative error (RE, root mean square error (RMSE, coefficient of determination (R2, and Nash–Sutcliffe efficiency coefficient (NSE. It was found that in Northwest China, both BESA and CESA forecasted monthly streamflow well with strong correlation. The forecast accuracy of BESA is higher than CESA. For the streamflow with weak correlation, the conclusion is the opposite.
A Novel MADM Approach Based on Fuzzy Cross Entropy with Interval-Valued Intuitionistic Fuzzy Sets
Directory of Open Access Journals (Sweden)
Xin Tong
2015-01-01
Full Text Available The paper presents a novel multiple attribute decision-making (MADM approach for the problem with completely unknown attribute weights in the framework of interval-valued intuitionistic fuzzy sets (IVIFS. First, the fuzzy cross entropy and discrimination degree of IVIFS are defied. Subsequently, based on the discrimination degree of IVIFS, a nonlinear programming model to minimize the total deviation of discrimination degrees between alternatives and the positive ideal solution PIS as well as the negative ideal solution (NIS is constructed to obtain the attribute weights and, then, the weighted discrimination degree. Finally, all the alternatives are ranked according to the relative closeness coefficients using the extended TOPSIS method, and the most desirable alternative is chosen. The proposed approach extends the research method of MADM based on the IVIF cross entropy. Finally, we illustrate the feasibility and validity of the proposed method by two examples.
Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing
Directory of Open Access Journals (Sweden)
James (Jong Hyuk Park
2016-09-01
Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.
Entropy conservative finite element schemes
Tadmor, E.
1986-01-01
The question of entropy stability for discrete approximations to hyperbolic systems of conservation laws is studied. The amount of numerical viscosity present in such schemes is quantified and related to their entropy stability by means of comparison. To this end, two main ingredients are used: entropy variables and the construction of certain entropy conservative schemes in terms of piecewise-linear finite element approximations. It is then shown that conservative schemes are entropy stable, if and (for three-point schemes) only if, they contain more numerical viscosity than the abovementioned entropy conservation ones.
Information Entropy- and Average-Based High-Resolution Digital Storage Oscilloscope
Directory of Open Access Journals (Sweden)
Jun Jiang
2014-01-01
Full Text Available Vertical resolution is an essential indicator of digital storage oscilloscope (DSO and the key to improving resolution is to increase digitalizing bits and lower noise. Averaging is a typical method to improve signal to noise ratio (SNR and the effective number of bits (ENOB. The existing averaging algorithm is apt to be restricted by the repetitiveness of signal and be influenced by gross error in quantization, and therefore its effect on restricting noise and improving resolution is limited. An information entropy-based data fusion and average-based decimation filtering algorithm, proceeding from improving average algorithm and in combination with relevant theories of information entropy, are proposed in this paper to improve the resolution of oscilloscope. For single acquiring signal, resolution is improved through eliminating gross error in quantization by utilizing the maximum entropy of sample data with further noise filtering via average-based decimation after data fusion of efficient sample data under the premise of oversampling. No subjective assumptions and constraints are added to the signal under test in the whole process without any impact on the analog bandwidth of oscilloscope under actual sampling rate.
Comparison of tomography reconstruction by maximum entropy and filtered retro projection
International Nuclear Information System (INIS)
Abdala, F.J.P.; Simpson, D.M.; Roberty, N.C.
1992-01-01
The tomographic reconstruction with few projections is studied, comparing the maximum entropy method with filtered retro projection. Simulations with and without the presence of noise and also with the presence of an object of high density inside of the skull are showed. (C.G.C.)
BRISENT: An Entropy-Based Model for Bridge-Pier Scour Estimation under Complex Hydraulic Scenarios
Directory of Open Access Journals (Sweden)
Alonso Pizarro
2017-11-01
Full Text Available The goal of this paper is to introduce the first clear-water scour model based on both the informational entropy concept and the principle of maximum entropy, showing that a variational approach is ideal for describing erosional processes under complex situations. The proposed bridge–pier scour entropic (BRISENT model is capable of reproducing the main dynamics of scour depth evolution under steady hydraulic conditions, step-wise hydrographs, and flood waves. For the calibration process, 266 clear-water scour experiments from 20 precedent studies were considered, where the dimensionless parameters varied widely. Simple formulations are proposed to estimate BRISENT’s fitting coefficients, in which the ratio between pier-diameter and sediment-size was the most critical physical characteristic controlling scour model parametrization. A validation process considering highly unsteady and multi-peaked hydrographs was carried out, showing that the proposed BRISENT model reproduces scour evolution with high accuracy.
Droplet Size Distribution in Sprays Based on Maximization of Entropy Generation
Directory of Open Access Journals (Sweden)
Meishen Li
2003-12-01
Full Text Available Abstract: The maximum entropy principle (MEP, which has been popular in the modeling of droplet size and velocity distribution in sprays, is, strictly speaking, only applicable for isolated systems in thermodynamic equilibrium; whereas the spray formation processes are irreversible and non-isolated with interaction between the atomizing liquid and its surrounding gas medium. In this study, a new model for the droplet size distribution has been developed based on the thermodynamically consistent concept - the maximization of entropy generation during the liquid atomization process. The model prediction compares favorably with the experimentally measured size distribution for droplets, near the liquid bulk breakup region, produced by an air-blast annular nozzle and a practical gas turbine nozzle. Therefore, the present model can be used to predict the initial droplet size distribution in sprays.
The prediction of engineering cost for green buildings based on information entropy
Liang, Guoqiang; Huang, Jinglian
2018-03-01
Green building is the developing trend in the world building industry. Additionally, construction costs are an essential consideration in building constructions. Therefore, it is necessary to investigate the problems of cost prediction in green building. On the basis of analyzing the cost of green building, this paper proposes the forecasting method of actual cost in green building based on information entropy and provides the forecasting working procedure. Using the probability density obtained from statistical data, such as labor costs, material costs, machinery costs, administration costs, profits, risk costs a unit project quotation and etc., situations can be predicted which lead to cost variations between budgeted cost and actual cost in constructions, through estimating the information entropy of budgeted cost and actual cost. The research results of this article have a practical significance in cost control of green building. Additionally, the method proposed in this article can be generalized and applied to a variety of other aspects in building management.
Rolling Bearing Fault Diagnosis Based on ELCD Permutation Entropy and RVM
Directory of Open Access Journals (Sweden)
Jiang Xingmeng
2016-01-01
Full Text Available Aiming at the nonstationary characteristic of a gear fault vibration signal, a recognition method based on permutation entropy of ensemble local characteristic-scale decomposition (ELCD and relevance vector machine (RVM is proposed. First, the vibration signal was decomposed by ELCD; then a series of intrinsic scale components (ISCs were obtained. Second, according to the kurtosis of ISCs, principal ISCs were selected and then the permutation entropy of principal ISCs was calculated and they were combined into a feature vector. Finally, the feature vectors were input in RVM classifier to train and test and identify the type of rolling bearing faults. Experimental results show that this method can effectively diagnose four kinds of working condition, and the effect is better than local characteristic-scale decomposition (LCD method.
Evaluation of Intensive Construction Land Use in the Emerging City Based on PSR-Entropy model
Jia, Yuanyuan; Lei, Guangyu
2018-01-01
A comprehensive understanding of emerging city land utilization and the evaluation of intensive land use in the Emerging City will provide the comprehensive and reliable technical basis for the planning and management. It is an important node. According to the Han cheng from 2008 to 2016 years of land use, based on PSR-Entropy model of land use evaluation system, using entropy method to determine the index weight, the introduction of comprehensive index method to evaluate the degree of land use. The results show that the intensive land use comprehensive evaluation index of Han cheng increased from 2008 to 2015, but the land intensive use can not achieve the standards. The potential of further enhancing space is relatively large.
An Entropy-Based Adaptive Hybrid Particle Swarm Optimization for Disassembly Line Balancing Problems
Directory of Open Access Journals (Sweden)
Shanli Xiao
2017-11-01
Full Text Available In order to improve the product disassembly efficiency, the disassembly line balancing problem (DLBP is transformed into a problem of searching for the optimum path in the directed and weighted graph by constructing the disassembly hierarchy information graph (DHIG. Then, combining the characteristic of the disassembly sequence, an entropy-based adaptive hybrid particle swarm optimization algorithm (AHPSO is presented. In this algorithm, entropy is introduced to measure the changing tendency of population diversity, and the dimension learning, crossover and mutation operator are used to increase the probability of producing feasible disassembly solutions (FDS. Performance of the proposed methodology is tested on the primary problem instances available in the literature, and the results are compared with other evolutionary algorithms. The results show that the proposed algorithm is efficient to solve the complex DLBP.
Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS
Directory of Open Access Journals (Sweden)
Moshen Kuai
2018-03-01
Full Text Available For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN Adaptive Neuro-fuzzy Inference System (ANFIS in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.
Directory of Open Access Journals (Sweden)
Hujun He
2017-01-01
Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.
Jamasebi, Reza; Redline, Susan; Patel, Sanjay R; Loparo, Kenneth A.
2008-01-01
Study Objectives: We propose a generation of PSG-derived measures that using entropy can quantify temporal patterns of sleep, and investigate the role of these measures as predictors of hypertension. We also investigate the influence of age on these entropy-based measures as compared to traditional indices. Design and Setting: Cross-sectional analyses of the association between hypertension status with traditional PSG and novel measures using adjusted and unadjusted logistic regression models. The novel measures were developed to quantify variability of the arousal event process. Patients or Participants: Analyses were based on a subsample of subjects from the Cleveland Family Study with clearly disparate hypertension status. Measurements and Results: Among traditional PSG indices, the apnea hypopnea index (AHI) has the highest Odds Ratio (unadjusted and adjusted for age, gender, race, BMI: OR = 2.36 (95% CI: 1.48, 3.75, P = 0.0003) and 1.18, (95% CI: 0.76, 1.84, P = 0.46), respectively). The best predictor among the entropy-based measures is derived from analysis of the temporal patterns of arousal duration with unadjusted and adjusted ORs of 1.36 (95% CI: 1.08, 1.71, P = 0.0085) and 2.08 (95% CI: 1.19, 3.64, P = 0.01), respectively. Conclusions: Our findings suggest that when adjusted for common confounders such as age, gender, race, and BMI, the entropy-based features that quantify the variability of the arousal event process are more strongly associated with hypertension as compared to traditional PSG indices; they are not as strongly influenced by age as are the traditional indices. The result implies that the regularity of arousals may be an important feature associated with hypertension. These measures may provide a powerful tool for discriminating individuals at risk for comorbidities, such as hypertension, associated with sleep disturbances. Citation: Jamasebi R; Redline S; Patel SR; Loparo KA. Entropy-based measures of EEG arousals as biomarkers for sleep
Structure of a Global Network of Financial Companies Based on Transfer Entropy
Directory of Open Access Journals (Sweden)
Leonidas Sandoval
2014-08-01
Full Text Available This work uses the stocks of the 197 largest companies in the world, in terms of market capitalization, in the financial area, from 2003 to 2012. We study the causal relationships between them using Transfer Entropy, which is calculated using the stocks of those companies and their counterparts lagged by one day. With this, we can assess which companies influence others according to sub-areas of the financial sector, which are banks, diversified financial services, savings and loans, insurance, private equity funds, real estate investment companies, and real estate trust funds. We also analyze the exchange of information between those stocks as seen by Transfer Entropy and the network formed by them based on this measure, verifying that they cluster mainly according to countries of origin, and then by industry and sub-industry. Then we use data on the stocks of companies in the financial sector of some countries that are suffering the most with the current credit crisis, namely Greece, Cyprus, Ireland, Spain, Portugal, and Italy, and assess, also using Transfer Entropy, which companies from the largest 197 are most affected by the stocks of these countries in crisis. The aim is to map a network of influences that may be used in the study of possible contagions originating in those countries in financial crisis.
Quantitative evaluation method of arc sound spectrum based on sample entropy
Yao, Ping; Zhou, Kang; Zhu, Qiang
2017-08-01
Arc sound analysis is an effective way to evaluate the stability of the arc welding process. Current methods cannot effectively quantify the disorder of the process. By studying the characteristics of the arc sound signal, we found that low frequency random mutation of arc sound power resulted from unstable factors, such as splashes or short circuits, increased the complexity and randomness of the arc sound signals. Then the arc sound signals were visualized on time-frequency interface by means of spectrogram, and it was found that the max power spectral density (PSD) distribution of spectrogram was closely related to the stability of arc welding process. Moreover, a method based on sample entropy was proposed to further quantify the relation. Finally, considering the factors such as averages of max PSD and the standard deviations of sample entropy, a compound quantitative evaluation indicator, arc sound sample entropy (ASSE), which can avoid the influence of different parameters on the quantitative results, was proposed, so that the stability of arc welding process can be quantitatively presented. Testing results showed that the accuracy rate of the method was more than 90 percent.
Directory of Open Access Journals (Sweden)
Sheng Lin
2015-07-01
Full Text Available On the basis of analyzing high-voltage direct current (HVDC transmission system and its fault superimposed circuit, the direction of the fault components of the voltage and the current measured at one end of transmission line is certified to be different for internal faults and external faults. As an estimate of the differences between two signals, relative entropy is an effective parameter for recognizing transient signals in HVDC transmission lines. In this paper, the relative entropy of wavelet energy is applied to distinguish internal fault from external fault. For internal faults, the directions of fault components of voltage and current are opposite at the two ends of the transmission line, indicating a huge difference of wavelet energy relative entropy; for external faults, the directions are identical, indicating a small difference. The simulation results based on PSCAD/EMTDC show that the proposed pilot protection system acts accurately for faults under different conditions, and its performance is not affected by fault type, fault location, fault resistance and noise.
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
Energy Technology Data Exchange (ETDEWEB)
Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel
2017-07-01
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel
2017-07-01
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.
Jenifer, M. Annie; Jha, Madan K.
2017-05-01
Groundwater is a treasured underground resource, which plays a central role in sustainable water management. However, it being hidden and dynamic in nature, its sustainable development and management calls for precise quantification of this precious resource at an appropriate scale. This study demonstrates the efficacy of three GIS-based multi-criteria decision analysis (MCDA) techniques, viz., Analytic Hierarchy Process (AHP), Catastrophe and Entropy in evaluating groundwater potential through a case study in hard-rock aquifer systems. Using satellite imagery and relevant field data, eight thematic layers (rainfall, land slope, drainage density, soil, lineament density, geology, proximity to surface water bodies and elevation) of the factors having significant influence on groundwater occurrence were prepared. These thematic layers and their features were assigned suitable weights based on the conceptual frameworks of AHP, Catastrophe and Entropy techniques and then they were integrated in the GIS environment to generate an integrated raster layer depicting groundwater potential index of the study area. The three groundwater prospect maps thus yielded by these MCDA techniques were verified using a novel approach (concept of 'Dynamic Groundwater Potential'). The validation results revealed that the groundwater potential predicted by the AHP technique has a pronounced accuracy of 87% compared to the Catastrophe (46% accuracy) and Entropy techniques (51% accuracy). It is concluded that the AHP technique is the most reliable for the assessment of groundwater resources followed by the Entropy method. The developed groundwater potential maps can serve as a scientific guideline for the cost-effective siting of wells and the effective planning of groundwater development at a catchment or basin scale.
International Nuclear Information System (INIS)
Clariá, F; Vallverdú, M; Caminal, P; Baranowski, R; Chojnowska, L
2008-01-01
In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time–frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0–0.04 Hz), low frequency band (LF, 0.04–0.15 Hz) and high frequency band (HF, 0.15–0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of
Adjoint entropy vs topological entropy
Giordano Bruno, Anna
2012-01-01
Recently the adjoint algebraic entropy of endomorphisms of abelian groups was introduced and studied. We generalize the notion of adjoint entropy to continuous endomorphisms of topological abelian groups. Indeed, the adjoint algebraic entropy is defined using the family of all finite-index subgroups, while we take only the subfamily of all open finite-index subgroups to define the topological adjoint entropy. This allows us to compare the (topological) adjoint entropy with the known topologic...
Cross entropy-based memetic algorithms: An application study over the tool switching problem
Directory of Open Access Journals (Sweden)
Jhon Edgar Amaya
2013-05-01
Full Text Available This paper presents a parameterized schema for building memetic algorithms based on cross-entropy (CE methods. This novel schema is general in nature, and features multiple probability mass functions and Lamarckian learning. The applicability of the approach is assessed by considering the Tool Switching Problem, a complex combinatorial problem in the field of Flexible Manufacturing Systems. An exhaustive evaluation (including techniques ranging from local search and evolutionary algorithms to constructive methods provides evidence of the effectiveness of CE-based memetic algorithms.
A comparison of pain level and entropy changes following core stability exercise intervention.
Lee, Taero; Kim, Yoon Hyuk; Sung, Paul S
2011-07-01
As reported in our previous studies, the complexity of physiologic time series is a sensitive measure of muscle fatigability. This study compared the differences between 2 different analyses following 4 weeks of core stability exercises (CSE) in subjects with and without chronic low back pain (LBP). We examined whether the observed Shannon (information) entropy, as compared with median frequency (MF), was able to differentiate fatigability of the thoracic and lumbar parts of the erector spinae (ES) muscles following the intervention. In total, 32 subjects participated in this study. There were 13 subjects in the CSE intervention group (average age 50.4 ± 9.1 years) and 19 subjects in the control group (average age 46.6 ± 9.1 years). The CSE group performed the specific exercise intervention, but the control group was asked to maintain their current activity and/or exercise levels. The endurance of the back muscles was determined by using a modified version of the isometric fatigue test as originally introduced by Sorensen. Pain level decreased significantly for all subjects (F=25.29, p=0.001), but there was no difference between groups (F=0.42, p=0.52). The MF was not different between groups following treatment (F=0.81, p=0.37). Although there was no entropy level changes following treatment (F=0.01, p=0.93), the interactions between muscles and groups following treatment were significant (F=7.25, p=0.01). The entropy level decreased in both thoracic ES muscles following intervention in the exercise group, while remaining the same in the control group. Although the change in pain level was not different between groups, the Shannon entropy measure more sensitively differentiated the exercise intervention than did MF. In addition, the results also suggested that complexity is related to muscle fatigue, which corresponds to the values of entropy between groups. Further studies are needed to investigate the effectiveness of nonlinear time series of EMG data for
IP Packet Size Entropy-Based Scheme for Detection of DoS/DDoS Attacks
Du, Ping; Abe, Shunji
Denial of service (DoS) attacks have become one of the most serious threats to the Internet. Enabling detection of attacks in network traffic is an important and challenging task. However, most existing volume-based schemes can not detect short-term attacks that have a minor effect on traffic volume. On the other hand, feature-based schemes are not suitable for real-time detection because of their complicated calculations. In this paper, we develop an IP packet size entropy (IPSE)-based DoS/DDoS detection scheme in which the entropy is markedly changed when traffic is affected by an attack. Through our analysis, we find that the IPSE-based scheme is capable of detecting not only long-term attacks but also short-term attacks that are beyond the volume-based schemes' ability to detect. Moreover, we test our proposal using two typical Internet traffic data sets from DARPA and SINET, and the test results show that the IPSE-based detection scheme can provide detection of DoS/DDoS attacks not only in a local area network (DARPA) and but also in academic backbone network (SINET).
Chaos control of ferroresonance system based on RBF-maximum entropy clustering algorithm
International Nuclear Information System (INIS)
Liu Fan; Sun Caixin; Sima Wenxia; Liao Ruijin; Guo Fei
2006-01-01
With regards to the ferroresonance overvoltage of neutral grounded power system, a maximum-entropy learning algorithm based on radial basis function neural networks is used to control the chaotic system. The algorithm optimizes the object function to derive learning rule of central vectors, and uses the clustering function of network hidden layers. It improves the regression and learning ability of neural networks. The numerical experiment of ferroresonance system testifies the effectiveness and feasibility of using the algorithm to control chaos in neutral grounded system
Ground state of the Hubbard model: a variational approach based on the maximum entropy principle
Energy Technology Data Exchange (ETDEWEB)
Arrachea, L. (Dept. de Fisica, Univ. Nacional de La Plata (Argentina)); Plastino, A. (Dept. de Fisica, Univ. Nacional de La Plata (Argentina)); Canosa, N. (Physik Dept. der Technischen Univ. Muenchen, Garching (Germany)); Rossignoli, R. (Physik Dept. der Technischen Univ. Muenchen, Garching (Germany))
1993-05-17
A variational approach based on maximum entropy considerations is used to approximate the ground state of the Hubbard Hamiltonian. The evaluation of both the ground state energy and the correlation functions is performed with a trial wave function, which is parameterized in terms of a small set of variables associated with the relevant correlation operators of the problem. Results for one-dimensional case are in very good agreement with the exact ones for arbitrary interaction strengths. It is also shown that the method provides us with better evaluations of the ground state energy and correlation functions than those obtained with the Gutzwiller approximation. (orig.)
Ground state of the Hubbard model: a variational approach based on the maximum entropy principle
Arrachea, L.; Canosa, N.; Plastino, A.; Rossignoli, R.
1993-05-01
A variational approach based on maximum entropy considerations is used to approximate the ground state of the Hubbard Hamiltonian. The evaluation of both the ground state energy and the correlation functions is performed with a trial wave function, which is parameterized in terms of a small set of variables associated with the relevant correlation operators of the problem. Results for the one-dimensional case are in very good agreement with the exact ones for arbitrary interaction strengths. It is also shown that the method provides us with better evaluations of the ground state energy and correlation functions than those obtained with the Gutzwiller approximation.
Local curvature entropy-based 3D terrain representation using a comprehensive Quadtree
Chen, Qiyu; Liu, Gang; Ma, Xiaogang; Mariethoz, Gregoire; He, Zhenwen; Tian, Yiping; Weng, Zhengping
2018-05-01
Large scale 3D digital terrain modeling is a crucial part of many real-time applications in geoinformatics. In recent years, the improved speed and precision in spatial data collection make the original terrain data more complex and bigger, which poses challenges for data management, visualization and analysis. In this work, we presented an effective and comprehensive 3D terrain representation based on local curvature entropy and a dynamic Quadtree. The Level-of-detail (LOD) models of significant terrain features were employed to generate hierarchical terrain surfaces. In order to reduce the radical changes of grid density between adjacent LODs, local entropy of terrain curvature was regarded as a measure of subdividing terrain grid cells. Then, an efficient approach was presented to eliminate the cracks among the different LODs by directly updating the Quadtree due to an edge-based structure proposed in this work. Furthermore, we utilized a threshold of local entropy stored in each parent node of this Quadtree to flexibly control the depth of the Quadtree and dynamically schedule large-scale LOD terrain. Several experiments were implemented to test the performance of the proposed method. The results demonstrate that our method can be applied to construct LOD 3D terrain models with good performance in terms of computational cost and the maintenance of terrain features. Our method has already been deployed in a geographic information system (GIS) for practical uses, and it is able to support the real-time dynamic scheduling of large scale terrain models more easily and efficiently.
Directory of Open Access Journals (Sweden)
Rongxing Zhou
2017-10-01
Full Text Available As a new development form for evaluating the regional water resources carrying capacity, forewarning regional water resources of their carrying capacities is an important adjustment and control measure for regional water security management. Up to now, most research on this issue have been qualitative analyses, with a lack of quantitative research. For this reason, an index system for forewarning regional water resources of their carrying capacities and grade standards, has been established in Anhui Province, China, in this paper. Subjective weights of forewarning indices can be calculated using a fuzzy analytic hierarchy process, based on an accelerating genetic algorithm, while objective weights of forewarning indices can be calculated by using a projection pursuit method, based on an accelerating genetic algorithm. These two kinds of weights can be combined into combination weights of forewarning indices, by using the minimum relative information entropy principle. Furthermore, a forewarning model of regional water resources carrying capacity, based on entropy combination weight, is put forward. The model can fully integrate subjective and objective information in the process of forewarning. The results show that the calculation results of the model are reasonable and the method has high adaptability. Therefore, this model is worth studying and popularizing.
A Roller Bearing Fault Diagnosis Method Based on LCD Energy Entropy and ACROA-SVM
Directory of Open Access Journals (Sweden)
HungLinh Ao
2014-01-01
Full Text Available This study investigates a novel method for roller bearing fault diagnosis based on local characteristic-scale decomposition (LCD energy entropy, together with a support vector machine designed using an Artificial Chemical Reaction Optimisation Algorithm, referred to as an ACROA-SVM. First, the original acceleration vibration signals are decomposed into intrinsic scale components (ISCs. Second, the concept of LCD energy entropy is introduced. Third, the energy features extracted from a number of ISCs that contain the most dominant fault information serve as input vectors for the support vector machine classifier. Finally, the ACROA-SVM classifier is proposed to recognize the faulty roller bearing pattern. The analysis of roller bearing signals with inner-race and outer-race faults shows that the diagnostic approach based on the ACROA-SVM and using LCD to extract the energy levels of the various frequency bands as features can identify roller bearing fault patterns accurately and effectively. The proposed method is superior to approaches based on Empirical Mode Decomposition method and requires less time.
EEG entropy measures in anesthesia
Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli
2015-01-01
Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation
Fault Detection for Vibration Signals on Rolling Bearings Based on the Symplectic Entropy Method
Directory of Open Access Journals (Sweden)
Min Lei
2017-11-01
Full Text Available Bearing vibration response studies are crucial for the condition monitoring of bearings and the quality inspection of rotating machinery systems. However, it is still very difficult to diagnose bearing faults, especially rolling element faults, due to the complex, high-dimensional and nonlinear characteristics of vibration signals as well as the strong background noise. A novel nonlinear analysis method—the symplectic entropy (SymEn measure—is proposed to analyze the measured signals for fault monitoring of rolling bearings. The core technique of the SymEn approach is the entropy analysis based on the symplectic principal components. The dynamical characteristics of the rolling bearing data are analyzed using the SymEn method. Unlike other techniques consisting of high-dimensional features in the time-domain, frequency-domain and the empirical mode decomposition (EMD/wavelet-domain, the SymEn approach constructs low-dimensional (i.e., two-dimensional features based on the SymEn estimate. The vibration signals from our experiments and the Case Western Reserve University Bearing Data Center are applied to verify the effectiveness of the proposed method. Meanwhile, it is found that faulty bearings have a great influence on the other normal bearings. To sum up, the results indicate that the proposed method can be used to detect rolling bearing faults.
A Novel Object Tracking Algorithm Based on Compressed Sensing and Entropy of Information
Directory of Open Access Journals (Sweden)
Ding Ma
2015-01-01
Full Text Available Object tracking has always been a hot research topic in the field of computer vision; its purpose is to track objects with specific characteristics or representation and estimate the information of objects such as their locations, sizes, and rotation angles in the current frame. Object tracking in complex scenes will usually encounter various sorts of challenges, such as location change, dimension change, illumination change, perception change, and occlusion. This paper proposed a novel object tracking algorithm based on compressed sensing and information entropy to address these challenges. First, objects are characterized by the Haar (Haar-like and ORB features. Second, the dimensions of computation space of the Haar and ORB features are effectively reduced through compressed sensing. Then the above-mentioned features are fused based on information entropy. Finally, in the particle filter framework, an object location was obtained by selecting candidate object locations in the current frame from the local context neighboring the optimal locations in the last frame. Our extensive experimental results demonstrated that this method was able to effectively address the challenges of perception change, illumination change, and large area occlusion, which made it achieve better performance than existing approaches such as MIL and CT.
Combined Forecasting of Rainfall Based on Fuzzy Clustering and Cross Entropy
Directory of Open Access Journals (Sweden)
Baohui Men
2017-12-01
Full Text Available Rainfall is an essential index to measure drought, and it is dependent upon various parameters including geographical environment, air temperature and pressure. The nonlinear nature of climatic variables leads to problems such as poor accuracy and instability in traditional forecasting methods. In this paper, the combined forecasting method based on data mining technology and cross entropy is proposed to forecast the rainfall with full consideration of the time-effectiveness of historical data. In view of the flaws of the fuzzy clustering method which is easy to fall into local optimal solution and low speed of operation, the ant colony algorithm is adopted to overcome these shortcomings and, as a result, refine the model. The method for determining weights is also improved by using the cross entropy. Besides, the forecast is conducted by analyzing the weighted average rainfall based on Thiessen polygon in the Beijing–Tianjin–Hebei region. Since the predictive errors are calculated, the results show that improved ant colony fuzzy clustering can effectively select historical data and enhance the accuracy of prediction so that the damage caused by extreme weather events like droughts and floods can be greatly lessened and even kept at bay.
An Entropy-Based Multiobjective Evolutionary Algorithm with an Enhanced Elite Mechanism
Directory of Open Access Journals (Sweden)
Yufang Qin
2012-01-01
Full Text Available Multiobjective optimization problem (MOP is an important and challenging topic in the fields of industrial design and scientific research. Multi-objective evolutionary algorithm (MOEA has proved to be one of the most efficient algorithms solving the multi-objective optimization. In this paper, we propose an entropy-based multi-objective evolutionary algorithm with an enhanced elite mechanism (E-MOEA, which improves the convergence and diversity of solution set in MOPs effectively. In this algorithm, an enhanced elite mechanism is applied to guide the direction of the evolution of the population. Specifically, it accelerates the population to approach the true Pareto front at the early stage of the evolution process. A strategy based on entropy is used to maintain the diversity of population when the population is near to the Pareto front. The proposed algorithm is executed on widely used test problems, and the simulated results show that the algorithm has better or comparative performances in convergence and diversity of solutions compared with two state-of-the-art evolutionary algorithms: NSGA-II, SPEA2 and the MOSADE.
Hassan, Mahmoud; Terrien, Jérémy; Marque, Catherine; Karlsson, Brynjar
2011-10-01
Detection of nonlinearity should be the first step before any analysis of nonlinearity or nonlinear behavior in biological signal. The question is which method should be used in each case and which one can best respect the different characteristics of the signals under investigation. In this paper we compare three methods widely used in nonlinearity detection: approximate entropy, correntropy and time reversibility. The false alarm rates with the numbers of surrogates for the three methods were computed on linear, nonlinear stationary and nonlinear nonstationary signals. The results indicate the superiority of time reversibility over the other methods for detecting linearity and nonlinearity in different signal types. The application of time reversibility on uterine electromyographic signal showed very good performance in classifying pregnancy and labor signals. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.
Energy and entropy analysis of closed adiabatic expansion based trilateral cycles
International Nuclear Information System (INIS)
Garcia, Ramon Ferreiro; Carril, Jose Carbia; Gomez, Javier Romero; Gomez, Manuel Romero
2016-01-01
Highlights: • The adiabatic expansion based TC surpass Carnot factor at low temperatures. • The fact of surpassing Carnot factor doesn’t violate the 2nd law. • An entropy analysis is applied to verify the fulfilment of the second law. • Correction of the exergy transfer associated with heat transferred to a cycle. - Abstract: A vast amount of heat energy is available at low cost within the range of medium and low temperatures. Existing thermal cycles cannot make efficient use of such available low grade heat because they are mainly based on conventional organic Rankine cycles which are limited by Carnot constraints. However, recent developments related to the performance of thermal cycles composed of closed processes have led to the exceeding of the Carnot factor. Consequently, once the viability of closed process based thermal cycles that surpass the Carnot factor operating at low and medium temperatures is globally accepted, research work will aim at looking into the consequences that lead from surpassing the Carnot factor while fulfilling the 2nd law, its impact on the 2nd law efficiency definition as well as the impact on the exergy transfer from thermal power sources to any heat consumer, including thermal cycles. The methodology used to meet the proposed objectives involves the analysis of energy and entropy on trilateral closed process based thermal cycles. Thus, such energy and entropy analysis is carried out upon non-condensing mode trilateral thermal cycles (TCs) characterised by the conversion of low grade heat into mechanical work undergoing closed adiabatic path functions: isochoric heat absorption, adiabatic heat to mechanical work conversion and isobaric heat rejection. Firstly, cycle energy analysis is performed to determine the range of some relevant cycle parameters, such as the operating temperatures and their associated pressures, entropies, internal energies and specific volumes. In this way, the ranges of temperatures within which
Constructing a Measurement Method of Differences in Group Preferences Based on Relative Entropy
Directory of Open Access Journals (Sweden)
Shiyu Zhang
2017-01-01
Full Text Available In the research and data analysis of the differences involved in group preferences, conventional statistical methods cannot reflect the integrity and preferences of human minds; in particular, it is difficult to exclude humans’ irrational factors. This paper introduces a preference amount model based on relative entropy theory. A related expansion is made based on the characteristics of the questionnaire data, and we also construct the parameters to measure differences in the data distribution of different groups on the whole. In this paper, this parameter is called the center distance, and it effectively reflects the preferences of human minds. Using the survey data of securities market participants as an example, this paper analyzes differences in market participants’ attitudes toward the effectiveness of securities regulation. Based on this method, differences between groups that were overlooked by analysis of variance are found, and certain aspects obscured by general data characteristics are also found.
Analysis of financial time series using multiscale entropy based on skewness and kurtosis
Xu, Meng; Shang, Pengjian
2018-01-01
There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.
A new qualitative acoustic emission parameter based on Shannon's entropy for damage monitoring
Chai, Mengyu; Zhang, Zaoxiao; Duan, Quan
2018-02-01
An important objective of acoustic emission (AE) non-destructive monitoring is to accurately identify approaching critical damage and to avoid premature failure by means of the evolutions of AE parameters. One major drawback of most parameters such as count and rise time is that they are strongly dependent on the threshold and other settings employed in AE data acquisition system. This may hinder the correct reflection of original waveform generated from AE sources and consequently bring difficulty for the accurate identification of the critical damage and early failure. In this investigation, a new qualitative AE parameter based on Shannon's entropy, i.e. AE entropy is proposed for damage monitoring. Since it derives from the uncertainty of amplitude distribution of each AE waveform, it is independent of the threshold and other time-driven parameters and can characterize the original micro-structural deformations. Fatigue crack growth test on CrMoV steel and three point bending test on a ductile material are conducted to validate the feasibility and effectiveness of the proposed parameter. The results show that the new parameter, compared to AE amplitude, is more effective in discriminating the different damage stages and identifying the critical damage.
Leach, James M.; Coulibaly, Paulin; Guo, Yiping
2016-10-01
This study explores the inclusion of a groundwater recharge based design objective and the impact it has on the design of optimum groundwater monitoring networks. The study was conducted in the Hamilton, Halton, and Credit Valley regions of Ontario, Canada, in which the existing Ontario Provincial Groundwater Monitoring Network was augmented with additional monitoring wells. The Dual Entropy-Multiobjective Optimization (DEMO) model was used in these analyses. The value of using this design objective is rooted in the information contained within the estimated recharge. Recharge requires knowledge of climate, geomorphology, and geology of the area, thus using this objective function can help account for these physical characteristics. Two sources of groundwater recharge data were examined and compared, the first was calculated using the Precipitation-Runoff Modeling System (PRMS), and the second was an aggregation of recharge found using both the PRMS and Hydrological Simulation Program-Fortran (HSP-F). The entropy functions are used to identify optimal trade-offs between the maximum information content and the minimum shared information between the monitoring wells. The recharge objective will help to quantify hydrological characteristics of the vadose zone, and thus provide more information to the optimization algorithm. Results show that by including recharge as a design objective, the spatial coverage of the monitoring network can be improved. The study also highlights the flexibility of DEMO and its ability to incorporate additional design objectives such as the groundwater recharge.
Analysis of alcoholic EEG signals based on horizontal visibility graph entropy.
Zhu, Guohun; Li, Yan; Wen, Peng Paul; Wang, Shuaifang
2014-12-01
This paper proposes a novel horizontal visibility graph entropy (HVGE) approach to evaluate EEG signals from alcoholic subjects and controlled drinkers and compare with a sample entropy (SaE) method. Firstly, HVGEs and SaEs are extracted from 1,200 recordings of biomedical signals, respectively. A statistical analysis method is employed to choose the optimal channels to identify the abnormalities in alcoholics. Five group channels are selected and forwarded to a K-Nearest Neighbour (K-NN) and a support vector machine (SVM) to conduct classification, respectively. The experimental results show that the HVGEs associated with left hemisphere, [Formula: see text]1, [Formula: see text]3 and FC5 electrodes, of alcoholics are significantly abnormal. The accuracy of classification with 10-fold cross-validation is 87.5 [Formula: see text] with about three HVGE features. By using just optimal 13-dimension HVGE features, the accuracy is 95.8 [Formula: see text]. In contrast, SaE features associated cannot identify the left hemisphere disorder for alcoholism and the maximum classification ratio based on SaE is just 95.2 [Formula: see text] even using all channel signals. These results demonstrate that the HVGE method is a promising approach for alcoholism identification by EEG signals.
Entropy-Based Voltage Fault Diagnosis of Battery Systems for Electric Vehicles
Directory of Open Access Journals (Sweden)
Peng Liu
2018-01-01
Full Text Available The battery is a key component and the major fault source in electric vehicles (EVs. Ensuring power battery safety is of great significance to make the diagnosis more effective and predict the occurrence of faults, for the power battery is one of the core technologies of EVs. This paper proposes a voltage fault diagnosis detection mechanism using entropy theory which is demonstrated in an EV with a multiple-cell battery system during an actual operation situation. The preliminary analysis, after collecting and preprocessing the typical data periods from Operation Service and Management Center for Electric Vehicle (OSMC-EV in Beijing, shows that overvoltage fault for Li-ion batteries cell can be observed from the voltage curves. To further locate abnormal cells and predict faults, an entropy weight method is established to calculate the objective weight, which reduces the subjectivity and improves the reliability. The result clearly identifies the abnormity of cell voltage. The proposed diagnostic model can be used for EV real-time diagnosis without laboratory testing methods. It is more effective than traditional methods based on contrastive analysis.
Microscopic insights into the NMR relaxation based protein conformational entropy meter
Kasinath, Vignesh; Sharp, Kim A.; Wand, A. Joshua
2013-01-01
Conformational entropy is a potentially important thermodynamic parameter contributing to protein function. Quantitative measures of conformational entropy are necessary for an understanding of its role but have been difficult to obtain. An empirical method that utilizes changes in conformational dynamics as a proxy for changes in conformational entropy has recently been introduced. Here we probe the microscopic origins of the link between conformational dynamics and conformational entropy using molecular dynamics simulations. Simulation of seven pro! teins gave an excellent correlation with measures of side-chain motion derived from NMR relaxation. The simulations show that the motion of methyl-bearing side-chains are sufficiently coupled to that of other side chains to serve as excellent reporters of the overall side-chain conformational entropy. These results tend to validate the use of experimentally accessible measures of methyl motion - the NMR-derived generalized order parameters - as a proxy from which to derive changes in protein conformational entropy. PMID:24007504
Tsallis Entropy for Geometry Simplification
Directory of Open Access Journals (Sweden)
Miguel Chover
2011-09-01
Full Text Available This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric of a surfaces implification algorithm. We demonstrate that these measures are useful for simplifying three-dimensional polygonal meshes. We have also compared these metrics with the error metrics used in a geometry-based method and in an image-driven method. Quantitative results are presented in the comparison using the root-mean-square error (RMSE.
Directory of Open Access Journals (Sweden)
Yingjun Zhang
2014-01-01
Full Text Available Hesitant fuzzy set has been an important tool in dealing with multiple attribute decision making (MADM problems, especially for the decision making situation when only some values of membership are possible for an alternative on attributes. However, determining attributes weights in hesitant fuzzy MADM is still an open problem. In this paper, we propose an objective weighting approach based on Shannon information entropy, which expresses the relative intensities of attribute importance to signify the average intrinsic information transmitted to the decision maker. Furthermore, we construct a hesitant fuzzy MADM approach based on the TOPSIS method and a weighted correlation coefficient proposed in this paper. Finally, we utilize a supplier selection example to validate the objective attributes weights determining method and the proposed hesitant fuzzy MADM approach.
Nonuniform Sparse Data Clustering Cascade Algorithm Based on Dynamic Cumulative Entropy
Directory of Open Access Journals (Sweden)
Ning Li
2016-01-01
Full Text Available A small amount of prior knowledge and randomly chosen initial cluster centers have a direct impact on the accuracy of the performance of iterative clustering algorithm. In this paper we propose a new algorithm to compute initial cluster centers for k-means clustering and the best number of the clusters with little prior knowledge and optimize clustering result. It constructs the Euclidean distance control factor based on aggregation density sparse degree to select the initial cluster center of nonuniform sparse data and obtains initial data clusters by multidimensional diffusion density distribution. Multiobjective clustering approach based on dynamic cumulative entropy is adopted to optimize the initial data clusters and the best number of the clusters. The experimental results show that the newly proposed algorithm has good performance to obtain the initial cluster centers for the k-means algorithm and it effectively improves the clustering accuracy of nonuniform sparse data by about 5%.
Digital Image Stabilization Method Based on Variational Mode Decomposition and Relative Entropy
Directory of Open Access Journals (Sweden)
Duo Hao
2017-11-01
Full Text Available Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD and relative entropy (RE. In this paper, the global motion vector (GMV is initially decomposed into several narrow-banded modes by VMD. REs, which exhibit the difference of probability distribution between two modes, are then calculated to identify the intentional and jitter motion modes. Finally, the summation of the jitter motion modes constitutes jitter motions, whereas the subtraction of the resulting sum from the GMV represents the intentional motions. The proposed stabilization method is compared with several known methods, namely, medium filter (MF, Kalman filter (KF, wavelet decomposition (MD method, empirical mode decomposition (EMD-based method, and enhanced EMD-based method, to evaluate stabilization performance. Experimental results show that the proposed method outperforms the other stabilization methods.
Directory of Open Access Journals (Sweden)
Yuntao Zhao
2016-01-01
Full Text Available DDoS attacks can prevent legitimate users from accessing the service by consuming resource of the target nodes, whose availability of network and service is exposed to a significant threat. Therefore, DDoS traffic perception is the premise and foundation of the whole system security. In this paper the method of DDoS traffic perception for SOA network based on time united conditional entropy was proposed. According to many-to-one relationship mapping between the source IP address and destination IP addresses of DDoS attacks, traffic characteristics of services are analyzed based on conditional entropy. The algorithm is provided with perception ability of DDoS attacks on SOA services by introducing time dimension. Simulation results show that the novel method can realize DDoS traffic perception with analyzing abrupt variation of conditional entropy in time dimension.
Nonsymmetric entropy and maximum nonsymmetric entropy principle
International Nuclear Information System (INIS)
Liu Chengshi
2009-01-01
Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.
Al-Abadi, Alaa M; Shahid, Shamsuddin
2015-09-01
In this study, index of entropy and catastrophe theory methods were used for demarcating groundwater potential in an arid region using weighted linear combination techniques in geographical information system (GIS) environment. A case study from Badra area in the eastern part of central of Iraq was analyzed and discussed. Six factors believed to have influence on groundwater occurrence namely elevation, slope, aquifer transmissivity and storativity, soil, and distance to fault were prepared as raster thematic layers to facility integration into GIS environment. The factors were chosen based on the availability of data and local conditions of the study area. Both techniques were used for computing weights and assigning ranks vital for applying weighted linear combination approach. The results of application of both modes indicated that the most influential groundwater occurrence factors were slope and elevation. The other factors have relatively smaller values of weights implying that these factors have a minor role in groundwater occurrence conditions. The groundwater potential index (GPI) values for both models were classified using natural break classification scheme into five categories: very low, low, moderate, high, and very high. For validation of generated GPI, the relative operating characteristic (ROC) curves were used. According to the obtained area under the curve, the catastrophe model with 78 % prediction accuracy was found to perform better than entropy model with 77 % prediction accuracy. The overall results indicated that both models have good capability for predicting groundwater potential zones.
Directory of Open Access Journals (Sweden)
Aaron Fong
2013-02-01
Full Text Available Previous theoretical studies of Mislow’s doubly-bridged biphenyl ketone 1 and dihydrodimethylphenanthrene 2 have determined significant entropic contributions to their normal (1 and inverse (2 conformational kinetic isotope effects (CKIEs. To broaden our investigation, we have used density functional methods to characterize the potential energy surfaces and vibrational frequencies for ground and transition structures of additional systems with measured CKIEs, including [2.2]-metaparacyclophane-d (3, 1,1'-binaphthyl (4, 2,2'-dibromo-[1,1'-biphenyl]-4,4'-dicarboxylic acid (5, and the 2-(N,N,N-trimethyl-2'-(N,N-dimethyl-diaminobiphenyl cation (6. We have also computed CKIEs in a number of systems whose experimental CKIEs are unknown. These include analogs of 1 in which the C=O groups have been replaced with CH2 (7, O (8, and S (9 atoms and ring-expanded variants of 2 containing CH2 (10, O (11, S (12, or C=O (13 groups. Vibrational entropy contributes to the CKIEs in all of these systems with the exception of cyclophane 3, whose isotope effect is predicted to be purely enthalpic in origin and whose Bigeleisen-Mayer ZPE term is equivalent to ΔΔ H‡. There is variable correspondence between these terms in the other molecules studied, thus identifying additional examples of systems in which the Bigeleisen-Mayer formalism does not correlate with ΔH/ΔS dissections.
An Entropy-based gene selection method for cancer classification using microarray data
Directory of Open Access Journals (Sweden)
Krishnan Arun
2005-03-01
Full Text Available Abstract Background Accurate diagnosis of cancer subtypes remains a challenging problem. Building classifiers based on gene expression data is a promising approach; yet the selection of non-redundant but relevant genes is difficult. The selected gene set should be small enough to allow diagnosis even in regular clinical laboratories and ideally identify genes involved in cancer-specific regulatory pathways. Here an entropy-based method is proposed that selects genes related to the different cancer classes while at the same time reducing the redundancy among the genes. Results The present study identifies a subset of features by maximizing the relevance and minimizing the redundancy of the selected genes. A merit called normalized mutual information is employed to measure the relevance and the redundancy of the genes. In order to find a more representative subset of features, an iterative procedure is adopted that incorporates an initial clustering followed by data partitioning and the application of the algorithm to each of the partitions. A leave-one-out approach then selects the most commonly selected genes across all the different runs and the gene selection algorithm is applied again to pare down the list of selected genes until a minimal subset is obtained that gives a satisfactory accuracy of classification. The algorithm was applied to three different data sets and the results obtained were compared to work done by others using the same data sets Conclusion This study presents an entropy-based iterative algorithm for selecting genes from microarray data that are able to classify various cancer sub-types with high accuracy. In addition, the feature set obtained is very compact, that is, the redundancy between genes is reduced to a large extent. This implies that classifiers can be built with a smaller subset of genes.
Combined SPHARM-PDM and entropy-based particle systems shape analysis framework.
Paniagua, Beatriz; Bompard, Lucile; Cates, Josh; Whitaker, Ross; Datar, Manasi; Vachet, Clement; Styner, Martin
2012-03-23
The NA-MIC SPHARM-PDM Toolbox represents an automated set of tools for the computation of 3D structural statistical shape analysis. SPHARM-PDM solves the correspondence problem by defining a first order ellipsoid aligned, uniform spherical parameterization for each object with correspondence established at equivalently parameterized points. However, SPHARM correspondence has shown to be inadequate for some biological shapes that are not well described by a uniform spherical parameterization. Entropy-based particle systems compute correspondence by representing surfaces as discrete point sets that does not rely on any inherent parameterization. However, they are sensitive to initialization and have little ability to recover from initial errors. By combining both methodologies we compute reliable correspondences in topologically challenging biological shapes. Diverse subcortical structures cohorts were used, obtained from MR brain images. The SPHARM-PDM shape analysis toolbox was used to compute point based correspondent models that were then used as initializing particles for the entropy-based particle systems. The combined framework was implemented as a stand-alone Slicer3 module, which works as an end-to-end shape analysis module. The combined SPHARM-PDM-Particle framework has demonstrated to improve correspondence in the example dataset over the conventional SPHARM-PDM toolbox. The work presented in this paper demonstrates a two-sided improvement for the scientific community, being able to 1) find good correspondences among spherically topological shapes, that can be used in many morphometry studies 2) offer an end-to-end solution that will facilitate the access to shape analysis framework to users without computer expertise.
Assessment of sustainable urban transport development based on entropy and unascertained measure.
Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie
2017-01-01
To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.
Feature extraction and learning using context cue and Rényi entropy based mutual information
DEFF Research Database (Denmark)
Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping
2015-01-01
Feature extraction and learning play a critical role for visual perception tasks. We focus on improving the robustness of the kernel descriptors (KDES) by embedding context cues and further learning a compact and discriminative feature codebook for feature reduction using Rényi entropy based mutu....... Experimental results show that our method has promising potential for visual object recognition and detection applications....... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...
Directory of Open Access Journals (Sweden)
Rabha W. Ibrahim
2015-07-01
Full Text Available Image splicing is a common operation in image forgery. Different techniques of image splicing detection have been utilized to regain people’s trust. This study introduces a texture enhancement technique involving the use of fractional differential masks based on the Machado entropy. The masks slide over the tampered image, and each pixel of the tampered image is convolved with the fractional mask weight window on eight directions. Consequently, the fractional differential texture descriptors are extracted using the gray-level co-occurrence matrix for image splicing detection. The support vector machine is used as a classifier that distinguishes between authentic and spliced images. Results prove that the achieved improvements of the proposed algorithm are compatible with other splicing detection methods.
Permutation entropy analysis of financial time series based on Hill's diversity number
Zhang, Yali; Shang, Pengjian
2017-12-01
In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.
A Review of Solid-Solution Models of High-Entropy Alloys Based on Ab Initio Calculations
Directory of Open Access Journals (Sweden)
Fuyang Tian
2017-11-01
Full Text Available Similar to the importance of XRD in experiments, ab initio calculations, as a powerful tool, have been applied to predict the new potential materials and investigate the intrinsic properties of materials in theory. As a typical solid-solution material, the large degree of uncertainty of high-entropy alloys (HEAs results in the difficulty of ab initio calculations application to HEAs. The present review focuses on the available ab initio based solid-solution models (virtual lattice approximation, coherent potential approximation, special quasirandom structure, similar local atomic environment, maximum-entropy method, and hybrid Monte Carlo/molecular dynamics and their applications and limits in single phase HEAs.
Kim, Younyoung; Park, Jungdae; Lee, Chaeyoung
2009-10-01
We conducted a simultaneous analysis of candidate genetic loci for their genotypic association with the susceptibility to vascular dementia (VaD) to put forth the best model for predicting genetic susceptibility to VaD. Individual-locus effects and their epistatic effects on susceptibility to VaD were simultaneously assessed by multifactor dimensionality reduction and entropy-based method. The 23 loci in 12 genes were studied in 207 VaD patients and age-matched and sex-matched 207 controls. The multifactor dimensionality reduction analysis revealed that the best single-locus candidate model included angiotensinogen (AGT) Thr235Met with testing accuracy (TA) of 58.31%, the best two-locus candidate model included AGT Thr235Met and transforming growth factor-beta1 Pro10Leu with TA of 58.06%, the best three-locus candidate model was not significant (P>0.05), and the best four-locus candidate model included transforming growth factor-beta1 Pro10Leu, AGT Thr235Met, sterol regulatory element binding protein 2 G34995T, and leukemia inhibitory factor T4524G with TA of 57.13% (P<0.05). The best four-locus model was, however, still in question because of the inconsistent best model selection by cross-validation. Synergistic epistatic effect of the best two-locus model was proven by entropy-based estimation. The best predictor for genetic susceptibility to VaD was the single-locus model of AGT. The best two-locus model reflecting epistasis would be also employed for predicting its susceptibility. Further studies on the epistasis are to elucidate their underlying mechanisms.
Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.
Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro
2015-01-01
Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.
Han, Yongming; Long, Chang; Geng, Zhiqiang; Zhang, Keyu
2018-01-01
Environmental protection and carbon emission reduction play a crucial role in the sustainable development procedure. However, the environmental efficiency analysis and evaluation based on the traditional data envelopment analysis (DEA) cross model is subjective and inaccurate, because all elements in a column or a row of the cross evaluation matrix (CEM) in the traditional DEA cross model are given the same weight. Therefore, this paper proposes an improved environmental DEA cross model based on the information entropy to analyze and evaluate the carbon emission of industrial departments in China. The information entropy is applied to build the entropy distance based on the turbulence of the whole system, and calculate the weights in the CEM of the environmental DEA cross model in a dynamic way. The theoretical results show that the new weight constructed based on the information entropy is unique and optimal globally by using the Monte Carlo simulation. Finally, compared with the traditional environmental DEA and DEA cross model, the improved environmental DEA cross model has a better efficiency discrimination ability based on the data of industrial departments in China. Moreover, the proposed model can obtain the potential of carbon emission reduction of industrial departments to improve the energy efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.
Entropy-based complexity measures for gait data of patients with Parkinson's disease
Afsar, Ozgur; Tirnakli, Ugur; Kurths, Juergen
2016-02-01
Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk predictor such that the sensitivity, i.e., the percentage of patients with PD who are correctly identified as having PD, increases from 25% to 67% while the Hoehn-Yahr stage increases from 2.5 to 3.0 (this stage goes from 0 to 5 as the disease severity increases). The renormalized entropy method for stride time variability of gait is found to correctly identify patients with a sensitivity of 80%, while the Shannon entropy and the Kullback-Leibler relative entropy can do this with a sensitivity of only 26.7% and 13.3%, respectively.
International Nuclear Information System (INIS)
Cao, Guangxi; Zhang, Qi; Li, Qingchen
2017-01-01
Highlights: • Mutual information is used as the edge weights of nodes instead of PCC, which overcomes the shortcomings of linear correlation functions. • SGD turns into a new cluster center and gradually becomes a point connecting the Asian and European clusters during and after the US sub-prime crisis. • Liang's entropy theory, which has not been adopted before in the global foreign exchange market, is considered. - Abstract: The foreign exchange (FX) market is a typical complex dynamic system under the background of exchange rate marketization reform and is an important part of the financial market. This study aims to generate an international FX network based on complex network theory. This study employs the mutual information method to judge the nonlinear characteristics of 54 major currencies in international FX markets. Through this method, we find that the FX network possesses a small average path length and a large clustering coefficient under different thresholds and that it exhibits small-world characteristics as a whole. Results show that the relationship between FX rates is close. Volatility can quickly transfer in the whole market, and the FX volatility of influential individual states transfers at a fast pace and a large scale. The period from July 21, 2005 to March 31, 2015 is subdivided into three sub-periods (i.e., before, during, and after the US sub-prime crisis) to analyze the topology evolution of FX markets using the maximum spanning tree approach. Results show that the USD gradually lost its core position, EUR remained a stable center, and the center of the Asian cluster became unstable. Liang's entropy theory is used to analyze the causal relationship between the four large clusters of the world.
Hypoglycemia-Related Electroencephalogram Changes Assessed by Multiscale Entropy
DEFF Research Database (Denmark)
Fabris, C.; Sparacino, G.; Sejling, A. S.
2014-01-01
physiopathological conditions have never been assessed in hypoglycemia. The present study investigates if properties of the EEG signal measured by nonlinear entropy-based algorithms are altered in a significant manner when a state of hypoglycemia is entered. Subjects and Methods: EEG was acquired from 19 patients...... derivation in the two glycemic intervals was assessed using the multiscale entropy (MSE) approach, obtaining measures of sample entropy (SampEn) at various temporal scales. The comparison of how signal irregularity measured by SampEn varies as the temporal scale increases in the two glycemic states provides...
Lechner, Joseph H.
1999-10-01
This report describes two classroom activities that help students visualize the abstract concept of entropy and apply the second law of thermodynamics to real situations. (i) A sealed "rainbow tube" contains six smaller vessels, each filled with a different brightly colored solution (low entropy). When the tube is inverted, the solutions mix together and react to form an amorphous precipitate (high entropy). The change from low entropy to high entropy is irreversible as long as the tube remains sealed. (ii) When U.S. currency is withdrawn from circulation, intact bills (low entropy) are shredded into small fragments (high entropy). Shredding is quick and easy; the reverse process is clearly nonspontaneous. It is theoretically possible, but it is time-consuming and energy-intensive, to reassemble one bill from a pile that contains fragments of hundreds of bills. We calculate the probability P of drawing pieces of only one specific bill from a mixture containing one pound of bills, each shredded into n fragments. This result can be related to Boltzmann's entropy formula S?=klnW.
Directory of Open Access Journals (Sweden)
Lv Xiaogui
2006-11-01
Full Text Available Abstract Background There are many differences between healthy tissue and growing tumor tissue, including metabolic, structural and thermodynamic differences. Both structural and thermodynamic differences can be used to follow the entropy differences in cancerous and normal tissue. Entropy production is a bilinear form of the rates of irreversible processes and the corresponding "generalized forces". Entropy production due to various dissipation mechanisms based on temperature differences, chemical potential gradient, chemical affinity, viscous stress and exerted force is a promising tool for calculations relating to potential targets for tumor isolation and demarcation. Methods The relative importance of five forms of entropy production was assessed through mathematical estimation. Using our mathematical model we demonstrated that the rate of entropy production by a cancerous cell is always higher than that of a healthy cell apart from the case of the application of external energy. Different rates of entropy production by two kinds of cells influence the direction of entropy flow between the cells. Entropy flow from a cancerous cell to a healthy cell transfers information regarding the cancerous cell and propagates its invasive action to the healthy tissues. To change the direction of entropy flow, in addition to designing certain biochemical pathways to reduce the rate of entropy production by cancerous cells, we suggest supplying external energy to the tumor area, changing the relative rate of entropy production by the two kinds of cells and leading to a higher entropy accumulation in the surrounding normal cells than in the tumorous cells. Conclusion Through the use of mathematical models it was quantitatively demonstrated that when no external force field is applied, the rate of entropy production of cancerous cells is always higher than that of healthy cells. However, when the external energy of square wave electric pulses is applied to
Information Distances versus Entropy Metric
Directory of Open Access Journals (Sweden)
Bo Hu
2017-06-01
Full Text Available Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.
On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy
Directory of Open Access Journals (Sweden)
Mikołaj Morzy
2017-01-01
Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.
Directory of Open Access Journals (Sweden)
Leonid M. Martyushev
2015-06-01
Full Text Available The entropy production (inside the volume bounded by a photosphere of main-sequence stars, subgiants, giants, and supergiants is calculated based on B–V photometry data. A non-linear inverse relationship of thermodynamic fluxes and forces as well as an almost constant specific (per volume entropy production of main-sequence stars (for 95% of stars, this quantity lies within 0.5 to 2.2 of the corresponding solar magnitude is found. The obtained results are discussed from the perspective of known extreme principles related to entropy production.
Entropy Estimation for Optical PUFs Based on Context-Tree Weighting Methods
Tuyls, Pim; Skoric, Boris; Ignatenko, Tanya; Willems, Frans; Schrijen, Geert-Jan
In this chapter we discuss estimation of the secrecy rate of fuzzy sources- more specifically of optical physical unclonable functions (PUFs)-using context-tree weighting (CTW) methods [291]. We show that the entropy of a stationary 2-D source is a limit of a series of conditional entropies [6] and extend this result to the conditional entropy of one 2-D source given another one. Furthermore, we show that the general CTW-method approaches the source entropy also in the 2-D stationary case. Moreover, we generalize Maurer's result [196] to the ergodic case, thus showing that we get realistic estimates of the achievable secrecy rate. Finally, we use these results to estimate the secrecy rate of speckle patterns from optical PUFs.
Detection of SNP effects on feed conversion ratio in pigs based on entropy approach
Directory of Open Access Journals (Sweden)
Henry Reyer
2016-09-01
Full Text Available The objectives of the study were to classify SNPs according to their contribution to the feed conversion ratio and to indicate interactions between the most informative SNPs using entropy analysis. The records of 1296 pigs were included. Two selection criteria for molecular data were applied: call rate 0.95 and minor allele frequency 0.05. After this, 50 951 SNPs were included into the entropy analysis. For each SNP entropy and conditional entropy were estimated. For interaction analyses the most informative SNPs were selected. For each pair of SNPs, the mutual information was assessed. A majority of the loci studied showed relatively small contributions. The most informative SNPs are mainly located on chromosomes: 1, 4, 7, 9 and 14. Whereas important interactions between SNP pairs were detected on chromosomes: 1, 14, 15 and 16. High mutual information was registered for SNPs located nearby.
Entropy-based artificial viscosity stabilization for non-equilibrium Grey Radiation-Hydrodynamics
Energy Technology Data Exchange (ETDEWEB)
Delchini, Marc O., E-mail: delchinm@email.tamu.edu; Ragusa, Jean C., E-mail: jean.ragusa@tamu.edu; Morel, Jim, E-mail: jim.morel@tamu.edu
2015-09-01
The entropy viscosity method is extended to the non-equilibrium Grey Radiation-Hydrodynamic equations. The method employs a viscous regularization to stabilize the numerical solution. The artificial viscosity coefficient is modulated by the entropy production and peaks at shock locations. The added dissipative terms are consistent with the entropy minimum principle. A new functional form of the entropy residual, suitable for the Radiation-Hydrodynamic equations, is derived. We demonstrate that the viscous regularization preserves the equilibrium diffusion limit. The equations are discretized with a standard Continuous Galerkin Finite Element Method and a fully implicit temporal integrator within the MOOSE multiphysics framework. The method of manufactured solutions is employed to demonstrate second-order accuracy in both the equilibrium diffusion and streaming limits. Several typical 1-D radiation-hydrodynamic test cases with shocks (from Mach 1.05 to Mach 50) are presented to establish the ability of the technique to capture and resolve shocks.
Collaborative Performance Research on Multi-level Hospital Management Based on Synergy Entropy-HoQ
Chen, Lei; Liang, Xuedong; Li, Tao
2015-01-01
Because of the general lack of multi-level hospital management collaboration performance effectiveness research, this paper proposes a multi-level hospital management Synergy Entropy-House of Quality (HoQ) Measurement Model by innovatively combining the House of Quality (HoQ) measure model with a Synergy Entropy computing principle. Triangular fuzzy functions are used to determine the importance degree parameter of each hospital management element which combined with the results from the Syne...
Entropy in bimolecular simulations: A comprehensive review of atomic fluctuations-based methods.
Kassem, Summer; Ahmed, Marawan; El-Sheikh, Salah; Barakat, Khaled H
2015-11-01
Entropy of binding constitutes a major, and in many cases a detrimental, component of the binding affinity in biomolecular interactions. While the enthalpic part of the binding free energy is easier to calculate, estimating the entropy of binding is further more complicated. A precise evaluation of entropy requires a comprehensive exploration of the complete phase space of the interacting entities. As this task is extremely hard to accomplish in the context of conventional molecular simulations, calculating entropy has involved many approximations. Most of these golden standard methods focused on developing a reliable estimation of the conformational part of the entropy. Here, we review these methods with a particular emphasis on the different techniques that extract entropy from atomic fluctuations. The theoretical formalisms behind each method is explained highlighting its strengths as well as its limitations, followed by a description of a number of case studies for each method. We hope that this brief, yet comprehensive, review provides a useful tool to understand these methods and realize the practical issues that may arise in such calculations. Copyright © 2015 Elsevier Inc. All rights reserved.
Wu, Jingjing; Wu, Xinming; Li, Pengfei; Li, Nan; Mao, Xiaomei; Chai, Lihe
2017-04-01
Meridian system is not only the basis of traditional Chinese medicine (TCM) method (e.g. acupuncture, massage), but also the core of TCM's basic theory. This paper has introduced a new informational perspective to understand the reality and the holographic field of meridian. Based on maximum information entropy principle (MIEP), a dynamic equation for the holographic field has been deduced, which reflects the evolutionary characteristics of meridian. By using self-organizing artificial neural network as algorithm, the evolutionary dynamic equation of the holographic field can be resolved to assess properties of meridians and clinically diagnose the health characteristics of patients. Finally, through some cases from clinical patients (e.g. a 30-year-old male patient, an apoplectic patient, an epilepsy patient), we use this model to assess the evolutionary properties of meridians. It is proved that this model not only has significant implications in revealing the essence of meridian in TCM, but also may play a guiding role in clinical assessment of patients based on the holographic field of meridians.
Entropy based unsupervised Feature Selection in digital mammogram image using rough set theory.
Velayutham, C; Thangavel, K
2012-01-01
Feature Selection (FS) is a process, which attempts to select features, which are more informative. In the supervised FS methods various feature subsets are evaluated using an evaluation function or metric to select only those features, which are related to the decision classes of the data under consideration. However, for many data mining applications, decision class labels are often unknown or incomplete, thus indicating the significance of unsupervised FS. However, in unsupervised learning, decision class labels are not provided. The problem is that not all features are important. Some of the features may be redundant, and others may be irrelevant and noisy. In this paper, a novel unsupervised FS in mammogram image, using rough set-based entropy measures, is proposed. A typical mammogram image processing system generally consists of mammogram image acquisition, pre-processing of image, segmentation, features extracted from the segmented mammogram image. The proposed method is used to select features from data set, the method is compared with the existing rough set-based supervised FS methods and classification performance of both methods are recorded and demonstrates the efficiency of the method.
Transportation Mode Detection Based on Permutation Entropy and Extreme Learning Machine
Directory of Open Access Journals (Sweden)
Lei Zhang
2015-01-01
Full Text Available With the increasing prevalence of GPS devices and mobile phones, transportation mode detection based on GPS data has been a hot topic in GPS trajectory data analysis. Transportation modes such as walking, driving, bus, and taxi denote an important characteristic of the mobile user. Longitude, latitude, speed, acceleration, and direction are usually used as features in transportation mode detection. In this paper, first, we explore the possibility of using Permutation Entropy (PE of speed, a measure of complexity and uncertainty of GPS trajectory segment, as a feature for transportation mode detection. Second, we employ Extreme Learning Machine (ELM to distinguish GPS trajectory segments of different transportation. Finally, to evaluate the performance of the proposed method, we make experiments on GeoLife dataset. Experiments results show that we can get more than 50% accuracy when only using PE as a feature to characterize trajectory sequence. PE can indeed be effectively used to detect transportation mode from GPS trajectory. The proposed method has much better accuracy and faster running time than the methods based on the other features and SVM classifier.
Directory of Open Access Journals (Sweden)
Tommaso Toffoli
2016-06-01
Full Text Available Here we deconstruct, and then in a reasoned way reconstruct, the concept of “entropy of a system”, paying particular attention to where the randomness may be coming from. We start with the core concept of entropy as a count associated with a description; this count (traditionally expressed in logarithmic form for a number of good reasons is in essence the number of possibilities—specific instances or “scenarios”—that match that description. Very natural (and virtually inescapable generalizations of the idea of description are the probability distribution and its quantum mechanical counterpart, the density operator. We track the process of dynamically updating entropy as a system evolves. Three factors may cause entropy to change: (1 the system’s internal dynamics; (2 unsolicited external influences on it; and (3 the approximations one has to make when one tries to predict the system’s future state. The latter task is usually hampered by hard-to-quantify aspects of the original description, limited data storage and processing resource, and possibly algorithmic inadequacy. Factors 2 and 3 introduce randomness—often huge amounts of it—into one’s predictions and accordingly degrade them. When forecasting, as long as the entropy bookkeping is conducted in an honest fashion, this degradation will always lead to an entropy increase. To clarify the above point we introduce the notion of honest entropy, which coalesces much of what is of course already done, often tacitly, in responsible entropy-bookkeping practice. This notion—we believe—will help to fill an expressivity gap in scientific discourse. With its help, we shall prove that any dynamical system—not just our physical universe—strictly obeys Clausius’s original formulation of the second law of thermodynamics if and only if it is invertible. Thus this law is a tautological property of invertible systems!
Indian Academy of Sciences (India)
It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf that satisfy ∫ f h i d = i for i = 1 , 2 , … , … k the maximizer of entropy is an f 0 that is proportional to exp ( ∑ c i h i ) for some choice of c i . An extension of this to a continuum of ...
Indian Academy of Sciences (India)
It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf that satisfy ∫ f h i d = i for i = 1 , 2 , … , … k the maximizer of entropy is an f 0 that is proportional to exp ( ∑ c i h i ) for some choice of c i . An extension of this to a continuum of ...
Indian Academy of Sciences (India)
Abstract. It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf f that satisfy. ∫ fhi dμ = λi for i = 1, 2,...,...k the maximizer of entropy is an f0 that is pro- portional to exp(. ∑ ci hi ) for some choice of ci . An extension of this to a continuum of.
Yin, Xiang-jun; He, Qing-yong
2015-02-01
To analyze the compatibility regularity of compound traditional Chinese medicine (TCM) patents for treating dyslipidemia, and provide basis for the clinical development and research of new TCM for treating dyslipidemia. Totally 243 compound traditional Chinese medicine patents for treating dyslipidemia were collected from the national patent database from September 1985 to March 2014 and analyzed by using drug frequency, association rules, complex network and entropy method of Traditional Chinese Medicine Inheritance System (V1.1). The commonest single medicine in the treatment of dyslipidemia is Crataegi Fructus 109 (44.86%). The commonest pair medicine is Crataegi Fructus-Salviae Miltiorrhizae Radix et Rhizoma 53 (21.81%). The commonest corner drug is Crataegi Fructus-Cassiae Semen-Polygoni Multiflori Radix 25 (10.29%). The common prescriptions on basis of association rules are Prunellae Spica-->Salviae Miltiorrhizae Radix et Rhizoma (0.833), Rhei Radix et Rhizoma, Alismatis Rhizoma-->Polygoni Multiflori Radix (1.00), Salviae Miltiorrhizae Radix et Rhizoma, Cassiae Semen, Alismatis Rhizoma-->Polygoni Multiflori Radix (0.929). The core drugs based on complex networks are Salviae Miltiorrhizae Radix et Rhizoma and Crataegi Fructus. The new prescriptions extracted by entropy method are Atractylodis Macrocephalae Rhizoma-Glycyrrhizae Radix et Rhizoma-Platycladi Semen-Stephaniae Tetrandrae Radix; Citri Reticulatae Pericarpium-Poria-Coicis Semen-Pinelliae Rhizoma. This study shows the regularity in the compatibility of compound TCM patents treating dyslipidemia, suggesting that future studies on new traditional Chinese medicines treating dyslipidemia should focus on the following six aspects: (1) Single medicine should be preferred: e. g. Crataegi Fructus; (2) Pair medicines should be preferred: e. g. Crataegi Fructus-Salviae Miltiorrhizae Radix et Rhizoma; (3) Corner drugs should be preferred: e. g. Crataegi Fructus, Cassiae Semen, Polygoni Multiflori Radix; (4) The
Naef, Rudolf; Acree, William E
2017-06-25
The calculation of the standard enthalpies of vaporization, sublimation and solvation of organic molecules is presented using a common computer algorithm on the basis of a group-additivity method. The same algorithm is also shown to enable the calculation of their entropy of fusion as well as the total phase-change entropy of liquid crystals. The present method is based on the complete breakdown of the molecules into their constituting atoms and their immediate neighbourhood; the respective calculations of the contribution of the atomic groups by means of the Gauss-Seidel fitting method is based on experimental data collected from literature. The feasibility of the calculations for each of the mentioned descriptors was verified by means of a 10-fold cross-validation procedure proving the good to high quality of the predicted values for the three mentioned enthalpies and for the entropy of fusion, whereas the predictive quality for the total phase-change entropy of liquid crystals was poor. The goodness of fit ( Q ²) and the standard deviation (σ) of the cross-validation calculations for the five descriptors was as follows: 0.9641 and 4.56 kJ/mol ( N = 3386 test molecules) for the enthalpy of vaporization, 0.8657 and 11.39 kJ/mol ( N = 1791) for the enthalpy of sublimation, 0.9546 and 4.34 kJ/mol ( N = 373) for the enthalpy of solvation, 0.8727 and 17.93 J/mol/K ( N = 2637) for the entropy of fusion and 0.5804 and 32.79 J/mol/K ( N = 2643) for the total phase-change entropy of liquid crystals. The large discrepancy between the results of the two closely related entropies is discussed in detail. Molecules for which both the standard enthalpies of vaporization and sublimation were calculable, enabled the estimation of their standard enthalpy of fusion by simple subtraction of the former from the latter enthalpy. For 990 of them the experimental enthalpy-of-fusion values are also known, allowing their comparison with predictions, yielding a correlation coefficient R
Comparison of the role that entropy has played in processes of non-enzymatic and enzymatic catalysis
International Nuclear Information System (INIS)
Dixon Pineda, Manuel Tomas
2012-01-01
The function that entropy has played is compared in processes of non-enzymatic and enzymatic catalysis. The processes followed are showed: the kinetics of the acid hydrolysis of 3-pentyl acetate and cyclopentyl acetate catalyzed by hydrochloric acid and enzymatic hydrolysis of ethyl acetate and γ-butyrolactone catalyzed by pig liver esterase. The activation parameters of Eyring were determined for each process and interpreted the contribution of the entropy of activation for catalysis in this type of model reactions. (author) [es
A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method
Energy Technology Data Exchange (ETDEWEB)
Huang, Shengzhi; Ming, Bo; Huang, Qiang; Leng, Guoyong; Hou, Beibei
2017-05-05
It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecasting models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.
Analysis of QCD sum rule based on the maximum entropy method
International Nuclear Information System (INIS)
Gubler, Philipp
2012-01-01
QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)
Risk assessment of security systems based on entropy theory and the Neyman–Pearson criterion
International Nuclear Information System (INIS)
Lv, Haitao; Yin, Chao; Cui, Zongmin; Zhan, Qin; Zhou, Hongbo
2015-01-01
For a security system, the risk assessment is an important method to verdict whether its protection effectiveness is good or not. In this paper, a security system is regarded abstractly as a network by the name of a security network. A security network is made up of security nodes that are abstract functional units with the ability of detecting, delaying and responding. By the use of risk entropy and the Neyman–Pearson criterion, we construct a model to computer the protection probability of any position in the area where a security network is deployed. We provide a solution to find the most vulnerable path of a security network and the protection probability on the path is considered as the risk measure. Finally, we study the effect of some parameters on the risk and the breach protection probability of a security network. Ultimately, we can gain insight about the risk assessment of a security system. - Highlights: • A security system is regarded abstractly as a network made up of security nodes. • We construct a model to computer the protection probability provided by a security network. • We provide a better solution to find the most vulnerable path of a security network. • We build a risk assessment model for a security network based on the most vulnerable path
Shi, Yan-ting; Liu, Jie; Wang, Peng; Zhang, Xu-nuo; Wang, Jun-qiang; Guo, Liang
2017-05-01
With the implementation of water environment management in key basins in China, the monitoring and evaluation system of basins are in urgent need of innovation and upgrading. In view of the heavy workload of existing evaluation methods and the cumbersome calculation of multi-factor weighting method, the idea of using entroy method to assess river health based on aquatic ecological function regionalization was put forward. According to the monitoring data of songhua river in the year of 2011-2015, the entropy weight method was used to calculate the weight of 9 evaluation factors of 29 monitoring sections, and the river health assessment was carried out. In the study area, the river health status of the biodiversity conservation function area (4.111 point) was good, the water conservation function area (3.371 point), the habitat maintenance functional area (3.262 point), the agricultural production maintenance functional area (3.695 point) and the urban supporting functional area (3.399 point) was light pollution.
Directory of Open Access Journals (Sweden)
Rong Jiang
2015-04-01
Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.
An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging
Directory of Open Access Journals (Sweden)
Pistorius Stephen
2010-01-01
Full Text Available During the last forty years, Subsurface Radar (SR has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI and Breast Microwave Radar (BMR, the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.
An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging
Flores-Tapia, Daniel; Pistorius, Stephen
2010-12-01
During the last forty years, Subsurface Radar (SR) has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI) and Breast Microwave Radar (BMR), the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.
Directory of Open Access Journals (Sweden)
Jinkyu Kim
Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.
Information Entropy-Based Metrics for Measuring Emergences in Artificial Societies
Directory of Open Access Journals (Sweden)
Mingsheng Tang
2014-08-01
Full Text Available Emergence is a common phenomenon, and it is also a general and important concept in complex dynamic systems like artificial societies. Usually, artificial societies are used for assisting in resolving several complex social issues (e.g., emergency management, intelligent transportation system with the aid of computer science. The levels of an emergence may have an effect on decisions making, and the occurrence and degree of an emergence are generally perceived by human observers. However, due to the ambiguity and inaccuracy of human observers, to propose a quantitative method to measure emergences in artificial societies is a meaningful and challenging task. This article mainly concentrates upon three kinds of emergences in artificial societies, including emergence of attribution, emergence of behavior, and emergence of structure. Based on information entropy, three metrics have been proposed to measure emergences in a quantitative way. Meanwhile, the correctness of these metrics has been verified through three case studies (the spread of an infectious influenza, a dynamic microblog network, and a flock of birds with several experimental simulations on the Netlogo platform. These experimental results confirm that these metrics increase with the rising degree of emergences. In addition, this article also has discussed the limitations and extended applications of these metrics.
Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh
2013-01-01
The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.
Yan, Zhi Gang; Li, Jun Qing
2017-12-01
The areas of the habitat and bamboo forest, and the size of the giant panda wild population have greatly increased, while habitat fragmentation and local population isolation have also intensified in recent years. Accurate evaluation of ecosystem status of the panda in the giant panda distribution area is important for giant panda conservation. The ecosystems of the distribution area and six mountain ranges were subdivided into habitat and population subsystems based on the hie-rarchical system theory. Using the panda distribution area as the study area and the three national surveys as the time node, the evolution laws of ecosystems were studied using the entropy method, coefficient of variation, and correlation analysis. We found that with continuous improvement, some differences existed in the evolution and present situation of the ecosystems of six mountain ranges could be divided into three groups. Ecosystems classified into the same group showed many commonalities, and difference between the groups was considerable. Problems of habitat fragmentation and local population isolation became more serious, resulting in ecosystem degradation. Individuali-zed ecological protection measures should be formulated and implemented in accordance with the conditions in each mountain system to achieve the best results.
Tang, Xi; Wu, Zheng-Mao; Wu, Jia-Gui; Deng, Tao; Chen, Jian-Jun; Fan, Li; Zhong, Zhu-Qiang; Xia, Guang-Qiong
2015-12-28
Using two mutually coupled semiconductor lasers (MC-SLs) outputs as chaotic entropy sources, a scheme for generating Tbits/s ultra-fast physical random bit (PRB) is demonstrated and analyzed experimentally. Firstly, two entropy sources originating from two chaotic outputs of MC-SLs are obtained in parallel. Secondly, by adopting multiple optimized post-processing methods, two PRB streams with the generation rate of 0.56 Tbits/s are extracted from the two entropy sources and their randomness are verified by using NIST Special Publication 800-22 statistical tests. Through merging the two sets of 0.56 Tbits/s PRB streams by an interleaving operation, a third set of 1.12 Tbits/s PRB stream, which meets all the quality criteria of NIST statistical tests, can be further acquired. Finally, after additionally taking into account the restriction of the min-entropy, the generation rate of two sets of PRB stream from the two entropy sources can still attain 0.48 Tbits/s, and then a third set of merging PRB stream is 0.96 Tbits/s. Moreover, for the sequence length of the order of 10 Gbits, the statistical bias and serial correlation coefficient of three sets of PRB streams are also analyzed.
Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard
2015-05-01
It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high
Directory of Open Access Journals (Sweden)
Kai Yan
2015-01-01
Full Text Available A predictive model for droplet size and velocity distributions of a pressure swirl atomizer has been proposed based on the maximum entropy formalism (MEF. The constraint conditions of the MEF model include the conservation laws of mass, momentum, and energy. The effects of liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio on the droplet size and velocity distributions of a pressure swirl atomizer are investigated. Results show that model based on maximum entropy formalism works well to predict droplet size and velocity distributions under different spray conditions. Liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio have different effects on droplet size and velocity distributions of a pressure swirl atomizer.
Jianliang Liu; Junhai Ma
2013-01-01
With the development of social economy, energy intensity in Shandong has decreased obviously. To solve the issue by taking proper measures, this paper analyze the energy consumption per myriad GDP of 17 cities in Shandong Province based on Entropy Method and rank them according to general energy consumption. It tells the difference of energy consumption in different cities and helps to make measures to solve current energy crisis. This study has certain theory significance and practical appli...
Liu, Jian; Zou, Renling; Zhang, Dongheng; Xu, Xiulin; Hu, Xiufang
2016-06-01
Exercise-induced muscle fatigue is a phenomenon that the maximum voluntary contraction force or power output of muscle is temporarily reduced due to muscular movement.If the fatigue is not treated properly,it will bring about a severe injury to the human body.With multi-channel collection of lower limb surface electromyography signals,this article analyzes the muscle fatigue by adoption of band spectrum entropy method which combined electromyographic signal spectral analysis and nonlinear dynamics.The experimental result indicated that with the increase of muscle fatigue,muscle signal spectrum began to move to low frequency,the energy concentrated,the system complexity came down,and the band spectrum entropy which reflected the complexity was also reduced.By monitoring the entropy,we can measure the degree of muscle fatigue,and provide an indicator to judge fatigue degree for the sports training and clinical rehabilitation training.
Entropy Is Simple, Qualitatively
Lambert, Frank L.
2002-10-01
Qualitatively, entropy is simple. What it is, why it is useful in understanding the behavior of macro systems or of molecular systems is easy to state: Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. The conventional q in qrev/T is the energy dispersed to or from a substance or a system. On a molecular basis, entropy increase means that a system changes from having fewer accessible microstates to having a larger number of accessible microstates. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. The foregoing in no way denies the subtlety or the difficulty presented by entropy in thermodynamics—to first-year students or to professionals. However, as an aid to beginners in their quantitative study of thermodynamics, the qualitative conclusions in this article give students the advantage of a clear bird’s-eye view of why entropy increases in a wide variety of basic cases: a substance going from 0 K to T, phase change, gas expansion, mixing of ideal gases or liquids, colligative effects, and the Gibbs equation. See Letter re: this article.
Renormalized entanglement entropy
Energy Technology Data Exchange (ETDEWEB)
Taylor, Marika; Woodhead, William [Mathematical Sciences and STAG Research Centre, University of Southampton,Highfield, Southampton, SO17 1BJ (United Kingdom)
2016-08-29
We develop a renormalization method for holographic entanglement entropy based on area renormalization of entangling surfaces. The renormalized entanglement entropy is derived for entangling surfaces in asymptotically locally anti-de Sitter spacetimes in general dimensions and for entangling surfaces in four dimensional holographic renormalization group flows. The renormalized entanglement entropy for disk regions in AdS{sub 4} spacetimes agrees precisely with the holographically renormalized action for AdS{sub 4} with spherical slicing and hence with the F quantity, in accordance with the Casini-Huerta-Myers map. We present a generic class of holographic RG flows associated with deformations by operators of dimension 3/2<Δ<5/2 for which the F quantity increases along the RG flow, hence violating the strong version of the F theorem. We conclude by explaining how the renormalized entanglement entropy can be derived directly from the renormalized partition function using the replica trick i.e. our renormalization method for the entanglement entropy is inherited directly from that of the partition function. We show explicitly how the entanglement entropy counterterms can be derived from the standard holographic renormalization counterterms for asymptotically locally anti-de Sitter spacetimes.
Li, Jimeng; Li, Ming; Zhang, Jinfeng
2017-08-01
Rolling bearings are the key components in the modern machinery, and tough operation environments often make them prone to failure. However, due to the influence of the transmission path and background noise, the useful feature information relevant to the bearing fault contained in the vibration signals is weak, which makes it difficult to identify the fault symptom of rolling bearings in time. Therefore, the paper proposes a novel weak signal detection method based on time-delayed feedback monostable stochastic resonance (TFMSR) system and adaptive minimum entropy deconvolution (MED) to realize the fault diagnosis of rolling bearings. The MED method is employed to preprocess the vibration signals, which can deconvolve the effect of transmission path and clarify the defect-induced impulses. And a modified power spectrum kurtosis (MPSK) index is constructed to realize the adaptive selection of filter length in the MED algorithm. By introducing the time-delayed feedback item in to an over-damped monostable system, the TFMSR method can effectively utilize the historical information of input signal to enhance the periodicity of SR output, which is beneficial to the detection of periodic signal. Furthermore, the influence of time delay and feedback intensity on the SR phenomenon is analyzed, and by selecting appropriate time delay, feedback intensity and re-scaling ratio with genetic algorithm, the SR can be produced to realize the resonance detection of weak signal. The combination of the adaptive MED (AMED) method and TFMSR method is conducive to extracting the feature information from strong background noise and realizing the fault diagnosis of rolling bearings. Finally, some experiments and engineering application are performed to evaluate the effectiveness of the proposed AMED-TFMSR method in comparison with a traditional bistable SR method.
Directory of Open Access Journals (Sweden)
H. T. R. Kurmasha
2017-12-01
Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.
Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng
2017-02-01
To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.
CoFea: A Novel Approach to Spam Review Identification Based on Entropy and Co-Training
Directory of Open Access Journals (Sweden)
Wen Zhang
2016-11-01
Full Text Available With the rapid development of electronic commerce, spam reviews are rapidly growing on the Internet to manipulate online customers’ opinions on goods being sold. This paper proposes a novel approach, called CoFea (Co-training by Features, to identify spam reviews, based on entropy and the co-training algorithm. After sorting all lexical terms of reviews by entropy, we produce two views on the reviews by dividing the lexical terms into two subsets. One subset contains odd-numbered terms and the other contains even-numbered terms. Using SVM (support vector machine as the base classifier, we further propose two strategies, CoFea-T and CoFea-S, embedded with the CoFea approach. The CoFea-T strategy uses all terms in the subsets for spam review identification by SVM. The CoFea-S strategy uses a predefined number of terms with small entropy for spam review identification by SVM. The experiment results show that the CoFea-T strategy produces better accuracy than the CoFea-S strategy, while the CoFea-S strategy saves more computing time than the CoFea-T strategy with acceptable accuracy in spam review identification.
McDonald, James G.; Groth, Clinton P. T.
2013-09-01
The ability to predict continuum and transition-regime flows by hyperbolic moment methods offers the promise of several advantages over traditional techniques. These methods offer an extended range of physical validity as compared with the Navier-Stokes equations and can be used for the prediction of many non-equilibrium flows with a lower expense than particle-based methods. Also, the hyperbolic first-order nature of the resulting partial differential equations leads to mathematical and numerical advantages. Moment equations generated through an entropy-maximization principle are particularly attractive due to their apparent robustness; however, their application to practical situations involving viscous, heat-conducting gases has been hampered by several issues. Firstly, the lack of closed-form expressions for closing fluxes leads to numerical expense as many integrals of distribution functions must be computed numerically during the course of a flow computation. Secondly, it has been shown that there exist physically realizable moment states for which the entropy-maximizing problem on which the method is based cannot be solved. Following a review of the theory surrounding maximum-entropy moment closures, this paper shows that both of these problems can be addressed in practice, at least for a simplified one-dimensional gas, and that the resulting flow predictions can be surprisingly good. The numerical results described provide significant motivations for the extension of these ideas to the fully three-dimensional case.
Directory of Open Access Journals (Sweden)
Jun Wu
2014-01-01
Full Text Available Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.
Entropy Based Detection of DDoS Attacks in Packet Switching Network Models
Lawniczak, Anna T.; Wu, Hao; di Stefano, Bruno
Distributed denial-of-service (DDoS) attacks are network-wide attacks that cannot be detected or stopped easily. They affect “natural” spatio-temporal packet traffic patterns, i.e. “natural distributions” of packets passing through the routers. Thus, they affect “natural” information entropy profiles, a sort of “fingerprints”, of normal packet traffic. We study if by monitoring information entropy of packet traffic through selected routers one may detect DDoS attacks or anomalous packet traffic in packet switching network (PSN) models. Our simulations show that the considered DDoS attacks of “ping” type cause shifts in information entropy profiles of packet traffic monitored even at small sets of routers and that it is easier to detect these shifts if static routing is used instead of dynamic routing. Thus, network-wide monitoring of information entropy of packet traffic at properly selected routers may provide means for detecting DDoS attacks and other anomalous packet traffics.
Quantum Coherence Quantifiers Based on Rényi α-Relative Entropy
Shao, Lian-He; Li, Yong-Ming; Luo, Yu; Xi, Zheng-Jun
2017-06-01
The resource theories of quantum coherence attract a lot of attention in recent years. Especially, the monotonicity property plays a crucial role here. In this paper we investigate the monotonicity property for the coherence measures induced by the Rényi α-relative entropy, which present in [Phys. Rev. A 94 (2016) 052336]. We show that the Rényi α-relative entropy of coherence does not in general satisfy the monotonicity requirement under the subselection of measurements condition and it also does not satisfy the extension of monotonicity requirement, which presents in [Phys. Rev. A 93 (2016) 032136]. Due to the Rényi α-relative entropy of coherence can act as a coherence monotone quantifier, we examine the trade-off relations between coherence and mixedness. Finally, some properties for the single qubit of Rényi 2-relative entropy of coherence are derived. Supported by by National Natural Science Foundation of China under Grant Nos. 11271237, 11671244, 61671280, and the Higher School Doctoral Subject Foundation of Ministry of Education of China under Grant No. 20130202110001, and Fundamental Research Funds for the Central Universities (GK201502004 and 2016CBY003), and the Academic Leaders and Academic Backbones, Shaanxi Normal University under Grant No. 16QNGG013
Entropy: From Thermodynamics to Hydrology
Directory of Open Access Journals (Sweden)
Demetris Koutsoyiannis
2014-02-01
Full Text Available Some known results from statistical thermophysics as well as from hydrology are revisited from a different perspective trying: (a to unify the notion of entropy in thermodynamic and statistical/stochastic approaches of complex hydrological systems and (b to show the power of entropy and the principle of maximum entropy in inference, both deductive and inductive. The capability for deductive reasoning is illustrated by deriving the law of phase change transition of water (Clausius-Clapeyron from scratch by maximizing entropy in a formal probabilistic frame. However, such deductive reasoning cannot work in more complex hydrological systems with diverse elements, yet the entropy maximization framework can help in inductive inference, necessarily based on data. Several examples of this type are provided in an attempt to link statistical thermophysics with hydrology with a unifying view of entropy.
Xue, Huijun; Liu, Miao; Zhang, Yang; Liang, Fulai; Qi, Fugui; Chen, Fuming; Lv, Hao; Wang, Jianqi; Zhang, Yang
2017-09-30
Ultra-wide band (UWB) radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar.
Directory of Open Access Journals (Sweden)
Huijun Xue
2017-09-01
Full Text Available Ultra-wide band (UWB radar for short-range human target detection is widely used to find and locate survivors in some rescue missions after a disaster. The results of the application of bistatic UWB radar for detecting multi-stationary human targets have shown that human targets close to the radar antennas are very often visible, while those farther from radar antennas are detected with less reliability. In this paper, on account of the significant difference of frequency content between the echo signal of the human target and that of noise in the shadowing region, an algorithm based on wavelet entropy is proposed to detect multiple targets. Our findings indicate that the entropy value of human targets was much lower than that of noise. Compared with the method of adaptive filtering and the energy spectrum, wavelet entropy can accurately detect the person farther from the radar antennas, and it can be employed as a useful tool in detecting multiple targets by bistatic UWB radar.
Tang, Jinjun; Zhang, Shen; Chen, Xinqiang; Liu, Fang; Zou, Yajie
2018-03-01
Understanding Origin-Destination distribution of taxi trips is very important for improving effects of transportation planning and enhancing quality of taxi services. This study proposes a new method based on Entropy-Maximizing theory to model OD distribution in Harbin city using large-scale taxi GPS trajectories. Firstly, a K-means clustering method is utilized to partition raw pick-up and drop-off location into different zones, and trips are assumed to start from and end at zone centers. A generalized cost function is further defined by considering travel distance, time and fee between each OD pair. GPS data collected from more than 1000 taxis at an interval of 30 s during one month are divided into two parts: data from first twenty days is treated as training dataset and last ten days is taken as testing dataset. The training dataset is used to calibrate model while testing dataset is used to validate model. Furthermore, three indicators, mean absolute error (MAE), root mean square error (RMSE) and mean percentage absolute error (MPAE), are applied to evaluate training and testing performance of Entropy-Maximizing model versus Gravity model. The results demonstrate Entropy-Maximizing model is superior to Gravity model. Findings of the study are used to validate the feasibility of OD distribution from taxi GPS data in urban system.
Logarithmic black hole entropy corrections and holographic Renyi entropy
International Nuclear Information System (INIS)
Mahapatra, Subhash
2018-01-01
The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G D 0 . The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)
Farokhi, Saeed; Taghavi, Ray; Keshmiri, Shawn
2015-11-01
Stealth technology is developed for military aircraft to minimize their signatures. The primary attention was focused on radar signature, followed by the thermal and noise signatures of the vehicle. For radar evasion, advanced configuration designs, extensive use of carbon composites and radar-absorbing material, are developed. On thermal signature, mainly in the infra-red (IR) bandwidth, the solution was found in blended rectangular nozzles of high aspect ratio that are shielded from ground detectors. For noise, quiet and calm jets are integrated into vehicles with low-turbulence configuration design. However, these technologies are totally incapable of detecting new generation of revolutionary aircraft. These shall use all electric, distributed, propulsion system that are thermally transparent. In addition, composite skin and non-emitting sensors onboard the aircraft will lead to low signature. However, based on the second-law of thermodynamics, there is no air vehicle that can escape from leaving an entropy trail. Entropy is thus the only inevitable signature of any system, that once measured, can detect the source. By characterizing the entropy field based on its statistical properties, the source may be recognized, akin to face recognition technology. Direct measurement of entropy is cumbersome, however as a derived property, it can be easily measured. The measurement accuracy depends on the probe design and the sensors onboard. One novel air data sensor suite is introduced with promising potential to capture the entropy trail.
Lim, Jongil; Kwon, Ji Young; Song, Juhee; Choi, Hosoon; Shin, Jong Chul; Park, In Yang
2014-02-01
The interpretation of the fetal heart rate (FHR) signal considering labor progression may improve perinatal morbidity and mortality. However, there have been few studies that evaluate the fetus in each labor stage quantitatively. To evaluate whether the entropy indices of FHR are different according to labor progression. A retrospective comparative study of FHR recordings in three groups: 280 recordings in the second stage of labor before vaginal delivery, 31 recordings in the first stage of labor before emergency cesarean delivery, and 23 recordings in the pre-labor before elective cesarean delivery. The stored FHR recordings of external cardiotocography during labor. Approximate entropy (ApEn) and sample entropy (SampEn) for the final 2000 RR intervals. The median ApEn and SampEn for the 2000 RR intervals showed the lowest values in the second stage of labor, followed by the emergency cesarean group and the elective cesarean group for all time segments (all PEntropy indices of FHR were significantly different according to labor progression. This result supports the necessity of considering labor progression when developing intrapartum fetal monitoring using the entropy indices of FHR. Copyright © 2013 Elsevier Ltd. All rights reserved.
Evaluation of single and multi-threshold entropy-based algorithms for folded substrate analysis
Directory of Open Access Journals (Sweden)
Magdolna Apro
2011-10-01
Full Text Available This paper presents a detailed evaluation of two variants of Maximum Entropy image segmentation algorithm(single and multi-thresholding with respect to their performance on segmenting test images showing folded substrates.The segmentation quality was determined by evaluating values of four different measures: misclassificationerror, modified Hausdorff distance, relative foreground area error and positive-negative false detection ratio. Newnormalization methods were proposed in order to combine all parameters into a unique algorithm evaluation rating.The segmentation algorithms were tested on images obtained by three different digitalisation methods coveringfour different surface textures. In addition, the methods were also tested on three images presenting a perfect fold.The obtained results showed that Multi-Maximum Entropy algorithm is better suited for the analysis of imagesshowing folded substrates.
Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle
Directory of Open Access Journals (Sweden)
Ge Cheng
2016-12-01
Full Text Available Predicting the outcome of National Basketball Association (NBA matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that the model is able to predict the winning team with 74.4% accuracy, outperforming other classical machine learning algorithms that could only afford a maximum prediction accuracy of 70.6% in the experiments that we performed.
Oracle inequalities for SVMs that are based on random entropy numbers
Energy Technology Data Exchange (ETDEWEB)
Steinwart, Ingo [Los Alamos National Laboratory
2009-01-01
In this paper we present a new technique for bounding local Rademacher averages of function classes induced by a loss function and a reproducing kernel Hilbert space (RKHS). At the heart of this technique lies the observation that certain expectations of random entropy numbers can be bounded by the eigenvalues of the integral operator associated to the RKHS. We then work out the details of the new technique by establishing two new oracle inequalities for SVMs, which complement and generalize orevious results.
Using entropy measures to characterize human locomotion.
Leverick, Graham; Szturm, Tony; Wu, Christine Q
2014-12-01
Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.
Liang, Xuedong; Si, Dongyang; Zhang, Xinli
2017-10-13
According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.
International Nuclear Information System (INIS)
Zambon, Ilaria; Colantoni, Andrea; Carlucci, Margherita; Morrow, Nathan; Sateriano, Adele; Salvati, Luca
2017-01-01
Land Degradation (LD) in socio-environmental systems negatively impacts sustainable development paths. This study proposes a framework to LD evaluation based on indicators of diversification in the spatial distribution of sensitive land. We hypothesize that conditions for spatial heterogeneity in a composite index of land sensitivity are more frequently associated to areas prone to LD than spatial homogeneity. Spatial heterogeneity is supposed to be associated with degraded areas that act as hotspots for future degradation processes. A diachronic analysis (1960–2010) was performed at the Italian agricultural district scale to identify environmental factors associated with spatial heterogeneity in the degree of land sensitivity to degradation based on the Environmentally Sensitive Area Index (ESAI). In 1960, diversification in the level of land sensitivity measured using two common indexes of entropy (Shannon's diversity and Pielou's evenness) increased significantly with the ESAI, indicating a high level of land sensitivity to degradation. In 2010, surface area classified as “critical” to LD was the highest in districts with diversification in the spatial distribution of ESAI values, confirming the hypothesis formulated above. Entropy indexes, based on observed alignment with the concept of LD, constitute a valuable base to inform mitigation strategies against desertification. - Highlights: • Spatial heterogeneity is supposed to be associated with degraded areas. • Entropy indexes can inform mitigation strategies against desertification. • Assessing spatial diversification in the degree of land sensitivity to degradation. • Mediterranean rural areas have an evident diversity in agricultural systems. • A diachronic analysis carried out at the Italian agricultural district scale.
Microscopic entropy and nonlocality
International Nuclear Information System (INIS)
Karpov, E.; Ordonets, G.; Petroskij, T.; Prigozhin, I.
2003-01-01
We have obtained a microscopic expression for entropy in terms of H function based on nonunitary Λ transformation which leads from the time evolution as a unitary group to a Markovian dynamics and unifies the reversible and irreversible aspects of quantum mechanics. This requires a new representation outside the Hilbert space. In terms of H, we show the entropy production and the entropy flow during the emission and absorption of radiation by an atom. Analyzing the time inversion experiment, we emphasize the importance of pre- and postcollisional correlations, which break the symmetry between incoming and outgoing waves. We consider the angle dependence of the H function in a three-dimensional situation. A model including virtual transitions is discussed in a subsequent paper
A gravitational entropy proposal
International Nuclear Information System (INIS)
Clifton, Timothy; Tavakol, Reza; Ellis, George F R
2013-01-01
We propose a thermodynamically motivated measure of gravitational entropy based on the Bel–Robinson tensor, which has a natural interpretation as the effective super-energy–momentum tensor of free gravitational fields. The specific form of this measure differs depending on whether the gravitational field is Coulomb-like or wave-like, and reduces to the Bekenstein–Hawking value when integrated over the interior of a Schwarzschild black hole. For scalar perturbations of a Robertson–Walker geometry we find that the entropy goes like the Hubble weighted anisotropy of the gravitational field, and therefore increases as structure formation occurs. This is in keeping with our expectations for the behaviour of gravitational entropy in cosmology, and provides a thermodynamically motivated arrow of time for cosmological solutions of Einstein’s field equations. It is also in keeping with Penrose’s Weyl curvature hypothesis. (paper)
Directory of Open Access Journals (Sweden)
Xiong Luo
2016-07-01
Full Text Available With the recent emergence of wireless sensor networks (WSNs in the cloud computing environment, it is now possible to monitor and gather physical information via lots of sensor nodes to meet the requirements of cloud services. Generally, those sensor nodes collect data and send data to sink node where end-users can query all the information and achieve cloud applications. Currently, one of the main disadvantages in the sensor nodes is that they are with limited physical performance relating to less memory for storage and less source of power. Therefore, in order to avoid such limitation, it is necessary to develop an efficient data prediction method in WSN. To serve this purpose, by reducing the redundant data transmission between sensor nodes and sink node while maintaining the required acceptable errors, this article proposes an entropy-based learning scheme for data prediction through the use of kernel least mean square (KLMS algorithm. The proposed scheme called E-KLMS develops a mechanism to maintain the predicted data synchronous at both sides. Specifically, the kernel-based method is able to adjust the coefficients adaptively in accordance with every input, which will achieve a better performance with smaller prediction errors, while employing information entropy to remove these data which may cause relatively large errors. E-KLMS can effectively solve the tradeoff problem between prediction accuracy and computational efforts while greatly simplifying the training structure compared with some other data prediction approaches. What’s more, the kernel-based method and entropy technique could ensure the prediction effect by both improving the accuracy and reducing errors. Experiments with some real data sets have been carried out to validate the efficiency and effectiveness of E-KLMS learning scheme, and the experiment results show advantages of the our method in prediction accuracy and computational time.
Yuntao Zhao; Hengchi Liu; Yongxin Feng
2016-01-01
DDoS attacks can prevent legitimate users from accessing the service by consuming resource of the target nodes, whose availability of network and service is exposed to a significant threat. Therefore, DDoS traffic perception is the premise and foundation of the whole system security. In this paper the method of DDoS traffic perception for SOA network based on time united conditional entropy was proposed. According to many-to-one relationship mapping between the source IP address and destinati...
Directory of Open Access Journals (Sweden)
YUE Chunyu
2017-03-01
Full Text Available A matching method of space-borne laser altimeter big footprint waveform and terrain based on cross cumulative residual entropy(CCRE is proposed. Firstly, the waveform data and digital surface model(DSM data are projected to the statistics domain, according to the terrain structure information of the waveform, where statistics signal vectors of the two data are in the same dimension. Then, the waveform data and DSM image are matched in the statistics domain with CCRE. Experiments show that the algorithm proposed is effective in waveform and terrain matching, and the matching accuracy is within 1 pixel.
Entropy Bounds and Field Equations
Directory of Open Access Journals (Sweden)
Alessandro Pesci
2015-08-01
Full Text Available For general metric theories of gravity, we compare the approach that describes/derives the field equations of gravity as a thermodynamic identity with the one which looks at them from entropy bounds. The comparison is made through the consideration of the matter entropy flux across (Rindler horizons, studied by making use of the notion of a limiting thermodynamic scale l* of matter, previously introduced in the context of entropy bounds. In doing this: (i a bound for the entropy of any lump of matter with a given energy-momentum tensor Tab is considered, in terms of a quantity, which is independent of the theory of gravity that we use; this quantity is the variation of the Clausius entropy of a suitable horizon when the element of matter crosses it; (ii by making use of the equations of motion of the theory, the same quantity is then expressed as the variation of Wald’s entropy of that horizon (and this leads to a generalized form of the generalized covariant entropy bound, applicable to general diffeomorphism-invariant theories of gravity; and (iii a notion of l* for horizons, as well as an expression for it, is given.
Entropy and the Complexity of Graphs Revisited
Directory of Open Access Journals (Sweden)
Matthias Dehmer
2012-03-01
Full Text Available This paper presents a taxonomy and overview of approaches to the measurement of graph and network complexity. The taxonomy distinguishes between deterministic (e.g., Kolmogorov complexity and probabilistic approaches with a view to placing entropy-based probabilistic measurement in context. Entropy-based measurement is the main focus of the paper. Relationships between the different entropy functions used to measure complexity are examined; and intrinsic (e.g., classical measures and extrinsic (e.g., Körner entropy variants of entropy-based models are discussed in some detail.
Ronglian, Yuan; Mingye, Ai; Qiaona, Jia; Yuxuan, Liu
2018-03-01
Sustainable development is the only way for the development of human society. As an important part of the national economy, the steel industry is an energy-intensive industry and needs to go further for sustainable development. In this paper, we use entropy method and Topsis method to evaluate the development of China’s steel industry during the “12th Five-Year Plan” from four aspects: resource utilization efficiency, main energy and material consumption, pollution status and resource reuse rate. And we also put forward some suggestions for the development of China’s steel industry.
DYNAMIC PARAMETER ESTIMATION BASED ON MINIMUM CROSS-ENTROPY METHOD FOR COMBINING INFORMATION SOURCES
Czech Academy of Sciences Publication Activity Database
Sečkárová, Vladimíra
2015-01-01
Roč. 24, č. 5 (2015), s. 181-188 ISSN 0204-9805. [XVI-th International Summer Conference on Probability and Statistics (ISCPS-2014). Pomorie, 21.6.-29.6.2014] R&D Projects: GA ČR GA13-13502S Grant - others:GA UK(CZ) SVV 260225/2015 Institutional support: RVO:67985556 Keywords : minimum cross- entropy principle * Kullback-Leibler divergence * dynamic diffusion estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/seckarova-0445817.pdf
Cleavage entropy as quantitative measure of protease specificity.
Directory of Open Access Journals (Sweden)
Julian E Fuchs
2013-04-01
Full Text Available A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity.
Multivariate refined composite multiscale entropy analysis
Energy Technology Data Exchange (ETDEWEB)
Humeau-Heurtier, Anne, E-mail: anne.humeau@univ-angers.fr
2016-04-01
Multiscale entropy (MSE) has become a prevailing method to quantify signals complexity. MSE relies on sample entropy. However, MSE may yield imprecise complexity estimation at large scales, because sample entropy does not give precise estimation of entropy when short signals are processed. A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. Nevertheless, RCMSE is for univariate signals only. The simultaneous analysis of multi-channel (multivariate) data often over-performs studies based on univariate signals. We therefore introduce an extension of RCMSE to multivariate data. Applications of multivariate RCMSE to simulated processes reveal its better performances over the standard multivariate MSE. - Highlights: • Multiscale entropy quantifies data complexity but may be inaccurate at large scale. • A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. • Nevertheless, RCMSE is adapted to univariate time series only. • We herein introduce an extension of RCMSE to multivariate data. • It shows better performances than the standard multivariate multiscale entropy.
Entropy of a system formed in the collision of heavy ions
International Nuclear Information System (INIS)
Gudima, K.K.; Roepke, G.; Toneev, V.D.; Schulz, H.
1987-01-01
In the framework of the cascade model, we study the evolution of the entropy of a system formed in the collision of heavy ions. The method of calculating the entropy is based on a smoothing of the momentum distribution function by means of introducing a temperature field. It is shown that the resulting entropy per nucleon is very sensitive to the specific partitioning of phase space into cells in the free-expansion phase of the reaction. From comparison with experiment it is found that the cascade calculations do not show a preference for a particular model calculation of the entropy, but predict that the entropy is smaller than the values following from equilibrium statistics
Harvey, Raymond A; Hayden, Jennifer D; Kamble, Pravin S; Bouchard, Jonathan R; Huang, Joanna C
2017-04-01
We compared methods to control bias and confounding in observational studies including inverse probability weighting (IPW) and stabilized IPW (sIPW). These methods often require iteration and post-calibration to achieve covariate balance. In comparison, entropy balance (EB) optimizes covariate balance a priori by calibrating weights using the target's moments as constraints. We measured covariate balance empirically and by simulation by using absolute standardized mean difference (ASMD), absolute bias (AB), and root mean square error (RMSE), investigating two scenarios: the size of the observed (exposed) cohort exceeds the target (unexposed) cohort and vice versa. The empirical application weighted a commercial health plan cohort to a nationally representative National Health and Nutrition Examination Survey target on the same covariates and compared average total health care cost estimates across methods. Entropy balance alone achieved balance (ASMD ≤ 0.10) on all covariates in simulation and empirically. In simulation scenario I, EB achieved the lowest AB and RMSE (13.64, 31.19) compared with IPW (263.05, 263.99) and sIPW (319.91, 320.71). In scenario II, EB outperformed IPW and sIPW with smaller AB and RMSE. In scenarios I and II, EB achieved the lowest mean estimate difference from the simulated population outcome ($490.05, $487.62) compared with IPW and sIPW, respectively. Empirically, only EB differed from the unweighted mean cost indicating IPW, and sIPW weighting was ineffective. Entropy balance demonstrated the bias-variance tradeoff achieving higher estimate accuracy, yet lower estimate precision, compared with IPW methods. EB weighting required no post-processing and effectively mitigated observed bias and confounding. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Guoqiang Xu
2017-10-01
Full Text Available Active control of heat flux can be realized with transformation optics (TO thermal metamaterials. Recently, a new class of metamaterial tunable cells has been proposed, aiming to significantly reduce the difficulty of fabrication and to flexibly switch functions by employing several cells assembled on related positions following the TO design. However, owing to the integration and rotation of materials in tunable cells, they might lead to extra thermal losses as compared with the previous continuum design. This paper focuses on investigating the thermodynamic properties of tunable cells under related design parameters. The universal expression for the local entropy generation rate in such metamaterial systems is obtained considering the influence of rotation. A series of contrast schemes are established to describe the thermodynamic process and thermal energy distributions from the viewpoint of entropy analysis. Moreover, effects of design parameters on thermal dissipations and system irreversibility are investigated. In conclusion, more thermal dissipations and stronger thermodynamic processes occur in a system with larger conductivity ratios and rotation angles. This paper presents a detailed description of the thermodynamic properties of metamaterial tunable cells and provides reference for selecting appropriate design parameters on related positions to fabricate more efficient and energy-economical switchable TO devices.
Railway Container Station Reselection Approach and Application: Based on Entropy-Cloud Model
Directory of Open Access Journals (Sweden)
Wencheng Huang
2017-01-01
Full Text Available Reasonable railway container freight stations layout means higher transportation efficiency and less transportation cost. To obtain more objective and accurate reselection results, a new entropy-cloud approach is formulated to solve the problem. The approach comprises three phases: Entropy Method is used to obtain the weight of each subcriterion during Phase 1, then cloud model is designed to form the evaluation cloud for each subcriterion during Phase 2, and finally during Phase 3 we use the weight during Phase 1 to multiply the initial evaluation cloud during Phase 2. MATLAB is applied to determine the evaluation figures and help us to make the final alternative decision. To test our approach, the railway container stations in Wuhan Railway Bureau were selected for our case study. The final evaluation result indicates only Xiangyang Station should be renovated and developed as a Special Transaction Station, five other stations should be kept and developed as Ordinary Stations, and the remaining 16 stations should be closed. Furthermore, the results show that, before the site reselection process, the average distance between two railway container stations was only 74.7 km but has improved to 182.6 km after using the approach formulated in this paper.
Entropy Based Test Point Evaluation and Selection Method for Analog Circuit Fault Diagnosis
Directory of Open Access Journals (Sweden)
Yuan Gao
2014-01-01
Full Text Available By simplifying tolerance problem and treating faulty voltages on different test points as independent variables, integer-coded table technique is proposed to simplify the test point selection process. Usually, simplifying tolerance problem may induce a wrong solution while the independence assumption will result in over conservative result. To address these problems, the tolerance problem is thoroughly considered in this paper, and dependency relationship between different test points is considered at the same time. A heuristic graph search method is proposed to facilitate the test point selection process. First, the information theoretic concept of entropy is used to evaluate the optimality of test point. The entropy is calculated by using the ambiguous sets and faulty voltage distribution, determined by component tolerance. Second, the selected optimal test point is used to expand current graph node by using dependence relationship between the test point and graph node. Simulated results indicate that the proposed method more accurately finds the optimal set of test points than other methods; therefore, it is a good solution to minimize the size of the test point set. To simplify and clarify the proposed method, only catastrophic and some specific parametric faults are discussed in this paper.
Directory of Open Access Journals (Sweden)
Varun Oswal
2013-01-01
Full Text Available The segmentation and quantification of cell nuclei are two very significant tasks in the analysis of histological images. Accurate results of cell nuclei segmentation are often adapted to a variety of applications such as the detection of cancerous cell nuclei and the observation of overlapping cellular events occurring during wound healing process in the human body. In this paper, an automated entropy-based thresholding system for segmentation and quantification of cell nuclei from histologically stained images has been presented. The proposed translational computation system aims to integrate clinical insight and computational analysis by identifying and segmenting objects of interest within histological images. Objects of interest and background regions are automatically distinguished by dynamically determining 3 optimal threshold values for the 3 color components of an input image. The threshold values are determined by means of entropy computations that are based on probability distributions of the color intensities of pixels and the spatial similarity of pixel intensities within neighborhoods. The effectiveness of the proposed system was tested over 21 histologically stained images containing approximately 1800 cell nuclei, and the overall performance of the algorithm was found to be promising, with high accuracy and precision values.
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Directory of Open Access Journals (Sweden)
Li Pan
2016-03-01
Full Text Available Virtualization technologies make it possible for cloud providers to consolidate multiple IaaS provisions into a single server in the form of virtual machines (VMs. Additionally, in order to fulfill the divergent service requirements from multiple users, a cloud provider needs to offer several types of VM instances, which are associated with varying configurations and performance, as well as different prices. In such a heterogeneous virtual machine placement process, one significant problem faced by a cloud provider is how to optimally accept and place multiple VM service requests into its cloud data centers to achieve revenue maximization. To address this issue, in this paper, we first formulate such a revenue maximization problem during VM admission control as a multiple-dimensional knapsack problem, which is known to be NP-hard to solve. Then, we propose to use a cross-entropy-based optimization approach to address this revenue maximization problem, by obtaining a near-optimal eligible set for the provider to accept into its data centers, from the waiting VM service requests in the system. Finally, through extensive experiments and measurements in a simulated environment with the settings of VM instance classes derived from real-world cloud systems, we show that our proposed cross-entropy-based admission control optimization algorithm is efficient and effective in maximizing cloud providers’ revenue in a public cloud computing environment.
Aur, Dorian; Vila-Rodriguez, Fidel
2017-01-01
Complexity measures for time series have been used in many applications to quantify the regularity of one dimensional time series, however many dynamical systems are spatially distributed multidimensional systems. We introduced Dynamic Cross-Entropy (DCE) a novel multidimensional complexity measure that quantifies the degree of regularity of EEG signals in selected frequency bands. Time series generated by discrete logistic equations with varying control parameter r are used to test DCE measures. Sliding window DCE analyses are able to reveal specific period doubling bifurcations that lead to chaos. A similar behavior can be observed in seizures triggered by electroconvulsive therapy (ECT). Sample entropy data show the level of signal complexity in different phases of the ictal ECT. The transition to irregular activity is preceded by the occurrence of cyclic regular behavior. A significant increase of DCE values in successive order from high frequencies in gamma to low frequencies in delta band reveals several phase transitions into less ordered states, possible chaos in the human brain. To our knowledge there are no reliable techniques able to reveal the transition to chaos in case of multidimensional times series. In addition, DCE based on sample entropy appears to be robust to EEG artifacts compared to DCE based on Shannon entropy. The applied technique may offer new approaches to better understand nonlinear brain activity. Copyright Â© 2016 Elsevier B.V. All rights reserved.
Towards information inequalities for generalized graph entropies.
Directory of Open Access Journals (Sweden)
Lavanya Sivakumar
Full Text Available In this article, we discuss the problem of establishing relations between information measures for network structures. Two types of entropy based measures namely, the Shannon entropy and its generalization, the Rényi entropy have been considered for this study. Our main results involve establishing formal relationships, by means of inequalities, between these two kinds of measures. Further, we also state and prove inequalities connecting the classical partition-based graph entropies and partition-independent entropy measures. In addition, several explicit inequalities are derived for special classes of graphs.
Directory of Open Access Journals (Sweden)
Jorge Pereira
2015-12-01
Full Text Available Biological invasion by exotic organisms became a key issue, a concern associated to the deep impacts on several domains described as resultant from such processes. A better understanding of the processes, the identification of more susceptible areas, and the definition of preventive or mitigation measures are identified as critical for the purpose of reducing associated impacts. The use of species distribution modeling might help on the purpose of identifying areas that are more susceptible to invasion. This paper aims to present preliminary results on assessing the susceptibility to invasion by the exotic species Acacia dealbata Mill. in the Ceira river basin. The results are based on the maximum entropy modeling approach, considered one of the correlative modelling techniques with better predictive performance. Models which validation is based on independent data sets present better performance, an evaluation based on the AUC of ROC accuracy measure.
Energy Technology Data Exchange (ETDEWEB)
Huang, Jian; Hu, Xiaoguang; Geng, Xin [School of Automation Science and Electrical Engineering, Beijing University of Aeronautics and Astronautics, Beijing, 100191 (China)
2011-02-15
Targeting the characteristics of machinery vibration signals of high voltage circuit breaker (CB), a new method based on improved empirical mode decomposition (EMD) energy entropy and multi-class support vector machine (MSVM) to diagnose fault for high voltage CB is proposed. In the fault diagnosis for the high voltage CB, the feature extraction based on improved EMD energy entropy is detailedly analyzed. A new multi-layered classification of SVM named 'one against others' algorithm approach is proposed and applied to machinery fault diagnosis of high voltage CB. The extracted features are applied to MSVM for estimating fault type. Compared with back-propagation network (BPN), the test results of MSVM demonstrate that the applying of improved EMD energy entropy to vibration signals is superior to that based on wavelet packet analysis (WPT) and hence estimating fault type on machinery condition of high voltage CB accurately and quickly. (author)
Directory of Open Access Journals (Sweden)
Kartik V. Bulusu
2015-09-01
Full Text Available The coherent secondary flow structures (i.e., swirling motions in a curved artery model possess a variety of spatio-temporal morphologies and can be encoded over an infinitely-wide range of wavelet scales. Wavelet analysis was applied to the following vorticity fields: (i a numerically-generated system of Oseen-type vortices for which the theoretical solution is known, used for bench marking and evaluation of the technique; and (ii experimental two-dimensional, particle image velocimetry data. The mother wavelet, a two-dimensional Ricker wavelet, can be dilated to infinitely large or infinitesimally small scales. We approached the problem of coherent structure detection by means of continuous wavelet transform (CWT and decomposition (or Shannon entropy. The main conclusion of this study is that the encoding of coherent secondary flow structures can be achieved by an optimal number of binary digits (or bits corresponding to an optimal wavelet scale. The optimal wavelet-scale search was driven by a decomposition entropy-based algorithmic approach and led to a threshold-free coherent structure detection method. The method presented in this paper was successfully utilized in the detection of secondary flow structures in three clinically-relevant blood flow scenarios involving the curved artery model under a carotid artery-inspired, pulsatile inflow condition. These scenarios were: (i a clean curved artery; (ii stent-implanted curved artery; and (iii an idealized Type IV stent fracture within the curved artery.
Directory of Open Access Journals (Sweden)
Yan Gao
2014-01-01
Full Text Available With the rapid development and application of medical sensor networks, the security has become a big challenge to be resolved. Trust mechanism as a method of “soft security” has been proposed to guarantee the network security. Trust models to compute the trustworthiness of single node and each path are constructed, respectively, in this paper. For the trust relationship between nodes, trust value in every interval is quantified based on Bayesian inference. A node estimates the parameters of prior distribution by using the collected recommendation information and obtains the posterior distribution combined with direct interactions. Further, the weights of trust values are allocated through using the ordered weighted vector twice and overall trust degree is represented. With the associated properties of Tsallis entropy, the definition of path Tsallis entropy is put forward, which can comprehensively measure the uncertainty of each path. Then a method to calculate the credibility of each path is derived. The simulation results show that the proposed models can correctly reflect the dynamic of node behavior, quickly identify the malicious attacks, and effectively avoid such path containing low-trust nodes so as to enhance the robustness.
International Nuclear Information System (INIS)
Avci, E.
2007-01-01
In this paper, an automatic system is presented for word recognition using real Turkish word signals. This paper especially deals with combination of the feature extraction and classification from real Turkish word signals. A Discrete Wavelet Neural Network (DWNN) model is used, which consists of two layers: discrete wavelet layer and multi-layer perceptron. The discrete wavelet layer is used for adaptive feature extraction in the time-frequency domain and is composed of Discrete Wavelet Transform (DWT) and wavelet entropy. The multi-layer perceptron used for classification is a feed-forward neural network. The performance of the used system is evaluated by using noisy Turkish word signals. Test results showing the effectiveness of the proposed automatic system are presented in this paper. The rate of correct recognition is about 92.5% for the sample speech signals. (author)
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Wavelet-Based Multi-Scale Entropy Analysis of Complex Rainfall Time Series
Directory of Open Access Journals (Sweden)
Chien-Ming Chou
2011-01-01
Full Text Available This paper presents a novel framework to determine the number of resolution levels in the application of a wavelet transformation to a rainfall time series. The rainfall time series are decomposed using the à trous wavelet transform. Then, multi-scale entropy (MSE analysis that helps to elucidate some hidden characteristics of the original rainfall time series is applied to the decomposed rainfall time series. The analysis shows that the Mann-Kendall (MK rank correlation test of MSE curves of residuals at various resolution levels could determine the number of resolution levels in the wavelet decomposition. The complexity of rainfall time series at four stations on a multi-scale is compared. The results reveal that the suggested number of resolution levels can be obtained using MSE analysis and MK test. The complexity of rainfall time series at various locations can also be analyzed to provide a reference for water resource planning and application.
Entropy-viscosity based LES of turbulent flow in a flexible pipe
Wang, Zhicheng; Xie, Fangfang; Triantafyllou, Michael; Constantinides, Yiannis; Karniadakis, George
2016-11-01
We present large-eddy simulations (LES) of turbulent flow in a flexible pipe conveying incompressible fluid. We are interested in quantifying the flow-structure interaction in terms of mean quantities and their variances. For the LES, we employ an Entropy Viscosity Method (EVM), implemented in a spectral element code. In previous work, we investigated laminar flow and studied the complex interaction between structural and internal flow dynamics and obtained a phase diagram of the transition between states as function of three non-dimensional quantities: the fluid-tension parameter, the dimensionless fluid velocity, and the Reynolds number. Here we extend our studies in the turbulence regime, Re from 5,000 to 50,000. The motion of the flexible pipe affects greatly the turbulence statistics of the pipe flow, with substantial differences for free (self-sustained) vibrations and prescribed (forced) vibrations.
Directory of Open Access Journals (Sweden)
Xiao-ping Bai
2013-01-01
Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Entropy and graph based modelling of document coherence using discourse entities
DEFF Research Database (Denmark)
Petersen, Casper; Lioma, Christina; Simonsen, Jakob Grue
2015-01-01
] that represents text as a graph of discourse entities, linked by different relations, such as their distance or adjacency in text. We use several graph topology metrics to approximate different aspects of the discourse flow that can indicate coherence, such as the average clustering or betweenness of discourse......We present two novel models of document coherence and their application to information retrieval (IR). Both models approximate document coherence using discourse entities, e.g. the subject or object of a sentence. Our first model views text as a Markov process generating sequences of discourse...... entities (entity n-grams); we use the entropy of these entity n-grams to approximate the rate at which new information appears in text, reasoning that as more new words appear, the topic increasingly drifts and text coherence decreases. Our second model extends the work of Guinaudeau & Strube [28...
International Nuclear Information System (INIS)
Ehrfeld, W.; Schelb, W.
1984-09-01
The effects exerted by separation of the heavy component and the light auxiliary gas, strong variations in the state of the gas in the direction of flow, and strong deviations from local equilibrium upon both isotope separation and pressure losses in separation nozzles are studied. The flow field and the separation of the heavy and light components are investigated. This simulation shows that the separation of the heavy component and the auxiliary gas results in a damping of isotope separation near the curved wall of the nozzle. The strong variations in the state of the gas in the direction of flow result in a large velocity slip between the components in the inner regions of flow. The examination of the local production of entropy shows that the diffusion of the heavy component through the light auxiliary gas may cause up to 30% of the total pressure losses. (orig./HP) [de
Asymmetric asynchrony of financial time series based on asymmetric multiscale cross-sample entropy
Yin, Yi; Shang, Pengjian
2015-03-01
The paper proposes the asymmetric multiscale cross-sample entropy (AMCSE) method and applies it to analyze the financial time series of US, Chinese, and European stock markets. The asynchronies of these time series in USA, China, and Europe all decrease (the correlations increase) with the increase in scale which declares that taking into account bigger time scale to study these financial time series is capable of revealing the intrinsic relations between these stock markets. Meanwhile, we find that there is a crossover between the upwards and the downwards in these AMCSE results, which indicates that when the scale reach a certain value, the asynchronies of the upwards and the downwards for these stock markets are equal and symmetric. But for the other scales, the asynchronies of the upwards and the downwards are different from each other indicating the necessity and importance of multiscale analysis for revealing the most comprehensive information of stock markets. The series with a positive trend have a higher decreasing pace on asynchrony than those with a negative trend, while the asynchrony between the series with a positive or negative trend is lower than that between the original series. Moreover, it is noticeable that there are some small abnormal rises at some abnormal scales. We find that the asynchronies are the highest at scales smaller than 2 when investigating the time series of stock markets with a negative trend. The existences of asymmetries declare the inaccuracy and weakness of multiscale cross-sample entropy, while by comparing the asymmetries of US, Chinese, and European markets, similar conclusions can be drawn and we acquire that the asymmetries of Chinese markets are the smallest and the asymmetries of European markets are the biggest. Thus, it is of great value and benefit to investigate the series with different trends using AMCSE method.
Excess Entropy and Diffusivity
Indian Academy of Sciences (India)
First page Back Continue Last page Graphics. Excess Entropy and Diffusivity. Excess entropy scaling of diffusivity (Rosenfeld,1977). Analogous relationships also exist for viscosity and thermal conductivity.
Davis, Tyler; Love, Bradley C.; Preston, Alison R.
2012-01-01
Category learning is a complex phenomenon that engages multiple cognitive processes, many of which occur simultaneously and unfold dynamically over time. For example, as people encounter objects in the world, they simultaneously engage processes to determine their fit with current knowledge structures, gather new information about the objects, and adjust their representations to support behavior in future encounters. Many techniques that are available to understand the neural basis of category learning assume that the multiple processes that subserve it can be neatly separated between different trials of an experiment. Model-based functional magnetic resonance imaging offers a promising tool to separate multiple, simultaneously occurring processes and bring the analysis of neuroimaging data more in line with category learning’s dynamic and multifaceted nature. We use model-based imaging to explore the neural basis of recognition and entropy signals in the medial temporal lobe and striatum that are engaged while participants learn to categorize novel stimuli. Consistent with theories suggesting a role for the anterior hippocampus and ventral striatum in motivated learning in response to uncertainty, we find that activation in both regions correlates with a model-based measure of entropy. Simultaneously, separate subregions of the hippocampus and striatum exhibit activation correlated with a model-based recognition strength measure. Our results suggest that model-based analyses are exceptionally useful for extracting information about cognitive processes from neuroimaging data. Models provide a basis for identifying the multiple neural processes that contribute to behavior, and neuroimaging data can provide a powerful test bed for constraining and testing model predictions. PMID:22746951
Chemical Engineering Students' Ideas of Entropy
Haglund, Jesper; Andersson, Staffan; Elmgren, Maja
2015-01-01
Thermodynamics, and in particular entropy, has been found to be challenging for students, not least due to its abstract character. Comparisons with more familiar and concrete domains, by means of analogy and metaphor, are commonly used in thermodynamics teaching, in particular the metaphor "entropy is disorder." However, this particular…
Controlling the Shannon Entropy of Quantum Systems
Xing, Yifan; Wu, Jun
2013-01-01
This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking. PMID:23818819
Interval Entropy and Informative Distance
Directory of Open Access Journals (Sweden)
Fakhroddin Misagh
2012-03-01
Full Text Available The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
Directory of Open Access Journals (Sweden)
Enrico Sciubba
2011-06-01
Full Text Available In this paper, the entropy generation minimization (EGM method is applied to an industrial heat transfer problem: the forced convective cooling of a LED-based spotlight. The design specification calls for eighteen diodes arranged on a circular copper plate of 35 mm diameter. Every diode dissipates 3 W and the maximum allowedtemperature of the plate is 80 °C. The cooling relies on the forced convection driven by a jet of air impinging on the plate. An initial complex geometry of plate fins is presented and analyzed with a commercial CFD code that computes the entropy generation rate. A pseudo-optimization process is carried out via a successive series of design modifications based on a careful analysis of the entropy generation maps. One of the advantages of the EGM method is that the rationale behind each step of the design process can be justified on a physical basis. It is found that the best performance is attained when the fins are periodically spaced in the radial direction.
Zucker, M. H.
This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own
Minimum Error Entropy Classification
Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A
2013-01-01
This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.
Holden, Todd; Gadura, N.; Dehipawala, S.; Cheung, E.; Tuffour, M.; Schneider, P.; Tremberger, G., Jr.; Lieberman, D.; Cheung, T.
2011-10-01
Technologically important extremophiles including oil eating microbes, uranium and rocket fuel perchlorate reduction microbes, electron producing microbes and electrode electrons feeding microbes were compared in terms of their 16S rRNA sequences, a standard targeted sequence in comparative phylogeny studies. Microbes that were reported to have survived a prolonged dormant duration were also studied. Examples included the recently discovered microbe that survives after 34,000 years in a salty environment while feeding off organic compounds from other trapped dead microbes. Shannon entropy of the 16S rRNA nucleotide composition and fractal dimension of the nucleotide sequence in terms of its atomic number fluctuation analyses suggest a selected range for these extremophiles as compared to other microbes; consistent with the experience of relatively mild evolutionary pressure. However, most of the microbes that have been reported to survive in prolonged dormant duration carry sequences with fractal dimension between 1.995 and 2.005 (N = 10 out of 13). Similar results are observed for halophiles, red-shifted chlorophyll and radiation resistant microbes. The results suggest that prolonged dormant duration, in analogous to high salty or radiation environment, would select high fractal 16S rRNA sequences. Path analysis in structural equation modeling supports a causal relation between entropy and fractal dimension for the studied 16S rRNA sequences (N = 7). Candidate choices for high fractal 16S rRNA microbes could offer protection for prolonged spaceflights. BioBrick gene network manipulation could include extremophile 16S rRNA sequences in synthetic biology and shed more light on exobiology and future colonization in shielded spaceflights. Whether the high fractal 16S rRNA sequences contain an asteroidlike extra-terrestrial source could be speculative but interesting.
Maravall, Darío; de Lope, Javier; Fuentes, Juan P.
2017-01-01
We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV) in typical indoor navigation tasks. PMID:28900394
Maravall, Darío; de Lope, Javier; Fuentes, Juan P
2017-01-01
We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV) in typical indoor navigation tasks.
Directory of Open Access Journals (Sweden)
Darío Maravall
2017-08-01
Full Text Available We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV in typical indoor navigation tasks.
An Entropy-Based Upper Bound Methodology for Robust Predictive Multi-Mode RCPSP Schedules
Directory of Open Access Journals (Sweden)
Angela Hsiang-Ling Chen
2014-09-01
Full Text Available Projects are an important part of our activities and regardless of their magnitude, scheduling is at the very core of every project. In an ideal world makespan minimization, which is the most commonly sought objective, would give us an advantage. However, every time we execute a project we have to deal with uncertainty; part of it coming from known sources and part remaining unknown until it affects us. For this reason, it is much more practical to focus on making our schedules robust, capable of handling uncertainty, and even to determine a range in which the project could be completed. In this paper we focus on an approach to determine such a range for the Multi-mode Resource Constrained Project Scheduling Problem (MRCPSP, a widely researched, NP-complete problem, but without adding any subjective considerations to its estimation. We do this by using a concept well known in the domain of thermodynamics, entropy and a three-stage approach. First we use Artificial Bee Colony (ABC—an effective and powerful meta-heuristic—to determine a schedule with minimized makespan which serves as a lower bound. The second stage defines buffer times and creates an upper bound makespan using an entropy function, with the advantage over other methods that it only considers elements which are inherent to the schedule itself and does not introduce any subjectivity to the buffer time generation. In the last stage, we use the ABC algorithm with an objective function that seeks to maximize robustness while staying within the makespan boundaries defined previously and in some cases even below the lower boundary. We evaluate our approach with two different benchmarks sets: when using the PSPLIB for the MRCPSP benchmark set, the computational results indicate that it is possible to generate robust schedules which generally result in an increase of less than 10% of the best known solutions while increasing the robustness in at least 20% for practically every
Vector entropy imaging theory with application to computerized tomography
International Nuclear Information System (INIS)
Wang Yuanmei; Cheng Jianping; Heng, Pheng Ann
2002-01-01
Medical imaging theory for x-ray CT and PET is based on image reconstruction from projections. In this paper a novel vector entropy imaging theory under the framework of multiple criteria decision making is presented. We also study the most frequently used image reconstruction methods, namely, least square, maximum entropy, and filtered back-projection methods under the framework of the single performance criterion optimization. Finally, we introduce some of the results obtained by various reconstruction algorithms using computer-generated noisy projection data from the Hoffman phantom and real CT scanner data. Comparison of the reconstructed images indicates that the vector entropy method gives the best in error (difference between the original phantom data and reconstruction), smoothness (suppression of noise), grey value resolution and is free of ghost images. (author)
Directory of Open Access Journals (Sweden)
Yan Jingyi
2016-01-01
Full Text Available The paper focuses on studying connotative meaning, evaluation methods and models for chemical industry park based on in-depth analysis of relevant research results in China and abroad, it summarizes and states the feature of menacing vulnerability and structural vulnerability and submits detailed influence factors such as personnel vulnerability, infrastructural vulnerability, environmental vulnerability and the vulnerability of safety managerial defeat. Using vulnerability scoping diagram establishes 21 evaluation indexes and an index system for the vulnerability evaluation of chemical industrial park. The comprehensive weights are calculated with entropy method, combining matter-element extension model to make the quantitative evaluation, then apply to evaluate some chemical industrial park successfully. This method provides a new ideas and ways for enhancing overall safety of the chemical industrial park.
Xiang, Min; Qu, Qinqin; Chen, Cheng; Tian, Li; Zeng, Lingkang
2017-11-01
To improve the reliability of communication service in smart distribution grid (SDG), an access selection algorithm based on dynamic network status and different service types for heterogeneous wireless networks was proposed. The network performance index values were obtained in real time by multimode terminal and the variation trend of index values was analyzed by the growth matrix. The index weights were calculated by entropy-weight and then modified by rough set to get the final weights. Combining the grey relational analysis to sort the candidate networks, and the optimum communication network is selected. Simulation results show that the proposed algorithm can implement dynamically access selection in heterogeneous wireless networks of SDG effectively and reduce the network blocking probability.
Directory of Open Access Journals (Sweden)
Fugui Qi
2016-08-01
Full Text Available Judgment and early danger warning of obstructive sleep apnea (OSA is meaningful to the diagnosis of sleep illness. This paper proposed a novel method based on wavelet information entropy spectrum to make an apnea judgment of the OSA respiratory signal detected by bio-radar in wavelet domain. It makes full use of the features of strong irregularity and disorder of respiratory signal resulting from the brain stimulation by real, low airflow during apnea. The experimental results demonstrated that the proposed method is effective for detecting the occurrence of sleep apnea and is also able to detect some apnea cases that the energy spectrum method cannot. Ultimately, the comprehensive judgment accuracy resulting from 10 groups of OSA data is 93.1%, which is promising for the non-contact aided-diagnosis of the OSA.
[Evaluation of urban human settlement quality in Ningxia based on AHP and the entropy method].
Li, Shuai; Wei, Hong; Ni, Xi-Lu; Gu, Yan-Wen; Li, Chang-Xiao
2014-09-01
As one of the key indicators of the urbanization and the sustainable development of cities, urban human settlement quality has been a hot issue. In this paper, an evaluation system containing indicators related to four aspects (ecological, social, humanities and economic environments) was established to assess the urban human settlement quality in five main cities in Ningxia Hui Autonomous Region, Northwest China. After calculating each indicator' s weight in the evaluation system through AHP and the entropy method, the quality of urban human settlement was analyzed. Results showed that Yinchuan had a score of 0. 85 for the quality of human settlement, Shizuishan 0.62, Wuzhong 0.43, Zhongwei 0.33, and Guyuan 0.32, respectively. Shizuishan got the highest score in the eco-environment aspect, and Yinchuan had the highest scores for social, humanities and economic environments. Zhongwei and Guyuan had relatively low scores in all the four urban human settlement aspects. Coordination analysis showed that internal coordination was moderate for Yinchuan (0.79) and Shizuishan (0.72), and relatively good for the other cities. However, coordination was relatively poor among the five cities, especially in social environment (0.48). These results suggested that an unsatisfied situation existed in terms of the urban human settlement quality in Ningxia, and that corresponding measures should be taken to accelerate the development of vulnerable indicators, so as to coordinate all the urban human settlement aspects within and among cities.
Analysis of Neural Oscillations on Drosophila’s Subesophageal Ganglion Based on Approximate Entropy
Directory of Open Access Journals (Sweden)
Tian Mei
2015-10-01
Full Text Available The suboesophageal ganglion (SOG, which connects to both central and peripheral nerves, is the primary taste-processing center in the Drosophila’s brain. The neural oscillation in this center may be of great research value yet it is rarely reported. This work aims to determine the amount of unique information contained within oscillations of the SOG and describe the variability of these patterns. The approximate entropy (ApEn values of the spontaneous membrane potential (sMP of SOG neurons were calculated in this paper. The arithmetic mean (MA, standard deviation (SDA and the coefficient of variation (CVA of ApEn were proposed as the three statistical indicators to describe the irregularity and complexity of oscillations. The hierarchical clustering method was used to classify them. As a result, the oscillations in SOG were divided into five categories, including: (1 Continuous spike pattern; (2 Mixed oscillation pattern; (3 Spikelet pattern; (4 Busting pattern and (5 Sparse spike pattern. Steady oscillation state has a low level of irregularity, and vice versa. The dopamine stimulation can distinctly cut down the complexity of the mixed oscillation pattern. The current study provides a quantitative method and some critera on mining the information carried in neural oscillations.
Environmental efficiency analysis of power industry in China based on an entropy SBM model
International Nuclear Information System (INIS)
Zhou, Yan; Xing, Xinpeng; Fang, Kuangnan; Liang, Dapeng; Xu, Chunlin
2013-01-01
In order to assess the environmental efficiency of power industry in China, this paper first proposes a new non-radial DEA approach by integrating the entropy weight and the SBM model. This will improve the assessment reliability and reasonableness. Using the model, this study then evaluates the environmental efficiency of the Chinese power industry at the provincial level during 2005–2010. The results show a marked difference in environmental efficiency of the power industry among Chinese provinces. Although the annual, average, environmental efficiency level fluctuates, there is an increasing trend. The Tobit regression analysis reveals the innovation ability of enterprises, the proportion of electricity generated by coal-fired plants and the generation capacity have a significantly positive effect on environmental efficiency. However the waste fees levied on waste discharge and investment in industrial pollutant treatment are negatively associated with environmental efficiency. - Highlights: ► We assess the environmental efficiency of power industry in China by E-SBM model. ► Environmental efficiency of power industry is different among provinces. ► Efficiency stays at a higher level in the eastern and the western area. ► Proportion of coal-fired plants has a positive effect on the efficiency. ► Waste fees and the investment have a negative effect on the efficiency
Analysis of ITMS System Impact Mechanism in Beijing Based on FD and Traffic Entropy
Directory of Open Access Journals (Sweden)
Ailing Huang
2012-01-01
Full Text Available Although more attention has been attracted to benefit evaluation of Intelligent Transportation Systems (ITS deployment, how ITS impact the traffic system and make great effects is little considered. As a subsystem of ITS, in this paper, Intelligent Transportation Management System (ITMS is studied with its impact mechanism on the road traffic system. Firstly, the correlative factors between ITMS and the road traffic system are presented and 3 positive feedback chains are defined. Secondly, we introduce the theory of Fundamental Diagram (FD and traffic system entropy to demonstrate the correlative relationship between ITMS and feedback chains. The analyzed results show that ITMS, as a negative feedback factor, has damping functions on the coupling relationship of all 3 positive feedback chains. It indicates that with its deployment in Beijing, ITMS has impacted the improvement of efficiency and safety for the road traffic system. Finally, related benefits brought by ITMS are presented corresponding to the correlative factors, and effect standards are identified for evaluating ITMS comprehensive benefits.
Entropy-Based Application Layer DDoS Attack Detection Using Artificial Neural Networks
Directory of Open Access Journals (Sweden)
Khundrakpam Johnson Singh
2016-10-01
Full Text Available Distributed denial-of-service (DDoS attack is one of the major threats to the web server. The rapid increase of DDoS attacks on the Internet has clearly pointed out the limitations in current intrusion detection systems or intrusion prevention systems (IDS/IPS, mostly caused by application-layer DDoS attacks. Within this context, the objective of the paper is to detect a DDoS attack using a multilayer perceptron (MLP classification algorithm with genetic algorithm (GA as learning algorithm. In this work, we analyzed the standard EPA-HTTP (environmental protection agency-hypertext transfer protocol dataset and selected the parameters that will be used as input to the classifier model for differentiating the attack from normal profile. The parameters selected are the HTTP GET request count, entropy, and variance for every connection. The proposed model can provide a better accuracy of 98.31%, sensitivity of 0.9962, and specificity of 0.0561 when compared to other traditional classification models.
Directory of Open Access Journals (Sweden)
Yimeng Zhang
2013-05-01
Full Text Available A method of blind recognition of the coding parameters for binary Bose-Chaudhuri-Hocquenghem (BCH codes is proposed in this paper. We consider an intelligent communication receiver which can blindly recognize the coding parameters of the received data stream. The only knowledge is that the stream is encoded using binary BCH codes, while the coding parameters are unknown. The problem can be addressed on the context of the non-cooperative communications or adaptive coding and modulations (ACM for cognitive radio networks. The recognition processing includes two major procedures: code length estimation and generator polynomial reconstruction. A hard decision method has been proposed in a previous literature. In this paper we propose the recognition approach in soft decision situations with Binary-Phase-Shift-Key modulations and Additive-White-Gaussian-Noise (AWGN channels. The code length is estimated by maximizing the root information dispersion entropy function. And then we search for the code roots to reconstruct the primitive and generator polynomials. By utilizing the soft output of the channel, the recognition performance is improved and the simulations show the efficiency of the proposed algorithm.
Cho, Yongrae; Kim, Minsung
2014-01-01
The volatility and uncertainty in the process of technological developments are growing faster than ever due to rapid technological innovations. Such phenomena result in integration among disparate technology fields. At this point, it is a critical research issue to understand the different roles and the propensity of each element technology for technological convergence. In particular, the network-based approach provides a holistic view in terms of technological linkage structures. Furthermore, the development of new indicators based on network visualization can reveal the dynamic patterns among disparate technologies in the process of technological convergence and provide insights for future technological developments. This research attempts to analyze and discover the patterns of the international patent classification codes of the United States Patent and Trademark Office's patent data in printed electronics, which is a representative technology in the technological convergence process. To this end, we apply the physical idea as a new methodological approach to interpret technological convergence. More specifically, the concepts of entropy and gravity are applied to measure the activities among patent citations and the binding forces among heterogeneous technologies during technological convergence. By applying the entropy and gravity indexes, we could distinguish the characteristic role of each technology in printed electronics. At the technological convergence stage, each technology exhibits idiosyncratic dynamics which tend to decrease technological differences and heterogeneity. Furthermore, through nonlinear regression analysis, we have found the decreasing patterns of disparity over a given total period in the evolution of technological convergence. This research has discovered the specific role of each element technology field and has consequently identified the co-evolutionary patterns of technological convergence. These new findings on the evolutionary
Lifescience Database Archive (English)
Full Text Available List Contact us Gclust Server Cluster based on sequence comparison of homologous proteins of 95 organism spe...cies Data detail Data name Cluster based on sequence comparison of homologous proteins of 95 organism specie...ta, and heuristic estimation of a similarity threshold for homologs of each protein by entropy-optimized organi...no acid sequences of predicted proteins and their annotation for 95 organism species . Data analysis method ...y threshold for homologs of each protein by entropy-optimized organism count method ( Bioinformatics 2009 Ma
Two-dimensional maximum entropy image restoration
International Nuclear Information System (INIS)
Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.
1977-07-01
An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures
Directory of Open Access Journals (Sweden)
Ingo Klein
2016-07-01
are based on the new entropy have been developed. We show that almost all known linear rank tests are special cases, and we introduce certain new tests. Moreover, formulas for different distributions and entropy calculations are presented for C P E φ if the cdf is available in a closed form.
Directory of Open Access Journals (Sweden)
Luca Faes
2017-01-01
Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.
Electroencephalogram–Electromyography Coupling Analysis in Stroke Based on Symbolic Transfer Entropy
Directory of Open Access Journals (Sweden)
Yunyuan Gao
2018-01-01
Full Text Available The coupling strength between electroencephalogram (EEG and electromyography (EMG signals during motion control reflects the interaction between the cerebral motor cortex and muscles. Therefore, neuromuscular coupling characterization is instructive in assessing motor function. In this study, to overcome the limitation of losing the characteristics of signals in conventional time series symbolization methods, a variable scale symbolic transfer entropy (VS-STE analysis approach was proposed for corticomuscular coupling evaluation. Post-stroke patients (n = 5 and healthy volunteers (n = 7 were recruited and participated in various tasks (left and right hand gripping, elbow bending. The proposed VS-STE was employed to evaluate the corticomuscular coupling strength between the EEG signal measured from the motor cortex and EMG signal measured from the upper limb in both the time-domain and frequency-domain. Results showed a greater strength of the bi-directional (EEG-to-EMG and EMG-to-EEG VS-STE in post-stroke patients compared to healthy controls. In addition, the strongest EEG–EMG coupling strength was observed in the beta frequency band (15–35 Hz during the upper limb movement. The predefined coupling strength of EMG-to-EEG in the affected side of the patient was larger than that of EEG-to-EMG. In conclusion, the results suggested that the corticomuscular coupling is bi-directional, and the proposed VS-STE can be used to quantitatively characterize the non-linear synchronization characteristics and information interaction between the primary motor cortex and muscles.
Energy Technology Data Exchange (ETDEWEB)
Weinberg, A.M.
1982-10-01
Utopians who use entropy to warn of a vast deterioration of energy and mineral resources seek a self-fulfilling prophesy when they work to deny society access to new energy sources, particularly nuclear power. While theoretically correct, entropy is not the relevant factor for the rest of this century. The more extreme entropists call for a return to an eotechnic society based on decentralized, renewable energy technologies, which rests on the assumptions of a loss in Gibbs Free Energy, a mineral depletion that will lead to OPEC-like manipulation, and a current technology that is destroying the environment. The author challenges these assumptions and calls for an exorcism of public fears over reactor accidents. He foresees a resurgence in public confidence in nuclear power by 1990 that will resolve Western dependence on foreign oil. (DCK)
Bubble Entropy: An Entropy Almost Free of Parameters.
Manis, George; Aktaruzzaman, Md; Sassi, Roberto
2017-11-01
Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation
Directory of Open Access Journals (Sweden)
Bernard S. Kay
2015-12-01
Full Text Available We give a review, in the style of an essay, of the author’s 1998 matter-gravity entanglement hypothesis which, unlike the standard approach to entropy based on coarse-graining, offers a definition for the entropy of a closed system as a real and objective quantity. We explain how this approach offers an explanation for the Second Law of Thermodynamics in general and a non-paradoxical understanding of information loss during black hole formation and evaporation in particular. It also involves a radically different from usual description of black hole equilibrium states in which the total state of a black hole in a box together with its atmosphere is a pure state—entangled in just such a way that the reduced state of the black hole and of its atmosphere are each separately approximately thermal. We also briefly recall some recent work of the author which involves a reworking of the string-theory understanding of black hole entropy consistent with this alternative description of black hole equilibrium states and point out that this is free from some unsatisfactory features of the usual string theory understanding. We also recall the author’s recent arguments based on this alternative description which suggest that the Anti de Sitter space (AdS/conformal field theory (CFT correspondence is a bijection between the boundary CFT and just the matter degrees of freedom of the bulk theory.
International Nuclear Information System (INIS)
Kawazura, Y.; Yoshida, Z.
2012-01-01
Two different types of self-organizing and sustaining ordered motion in fluids or plasmas--one is a Benard convection (or streamer) and the other is a zonal flow--have been compared by introducing a thermodynamic phenomenological model and evaluating the corresponding entropy production rates (EP). These two systems have different topologies in their equivalent circuits: the Benard convection is modeled by parallel connection of linear and nonlinear conductances, while the zonal flow is modeled by series connection. The ''power supply'' that drives the systems is also a determinant of operating modes. When the energy flux is a control parameter (as in usual plasma experiments), the driver is modeled by a constant-current power supply, and when the temperature difference between two separate boundaries is controlled (as in usual computational studies), the driver is modeled by a constant-voltage power supply. The parallel (series)-connection system tends to minimize (maximize) the total EP when a constant-current power supply drives the system. This minimum/maximum relation flips when a constant-voltage power supply is connected.
Barbosa, Tiago M; Goh, Wan X; Morais, Jorge E; Costa, Mário J; Pendergast, David
2016-01-01
The aim of this study was to compare the non-linear properties of the four competitive swim strokes. Sixty-eight swimmers performed a set of maximal 4 × 25 m using the four competitive swim strokes. The hip's speed-data as a function of time was collected with a speedo-meter. The speed fluctuation ( dv ), approximate entropy ( ApEn ) and the fractal dimension by Higuchi's method ( D ) were computed. Swimming data exhibited non-linear properties that were different among the four strokes (14.048 ≤ dv ≤ 39.722; 0.682 ≤ ApEn ≤ 1.025; 1.823 ≤ D ≤ 1.919). The ApEn showed the lowest value for front-crawl, followed by breaststroke, butterfly, and backstroke ( P backstroke, followed by butterfly and breaststroke ( P < 0.001). It can be concluded that swimming data exhibits non-linear properties, which are different among the four competitive swimming strokes.
Directory of Open Access Journals (Sweden)
Tiago Barbosa
2016-10-01
Full Text Available The aim of this study was to compare the nonlinear properties of the four competitive swim strokes. Sixty-eight swimmers performed a set of maximal 4x25m using the four competitive swim strokes. The hip’s speed-data as a function of time was collected with a speedo-meter. The speed fluctuation (dv, approximate entropy (ApEn and the fractal dimension by Higuchi’s method (D were computed. Swimming data exhibited nonlinear properties that were different among the four strokes (14.048≤dv≤39.722; 0.682≤ApEn≤1.025; 1.823≤D≤1.919. The ApEn showed the lowest value for front-crawl, followed by breaststroke, butterfly and backstroke (P<0.001. Fractal dimension and dv had the lowest values for front-crawl and backstroke, followed by butterfly and breaststroke (P<0.001. It can be concluded that swimming data exhibits nonlinear properties, which are different among the four competitive swimming strokes.
Quantitative Entropy Study of Language Complexity
Xie, R. R.; Deng, W. B.; Wang, D. J.; Csernai, L. P.
2016-01-01
We study the entropy of Chinese and English texts, based on characters in case of Chinese texts and based on words for both languages. Significant differences are found between the languages and between different personal styles of debating partners. The entropy analysis points in the direction of lower entropy, that is of higher complexity. Such a text analysis would be applied for individuals of different styles, a single individual at different age, as well as different groups of the popul...
Entropy, related thermodynamic properties, and structure of methylisocyanate
International Nuclear Information System (INIS)
Davis, Phil S.; Kilpatrick, John E.
2013-01-01
Highlights: ► The thermodynamic properties of methylisocyanate have been determined by isothermal calorimetry from 15 to 298.15 K. ► The third law entropy has been compared with the entropy calculated by statistical thermodynamics. ► The comparisons are consistent with selected proposed molecular structures and vibrational frequencies. -- Abstract: The entropy and related thermodynamic properties of methylisocyanate, CH 3 NCO, have been determined by isothermal calorimetry. The entropy in the ideal gas state at 298.15 K and 1 atmosphere is S m o = 284.3 ± 0.6 J/K · mol. Other thermodynamic properties determined include: the heat capacity from 15 to 300 K, the temperature of fusion (T fus = 178.461 ± 0.024 K), the enthalpy of fusion (ΔH fus = 7455.2 ± 14.0 J/mol), the enthalpy of vaporization at 298.15 K (ΔH vap = 28768 ± 54 J/mol), and the vapor pressure from fusion to 300 K. Using statistical thermodynamics, the entropy in this same state has been calculated for various assumed structures for methylisocyante which have been proposed based on several spectroscopic and ab initio results. Comparisons between the experimental and calculated entropy have led to the following conclusions concerning historical differences among problematic structural properties: (1) The CNC/CNO angles can have the paired values of 140/180° or 135/173° respectively. It is not possible to distinguish between the two by this thermodynamic analysis. (2) The methyl group functions as a free rotor or near free rotor against the NCO rigid frame. The barrier to internal rotation is less than 2100 J/mol. (3) The CNC vibrational bending frequency is consistent with the more recently observed assignments at 165 and 172 cm −1 with some degree of anharmonicity or with a pure harmonic at about 158 cm −1
An approximate method for Bayesian entropy estimation for a discrete random variable.
Yokota, Yasunari
2004-01-01
This article proposes an approximated Bayesian entropy estimator for a discrete random variable. An entropy estimator that achieves least square error is obtained through Bayesian estimation of the occurrence probabilities of each value taken by the discrete random variable. This Bayesian entropy estimator requires large amount of calculation cost if the random variable takes numerous sorts of values. Therefore, the present article proposes a practical method for calculating an Bayesian entropy estimate; the proposed method utilizes approximation of the entropy function by a truncated Taylor series. Numerical experiments demonstrate that the proposed entropy estimation method improves estimation precision of entropy remarkably in comparison to the conventional entropy estimation method.
Comparison of metatranscriptomic samples based on k-tuple frequencies.
Directory of Open Access Journals (Sweden)
Ying Wang
Full Text Available The comparison of samples, or beta diversity, is one of the essential problems in ecological studies. Next generation sequencing (NGS technologies make it possible to obtain large amounts of metagenomic and metatranscriptomic short read sequences across many microbial communities. De novo assembly of the short reads can be especially challenging because the number of genomes and their sequences are generally unknown and the coverage of each genome can be very low, where the traditional alignment-based sequence comparison methods cannot be used. Alignment-free approaches based on k-tuple frequencies, on the other hand, have yielded promising results for the comparison of metagenomic samples. However, it is not known if these approaches can be used for the comparison of metatranscriptome datasets and which dissimilarity measures perform the best.We applied several beta diversity measures based on k-tuple frequencies to real metatranscriptomic datasets from pyrosequencing 454 and Illumina sequencing platforms to evaluate their effectiveness for the clustering of metatranscriptomic samples, including three d2-type dissimilarity measures, one dissimilarity measure in CVTree, one relative entropy based measure S2 and three classical 1p-norm distances. Results showed that the measure d2(S can achieve superior performance on clustering metatranscriptomic samples into different groups under different sequencing depths for both 454 and Illumina datasets, recovering environmental gradients affecting microbial samples, classifying coexisting metagenomic and metatranscriptomic datasets, and being robust to sequencing errors. We also investigated the effects of tuple size and order of the background Markov model. A software pipeline to implement all the steps of analysis is built and is available at http://code.google.com/p/d2-tools/.The k-tuple based sequence signature measures can effectively reveal major groups and gradient variation among
Black hole entropy functions and attractor equations
International Nuclear Information System (INIS)
Lopes Cardoso, Gabriel; Wit, Bernard de; Mahapatra, Swapna
2007-01-01
The entropy and the attractor equations for static extremal black hole solutions follow from a variational principle based on an entropy function. In the general case such an entropy function can be derived from the reduced action evaluated in a near-horizon geometry. BPS black holes constitute special solutions of this variational principle, but they can also be derived directly from a different entropy function based on supersymmetry enhancement at the horizon. Both functions are consistent with electric/magnetic duality and for BPS black holes their corresponding OSV-type integrals give identical results at the semi-classical level. We clarify the relation between the two entropy functions and the corresponding attractor equations for N = 2 supergravity theories with higher-derivative couplings in four space-time dimensions. We discuss how non-holomorphic corrections will modify these entropy functions
Directory of Open Access Journals (Sweden)
Qing Ye
2015-01-01
Full Text Available This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach.
Energy Technology Data Exchange (ETDEWEB)
Marc O Delchini; Jean E. Ragusa; Ray A. Berry
2015-07-01
We present a new version of the entropy viscosity method, a viscous regularization technique for hyperbolic conservation laws, that is well-suited for low-Mach flows. By means of a low-Mach asymptotic study, new expressions for the entropy viscosity coefficients are derived. These definitions are valid for a wide range of Mach numbers, from subsonic flows (with very low Mach numbers) to supersonic flows, and no longer depend on an analytical expression for the entropy function. In addition, the entropy viscosity method is extended to Euler equations with variable area for nozzle flow problems. The effectiveness of the method is demonstrated using various 1-D and 2-D benchmark tests: flow in a converging–diverging nozzle; Leblanc shock tube; slow moving shock; strong shock for liquid phase; low-Mach flows around a cylinder and over a circular hump; and supersonic flow in a compression corner. Convergence studies are performed for smooth solutions and solutions with shocks present.
Entropy Stable Summation-by-Parts Formulations for Compressible Computational Fluid Dynamics
Carpenter, M.H.
2016-11-09
A systematic approach based on a diagonal-norm summation-by-parts (SBP) framework is presented for implementing entropy stable (SS) formulations of any order for the compressible Navier–Stokes equations (NSE). These SS formulations discretely conserve mass, momentum, energy and satisfy a mathematical entropy equality for smooth problems. They are also valid for discontinuous flows provided sufficient dissipation is added at shocks and discontinuities to satisfy an entropy inequality. Admissible SBP operators include all centred diagonal-norm finite-difference (FD) operators and Legendre spectral collocation-finite element methods (LSC-FEM). Entropy stable multiblock FD and FEM operators follows immediately via nonlinear coupling operators that ensure conservation, accuracy and preserve the interior entropy estimates. Nonlinearly stable solid wall boundary conditions are also available. Existing SBP operators that lack a stability proof (e.g. weighted essentially nonoscillatory) may be combined with an entropy stable operator using a comparison technique to guarantee nonlinear stability of the pair. All capabilities extend naturally to a curvilinear form of the NSE provided that the coordinate mappings satisfy a geometric conservation law constraint. Examples are presented that demonstrate the robustness of current state-of-the-art entropy stable SBP formulations.
Permutation Entropy: New Ideas and Challenges
Directory of Open Access Journals (Sweden)
Karsten Keller
2017-03-01
Full Text Available Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data.
Secondary structural entropy in RNA switch (Riboswitch) identification.
Manzourolajdad, Amirhossein; Arnold, Jonathan
2015-04-28
RNA regulatory elements play a significant role in gene regulation. Riboswitches, a widespread group of regulatory RNAs, are vital components of many bacterial genomes. These regulatory elements generally function by forming a ligand-induced alternative fold that controls access to ribosome binding sites or other regulatory sites in RNA. Riboswitch-mediated mechanisms are ubiquitous across bacterial genomes. A typical class of riboswitch has its own unique structural and biological complexity, making de novo riboswitch identification a formidable task. Traditionally, riboswitches have been identified through comparative genomics based on sequence and structural homology. The limitations of structural-homology-based approaches, coupled with the assumption that there is a great diversity of undiscovered riboswitches, suggests the need for alternative methods for riboswitch identification, possibly based on features intrinsic to their structure. As of yet, no such reliable method has been proposed. We used structural entropy of riboswitch sequences as a measure of their secondary structural dynamics. Entropy values of a diverse set of riboswitches were compared to that of their mutants, their dinucleotide shuffles, and their reverse complement sequences under different stochastic context-free grammar folding models. Significance of our results was evaluated by comparison to other approaches, such as the base-pairing entropy and energy landscapes dynamics. Classifiers based on structural entropy optimized via sequence and structural features were devised as riboswitch identifiers and tested on Bacillus subtilis, Escherichia coli, and Synechococcus elongatus as an exploration of structural entropy based approaches. The unusually long untranslated region of the cotH in Bacillus subtilis, as well as upstream regions of certain genes, such as the sucC genes were associated with significant structural entropy values in genome-wide examinations. Various tests show that there
The role of entropy in magnetotail dynamics
Energy Technology Data Exchange (ETDEWEB)
Birn, Joachim [Los Alamos National Laboratory; Zaharia, Sorin [Los Alamos National Laboratory; Hesse, Michael [NASA/GSFC; Schindler, K [INSTITUT FOR THEORETISCHE
2008-01-01
The role of entropy conservation and loss in magnetospheric dynamics, particularly in relation to substorm phases, is discussed on the basis of MHD theory and simulations, using comparisons with PIC simulations for validation. Entropy conservation appears to be a crucial element leading to the formation of thin embedded current sheets in the late substorm growth phase and the potential loss of equilibrium. Entropy loss (in the form of plasmoids) is essential in the earthward transport of flux tubes (bubbles, bursty bulk flows). Entropy loss also changes the tail stability properties and may render ballooning modes unstable and thus contribute to cross-tail variability. We illustrate these effects through results from theory and simulations. Entropy conservation also governs the accessibility of final states of evolution and the amount of energy that may be released.
Schoenenberger, A W; Erne, P; Ammann, S; Perrig, M; Bürgi, U; Stuck, A E
2008-01-01
Approximate entropy (ApEn) of blood pressure (BP) can be easily measured based on software analysing 24-h ambulatory BP monitoring (ABPM), but the clinical value of this measure is unknown. In a prospective study we investigated whether ApEn of BP predicts, in addition to average and variability of BP, the risk of hypertensive crisis. In 57 patients with known hypertension we measured ApEn, average and variability of systolic and diastolic BP based on 24-h ABPM. Eight of these fifty-seven patients developed hypertensive crisis during follow-up (mean follow-up duration 726 days). In bivariate regression analysis, ApEn of systolic BP (Phypertensive crisis. The incidence rate ratio of hypertensive crisis was 14.0 (95% confidence interval (CI) 1.8, 631.5; Phypertensive crisis. A combination of these two measures had a positive predictive value of 75%, and a negative predictive value of 91%, respectively. ApEn, combined with other measures of 24-h ABPM, is a potentially powerful predictor of hypertensive crisis. If confirmed in independent samples, these findings have major clinical implications since measures predicting the risk of hypertensive crisis define patients requiring intensive follow-up and intensified therapy.
Directory of Open Access Journals (Sweden)
Chengquan Zhou
2018-02-01
Full Text Available To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP is used to obtain orthographic images of wheat plots at the filling stage. The data acquisition system of the FPP provides high-definition RGB images and multispectral images of the corresponding quadrats. Then, the high-definition panchromatic images are obtained by fusion of three channels of RGB. The Gram–Schmidt fusion algorithm is then used to fuse these multispectral and panchromatic images, thereby improving the color identification degree of the targets. Next, the maximum entropy segmentation method is used to do the coarse-segmentation. The threshold of this method is determined by a firefly algorithm based on chaos theory (FACT, and then a morphological filter is used to de-noise the coarse-segmentation results. Finally, morphological reconstruction theory is applied to segment the adhesive part of the de-noised image and realize the fine-segmentation of the image. The computer-generated counting results for the wheat plots, using independent regional statistical function in Matlab R2017b software, are then compared with field measurements which indicate that the proposed method provides a more accurate count of wheat spikes when compared with other traditional fusion and segmentation methods mentioned in this paper.
Entropy, Perception, and Relativity
National Research Council Canada - National Science Library
Jaegar, Stefan
2006-01-01
.... Shannon's notion of entropy is a special case of my more general definition of entropy. I define probability using a so-called performance function, which is de facto an exponential distribution...
Meirovitch, Hagai
2010-01-01
The commonly used simulation techniques, Metropolis Monte Carlo (MC) and molecular dynamics (MD) are of a dynamical type which enables one to sample system configurations i correctly with the Boltzmann probability, P(i)(B), while the value of P(i)(B) is not provided directly; therefore, it is difficult to obtain the absolute entropy, S approximately -ln P(i)(B), and the Helmholtz free energy, F. With a different simulation approach developed in polymer physics, a chain is grown step-by-step with transition probabilities (TPs), and thus their product is the value of the construction probability; therefore, the entropy is known. Because all exact simulation methods are equivalent, i.e. they lead to the same averages and fluctuations of physical properties, one can treat an MC or MD sample as if its members have rather been generated step-by-step. Thus, each configuration i of the sample can be reconstructed (from nothing) by calculating the TPs with which it could have been constructed. This idea applies also to bulk systems such as fluids or magnets. This approach has led earlier to the "local states" (LS) and the "hypothetical scanning" (HS) methods, which are approximate in nature. A recent development is the hypothetical scanning Monte Carlo (HSMC) (or molecular dynamics, HSMD) method which is based on stochastic TPs where all interactions are taken into account. In this respect, HSMC(D) can be viewed as exact and the only approximation involved is due to insufficient MC(MD) sampling for calculating the TPs. The validity of HSMC has been established by applying it first to liquid argon, TIP3P water, self-avoiding walks (SAW), and polyglycine models, where the results for F were found to agree with those obtained by other methods. Subsequently, HSMD was applied to mobile loops of the enzymes porcine pancreatic alpha-amylase and acetylcholinesterase in explicit water, where the difference in F between the bound and free states of the loop was calculated. Currently
Directory of Open Access Journals (Sweden)
Jia Xiao
2016-11-01
Full Text Available Constructing a merged concept lattice with formal concept analysis (FCA is an important research direction in the field of integrating multi-source geo-ontologies. Extracting essential geographical properties and reducing the concept lattice are two key points of previous research. A formal integration method is proposed to address the challenges in these two areas. We first extract essential properties from multi-source geo-ontologies and use FCA to build a merged formal context. Second, the combined importance weight of each single attribute of the formal context is calculated by introducing the inclusion degree importance from rough set theory and information entropy; then a weighted formal context is built from the merged formal context. Third, a combined weighted concept lattice is established from the weighted formal context with FCA and the importance weight value of every concept is defined as the sum of weight of attributes belonging to the concept’s intent. Finally, semantic granularity of concept is defined by its importance weight; we, then gradually reduce the weighted concept lattice by setting up diminishing threshold of semantic granularity. Additionally, all of those reduced lattices are organized into a regular hierarchy structure based on the threshold of semantic granularity. A workflow is designed to demonstrate this procedure. A case study is conducted to show feasibility and validity of this method and the procedure to integrate multi-source geo-ontologies.
Directory of Open Access Journals (Sweden)
Isis Didier Lins
2018-03-01
Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.
Ben-Naim, Arieh
2011-01-01
Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)
Baxa, Michael C.; Haddadian, Esmael J.; Jumper, John M.; Freed, Karl F.; Sosnick, Tobin R.
2014-01-01
The loss of conformational entropy is a major contribution in the thermodynamics of protein folding. However, accurate determination of the quantity has proven challenging. We calculate this loss using molecular dynamic simulations of both the native protein and a realistic denatured state ensemble. For ubiquitin, the total change in entropy is TΔSTotal = 1.4 kcal⋅mol−1 per residue at 300 K with only 20% from the loss of side-chain entropy. Our analysis exhibits mixed agreement with prior studies because of the use of more accurate ensembles and contributions from correlated motions. Buried side chains lose only a factor of 1.4 in the number of conformations available per rotamer upon folding (ΩU/ΩN). The entropy loss for helical and sheet residues differs due to the smaller motions of helical residues (TΔShelix−sheet = 0.5 kcal⋅mol−1), a property not fully reflected in the amide N-H and carbonyl C=O bond NMR order parameters. The results have implications for the thermodynamics of folding and binding, including estimates of solvent ordering and microscopic entropies obtained from NMR. PMID:25313044
Entropy of the system formed in heavy ion collision
International Nuclear Information System (INIS)
Gudima, K.K.; Schulz, H.; Toneev, V.D.
1985-01-01
In frames of a cascade model the entropy evolution in a system producted in heavy ion collisions is investigated. Entropy calculation is based on smoothing of the distribution function over the momentum space by the temperature field introduction. The resulting entropy per one nucleon is shown to be rather sensitive to phase space subdivision into cells at the stage of free scattering of reaction products. Compared to recent experimental results for specific entropy values inferred from the composite particle yield of 4π measurements, it is found that cascade calculations do not favour some particular entropy model treatments and suggest smaller entropy values than following from consideration within equilibrium statistics
Attractor comparisons based on density
International Nuclear Information System (INIS)
Carroll, T. L.
2015-01-01
Recognizing a chaotic attractor can be seen as a problem in pattern recognition. Some feature vector must be extracted from the attractor and used to compare to other attractors. The field of machine learning has many methods for extracting feature vectors, including clustering methods, decision trees, support vector machines, and many others. In this work, feature vectors are created by representing the attractor as a density in phase space and creating polynomials based on this density. Density is useful in itself because it is a one dimensional function of phase space position, but representing an attractor as a density is also a way to reduce the size of a large data set before analyzing it with graph theory methods, which can be computationally intensive. The density computation in this paper is also fast to execute. In this paper, as a demonstration of the usefulness of density, the density is used directly to construct phase space polynomials for comparing attractors. Comparisons between attractors could be useful for tracking changes in an experiment when the underlying equations are too complicated for vector field modeling
Quantum chaos: entropy signatures
International Nuclear Information System (INIS)
Miller, P.A.; Sarkar, S.; Zarum, R.
1998-01-01
A definition of quantum chaos is given in terms of entropy production rates for a quantum system coupled weakly to a reservoir. This allows the treatment of classical and quantum chaos on the same footing. In the quantum theory the entropy considered is the von Neumann entropy and in classical systems it is the Gibbs entropy. The rate of change of the coarse-grained Gibbs entropy of the classical system with time is given by the Kolmogorov-Sinai (KS) entropy. The relation between KS entropy and the rate of change of von Neumann entropy is investigated for the kicked rotator. For a system which is classically chaotic there is a linear relationship between these two entropies. Moreover it is possible to construct contour plots for the local KS entropy and compare it with the corresponding plots for the rate of change of von Neumann entropy. The quantitative and qualitative similarities of these plots are discussed for the standard map (kicked rotor) and the generalised cat maps. (author)
Novel lossless FMRI image compression based on motion compensation and customized entropy coding.
Sanchez, Victor; Nasiopoulos, Panos; Abugharbieh, Rafeef
2009-07-01
We recently proposed a method for lossless compression of 4-D medical images based on the advanced video coding standard (H.264/AVC). In this paper, we present two major contributions that enhance our previous work for compression of functional MRI (fMRI) data: 1) a new multiframe motion compensation process that employs 4-D search, variable-size block matching, and bidirectional prediction; and 2) a new context-based adaptive binary arithmetic coder designed for lossless compression of the residual and motion vector data. We validate our method on real fMRI sequences of various resolutions and compare the performance to two state-of-the-art methods: 4D-JPEG2000 and H.264/AVC. Quantitative results demonstrate that our proposed technique significantly outperforms current state of the art with an average compression ratio improvement of 13%.
Bayes-Optimal Entropy Pursuit for Active Choice-Based Preference Learning
Pallone, Stephen N.; Frazier, Peter I.; Henderson, Shane G.
2017-01-01
We analyze the problem of learning a single user's preferences in an active learning setting, sequentially and adaptively querying the user over a finite time horizon. Learning is conducted via choice-based queries, where the user selects her preferred option among a small subset of offered alternatives. These queries have been shown to be a robust and efficient way to learn an individual's preferences. We take a parametric approach and model the user's preferences through a linear classifier...
RNA Thermodynamic Structural Entropy.
Directory of Open Access Journals (Sweden)
Juan Antonio Garcia-Martin
Full Text Available Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs. However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
RNA Thermodynamic Structural Entropy.
Garcia-Martin, Juan Antonio; Clote, Peter
2015-01-01
Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
An efficient binomial model-based measure for sequence comparison and its application.
Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong
2011-04-01
Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.
Chen, Xiurong; Tian, Yixiang; Zhao, Rubo
2017-01-01
In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value.
Directory of Open Access Journals (Sweden)
Xiurong Chen
Full Text Available In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value.
Chen, Xiurong; Zhao, Rubo
2017-01-01
In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value. PMID:28817712
International Nuclear Information System (INIS)
Zhou Yunlong; Chen Fei; Sun Bin
2008-01-01
Based on the characteristic that wavelet packet transform image can be decomposed by different scales, a flow regime identification method based on image wavelet packet information entropy feature and genetic neural network was proposed. Gas-liquid two-phase flow images were captured by digital high speed video systems in horizontal pipe. The information entropy feature from transformation coefficients were extracted using image processing techniques and multi-resolution analysis. The genetic neural network was trained using those eigenvectors, which was reduced by the principal component analysis, as flow regime samples, and the flow regime intelligent identification was realized. The test result showed that image wavelet packet information entropy feature could excellently reflect the difference between seven typical flow regimes, and the genetic neural network with genetic algorithm and BP algorithm merits were with the characteristics of fast convergence for simulation and avoidance of local minimum. The recognition possibility of the network could reach up to about 100%, and a new and effective method was presented for on-line flow regime. (authors)
Methods for calculating nonconcave entropies
International Nuclear Information System (INIS)
Touchette, Hugo
2010-01-01
Five different methods which can be used to analytically calculate entropies that are nonconcave as functions of the energy in the thermodynamic limit are discussed and compared. The five methods are based on the following ideas and techniques: (i) microcanonical contraction, (ii) metastable branches of the free energy, (iii) generalized canonical ensembles with specific illustrations involving the so-called Gaussian and Betrag ensembles, (iv) the restricted canonical ensemble, and (v) the inverse Laplace transform. A simple long-range spin model having a nonconcave entropy is used to illustrate each method
The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis
Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali
2018-04-01
The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.
International Nuclear Information System (INIS)
Bian Yiwen; Yang Feng
2010-01-01
Data envelopment analysis (DEA) has been widely used in energy efficiency and environment efficiency analysis in recent years. Based on the existing environment DEA technology, this paper presents several DEA models for estimating the aggregated efficiency of resource and environment. These models can evaluate DMUs' energy efficiencies and environment efficiencies simultaneously. However, efficiency ranking results obtained from these models are not the same, and each model can provide some valuable information of DMUs' efficiencies, which we could not ignore. Under this situation, it may be hard for us to choose a specific model in practice. To address this kind of performance evaluation problem, the current paper extends Shannon-DEA procedure to establish a comprehensive efficiency measure for appraising DMUs' resource and environment efficiencies. In the proposed approach, the measure for evaluating a model's importance degree is provided, and the targets setting approach of inputs/outputs for DMU managers to improve DMUs' energy and environmental efficiencies is also discussed. We illustrate the proposed approach using real data set of 30 provinces in China.
IGENT: efficient entropy based algorithm for genome-wide gene-gene interaction analysis.
Kwon, Min-Seok; Park, Mira; Park, Taesung
2014-01-01
With the development of high-throughput genotyping and sequencing technology, there are growing evidences of association with genetic variants and complex traits. In spite of thousands of genetic variants discovered, such genetic markers have been shown to explain only a very small proportion of the underlying genetic variance of complex traits. Gene-gene interaction (GGI) analysis is expected to unveil a large portion of unexplained heritability of complex traits. In this work, we propose IGENT, Information theory-based GEnome-wide gene-gene iNTeraction method. IGENT is an efficient algorithm for identifying genome-wide gene-gene interactions (GGI) and gene-environment interaction (GEI). For detecting significant GGIs in genome-wide scale, it is important to reduce computational burden significantly. Our method uses information gain (IG) and evaluates its significance without resampling. Through our simulation studies, the power of the IGENT is shown to be better than or equivalent to that of that of BOOST. The proposed method successfully detected GGI for bipolar disorder in the Wellcome Trust Case Control Consortium (WTCCC) and age-related macular degeneration (AMD). The proposed method is implemented by C++ and available on Windows, Linux and MacOSX.
Efficient Multi-Label Feature Selection Using Entropy-Based Label Selection
Directory of Open Access Journals (Sweden)
Jaesung Lee
2016-11-01
Full Text Available Multi-label feature selection is designed to select a subset of features according to their importance to multiple labels. This task can be achieved by ranking the dependencies of features and selecting the features with the highest rankings. In a multi-label feature selection problem, the algorithm may be faced with a dataset containing a large number of labels. Because the computational cost of multi-label feature selection increases according to the number of labels, the algorithm may suffer from a degradation in performance when processing very large datasets. In this study, we propose an efficient multi-label feature selection method based on an information-theoretic label selection strategy. By identifying a subset of labels that significantly influence the importance of features, the proposed method efficiently outputs a feature subset. Experimental results demonstrate that the proposed method can identify a feature subset much faster than conventional multi-label feature selection methods for large multi-label datasets.
Directory of Open Access Journals (Sweden)
Christian Corda
2018-01-01
Full Text Available In this paper we consider the metric entropies of the maps of an iterated function system deduced from a black hole which are known the Bekenstein–Hawking entropies and its subleading corrections. More precisely, we consider the recent model of a Bohr-like black hole that has been recently analysed in some papers in the literature, obtaining the intriguing result that the metric entropies of a black hole are created by the metric entropies of the functions, created by the black hole principal quantum numbers, i.e., by the black hole quantum levels. We present a new type of topological entropy for general iterated function systems based on a new kind of the inverse of covers. Then the notion of metric entropy for an Iterated Function System ( I F S is considered, and we prove that these definitions for topological entropy of IFS’s are equivalent. It is shown that this kind of topological entropy keeps some properties which are hold by the classic definition of topological entropy for a continuous map. We also consider average entropy as another type of topological entropy for an I F S which is based on the topological entropies of its elements and it is also an invariant object under topological conjugacy. The relation between Axiom A and the average entropy is investigated.
Hepatic Steatosis Assessment with Ultrasound Small-Window Entropy Imaging.
Zhou, Zhuhuang; Tai, Dar-In; Wan, Yung-Liang; Tseng, Jeng-Hwei; Lin, Yi-Ru; Wu, Shuicai; Yang, Kuen-Cheh; Liao, Yin-Yin; Yeh, Chih-Kuang; Tsui, Po-Hsiang
2018-04-02
Nonalcoholic fatty liver disease is a type of hepatic steatosis that is not only associated with critical metabolic risk factors but can also result in advanced liver diseases. Ultrasound parametric imaging, which is based on statistical models, assesses fatty liver changes, using quantitative visualization of hepatic-steatosis-caused variations in the statistical properties of backscattered signals. One constraint with using statistical models in ultrasound imaging is that ultrasound data must conform to the distribution employed. Small-window entropy imaging was recently proposed as a non-model-based parametric imaging technique with physical meanings of backscattered statistics. In this study, we explored the feasibility of using small-window entropy imaging in the assessment of fatty liver disease and evaluated its performance through comparisons with parametric imaging based on the Nakagami distribution model (currently the most frequently used statistical model). Liver donors (n = 53) and patients (n = 142) were recruited to evaluate hepatic fat fractions (HFFs), using magnetic resonance spectroscopy and to evaluate the stages of fatty liver disease (normal, mild, moderate and severe), using liver biopsy with histopathology. Livers were scanned using a 3-MHz ultrasound to construct B-mode, small-window entropy and Nakagami images to correlate with HFF analyses and fatty liver stages. The diagnostic values of the imaging methods were evaluated using receiver operating characteristic curves. The results demonstrated that the entropy value obtained using small-window entropy imaging correlated well with log 10 (HFF), with a correlation coefficient r = 0.74, which was higher than those obtained for the B-scan and Nakagami images. Moreover, small-window entropy imaging also resulted in the highest area under the receiver operating characteristic curve (0.80 for stages equal to or more severe than mild; 0.90 for equal to or more severe than moderate; 0
A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns.
Osgood, Nathaniel D; Paul, Tuhin; Stanley, Kevin G; Qian, Weicheng
2016-01-01
Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist.
Directory of Open Access Journals (Sweden)
Øyvind Grøn
2012-12-01
Full Text Available The effect of gravity upon changes of the entropy of a gravity-dominated system is discussed. In a universe dominated by vacuum energy, gravity is repulsive, and there is accelerated expansion. Furthermore, inhomogeneities are inflated and the universe approaches a state of thermal equilibrium. The difference between the evolution of the cosmic entropy in a co-moving volume in an inflationary era with repulsive gravity and a matter-dominated era with attractive gravity is discussed. The significance of conversion of gravitational energy to thermal energy in a process with gravitational clumping, in order that the entropy of the universe shall increase, is made clear. Entropy of black holes and cosmic horizons are considered. The contribution to the gravitational entropy according to the Weyl curvature hypothesis is discussed. The entropy history of the Universe is reviewed.
Rosser, J. Barkley
2016-12-01
Entropy is a central concept of statistical mechanics, which is the main branch of physics that underlies econophysics, the application of physics concepts to understand economic phenomena. It enters into econophysics both in an ontological way through the Second Law of Thermodynamics as this drives the world economy from its ecological foundations as solar energy passes through food chains in dissipative process of entropy rising and production fundamentally involving the replacement of lower entropy energy states with higher entropy ones. In contrast the mathematics of entropy as appearing in information theory becomes the basis for modeling financial market dynamics as well as income and wealth distribution dynamics. It also provides the basis for an alternative view of stochastic price equilibria in economics, as well providing a crucial link between econophysics and sociophysics, keeping in mind the essential unity of the various concepts of entropy.
The Conditional Entropy Power Inequality for Bosonic Quantum Systems
DEFF Research Database (Denmark)
de Palma, Giacomo; Trevisan, Dario
2018-01-01
We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally...... independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically...... achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under...
Entropy of black holes with multiple horizons
He, Yun; Ma, Meng-Sen; Zhao, Ren
2018-05-01
We examine the entropy of black holes in de Sitter space and black holes surrounded by quintessence. These black holes have multiple horizons, including at least the black hole event horizon and a horizon outside it (cosmological horizon for de Sitter black holes and "quintessence horizon" for the black holes surrounded by quintessence). Based on the consideration that the two horizons are not independent each other, we conjecture that the total entropy of these black holes should not be simply the sum of entropies of the two horizons, but should have an extra term coming from the correlations between the two horizons. Different from our previous works, in this paper we consider the cosmological constant as the variable and employ an effective method to derive the explicit form of the entropy. We also try to discuss the thermodynamic stabilities of these black holes according to the entropy and the effective temperature.
Problems in black-hole entropy interpretation
International Nuclear Information System (INIS)
Liberati, S.
1997-01-01
In this work some proposals for black-hole entropy interpretation are exposed and investigated. In particular, the author will firstly consider the so-called 'entanglement entropy' interpretation, in the framework of the brick wall model and the divergence problem arising in the one-loop calculations of various thermodynamical quantities, like entropy, internal energy and heat capacity. It is shown that the assumption of equality of entanglement entropy and Bekenstein-Hawking one appears to give inconsistent results. These will be a starting point for a different interpretation of black.hole entropy based on peculiar topological structures of manifolds with 'intrinsic' thermodynamical features. It is possible to show an exact relation between black-hole gravitational entropy and topology of these Euclidean space-times. the expression for the Euler characteristic, through the Gauss-Bonnet integral, and the one for entropy for gravitational instantons are proposed in a form which makes the relation between these self-evident. Using this relation he propose a generalization of the Bekenstein-Hawking entropy in which the former and Euler characteristic are related in the equation S = χA / 8. Finally, he try to expose some conclusions and hypotheses about possible further development of this research
Fused Entropy Algorithm in Optical Computed Tomography
Directory of Open Access Journals (Sweden)
Xiong Wan
2014-02-01
Full Text Available In most applications of optical computed tomography (OpCT, limited-view problems are often encountered, which can be solved to a certain extent with typical OpCT reconstructive algorithms. The concept of entropy first emerged in information theory has been introduced into OpCT algorithms, such as maximum entropy (ME algorithms and cross entropy (CE algorithms, which have demonstrated their superiority over traditional OpCT algorithms, yet have their own limitations. A fused entropy (FE algorithm, which follows an optimized criterion combining self-adaptively ME with CE, is proposed and investigated by comparisons with ME, CE and some traditional OpCT algorithms. Reconstructed results of several physical models show this FE algorithm has a good convergence and can achieve better precision than other algorithms, which verifies the feasibility of FE as an approach of optimizing computation, not only for OpCT, but also for other image processing applications.
Mei, Zhanyong; Zhao, Guoru; Ivanov, Kamen; Guo, Yanwei; Zhu, Qingsong; Zhou, Yongjin; Wang, Lei
2013-10-10
Motion characteristics of CoP (Centre of Pressure, the point of application of the resultant ground reaction force acting on the plate) are useful for foot type characteristics detection. To date, only few studies have investigated the nonlinear characteristics of CoP velocity and acceleration during the stance phase. The aim of this study is to investigate whether CoP regularity is different among four foot types (normal foot, pes valgus, hallux valgus and pes cavus); this might be useful for classification and diagnosis of foot injuries and diseases. To meet this goal, sample entropy, a measure of time-series regularity, was used to quantify the CoP regularity of four foot types. One hundred and sixty five subjects that had the same foot type bilaterally (48 subjects with healthy feet, 22 with pes valgus, 47 with hallux valgus, and 48 with pes cavus) were recruited for this study. A Footscan® system was used to collect CoP data when each subject walked at normal and steady speed. The velocity and acceleration in medial-lateral (ML) and anterior-posterior (AP) directions, and resultant velocity and acceleration were derived from CoP. The sample entropy is the negative natural logarithm of the conditional probability that a subseries of length m that matches pointwise within a tolerance r also matches at the next point. This was used to quantify variables of CoP velocity and acceleration of four foot types. The parameters r (the tolerance) and m (the matching length) for sample entropy calculation have been determined by an optimal method. It has been found that in order to analyze all CoP parameters of velocity and acceleration during the stance phase of walking gait, for each variable there is a different optimal r value. On the contrary, the value m=4 is optimal for all variables.Sample entropies of both velocity and acceleration in AP direction were highly correlated with their corresponding resultant variables for r>0.91. The sample entropy of the velocity in
Directory of Open Access Journals (Sweden)
Jinlu Sheng
2016-07-01
Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.
Gao, Yangde; Karimi, Mohammad; Kudreyko, Aleksey A; Song, Wanqing
2017-12-30
In the marine systems, engines represent the most important part of ships, the probability of the bearings fault is the highest in the engines, so in the bearing vibration analysis, early weak fault detection is very important for long term monitoring. In this paper, we propose a novel method to solve the early weak fault diagnosis of bearing. Firstly, we should improve the alternating direction method of multipliers (ADMM), structure of the traditional ADMM is changed, and then the improved ADMM is applied to the compressed sensing (CS) theory, which realizes the sparse optimization of bearing signal for a mount of data. After the sparse signal is reconstructed, the calculated signal is restored with the minimum entropy de-convolution (MED) to get clear fault information. Finally we adopt the sample entropy. Morphological mean square amplitude and the root mean square (RMS) to find the early fault diagnosis of bearing respectively, at the same time, we plot the Boxplot comparison chart to find the best of the three indicators. The experimental results prove that the proposed method can effectively identify the early weak fault diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Chavanis, Pierre-Henri, E-mail: chavanis@irsamc.ups-tlse.fr [Laboratoire de Physique Théorique, Université Paul Sabatier, 118 route de Narbonne, F-31062 Toulouse (France)
2014-12-01
In the context of two-dimensional (2D) turbulence, we apply the maximum entropy production principle (MEPP) by enforcing a local conservation of energy. This leads to an equation for the vorticity distribution that conserves all the Casimirs, the energy, and that increases monotonically the mixing entropy (H-theorem). Furthermore, the equation for the coarse-grained vorticity dissipates monotonically all the generalized enstrophies. These equations may provide a parametrization of 2D turbulence. They do not generally relax towards the maximum entropy state. The vorticity current vanishes for any steady state of the 2D Euler equation. Interestingly, the equation for the coarse-grained vorticity obtained from the MEPP turns out to coincide, after some algebraic manipulations, with the one obtained with the anticipated vorticity method. This shows a connection between these two approaches when the conservation of energy is treated locally. Furthermore, the newly derived equation, which incorporates a diffusion term and a drift term, has a nice physical interpretation in terms of a selective decay principle. This sheds new light on both the MEPP and the anticipated vorticity method. (paper)
Li, Tie; He, Xiaoyang; Tang, Junci; Zeng, Hui; Zhou, Chunying; Zhang, Nan; Liu, Hui; Lu, Zhuoxin; Kong, Xiangrui; Yan, Zheng
2018-02-01
Forasmuch as the distinguishment of islanding is easy to be interfered by grid disturbance, island detection device may make misjudgment thus causing the consequence of photovoltaic out of service. The detection device must provide with the ability to differ islanding from grid disturbance. In this paper, the concept of deep learning is introduced into classification of islanding and grid disturbance for the first time. A novel deep learning framework is proposed to detect and classify islanding or grid disturbance. The framework is a hybrid of wavelet transformation, multi-resolution singular spectrum entropy, and deep learning architecture. As a signal processing method after wavelet transformation, multi-resolution singular spectrum entropy combines multi-resolution analysis and spectrum analysis with entropy as output, from which we can extract the intrinsic different features between islanding and grid disturbance. With the features extracted, deep learning is utilized to classify islanding and grid disturbance. Simulation results indicate that the method can achieve its goal while being highly accurate, so the photovoltaic system mistakenly withdrawing from power grids can be avoided.
Option price calibration from Renyi entropy
International Nuclear Information System (INIS)
Brody, Dorje C.; Buckley, Ian R.C.; Constantinou, Irene C.
2007-01-01
The calibration of the risk-neutral density function for the future asset price, based on the maximisation of the entropy measure of Renyi, is proposed. Whilst the conventional approach based on the use of logarithmic entropy measure fails to produce the observed power-law distribution when calibrated against option prices, the approach outlined here is shown to produce the desired form of the distribution. Procedures for the maximisation of the Renyi entropy under constraints are outlined in detail, and a number of interesting properties of the resulting power-law distributions are also derived. The result is applied to efficiently evaluate prices of path-independent derivatives
Logarithmic black hole entropy corrections and holographic Rényi entropy
Mahapatra, Subhash
2018-01-01
The entanglement and Rényi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Rényi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order GD^0. The entropic c-function and the inequalities of the Rényi entropy are also satisfied even with the correction terms.
Entropy Explained: The Origin of Some Simple Trends
Watson, Lori; Eisenstein, Odile
2002-10-01
Density functional theory computational methods were used to calculate the entropies of various molecules; computed entropies correlated closely with measured values. For organic systems, an average of 8.4 kcal/mol for the reaction entropy (one particle to two at 298.15 K) was observed; this value is largely determined by translational entropy gain. The average reaction entropy is slightly lower for reactions that produce two linear molecules and up to 4 kcal/mol higher when no linear molecules are produced, due to differences in rotational entropy of the reactants and products. Translational and rotational entropy are generally independent of molecular identity except for increases in mass and generation of additional moments of inertia; vibrational entropy, which is more dependent on the molecule itself, is a small contributor to the nearly constant entropy of reaction. A variety of inorganic and non-hydrocarbon main group reaction entropies were also calculated; there is an increased contribution of vibrational entropy in inorganic molecules with "softer" vibrations. The trends discussed in this paper can serve as a basis for understanding the contributions of different sources of entropy to the overall reaction TΔS° for students and practicing chemists; the method employed (i.e., using a commercial program to "discover" trends in a thermodynamic property) can serve as an example of discovery-based learning in the curriculum.
Entropy and Kolmogorov complexity
Moriakov, N.V.
2016-01-01
This thesis is dedicated to studying the theory of entropy and its relation to the Kolmogorov complexity. Originating in physics, the notion of entropy was introduced to mathematics by C. E. Shannon as a way of measuring the rate at which information is coming from a data source. There are, however,
International Nuclear Information System (INIS)
Bao, Ning; Nezami, Sepehr; Ooguri, Hirosi; Stoica, Bogdan; Sully, James; Walter, Michael
2015-01-01
We initiate a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries. For 2, 3, and 4 regions, we prove that the strong subadditivity and the monogamy of mutual information give the complete set of inequalities. This is in contrast to the situation for generic quantum systems, where a complete set of entropy inequalities is not known for 4 or more regions. We also find an infinite new family of inequalities applicable to 5 or more regions. The set of all holographic entropy inequalities bounds the phase space of Ryu-Takayanagi entropies, defining the holographic entropy cone. We characterize this entropy cone by reducing geometries to minimal graph models that encode the possible cutting and gluing relations of minimal surfaces. We find that, for a fixed number of regions, there are only finitely many independent entropy inequalities. To establish new holographic entropy inequalities, we introduce a combinatorial proof technique that may also be of independent interest in Riemannian geometry and graph theory.
Indian Academy of Sciences (India)
Enthalpy–entropy compensation is the name given to the correlation sometimes observed between the estimates of the enthalpy and entropy of a reaction obtained from temperature-dependence data. Although the mainly artefactual nature of this correlation has been known for many years, the subject enjoys periodical ...
Indian Academy of Sciences (India)
entropy loss during protein folding plays a much larger role in determining the shape of the free energy reaction landscape than it does in most small molecule reactions. For a protein to fold, the loss of entropy must be balanced by the gain in enthalpy for the free energy to favor folding. Strong non- covalent forces from ...
Tsallis Entropy, Escort Probability and the Incomplete Information Theory
Directory of Open Access Journals (Sweden)
Parvin Sadeghi
2010-12-01
Full Text Available Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis entropy, the main core of this theory has been remained as an unproven assumption. Many people have tried to derive the Tsallis entropy axiomatically. Here we follow the work of Wang (EPJB, 2002 and use the incomplete information theory to retrieve the Tsallis entropy. We change the incomplete information axioms to consider the escort probability and obtain a correct form of Tsallis entropy in comparison with Wang’s work.
Density estimation by maximum quantum entropy
Energy Technology Data Exchange (ETDEWEB)
Silver, R.N.; Wallstrom, T.; Martz, H.F.
1993-11-01
A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets.
Entropy viscosity method for nonlinear conservation laws
Guermond, Jean-Luc
2011-05-01
A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.
Permutation Entropy for Random Binary Sequences
Directory of Open Access Journals (Sweden)
Lingfeng Liu
2015-12-01
Full Text Available In this paper, we generalize the permutation entropy (PE measure to binary sequences, which is based on Shannon’s entropy, and theoretically analyze this measure for random binary sequences. We deduce the theoretical value of PE for random binary sequences, which can be used to measure the randomness of binary sequences. We also reveal the relationship between this PE measure with other randomness measures, such as Shannon’s entropy and Lempel–Ziv complexity. The results show that PE is consistent with these two measures. Furthermore, we use PE as one of the randomness measures to evaluate the randomness of chaotic binary sequences.
International Nuclear Information System (INIS)
Guo Yongfeng; Xu Wei; Li Dongxi; Xie Wenxian
2008-01-01
A stochastic dissipative dynamical system driven by non-Gaussian noise is investigated. A general approximate Fokker-Planck equation of the system is derived through a path-integral approach. Based on the definition of Shannon's information entropy, the exact time dependence of entropy flux and entropy production of the system is calculated both in the absence and in the presence of non-equilibrium constraint. The present calculation can be used to interpret the interplay of the dissipative constant and non-Gaussian noise on the entropy flux and entropy production
Nonsymmetric entropy I: basic concepts and results
Liu, Chengshi
2006-01-01
A new concept named nonsymmetric entropy which generalizes the concepts of Boltzman's entropy and shannon's entropy, was introduced. Maximal nonsymmetric entropy principle was proven. Some important distribution laws were derived naturally from maximal nonsymmetric entropy principle.
Entropy resistance minimization: An alternative method for heat exchanger analyses
International Nuclear Information System (INIS)
Cheng, XueTao
2013-01-01
In this paper, the concept of entropy resistance is proposed based on the entropy generation analyses of heat transfer processes. It is shown that smaller entropy resistance leads to larger heat transfer rate with fixed thermodynamic force difference and smaller thermodynamic force difference with fixed heat transfer rate, respectively. For the discussed two-stream heat exchangers in which the heat transfer rates are not given and the three-stream heat exchanger with prescribed heat capacity flow rates and inlet temperatures of the streams, smaller entropy resistance leads to larger heat transfer rate. For the two-stream heat exchangers with fixed heat transfer rate, smaller entropy resistance leads to larger effectiveness. Furthermore, it is shown that smaller values of the concepts of entropy generation numbers and modified entropy generation number do not always correspond to better performance of the discussed heat exchangers. - Highlights: • The concept of entropy resistance is defined for heat exchangers. • The concepts based on entropy generation are used to analyze heat exchangers. • Smaller entropy resistance leads to better performance of heat exchangers. • The applicability of entropy generation minimization is conditional
Tsallis Entropy Theory for Modeling in Water Engineering: A Review
Directory of Open Access Journals (Sweden)
Vijay P. Singh
2017-11-01
Full Text Available Water engineering is an amalgam of engineering (e.g., hydraulics, hydrology, irrigation, ecosystems, environment, water resources and non-engineering (e.g., social, economic, political aspects that are needed for planning, designing and managing water systems. These aspects and the associated issues have been dealt with in the literature using different techniques that are based on different concepts and assumptions. A fundamental question that still remains is: Can we develop a unifying theory for addressing these? The second law of thermodynamics permits us to develop a theory that helps address these in a unified manner. This theory can be referred to as the entropy theory. The thermodynamic entropy theory is analogous to the Shannon entropy or the information theory. Perhaps, the most popular generalization of the Shannon entropy is the Tsallis entropy. The Tsallis entropy has been applied to a wide spectrum of problems in water engineering. This paper provides an overview of Tsallis entropy theory in water engineering. After some basic description of entropy and Tsallis entropy, a review of its applications in water engineering is presented, based on three types of problems: (1 problems requiring entropy maximization; (2 problems requiring coupling Tsallis entropy theory with another theory; and (3 problems involving physical relations.
Towards operational interpretations of generalized entropies
International Nuclear Information System (INIS)
Topsoee, Flemming
2010-01-01
The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.
Coarse Graining Shannon and von Neumann Entropies
Directory of Open Access Journals (Sweden)
Ana Alonso-Serrano
2017-05-01
Full Text Available The nature of coarse graining is intuitively “obvious”, but it is rather difficult to find explicit and calculable models of the coarse graining process (and the resulting entropy flow discussed in the literature. What we would like to have at hand is some explicit and calculable process that takes an arbitrary system, with specified initial entropy S, and that monotonically and controllably drives the entropy to its maximum value. This does not have to be a physical process, in fact for some purposes it is better to deal with a gedanken-process, since then it is more obvious how the “hidden information” is hiding in the fine-grain correlations that one is simply agreeing not to look at. We shall present several simple mathematically well-defined and easy to work with conceptual models for coarse graining. We shall consider both the classical Shannon and quantum von Neumann entropies, including models based on quantum decoherence, and analyse the entropy flow in some detail. When coarse graining the quantum von Neumann entropy, we find it extremely useful to introduce an adaptation of Hawking’s super-scattering matrix. These explicit models that we shall construct allow us to quantify and keep clear track of the entropy that appears when coarse graining the system and the information that can be hidden in unobserved correlations (while not the focus of the current article, in the long run, these considerations are of interest when addressing the black hole information puzzle.
International Nuclear Information System (INIS)
Baccetti, Valentina; Visser, Matt
2013-01-01
Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann) entropy can easily be infinite. We carefully analyze conditions under which this phenomenon can occur. Roughly speaking, this happens when arbitrarily small amounts of probability are dispersed into an infinite number of states; we shall quantify this observation and make it precise. We develop several particularly simple, elementary, and useful bounds, and also provide some asymptotic estimates, leading to necessary and sufficient conditions for the occurrence of infinite Shannon entropy. We go to some effort to keep technical computations as simple and conceptually clear as possible. In particular, we shall see that large entropies cannot be localized in state space; large entropies can only be supported on an exponentially large number of states. We are for the time being interested in single-channel Shannon entropy in the information theoretic sense, not entropy in a stochastic field theory or quantum field theory defined over some configuration space, on the grounds that this simple problem is a necessary precursor to understanding infinite entropy in a field theoretic context. (paper)
Entropy Coherent and Entropy Convex Measures of Risk
Laeven, R.J.A.; Stadje, M.A.
2011-01-01
We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. We prove that convex, entropy convex and entropy coherent measures of risk emerge as certainty equivalents under variational, homothetic and multiple priors preferences,
Entropy coherent and entropy convex measures of risk
Laeven, R.J.A.; Stadje, M.
2013-01-01
We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. Entropy coherent and entropy convex measures of risk are special cases of φ-coherent and φ-convex measures of risk. Contrary to the classical use of coherent and convex
A robust method for heart sounds localization using lung sounds entropy.
Yadollahi, Azadeh; Moussavi, Zahra M K
2006-03-01
Heart sounds are the main unavoidable interference in lung sound recording and analysis. Hence, several techniques have been developed to reduce or cancel heart sounds (HS) from lung sound records. The first step in most HS cancellation techniques is to detect the segments including HS. This paper proposes a novel method for HS localization using entropy of the lung sounds. We investigated both Shannon and Renyi entropies and the results of the method using Shannon entropy were superior. Another HS localization method based on multiresolution product of lung sounds wavelet coefficients adopted from was also implemented for comparison. The methods were tested on data from 6 healthy subjects recorded at low (7.5 ml/s/kg) and medium 115 ml/s/kg) flow rates. The error of entropy-based method using Shannon entropy was found to be 0.1 +/- 0.4% and 1.0 +/- 0.7% at low and medium flow rates, respectively, which is significantly lower than that of multiresolution product method and those of other methods reported in previous studies. The proposed method is fully automated and detects HS included segments in a completely unsupervised manner.
Directory of Open Access Journals (Sweden)
José Pinto Casquilho
2017-02-01
Full Text Available The search for hypothetical optimal solutions of landscape composition is a major issue in landscape planning and it can be outlined in a two-dimensional decision space involving economic value and landscape diversity, the latter being considered as a potential safeguard to the provision of services and externalities not accounted in the economic value. In this paper, we use decision models with different utility valuations combined with weighted entropies respectively incorporating rarity factors associated to Gini-Simpson and Shannon measures. A small example of this framework is provided and discussed for landscape compositional scenarios in the region of Nisa, Portugal. The optimal solutions relative to the different cases considered are assessed in the two-dimensional decision space using a benchmark indicator. The results indicate that the likely best combination is achieved by the solution using Shannon weighted entropy and a square root utility function, corresponding to a risk-averse behavior associated to the precautionary principle linked to safeguarding landscape diversity, anchoring for ecosystem services provision and other externalities. Further developments are suggested, mainly those relative to the hypothesis that the decision models here outlined could be used to revisit the stability-complexity debate in the field of ecological studies.
Combined Power Quality Disturbances Recognition Using Wavelet Packet Entropies and S-Transform
Directory of Open Access Journals (Sweden)
Zhigang Liu
2015-08-01
Full Text Available Aiming at the combined power quality +disturbance recognition, an automated recognition method based on wavelet packet entropy (WPE and modified incomplete S-transform (MIST is proposed in this paper. By combining wavelet packet Tsallis singular entropy, energy entropy and MIST, a 13-dimension vector of different power quality (PQ disturbances including single disturbances and combined disturbances is extracted. Then, a ruled decision tree is designed to recognize the combined disturbances. The proposed method is tested and evaluated using a large number of simulated PQ disturbances and some real-life signals, which include voltage sag, swell, interruption, oscillation transient, impulsive transient, harmonics, voltage fluctuation and their combinations. In addition, the comparison of the proposed recognition approach with some existing techniques is made. The experimental results show that the proposed method can effectively recognize the single and combined PQ disturbances.
Advancing Shannon Entropy for Measuring Diversity in Systems
Directory of Open Access Journals (Sweden)
R. Rajaram
2017-01-01
Full Text Available From economic inequality and species diversity to power laws and the analysis of multiple trends and trajectories, diversity within systems is a major issue for science. Part of the challenge is measuring it. Shannon entropy H has been used to rethink diversity within probability distributions, based on the notion of information. However, there are two major limitations to Shannon’s approach. First, it cannot be used to compare diversity distributions that have different levels of scale. Second, it cannot be used to compare parts of diversity distributions to the whole. To address these limitations, we introduce a renormalization of probability distributions based on the notion of case-based entropy Cc as a function of the cumulative probability c. Given a probability density p(x, Cc measures the diversity of the distribution up to a cumulative probability of c, by computing the length or support of an equivalent uniform distribution that has the same Shannon information as the conditional distribution of p^c(x up to cumulative probability c. We illustrate the utility of our approach by renormalizing and comparing three well-known energy distributions in physics, namely, the Maxwell-Boltzmann, Bose-Einstein, and Fermi-Dirac distributions for energy of subatomic particles. The comparison shows that Cc is a vast improvement over H as it provides a scale-free comparison of these diversity distributions and also allows for a comparison between parts of these diversity distributions.
Entropy in Corporate Information Systems
Directory of Open Access Journals (Sweden)
Victor Y. Tsvetkov
2014-03-01
Full Text Available This paper describes the stages of entropy formation. It depicts the basic definitions of the corporate information systems. This paper describes the quality of entropy, the duration of the entropy in the corporate information system. The article also gives a paradigmatic description of the action of information entropy in time.
Holographic QCD, entanglement entropy, and critical temperature
Ali-Akbari, M.; Lezgi, M.
2017-10-01
Based on gauge-gravity duality, by using holographic entanglement entropy, we have done a phenomenological study to probe the confinement-deconfinement phase transition in the holographic model resembling quantum chromodynamics with two massless flavors and three colors. Our outcomes are in perfect agreement with the expected results, qualitatively and quantitatively. We find out that the (holographic) entanglement entropy is a reliable order parameter for probing the phase transition.
Shimauchi, Akiko; Abe, Hiroyuki; Schacht, David V; Yulei, Jian; Pineda, Federico D; Jansen, Sanaz A; Ganesh, Rajiv; Newstead, Gillian M
2015-08-01
To quantify kinetic heterogeneity of breast masses that were initially detected with dynamic contrast-enhanced MRI, using whole-lesion kinetic distribution data obtained from computer-aided evaluation (CAE), and to compare that with standard kinetic curve analysis. Clinical MR images from 2006 to 2011 with breast masses initially detected with MRI were evaluated with CAE. The relative frequencies of six kinetic patterns (medium-persistent, medium-plateau, medium-washout, rapid-persistent, rapid-plateau, rapid-washout) within the entire lesion were used to calculate kinetic entropy (KE), a quantitative measure of enhancement pattern heterogeneity. Initial uptake (IU) and signal enhancement ratio (SER) were obtained from the most-suspicious kinetic curve. Mann-Whitney U test and ROC analysis were conducted for differentiation of malignant and benign masses. Forty benign and 37 malignant masses comprised the case set. IU and SER were not significantly different between malignant and benign masses, whereas KE was significantly greater for malignant than benign masses (p = 0.748, p = 0.083, and p kinetic heterogeneity of whole-lesion time-curve data with KE has the potential to improve differentiation of malignant from benign breast masses on breast MRI. • Kinetic heterogeneity can be quantified by computer-aided evaluation of breast MRI • Kinetic entropy was greater in malignant masses than benign masses • Kinetic entropy has the potential to improve differentiation of breast masses.
Two dissimilar approaches to dynamical systems on hyper MV -algebras and their information entropy
Mehrpooya, Adel; Ebrahimi, Mohammad; Davvaz, Bijan
2017-09-01
Measuring the flow of information that is related to the evolution of a system which is modeled by applying a mathematical structure is of capital significance for science and usually for mathematics itself. Regarding this fact, a major issue in concern with hyperstructures is their dynamics and the complexity of the varied possible dynamics that exist over them. Notably, the dynamics and uncertainty of hyper MV -algebras which are hyperstructures and extensions of a central tool in infinite-valued Lukasiewicz propositional calculus that models many valued logics are of primary concern. Tackling this problem, in this paper we focus on the subject of dynamical systems on hyper MV -algebras and their entropy. In this respect, we adopt two varied approaches. One is the set-based approach in which hyper MV -algebra dynamical systems are developed by employing set functions and set partitions. By the other method that is based on points and point partitions, we establish the concept of hyper injective dynamical systems on hyper MV -algebras. Next, we study the notion of entropy for both kinds of systems. Furthermore, we consider essential ergodic characteristics of those systems and their entropy. In particular, we introduce the concept of isomorphic hyper injective and hyper MV -algebra dynamical systems, and we demonstrate that isomorphic systems have the same entropy. We present a couple of theorems in order to help calculate entropy. In particular, we prove a contemporary version of addition and Kolmogorov-Sinai Theorems. Furthermore, we provide a comparison between the indispensable properties of hyper injective and semi-independent dynamical systems. Specifically, we present and prove theorems that draw comparisons between the entropies of such systems. Lastly, we discuss some possible relationships between the theories of hyper MV -algebra and MV -algebra dynamical systems.
Minimum entropy production principle
Czech Academy of Sciences Publication Activity Database
Maes, C.; Netočný, Karel
2013-01-01
Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle
Energy Technology Data Exchange (ETDEWEB)
Estes, John [Blackett Laboratory, Imperial College,London SW7 2AZ (United Kingdom); Jensen, Kristan [Department of Physics and Astronomy, University of Victoria,Victoria, BC V8W 3P6 (Canada); C.N. Yang Institute for Theoretical Physics, SUNY Stony Brook,Stony Brook, NY 11794-3840 (United States); O’Bannon, Andy [Rudolf Peierls Centre for Theoretical Physics, University of Oxford,1 Keble Road, Oxford OX1 3NP (United Kingdom); Tsatis, Efstratios [8 Kotylaiou Street, Athens 11364 (Greece); Wrase, Timm [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States)
2014-05-19
We study a number of (3+1)- and (2+1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3+1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1+1)-dimensional field theories generalizes to higher dimensions.
Estes, John; Jensen, Kristan; O'Bannon, Andy; Tsatis, Efstratios; Wrase, Timm
2014-05-01
We study a number of (3 + 1)- and (2 + 1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3 + 1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1 + 1)-dimensional field theories generalizes to higher dimensions.
Microcanonical entropy for classical systems
Franzosi, Roberto
2018-03-01
The entropy definition in the microcanonical ensemble is revisited. We propose a novel definition for the microcanonical entropy that resolve the debate on the correct definition of the microcanonical entropy. In particular we show that this entropy definition fixes the problem inherent the exact extensivity of the caloric equation. Furthermore, this entropy reproduces results which are in agreement with the ones predicted with standard Boltzmann entropy when applied to macroscopic systems. On the contrary, the predictions obtained with the standard Boltzmann entropy and with the entropy we propose, are different for small system sizes. Thus, we conclude that the Boltzmann entropy provides a correct description for macroscopic systems whereas extremely small systems should be better described with the entropy that we propose here.
Directory of Open Access Journals (Sweden)
Nantian Huang
2016-09-01
Full Text Available In order to improve the identification accuracy of the high voltage circuit breakers’ (HVCBs mechanical fault types without training samples, a novel mechanical fault diagnosis method of HVCBs using a hybrid classifier constructed with Support Vector Data Description (SVDD and fuzzy c-means (FCM clustering method based on Local Mean Decomposition (LMD and time segmentation energy entropy (TSEE is proposed. Firstly, LMD is used to decompose nonlinear and non-stationary vibration signals of HVCBs into a series of product functions (PFs. Secondly, TSEE is chosen as feature vectors with the superiority of energy entropy and characteristics of time-delay faults of HVCBs. Then, SVDD trained with normal samples is applied to judge mechanical faults of HVCBs. If the mechanical fault is confirmed, the new fault sample and all known fault samples are clustered by FCM with the cluster number of known fault types. Finally, another SVDD trained by the specific fault samples is used to judge whether the fault sample belongs to an unknown type or not. The results of experiments carried on a real SF6 HVCB validate that the proposed fault-detection method is effective for the known faults with training samples and unknown faults without training samples.
International Nuclear Information System (INIS)
Lemos, Jose P. S.; Zaslavskii, Oleg B.
2010-01-01
We trace the origin of the black hole entropy S, replacing a black hole by a quasiblack hole. Let the boundary of a static body approach its own gravitational radius, in such a way that a quasihorizon forms. We show that if the body is thermal with the temperature taking the Hawking value at the quasihorizon limit, it follows, in the nonextremal case, from the first law of thermodynamics that the entropy approaches the Bekenstein-Hawking value S=A/4. In this setup, the key role is played by the surface stresses on the quasihorizon and one finds that the entropy comes from the quasihorizon surface. Any distribution of matter inside the surface leads to the same universal value for the entropy in the quasihorizon limit. This can be of some help in the understanding of black hole entropy. Other similarities between black holes and quasiblack holes such as the mass formulas for both objects had been found previously. We also discuss the entropy for extremal quasiblack holes, a more subtle issue.
Multidimensional entropy landscape of quantum criticality
Grube, K.; Zaum, S.; Stockert, O.; Si, Q.; Löhneysen, H. V.
2017-08-01
The third law of thermodynamics states that the entropy of any system in equilibrium has to vanish at absolute zero temperature. At nonzero temperatures, on the other hand, matter is expected to accumulate entropy near a quantum critical point, where it undergoes a continuous transition from one ground state to another. Here, we determine, based on general thermodynamic principles, the spatial-dimensional profile of the entropy S near a quantum critical point and its steepest descent in the corresponding multidimensional stress space. We demonstrate this approach for the canonical quantum critical compound CeCu 6-xAux near its onset of antiferromagnetic order. We are able to link the directional stress dependence of S to the previously determined geometry of quantum critical fluctuations. Our demonstration of the multidimensional entropy landscape provides the foundation to understand how quantum criticality nucleates novel phases such as high-temperature superconductivity.
Minimal entropy approximation for cellular automata
International Nuclear Information System (INIS)
Fukś, Henryk
2014-01-01
We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)
Entropy Generation Analysis of Wildfire Propagation
Directory of Open Access Journals (Sweden)
Elisa Guelpa
2017-08-01
Full Text Available Entropy generation is commonly applied to describe the evolution of irreversible processes, such as heat transfer and turbulence. These are both dominating phenomena in fire propagation. In this paper, entropy generation analysis is applied to a grassland fire event, with the aim of finding possible links between entropy generation and propagation directions. The ultimate goal of such analysis consists in helping one to overcome possible limitations of the models usually applied to the prediction of wildfire propagation. These models are based on the application of the superimposition of the effects due to wind and slope, which has proven to fail in various cases. The analysis presented here shows that entropy generation allows a detailed analysis of the landscape propagation of a fire and can be thus applied to its quantitative description.
Zhang, Li; Wu, Kexin; Liu, Yang
2017-12-01
A multi-objective performance optimization method is proposed, and the problem that single structural parameters of small fan balance the optimization between the static characteristics and the aerodynamic noise is solved. In this method, three structural parameters are selected as the optimization variables. Besides, the static pressure efficiency and the aerodynamic noise of the fan are regarded as the multi-objective performance. Furthermore, the response surface method and the entropy method are used to establish the optimization function between the optimization variables and the multi-objective performances. Finally, the optimized model is found when the optimization function reaches its maximum value. Experimental data shows that the optimized model not only enhances the static characteristics of the fan but also obviously reduces the noise. The results of the study will provide some reference for the optimization of multi-objective performance of other types of rotating machinery.
Liang, Xuedong; Liu, Canmian; Li, Zhi
2017-01-01
In connection with the sustainable development of scenic spots, this paper, with consideration of resource conditions, economic benefits, auxiliary industry scale and ecological environment, establishes a comprehensive measurement model of the sustainable capacity of scenic spots; optimizes the index system by principal components analysis to extract principal components; assigns the weight of principal components by entropy method; analyzes the sustainable capacity of scenic spots in each province of China comprehensively in combination with TOPSIS method and finally puts forward suggestions aid decision-making. According to the study, this method provides an effective reference for the study of the sustainable development of scenic spots and is very significant for considering the sustainable development of scenic spots and auxiliary industries to establish specific and scientific countermeasures for improvement. PMID:29271947
Liang, Xuedong; Liu, Canmian; Li, Zhi
2017-12-22
In connection with the sustainable development of scenic spots, this paper, with consideration of resource conditions, economic benefits, auxiliary industry scale and ecological environment, establishes a comprehensive measurement model of the sustainable capacity of scenic spots; optimizes the index system by principal components analysis to extract principal components; assigns the weight of principal components by entropy method; analyzes the sustainable capacity of scenic spots in each province of China comprehensively in combination with TOPSIS method and finally puts forward suggestions aid decision-making. According to the study, this method provides an effective reference for the study of the sustainable development of scenic spots and is very significant for considering the sustainable development of scenic spots and auxiliary industries to establish specific and scientific countermeasures for improvement.
Manufacturing of High Entropy Alloys
Jablonski, Paul D.; Licavoli, Joseph J.; Gao, Michael C.; Hawk, Jeffrey A.
2015-07-01
High entropy alloys (HEAs) have generated interest in recent years due to their unique positioning within the alloy world. By incorporating a number of elements in high proportion they have high configurational entropy, and thus they hold the promise of interesting and useful properties such as enhanced strength and phase stability. The present study investigates the microstructure of two single-phase face-centered cubic (FCC) HEAs, CoCrFeNi and CoCrFeNiMn, with special attention given to melting, homogenization and thermo-mechanical processing. Large-scale ingots were made by vacuum induction melting to avoid the extrinsic factors inherent in small-scale laboratory button samples. A computationally based homogenization heat treatment was applied to both alloys in order to eliminate segregation due to normal ingot solidification. The alloys fabricated well, with typical thermo-mechanical processing parameters being employed.
Preserved entropy and fragile magnetism.
Canfield, Paul C; Bud'ko, Sergey L
2016-08-01
A large swath of quantum critical and strongly correlated electron systems can be associated with the phenomena of preserved entropy and fragile magnetism. In this overview we present our thoughts and plans for the discovery and development of lanthanide and transition metal based, strongly correlated systems that are revealed by suppressed, fragile magnetism, quantum criticality, or grow out of preserved entropy. We will present and discuss current examples such as YbBiPt, YbAgGe, YbFe2Zn20, PrAg2In, BaFe2As2, CaFe2As2, LaCrSb3 and LaCrGe3 as part of our motivation and to provide illustrative examples.
Sabbe, Maarten K; De Vleeschouwer, Freija; Reyniers, Marie-Françoise; Waroquier, Michel; Marin, Guy B
2008-11-27
In this work a complete and consistent set of 95 Benson group additive values (GAVs) for standard entropies S(o) and heat capacities C(p)(o) of hydrocarbons and hydrocarbon radicals is presented. These GAVs include 46 groups, among which 25 radical groups, which, to the best of our knowledge, have not been reported before. The GAVs have been determined from a set of B3LYP/6-311G(d,p) ideal gas statistical thermodynamics values for 265 species, consistently with previously reported GAVs for standard enthalpies of formation. One-dimensional hindered rotor corrections for all internal rotations are included. The computational methodology has been compared to experimental entropies (298 K) for 39 species, with a mean absolute deviation (MAD) between experiment and calculation of 1.2 J mol(-1) K(-1), and to 46 experimental heat capacities (298 K) with a resulting MAD = 1.8 J mol(-1) K(-1). The constructed database allowed evaluation of corrections on S(o) and C(p)(o) for non-nearest-neighbor effects, which have not been determined previously. The group additive model predicts the S(o) and C(p)(o) within approximately 5 J mol(-1) K(-1) of the ab initio values for 11 of the 14 molecules of the test set, corresponding to an acceptable maximal deviation of a factor of 1.6 on the equilibrium coefficient. The obtained GAVs can be applied for the prediction of S(o) and C(p)(o) for a wide range of hydrocarbons and hydrocarbon radicals. The constructed database also allowed determination of a large set of hydrogen bond increments, which can be useful for the prediction of radical thermochemistry.
Behnam, Morteza; Pourghassem, Hossein
2017-01-30
EEG signal analysis of pediatric patients plays vital role for making a decision to intervene in presurgical stages. In this paper, an offline seizure detection algorithm based on definition of a seizure-specific wavelet (Seizlet) is presented. After designing the Seizlet, by forming cone of influence map of the EEG signal, four types of layouts are analytically designed that are called Seizure Modulus Maximas Patterns (SMMP). By mapping CorrEntropy Induced Metric (CIM) series, four structural features based on least square estimation of fitted non-tilt conic ellipse are extracted that are called CorrEntropy Ellipse Features (CEF). The parameters of the SMMP and CEF are tuned by employing a hybrid optimization algorithm based on honeybee hive optimization in combination with Las Vegas randomized algorithm and Elman recurrent classifier. Eventually, the optimal features by AdaBoost classifiers in a cascade structure are classified into the seizure and non-seizure signals. The proposed algorithm is evaluated on 844h signals with 163 seizure events recorded from 23 patients with intractable seizure disorder and accuracy rate of 91.44% and false detection rate of 0.014 per hour are obtained by 7-channel EEG signals. To overcome the restrictions of general kernels and wavelet coefficient-based features, we designed the Seizlet as an exclusive kernel of seizure signal for first time. Also, the Seizlet-based patterns of EEG signals have been modeled to extract the seizure. The reported results demonstrate that our proposed Seizlet is effectiveness to extract the patterns of the epileptic seizure. Copyright © 2016 Elsevier B.V. All rights reserved.
Cross-entropy clustering framework for catchment classification
Tongal, Hakan; Sivakumar, Bellie
2017-09-01
There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.
A comparison between centre-based and expedition-based ...
African Journals Online (AJOL)
A comparison between centre-based and expedition-based (wilderness) adventure experiential learning regarding group effectiveness: A mixed methodology ... it is strongly recommended that a centre-based adventure program be used – mainly on account of active involvement, intensive social interaction and continuous ...
Entropy in quantum information theory - Communication and cryptography
DEFF Research Database (Denmark)
Majenz, Christian
to density matrices, the von Neumann entropy behaves dierently. The latter does not, for example, have the monotonicity property that the latter possesses: When adding another quantum system, the entropy can decrease. A long-standing open question is, whether there are quantum analogues of unconstrained non......Entropies have been immensely useful in information theory. In this Thesis, several results in quantum information theory are collected, most of which use entropy as the main mathematical tool. The rst one concerns the von Neumann entropy. While a direct generalization of the Shannon entropy...... in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly...
On the Conditional Rényi Entropy
S. Fehr (Serge); S. Berens (Stefan)
2014-01-01
htmlabstractThe Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy notions, like the min-entropy or the collision entropy. In contrast to the Shannon entropy, there seems to be no commonly accepted definition for the conditional Rényi entropy: several
Fan Zhang; Oleg N. Senkov; Jonathan D. Miller
2013-01-01
Microstructure and phase composition of a CrMo0.5NbTa0.5TiZr high entropy alloy were studied in the as-solidified and heat treated conditions. In the as-solidified condition, the alloy consisted of two disordered BCC phases and an ordered cubic Laves phase. The BCC1 phase solidified in the form of dendrites enriched with Mo, Ta and Nb, and its volume fraction was 42%. The BCC2 and Laves phases solidified by the eutectic-type reaction, and their volume fractions were 27% and 31%, respectively....
Maximum Entropy Approaches to Living Neural Networks
Directory of Open Access Journals (Sweden)
John M. Beggs
2010-01-01
Full Text Available Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key developments driving this interest is a new class of models based on the principle of maximum entropy. Maximum entropy models have been reported to account for spatial correlation structure in ensembles of neurons recorded from several different types of data. Importantly, these models require only information about the firing rates of individual neurons and their pairwise correlations. If this approach is generally applicable, it would drastically simplify the problem of understanding how neural networks behave. Given the interest in this method, several groups now have worked to extend maximum entropy models to account for temporal correlations. Here, we review how maximum entropy models have been applied to neuronal ensemble data to account for spatial and temporal correlations. We also discuss criticisms of the maximum entropy approach that argue that it is not generally applicable to larger ensembles of neurons. We conclude that future maximum entropy models will need to address three issues: temporal correlations, higher-order correlations, and larger ensemble sizes. Finally, we provide a brief list of topics for future research.
Entropy Production in Stochastics
Directory of Open Access Journals (Sweden)
Demetris Koutsoyiannis
2017-10-01
Full Text Available While the modern definition of entropy is genuinely probabilistic, in entropy production the classical thermodynamic definition, as in heat transfer, is typically used. Here we explore the concept of entropy production within stochastics and, particularly, two forms of entropy production in logarithmic time, unconditionally (EPLT or conditionally on the past and present having been observed (CEPLT. We study the theoretical properties of both forms, in general and in application to a broad set of stochastic processes. A main question investigated, related to model identification and fitting from data, is how to estimate the entropy production from a time series. It turns out that there is a link of the EPLT with the climacogram, and of the CEPLT with two additional tools introduced here, namely the differenced climacogram and the climacospectrum. In particular, EPLT and CEPLT are related to slopes of log-log plots of these tools, with the asymptotic slopes at the tails being most important as they justify the emergence of scaling laws of second-order characteristics of stochastic processes. As a real-world application, we use an extraordinary long time series of turbulent velocity and show how a parsimonious stochastic model can be identified and fitted using the tools developed.
The Conditional Entropy Power Inequality for Bosonic Quantum Systems
De Palma, Giacomo; Trevisan, Dario
2018-01-01
We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.
Directory of Open Access Journals (Sweden)
Rong Jiang
2014-09-01
Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.
Wu, Jia-rui; Guo, Wei-xian; Zhang, Xiao-meng; Zhang, Bing; Zhang, Yue
2015-04-01
In this study, Professor Yan Zhenghua's recipes for treating heart diseases were collected to determine the frequency and association rules among drugs by such data mining methods as apriori algorithm and complex system entropy cluster and summarize Pro- fessor Yan Zhenghua's medication experience in treating heart diseases. The results indicated that frequently used drugs included Salviae Miltiorrhizae Radix et Rhizoma, Parched Ziziphi Spinosae Semen, Polygoni Multiflori Caulis, Ostreae Concha, Poria; frequently used drug combinations included "Ostreae Concha, Draconis Os", "Polygoni Multiflori Caulis, Parched Ziziphi Spinosae Semen" , and "Salviae Miltiorrhizae Radix et Rhizoma, Parched Ziziphi Spinosae Semen". The drug combinations with the confidence of 1 included "Dalbergiae Odoriferae Lignum-->Salviae Miltiorrhizae Radix et Rhizoma", "Allii Macrostemonis Bulbus-->Parched Ziziphi Spinosae Semen", "Draconis Os-->Ostreae Concha", and "Salviae Miltiorrhizac Radix et Rhizoma, Draconis Os-->Ostreae Concha". The core drug combinations included" Chrysanthemi Flos-Gastrodiae Rhizoma-Tribuli Fructus", "Dipsaci Radix-Taxillus sutchuenensis-Achyranthis Bidentatae Radix", and "Margaritifera Concha-Polygoni Multiflori Caulis-Platycladi Semen-Draconis Os".
Directory of Open Access Journals (Sweden)
George J. A. Jiang
2015-01-01
Full Text Available Electroencephalogram (EEG signals, as it can express the human brain’s activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA. Bispectral (BIS index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD method and analyzed using sample entropy (SampEn analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN model through using expert assessment of consciousness level (EACL which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.
Jiang, George J A; Fan, Shou-Zen; Abbod, Maysam F; Huang, Hui-Hsun; Lan, Jheng-Yan; Tsai, Feng-Fang; Chang, Hung-Chi; Yang, Yea-Wen; Chuang, Fu-Lan; Chiu, Yi-Fang; Jen, Kuo-Kuang; Wu, Jeng-Fu; Shieh, Jiann-Shing
2015-01-01
Electroencephalogram (EEG) signals, as it can express the human brain's activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA). Bispectral (BIS) index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD) method and analyzed using sample entropy (SampEn) analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN) model through using expert assessment of consciousness level (EACL) which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.
Directory of Open Access Journals (Sweden)
Jiaxin Lu
2017-10-01
Full Text Available Implementation of hybrid energy system (HES is generally considered as a promising way to satisfy the electrification requirements for remote areas. In the present study, a novel decision making methodology is proposed to identify the best compromise configuration of HES from a set of feasible combinations obtained from HOMER. For this purpose, a multi-objective function, which comprises four crucial and representative indices, is formulated by applying the weighted sum method. The entropy weight method is employed as a quantitative methodology for weighting factors calculation to enhance the objectivity of decision-making. Moreover, the optimal design of a stand-alone PV/wind/battery/diesel HES in Yongxing Island, China, is conducted as a case study to validate the effectiveness of the proposed method. Both the simulation and optimization results indicate that, the optimization method is able to identify the best trade-off configuration among system reliability, economy, practicability and environmental sustainability. Several useful conclusions are given by analyzing the operation of the best configuration.
Directory of Open Access Journals (Sweden)
CHENG Xiaoqiang
2017-11-01
Full Text Available Thumbnail can greatly increase the efficiency of browsing pictures,videos and other image resources and improve the user experience prominently. Map service is a kind of graphic resource coupling spatial information and representation scale,its crafting,retrieval and management will not function well without the support of thumbnail. Sophisticated designed thumbnails bring users vivid first impressions and help users make efficient exploration. On the contrast,coarse thumbnail cause negative emotion and discourage users to explore the map service positively. Inspired by video summarization,key position and key scale of web map service were proposed. Meanwhile,corresponding quantitative measures and an automatic algorithm were drawn up and implemented. With the help of this algorithm,poor visual quality,lack of map information and low automation of current thumbnails was solved successfully. Information entropy was used to determine areas richer in content and tran-scale similarity was calculated to judge at which scale the appearance of the map service has changed drastically,and finally a series of static pictures were extracted which can represent the content of the map service. Experimental results show that this method produced medium-sized,content-rich and well-representative thumbnails which effectively reflect the content and appearance of map service.
Directory of Open Access Journals (Sweden)
Luping Chen
2018-04-01
Full Text Available The degradation of lithium-ion battery often leads to electrical system failure. Battery remaining useful life (RUL prediction can effectively prevent this failure. Battery capacity is usually utilized as health indicator (HI for RUL prediction. However, battery capacity is often estimated on-line and it is difficult to be obtained by monitoring on-line parameters. Therefore, there is a great need to find a simple and on-line prediction method to solve this issue. In this paper, as a novel HI, permutation entropy (PE is extracted from the discharge voltage curve for analyzing battery degradation. Then the similarity between PE and battery capacity are judged by Pearson and Spearman correlation analyses. Experiment results illustrate the effectiveness and excellent similar performance of the novel HI for battery fading indication. Furthermore, we propose a hybrid approach combining Variational mode decomposition (VMD denoising technique, autoregressive integrated moving average (ARIMA, and GM(1,1 models for RUL prediction. Experiment results illustrate the accuracy of the proposed approach for lithium-ion battery on-line RUL prediction.
Quantum Entropy and Complexity
Benatti, F.; Oskouei, S. Khabbazi; Abad, A. Shafiei Deh
We study the relations between the recently proposed machine-independent quantum complexity of P. Gacs [1] and the entropy of classical and quantum systems. On one hand, by restricting Gacs complexity to ergodic classical dynamical systems, we retrieve the equality between the Kolmogorov complexity rate and the Shannon entropy rate derived by A. A. Brudno [2]. On the other hand, using the quantum Shannon-McMillan theorem [3], we show that such an equality holds densely in the case of ergodic quantum spin chains.
DEFF Research Database (Denmark)
Yuri, Shtarkov; Justesen, Jørn
1997-01-01
The concept of entropy for an image on a discrete two dimensional grid is introduced. This concept is used as an information theoretic bound on the coding rate for the image. It is proved that this quantity exists as a limit for arbitrary sets satisfying certain conditions.......The concept of entropy for an image on a discrete two dimensional grid is introduced. This concept is used as an information theoretic bound on the coding rate for the image. It is proved that this quantity exists as a limit for arbitrary sets satisfying certain conditions....
International Nuclear Information System (INIS)
Ponman, T.J.
1984-01-01
For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)
Maximum entropy tokamak configurations
International Nuclear Information System (INIS)
Minardi, E.
1989-01-01
The new entropy concept for the collective magnetic equilibria is applied to the description of the states of a tokamak subject to ohmic and auxiliary heating. The condition for the existence of steady state plasma states with vanishing entropy production implies, on one hand, the resilience of specific current density profiles and, on the other, severe restrictions on the scaling of the confinement time with power and current. These restrictions are consistent with Goldston scaling and with the existence of a heat pinch. (author)
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Directory of Open Access Journals (Sweden)
Antonio G. Ravelo-García
2015-02-01
Full Text Available In this paper the permutation entropy (PE obtained from heart rate variability (HRV is analyzed in a statistical model. In this model we also integrate other feature extraction techniques, the cepstrum coefficients derived from the same HRV and a set of band powers obtained from the electrocardiogram derived respiratory (EDR signal. The aim of the model is detecting obstructive sleep apnea (OSA events. For this purpose, we apply two statistical classification methods: Logistic Regression (LR and Quadratic Discriminant Analysis (QDA. For testing the models we use seventy ECG recordings from the Physionet database which are divided into equal-size learning and testing sets. Both sets consist of 35 recordings, each containing a single ECG signal. In our experiments we have found that the features extracted from the EDR signal present a sensitivity of 65.6% and specificity of 87.7% (auc = 85 in the LR classifier, and sensitivity of 59.4% and specificity of 90.3% (auc = 83.9 in the QDA classifier. The HRV-based cepstrum coefficients present a sensitivity of 63.8% and specificity of 89.2% (auc = 86 in the LR classifier, and sensitivity of 67.2% and specificity of 86.8% (auc = 86.9 in the QDA. Subsequent tests show that the contribution of the permutation entropy increases the performance of the classifiers, implying that the complexity of RR interval time series play an important role in the breathing pauses detection. Particularly, when all features are jointly used, the quantification task reaches a sensitivity of 71.9% and specificity of 92.1% (auc = 90.3 for LR. Similarly, for QDA the sensitivity is 75.1% and the specificity is 90.5% (auc = 91.7.
Differential effects of gender on entropy perception
Satcharoen, Kleddao
2017-12-01
The purpose of this research is to examine differences in perception of entropy (color intensity) between male and female computer users. The objectives include identifying gender-based differences in entropy intention and exploring the potential effects of these differences (if any) on user interface design. The research is an effort to contribute to an emerging field of interest in gender as it relates to science, engineering and technology (SET), particularly user interface design. Currently, there is limited evidence on the role of gender in user interface design and in use of technology generally, with most efforts at gender-differentiated or customized design based on stereotypes and assumptions about female use of technology or the assumption of a default position based on male preferences. Image entropy was selected as a potential characteristic where gender could be a factor in perception because of known differences in color perception acuity between male and female individuals, even where there is no known color perception abnormality (which is more common with males). Although the literature review suggested that training could offset differences in color perception and identification, tests in untrained subject groups routinely show that females are more able to identify, match, and differentiate colors, and that there is a stronger emotional and psychosocial association of color for females. Since image entropy is associated with information content and image salience, the ability to identify areas of high entropy could make a difference in user perception and technological capabilities.
Entropy generation impact on peristaltic motion in a rotating frame
Directory of Open Access Journals (Sweden)
H. Zahir
Full Text Available Outcome of entropy generation in peristalsis of Casson fluid in a rotating frame is intended. Formulation is based upon thermal radiation, viscous dissipation and slip conditions of velocity and temperature. Lubrication approach is followed. The velocity components, temperature and trapping are examined. Specifically the outcomes of Taylor number, fluid parameter, slip parameters, Brinkman, radiation and compliant wall effects are focused. In addition entropy generation and Bejan numbers are examined. It is observed that entropy is controlled through slip effects. Keywords: Casson fluid, Radiative heat flux, Entropy generation, Rotating frame, Slip conditions, Wall properties
Calculation of Configurational Entropy in Complex Landscapes
Directory of Open Access Journals (Sweden)
Samuel A Cushman
2018-04-01
configurational entropy of a landscape is highly related to the dimensionality of the landscape, the number of cover classes, the evenness of landscape composition across classes, and landscape heterogeneity. These advances provide a means for researchers to directly estimate the frequency distribution of all possible macrostates of any observed landscape, and then directly calculate the relative configurational entropy of the observed macrostate, and to understand the ecological meaning of different amounts of configurational entropy. These advances enable scientists to take configurational entropy from a concept to an applied tool to measure and compare the disorder of real landscapes with an objective and unbiased measure based on entropy and the second law.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy à la Boltzmann. Jayanta K Bhattacharjee. General Article Volume 6 Issue 9 September 2001 pp 19-34. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/09/0019-0034. Author Affiliations.
Rescaling Temperature and Entropy
Olmsted, John, III
2010-01-01
Temperature and entropy traditionally are expressed in units of kelvin and joule/kelvin. These units obscure some important aspects of the natures of these thermodynamic quantities. Defining a rescaled temperature using the Boltzmann constant, T' = k[subscript B]T, expresses temperature in energy units, thereby emphasizing the close relationship…
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy à la Boltzmann. Jayanta K Bhattacharjee. General Article Volume 6 Issue 9 September 2001 pp 19-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/006/09/0019-0034 ...
DEFF Research Database (Denmark)
Hansen, Britt Rosendahl; Kuhn, Luise Theil; Bahl, Christian Robert Haffenden
2010-01-01
the eect: the isothermal magnetic entropy change and the adiabatic temperature change. Some of the manifestations and utilizations of the MCE will be touched upon in a general way and nally I will talk about the results I have obtained on a sample of Gadolinium Iron Garnet (GdIG, Gd3Fe5O12), which...
Indian Academy of Sciences (India)
Consider the integral. taken over a reversible transformation. We shall call this function the entropy of state A.” 'Thermodynamics' by Enrico Fermi. “Let Γ be the volume of the region of motion of the states, and. This is the basic assumption of ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy in Biology. Jayant B Udgaonkar. General Article Volume 6 Issue 9 September 2001 pp 61-66. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/09/0061-0066. Author Affiliations.
The numerical viscosity of entropy stable schemes for systems of conservation laws. I
Tadmor, Eitan
1987-01-01
Discrete approximations to hyperbolic systems of conservation laws are studied. The amount of numerical viscosity present in such schemes is quantified and related to their entropy stability by means of comparison.To this end, conservative schemes which are also entropy-conservative are constructed. These entropy-conservative schemes enjoy second-order accuracy; moreover, they can be interpreted as piecewise-linear finite-element methods, and hence can be formulated on various mesh configurations. It is then shown that conservative schemes are entropy stable, if and (for three-point schemes) only they contain more viscosity than that present in the above-mentioned entropy-conservative ones.
The numerical viscosity of entropy stable schemes for systems of conservation laws
Tadmor, E.
1985-01-01
Discrete approximations to hyperbolic systems of conservation laws are studied. The amount of numerical viscosity present in such schemes, is quantified and related to their entropy stability by means of comparison. To this end, conservative schemes which are also entropy conservative are constructed. These entropy conservative schemes enjoy second-order accuracy; moreover, they admit a particular interpretation within the finite-element frameworks, and hence can be formulated on various mesh configurations. It is then shown that conservative schemes are entropy stable if and only if they contain more viscosity than the mentioned above entropy conservative ones.
Numerical viscosity of entropy stable schemes for systems of conservation laws. Final Report
International Nuclear Information System (INIS)
Tadmor, E.
1985-11-01
Discrete approximations to hyperbolic systems of conservation laws are studied. The amount of numerical viscosity present in such schemes is quantified and related to their entropy stability by means of comparison. To this end conservative schemes which are also entropy conservative are constructed. These entropy conservative schemes enjoy second-order accuracy; moreover, they admit a particular interpretation within the finite-element frameworks, and hence can be formulated on various mesh configurations. It is then shown that conservative schemes are entropy stable if and only if they contain more viscosity than the mentioned above entropy conservative ones
Entropy Measurement for Biometric Verification Systems.
Lim, Meng-Hui; Yuen, Pong C
2016-05-01
Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.
Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei
2013-02-01
Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and
Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei
2013-02-01
Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and
Entropy resistance analyses of a two-stream parallel flow heat exchanger with viscous heating
International Nuclear Information System (INIS)
Cheng Xue-Tao; Liang Xin-Gang
2013-01-01
Heat exchangers are widely used in industry, and analyses and optimizations of the performance of heat exchangers are important topics. In this paper, we define the concept of entropy resistance based on the entropy generation analyses of a one-dimensional heat transfer process. With this concept, a two-stream parallel flow heat exchanger with viscous heating is analyzed and discussed. It is found that the minimization of entropy resistance always leads to the maximum heat transfer rate for the discussed two-stream parallel flow heat exchanger, while the minimizations of entropy generation rate, entropy generation numbers, and revised entropy generation number do not always. (general)
Self-adjusting entropy-stable scheme for compressible Euler equations
International Nuclear Information System (INIS)
Cheng Xiao-Han; Nie Yu-Feng; Cai Li; Feng Jian-Hu; Luo Xiao-Yu
2015-01-01
In this work, a self-adjusting entropy-stable scheme is proposed for solving compressible Euler equations. The entropy-stable scheme is constructed by combining the entropy conservative flux with a suitable diffusion operator. The entropy has to be preserved in smooth solutions and be dissipated at shocks. To achieve this, a switch function, which is based on entropy variables, is employed to make the numerical diffusion term be automatically added around discontinuities. The resulting scheme is still entropy-stable. A number of numerical experiments illustrating the robustness and accuracy of the scheme are presented. From these numerical results, we observe a remarkable gain in accuracy. (paper)
Permutation entropy of fractional Brownian motion and fractional Gaussian noise
International Nuclear Information System (INIS)
Zunino, L.; Perez, D.G.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.
2008-01-01
We have worked out theoretical curves for the permutation entropy of the fractional Brownian motion and fractional Gaussian noise by using the Bandt and Shiha [C. Bandt, F. Shiha, J. Time Ser. Anal. 28 (2007) 646] theoretical predictions for their corresponding relative frequencies. Comparisons with numerical simulations show an excellent agreement. Furthermore, the entropy-gap in the transition between these processes, observed previously via numerical results, has been here theoretically validated. Also, we have analyzed the behaviour of the permutation entropy of the fractional Gaussian noise for different time delays
Liu, Dong-jun; Li, Li
2015-01-01
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332
Entropy, neutro-entropy and anti-entropy for neutrosophic information
Patrascu, Vasile
2017-01-01
This approach presents a multi-valued representation of the neutrosophic information. It highlights the link between the bifuzzy information and neutrosophic one. The constructed deca-valued structure shows the neutrosophic information complexity. This deca-valued structure led to construction of two new concepts for the neutrosophic information: neutro-entropy and anti-entropy. These two concepts are added to the two existing: entropy and non-entropy. Thus, we obtained the following triad: e...
Entropy, neutro-entropy and anti-entropy for neutrosophic information
Vasile Patrascu
2017-01-01
This article shows a deca-valued representation of neutrosophic information in which are defined the following features: truth, falsity, weak truth, weak falsity, ignorance, contradiction, saturation, neutrality, ambiguity and hesitation. Using these features, there are constructed computing formulas for entropy, neutro-entropy and anti-entropy.
Entropy Measures vs. Kolmogorov Complexity
Directory of Open Access Journals (Sweden)
Luís Antunes
2011-03-01
Full Text Available Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x, Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
Maximizing entropy over Markov processes
DEFF Research Database (Denmark)
Biondi, Fabrizio; Legay, Axel; Nielsen, Bo Friis
2014-01-01
computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process...... as a reward function, a polynomial algorithm to verify the existence of a system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy. We show how...
Maximizing Entropy over Markov Processes
DEFF Research Database (Denmark)
Biondi, Fabrizio; Legay, Axel; Nielsen, Bo Friis
2013-01-01
computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process...... as a reward function, a polynomial algorithm to verify the existence of an system maximizing entropy among those respecting a specification, a procedure for the maximization of reward functions over Interval Markov Chains and its application to synthesize an implementation maximizing entropy. We show how...
An entropy-assisted musculoskeletal shoulder model.
Xu, Xu; Lin, Jia-Hua; McGorry, Raymond W
2017-04-01
Optimization combined with a musculoskeletal shoulder model has been used to estimate mechanical loading of musculoskeletal elements around the shoulder. Traditionally, the objective function is to minimize the summation of the total activities of the muscles with forces, moments, and stability constraints. Such an objective function, however, tends to neglect the antagonist muscle co-contraction. In this study, an objective function including an entropy term is proposed to address muscle co-contractions. A musculoskeletal shoulder model is developed to apply the proposed objective function. To find the optimal weight for the entropy term, an experiment was conducted. In the experiment, participants generated various 3-D shoulder moments in six shoulder postures. The surface EMG of 8 shoulder muscles was measured and compared with the predicted muscle activities based on the proposed objective function using Bhattacharyya distance and concordance ratio under different weight of the entropy term. The results show that a small weight of the entropy term can improve the predictability of the model in terms of muscle activities. Such a result suggests that the concept of entropy could be helpful for further understanding the mechanism of muscle co-contractions as well as developing a shoulder biomechanical model with greater validity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Fan Zhang
2013-09-01
Full Text Available Microstructure and phase composition of a CrMo0.5NbTa0.5TiZr high entropy alloy were studied in the as-solidified and heat treated conditions. In the as-solidified condition, the alloy consisted of two disordered BCC phases and an ordered cubic Laves phase. The BCC1 phase solidified in the form of dendrites enriched with Mo, Ta and Nb, and its volume fraction was 42%. The BCC2 and Laves phases solidified by the eutectic-type reaction, and their volume fractions were 27% and 31%, respectively. The BCC2 phase was enriched with Ti and Zr and the Laves phase was heavily enriched with Cr. After hot isostatic pressing at 1450 °C for 3 h, the BCC1 dendrites coagulated into round-shaped particles and their volume fraction increased to 67%. The volume fractions of the BCC2 and Laves phases decreased to 16% and 17%, respectively. After subsequent annealing at 1000 °C for 100 h, submicron-sized Laves particles precipitated inside the BCC1 phase, and the alloy consisted of 52% BCC1, 16% BCC2 and 32% Laves phases. Solidification and phase equilibrium simulations were conducted for the CrMo0.5NbTa0.5TiZr alloy using a thermodynamic database developed by CompuTherm LLC. Some discrepancies were found between the calculated and experimental results and the reasons for these discrepancies were discussed.
On Using Entropy for Enhancing Handwriting Preprocessing
Directory of Open Access Journals (Sweden)
Bernhard Peischl
2012-11-01
Full Text Available Handwriting is an important modality for Human-Computer Interaction. For medical professionals, handwriting is (still the preferred natural method of documentation. Handwriting recognition has long been a primary research area in Computer Science. With the tremendous ubiquity of smartphones, along with the renaissance of the stylus, handwriting recognition has become a new impetus. However, recognition rates are still not 100% perfect, and researchers still are constantly improving handwriting algorithms. In this paper we evaluate the performance of entropy based slant- and skew-correction, and compare the results to other methods. We selected 3700 words of 23 writers out of the Unipen-ICROW-03 benchmark set, which we annotated with their associated error angles by hand. Our results show that the entropy-based slant correction method outperforms a window based approach with an average precision of 6:02 for the entropy-based method, compared with the 7:85 for the alternative. On the other hand, the entropy-based skew correction yields a lower average precision of 2:86, compared with the average precision of 2:13 for the alternative LSM based approach.
Spatial-dependence recurrence sample entropy
Pham, Tuan D.; Yan, Hong
2018-03-01
Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.
Zipf's law, power laws and maximum entropy
International Nuclear Information System (INIS)
Visser, Matt
2013-01-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified. (paper)
Zipf's law, power laws and maximum entropy
Visser, Matt
2013-04-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.
Holographic Entanglement Entropy
Rangamani, Mukund
2016-01-01
We review the developments in the past decade on holographic entanglement entropy, a subject that has garnered much attention owing to its potential to teach us about the emergence of spacetime in holography. We provide an introduction to the concept of entanglement entropy in quantum field theories, review the holographic proposals for computing the same, providing some justification for where these proposals arise from in the first two parts. The final part addresses recent developments linking entanglement and geometry. We provide an overview of the various arguments and technical developments that teach us how to use field theory entanglement to detect geometry. Our discussion is by design eclectic; we have chosen to focus on developments that appear to us most promising for further insights into the holographic map. This is a preliminary draft of a few chapters of a book which will appear sometime in the near future, to be published by Springer. The book in addition contains a discussion of application o...
Directory of Open Access Journals (Sweden)
Shuihua Wang
2015-08-01
Full Text Available Fruit classification is quite difficult because of the various categories and similar shapes and features of fruit. In this work, we proposed two novel machine-learning based classification methods. The developed system consists of wavelet entropy (WE, principal component analysis (PCA, feedforward neural network (FNN trained by fitness-scaled chaotic artificial bee colony (FSCABC and biogeography-based optimization (BBO, respectively. The K-fold stratified cross validation (SCV was utilized for statistical analysis. The classification performance for 1653 fruit images from 18 categories showed that the proposed “WE + PCA + FSCABC-FNN” and “WE + PCA + BBO-FNN” methods achieve the same accuracy of 89.5%, higher than state-of-the-art approaches: “(CH + MP + US + PCA + GA-FNN ” of 84.8%, “(CH + MP + US + PCA + PSO-FNN” of 87.9%, “(CH + MP + US + PCA + ABC-FNN” of 85.4%, “(CH + MP + US + PCA + kSVM” of 88.2%, and “(CH + MP + US + PCA + FSCABC-FNN” of 89.1%. Besides, our methods used only 12 features, less than the number of features used by other methods. Therefore, the proposed methods are effective for fruit classification.
Kusaba, Akira; Li, Guanchen; von Spakovsky, Michael R; Kangawa, Yoshihiro; Kakimoto, Koichi
2017-08-15
Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and N ad -H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict the behavior of non-equilibrium processes, even those far from equilibrium where the state evolution is a combination of reversible and irreversible dynamics. SEAQT is an ideal choice to handle this problem on a first-principles basis since the chemical adsorption process starts from a highly non-equilibrium state. A result of the analysis shows that the probability of adsorption on 3Ga-H is significantly higher than that on N ad -H + Ga-H. Additionally, the growth temperature dependence of these adsorption probabilities and the temperature increase due to the heat of reaction is determined. The non-equilibrium thermodynamic modeling applied can lead to better control of the MOVPE process through the selection of preferable reconstructed surfaces. The modeling also demonstrates the efficacy of DFT-SEAQT coupling for determining detailed non-equilibrium process characteristics with a much smaller computational burden than would be entailed with mechanics-based, microscopic-mesoscopic approaches.
Directory of Open Access Journals (Sweden)
Ayman El Mobacher
2013-01-01
Full Text Available Using local invariant features has been proven by published literature to be powerful for image processing and pattern recognition tasks. However, in energy aware environments, these invariant features would not scale easily because of their computational requirements. Motivated to find an efficient building recognition algorithm based on scale invariant feature transform (SIFT keypoints, we present in this paper uSee, a supervised learning framework which exploits the symmetrical and repetitive structural patterns in buildings to identify subsets of relevant clusters formed by these keypoints. Once an image is captured by a smart phone, uSee preprocesses it using variations in gradient angle- and entropy-based measures before extracting the building signature and comparing its representative SIFT keypoints against a repository of building images. Experimental results on 2 different databases confirm the effectiveness of uSee in delivering, at a greatly reduced computational cost, the high matching scores for building recognition that local descriptors can achieve. With only 14.3% of image SIFT keypoints, uSee exceeded prior literature results by achieving an accuracy of 99.1% on the Zurich Building Database with no manual rotation; thus saving significantly on the computational requirements of the task at hand.
Entropy and energy quantization: Planck thermodynamic calculation
International Nuclear Information System (INIS)
Mota e Albuquerque, Ivone Freire da.
1988-01-01
This dissertation analyses the origins and development of the concept of entropy and its meaning of the second Law of thermodynamics, as well as the thermodynamics derivation of the energy quantization. The probabilistic interpretation of that law and its implication in physics theory are evidenciated. Based on Clausius work (which follows Carnot's work), we analyse and expose in a original way the entropy concept. Research upon Boltzmann's work and his probabilistic interpretation of the second Law of thermodynamics is made. The discuss between the atomistic and the energeticist points of view, which were actual at that time are also commented. (author). 38 refs., 3 figs
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
Directory of Open Access Journals (Sweden)
Elie Bienenstock
2008-06-01
Full Text Available Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the Ã¢Â€Âœrenewal entropy estimator,Ã¢Â€Â which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i For all estimators considered, the main source of error is the bias. (ii The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie
2008-06-01
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word
Entropy region and convolution
Czech Academy of Sciences Publication Activity Database
Matúš, František; Csirmaz, L.
2016-01-01
Roč. 62, č. 11 (2016), s. 6007-6018 ISSN 0018-9448 R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : entropy region * information-theoretic inequality * polymatroid Subject RIV: BD - Theory of Information Impact factor: 2.679, year: 2016 http://library.utia.cas.cz/separaty/2016/MTR/matus-0465564.pdf
Equipartition of entropy production
International Nuclear Information System (INIS)
Tondeur, D.
1990-01-01
This paper deals with the optimal design or operation of heat and mass transfer processes and develops the following conjecture: for a given duty, the best configuration of the process is that in which the entropy production rate is most uniformly distributed. This principle is first analyzed in detail on the simple example of tubular heat exchangers, and within the framework of linear irreversible thermodynamics. A main result is established, which states that the total entropy production is minimal when the local production is uniformly distributed (equipartition). Corollaries then result, which relate the entropy production and the variance of its distribution to economic factors such as the duty, the exchange area, the fluid flow-rates, and the temperature changes. The equipartition principle is then extended to multiple independent variables (time and space), multicomponent transfer, and non-linear but concave flux vs force relationship. Chemical Engineering examples are discussed, where the equipartition property has been applied implicitly or explicitly: design of distillation plates, cyclic distillation, optimal state of feed, and flow-sheets in chromatographic separations. Finally, a generalization of the equipartition principle is proposed, for systems with a distributed design variable (such as the size of the various elements of a system). The optimal distribution of investment is such that the investment in each element (properly amortized) is equal to the cost of irreversible energy degradation in this element. This is equivalent to saying that the ratio of these two quantities is uniformly distributed over the system, and reduces to equipartition of entropy production when the cost factors are constant over the whole system
Holographic entanglement entropy
Rangamani, Mukund
2017-01-01
This book provides a comprehensive overview of developments in the field of holographic entanglement entropy. Within the context of the AdS/CFT correspondence, it is shown how quantum entanglement is computed by the area of certain extremal surfaces. The general lessons one can learn from this connection are drawn out for quantum field theories, many-body physics, and quantum gravity. An overview of the necessary background material is provided together with a flavor of the exciting open questions that are currently being discussed. The book is divided into four main parts. In the first part, the concept of entanglement, and methods for computing it, in quantum field theories is reviewed. In the second part, an overview of the AdS/CFT correspondence is given and the holographic entanglement entropy prescription is explained. In the third part, the time-dependence of entanglement entropy in out-of-equilibrium systems, and applications to many body physics are explored using holographic methods. The last part f...
Fluctuation of Information Entropy Measures in Cell Image
Directory of Open Access Journals (Sweden)
Ishay Wohl
2017-10-01
Full Text Available A simple, label-free cytometry technique is introduced. It is based on the analysis of the fluctuation of image Gray Level Information Entropy (GLIE which is shown to reflect intracellular biophysical properties like generalized entropy. In this study, the analytical relations between cellular thermodynamic generalized entropy and diffusivity and GLIE fluctuation measures are explored for the first time. The standard deviation (SD of GLIE is shown by experiments, simulation and theoretical analysis to be indifferent to microscope system “noise”. Then, the ability of GLIE fluctuation measures to reflect basic cellular entropy conditions of early death and malignancy is demonstrated in a cell model of human, healthy-donor lymphocytes, malignant Jurkat cells, as well as dead lymphocytes and Jurkat cells. Utilization of GLIE-based fluctuation measures seems to have the advantage of displaying biophysical characterization of the tested cells, like diffusivity and entropy, in a novel, unique, simple and illustrative way.
Ranking DMUs by Comparing DEA Cross-Efficiency Intervals Using Entropy Measures
Directory of Open Access Journals (Sweden)
Tim Lu
2016-12-01
Full Text Available Cross-efficiency evaluation, an extension of data envelopment analysis (DEA, can eliminate unrealistic weighing schemes and provide a ranking for decision making units (DMUs. In the literature, the determination of input and output weights uniquely receives more attentions. However, the problem of choosing the aggressive (minimal or benevolent (maximal formulation for decision-making might still remain. In this paper, we develop a procedure to perform cross-efficiency evaluation without the need to make any specific choice of DEA weights. The proposed procedure takes into account the aggressive and benevolent formulations at the same time, and the choice of DEA weights can then be avoided. Consequently, a number of cross-efficiency intervals is obtained for each DMU. The entropy, which is based on information theory, is an effective tool to measure the uncertainty. We then utilize the entropy to construct a numerical index for DMUs with cross-efficiency intervals. A mathematical program is proposed to find the optimal entropy values of DMUs for comparison. With the derived entropy value, we can rank DMUs accordingly. Two examples are illustrated to show the effectiveness of the idea proposed in this paper.
Preimage entropy dimension of topological dynamical systems
Liu, Lei; Zhou, Xiaomin; Zhou, Xiaoyao
2014-01-01
We propose a new definition of preimage entropy dimension for continuous maps on compact metric spaces, investigate fundamental properties of the preimage entropy dimension, and compare the preimage entropy dimension with the topological entropy dimension. The defined preimage entropy dimension holds various basic properties of topological entropy dimension, for example, the preimage entropy dimension of a subsystem is bounded by that of the original system and topologically conjugated system...
Quantum Dynamical Entropies and Gács Algorithmic Entropy
Directory of Open Access Journals (Sweden)
Fabio Benatti
2012-07-01
Full Text Available Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step along typical trajectories of a classical ergodic system equals the KS-entropy. In the following, we establish a similar relation between the Connes–Narnhofer–Thirring quantum dynamical entropy for the shift on quantum spin chains and the Gács algorithmic entropy. We further provide, for the same system, a weaker linkage between the latter algorithmic complexity and a different quantum dynamical entropy proposed by Alicki and Fannes.
Entanglement entropy and differential entropy for massive flavors
International Nuclear Information System (INIS)
Jones, Peter A.R.; Taylor, Marika
2015-01-01
In this paper we compute the holographic entanglement entropy for massive flavors in the D3-D7 system, for arbitrary mass and various entangling region geometries. We show that the universal terms in the entanglement entropy exactly match those computed in the dual theory using conformal perturbation theory. We derive holographically the universal terms in the entanglement entropy for a CFT perturbed by a relevant operator, up to second order in the coupling; our results are valid for any entangling region geometry. We present a new method for computing the entanglement entropy of any top-down brane probe system using Kaluza-Klein holography and illustrate our results with massive flavors at finite density. Finally we discuss the differential entropy for brane probe systems, emphasising that the differential entropy captures only the effective lower-dimensional Einstein metric rather than the ten-dimensional geometry.
Entanglement entropy and differential entropy for massive flavors
Energy Technology Data Exchange (ETDEWEB)
Jones, Peter A.R. [Physics and Astronomy and STAG Research Centre, University of Southampton, Highfield, Southampton, SO17 1BJ (United Kingdom); Taylor, Marika [Mathematical Sciences and STAG Research Centre, University of Southampton, Highfield, Southampton, SO17 1BJ (United Kingdom)
2015-08-04
In this paper we compute the holographic entanglement entropy for massive flavors in the D3-D7 system, for arbitrary mass and various entangling region geometries. We show that the universal terms in the entanglement entropy exactly match those computed in the dual theory using conformal perturbation theory. We derive holographically the universal terms in the entanglement entropy for a CFT perturbed by a relevant operator, up to second order in the coupling; our results are valid for any entangling region geometry. We present a new method for computing the entanglement entropy of any top-down brane probe system using Kaluza-Klein holography and illustrate our results with massive flavors at finite density. Finally we discuss the differential entropy for brane probe systems, emphasising that the differential entropy captures only the effective lower-dimensional Einstein metric rather than the ten-dimensional geometry.
Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh
2016-12-01
Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.
Entropy Analysis of Kinetic Flux Vector Splitting Schemes for the Compressible Euler Equations
Shiuhong, Lui; Xu, Jun
1999-01-01
Flux Vector Splitting (FVS) scheme is one group of approximate Riemann solvers for the compressible Euler equations. In this paper, the discretized entropy condition of the Kinetic Flux Vector Splitting (KFVS) scheme based on the gas-kinetic theory is proved. The proof of the entropy condition involves the entropy definition difference between the distinguishable and indistinguishable particles.
Scaling-Laws of Flow Entropy with Topological Metrics of Water Distribution Networks
Giovanni Francesco Santonastaso; Armando Di Nardo; Michele Di Natale; Carlo Giudicianni; Roberto Greco
2018-01-01
Robustness of water distribution networks is related to their connectivity and topological structure, which also affect their reliability. Flow entropy, based on Shannon’s informational entropy, has been proposed as a measure of network redundancy and adopted as a proxy of reliability in optimal network design procedures. In this paper, the scaling properties of flow entropy of water distribution networks with their size and other topological metrics are studied. To such aim, flow entropy, ma...
Texture analysis using Renyi's generalized entropies
Grigorescu, SE; Petkov, N
2003-01-01
We propose a texture analysis method based on Renyi's generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The
Generalized entropy production fluctuation theorems for quantum ...
Indian Academy of Sciences (India)
Based on trajectory-dependent path probability formalism in state space, we derive generalized entropy production fluctuation relations for a quantum system in the presence of measurement and ... Proceedings of the International Workshop/Conference on Computational Condensed Matter Physics and Materials Science
Universal canonical entropy for gravitating systems
Indian Academy of Sciences (India)
... fluctuations, as found earlier) the canonical entropy then has a universal form including logarithmic corrections to the area law. This form is shown to be independent of the index appearing in assumption (b). The index, however, is crucial in ascertaining the domain of validity of our approach based on thermal equilibrium.
Directory of Open Access Journals (Sweden)
Shuan-Feng Zhao
2017-01-01
Full Text Available In the driver fatigue monitoring technology, the essence is to capture and analyze the driver behavior information, such as eyes, face, heart, and EEG activity during driving. However, ECG and EEG monitoring are limited by the installation electrodes and are not commercially available. The most common fatigue detection method is the analysis of driver behavior, that is, to determine whether the driver is tired by recording and analyzing the behavior characteristics of steering wheel and brake. The driver usually adjusts his or her actions based on the observed road conditions. Obviously the road path information is directly contained in the vehicle driving state; if you want to judge the driver’s driving behavior by vehicle driving status information, the first task is to remove the road information from the vehicle driving state data. Therefore, this paper proposes an effective intrinsic mode function selection method for the approximate entropy of empirical mode decomposition considering the characteristics of the frequency distribution of road and vehicle information and the unsteady and nonlinear characteristics of the driver closed-loop driving system in vehicle driving state data. The objective is to extract the effective component of the driving behavior information and to weaken the road information component. Finally the effectiveness of the proposed method is verified by simulating driving experiments.