WorldWideScience

Sample records for entropy based comparison

  1. Entropy-based Probabilistic Fatigue Damage Prognosis and Algorithmic Performance Comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  2. Entropy-based probabilistic fatigue damage prognosis and algorithmic performance comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  3. Entropy Evaluation Based on Value Validity

    Directory of Open Access Journals (Sweden)

    Tarald O. Kvålseth

    2014-09-01

    Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.

  4. A comparison of EEG spectral entropy with conventional quantitative ...

    African Journals Online (AJOL)

    A comparison of EEG spectral entropy with conventional quantitative EEG at varying depths of sevoflurane anaesthesia. PR Bartel, FJ Smith, PJ Becker. Abstract. Background and Aim: Recently an electroencephalographic (EEG) spectral entropy module (M-ENTROPY) for an anaesthetic monitor has become commercially ...

  5. Dose sparing of induction dose of propofol by fentanyl and butorphanol: A comparison based on entropy analysis

    Directory of Open Access Journals (Sweden)

    Jasleen Kaur

    2013-01-01

    Full Text Available Background: The induction dose of propofol is reduced with concomitant use of opioids as a result of a possible synergistic action. Aim and Objectives: The present study compared the effect of fentanyl and two doses of butorphanol pre-treatment on the induction dose of propofol, with specific emphasis on entropy. Methods: Three groups of 40 patients each, of the American Society of Anaesthesiologistsphysical status I and II, were randomized to receive fentanyl 2 μg/kg (Group F, butorphanol 20 μg/kg (Group B 20 or 40 μg/kg (Group B 40 as pre-treatment. Five minutes later, the degree of sedation was assessed by the observer′s assessment of alertness scale (OAA/S. Induction of anesthesia was done with propofol (30 mg/10 s till the loss of response to verbal commands. Thereafter, rocuronium 1 mg/kg was administered and endotracheal intubation was performed 2 min later. OAA/S, propofol induction dose, heart rate, blood pressure, oxygen saturation and entropy (response and state were compared in the three groups. Statistical Analysis: Data was analyzed using ANOVA test with posthoc significance, Kruskal-Wallis test, Chi-square test and Fischer exact test. A P<0.05 was considered as significant. Results: The induction dose of propofol (mg/kg was observed to be 1.1±0.50 in Group F, 1.05±0.35 in Group B 20 and 1.18±0.41 in Group B40. Induction with propofol occurred at higher entropy values on pre-treatment with both fentanyl as well as butorphanol. Hemodynamic variables were comparable in all the three groups. Conclusion: Butorphanol 20 μg/kg and 40 μg/kg reduce the induction requirement of propofol, comparable to that of fentanyl 2 μg/kg, and confer hemodynamic stability at induction and intubation.

  6. Entropy-Based Clutter Rejection for Intrawall Diagnostics

    Directory of Open Access Journals (Sweden)

    Raffaele Solimene

    2012-01-01

    Full Text Available The intrawall diagnostic problem of detecting localized inhomogeneities possibly present within the wall is addressed. As well known, clutter arising from masonry structure can impair detection of embedded scatterers due to high amplitude reflections that wall front face introduces. Moreover, internal multiple reflections also can make it difficult ground penetrating radar images (radargramms interpretation. To counteract these drawbacks, a clutter rejection method, properly tailored on the wall features, is mandatory. To this end, here we employ a windowing strategy based on entropy measures of temporal traces “similarity.” Accordingly, instants of time for which radargramms exhibit entropy values greater than a prescribed threshold are “silenced.” Numerical results are presented in order to show the effectiveness of the entropy-based clutter rejection algorithm. Moreover, a comparison with the standard average trace subtraction is also included.

  7. Autonomous entropy-based intelligent experimental design

    Science.gov (United States)

    Malakar, Nabin Kumar

    2011-07-01

    The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same

  8. Entropy based fingerprint for local crystalline order

    Science.gov (United States)

    Piaggi, Pablo M.; Parrinello, Michele

    2017-09-01

    We introduce a new fingerprint that allows distinguishing between liquid-like and solid-like atomic environments. This fingerprint is based on an approximate expression for the entropy projected on individual atoms. When combined with local enthalpy, this fingerprint acquires an even finer resolution and it is capable of discriminating between different crystal structures.

  9. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  10. Comparison of background EEG activity of different groups of patients with idiopathic epilepsy using Shannon spectral entropy and cluster-based permutation statistical testing.

    Directory of Open Access Journals (Sweden)

    Jose Antonio Urigüen

    Full Text Available Idiopathic epilepsy is characterized by generalized seizures with no apparent cause. One of its main problems is the lack of biomarkers to monitor the evolution of patients. The only tools they can use are limited to inspecting the amount of seizures during previous periods of time and assessing the existence of interictal discharges. As a result, there is a need for improving the tools to assist the diagnosis and follow up of these patients. The goal of the present study is to compare and find a way to differentiate between two groups of patients suffering from idiopathic epilepsy, one group that could be followed-up by means of specific electroencephalographic (EEG signatures (intercritical activity present, and another one that could not due to the absence of these markers. To do that, we analyzed the background EEG activity of each in the absence of seizures and epileptic intercritical activity. We used the Shannon spectral entropy (SSE as a metric to discriminate between the two groups and performed permutation-based statistical tests to detect the set of frequencies that show significant differences. By constraining the spectral entropy estimation to the [6.25-12.89 Hz range, we detect statistical differences (at below 0.05 alpha-level between both types of epileptic patients at all available recording channels. Interestingly, entropy values follow a trend that is inversely related to the elapsed time from the last seizure. Indeed, this trend shows asymptotical convergence to the SSE values measured in a group of healthy subjects, which present SSE values lower than any of the two groups of patients. All these results suggest that the SSE, measured in a specific range of frequencies, could serve to follow up the evolution of patients suffering from idiopathic epilepsy. Future studies remain to be conducted in order to assess the predictive value of this approach for the anticipation of seizures.

  11. Entropy-based financial asset pricing.

    Directory of Open Access Journals (Sweden)

    Mihály Ormos

    Full Text Available We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  12. Entropy-based financial asset pricing.

    Science.gov (United States)

    Ormos, Mihály; Zibriczky, Dávid

    2014-01-01

    We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  13. Entropy-Based Algorithm for Supply-Chain Complexity Assessment

    Directory of Open Access Journals (Sweden)

    Boris Kriheli

    2018-03-01

    Full Text Available This paper considers a graph model of hierarchical supply chains. The goal is to measure the complexity of links between different components of the chain, for instance, between the principal equipment manufacturer (a root node and its suppliers (preceding supply nodes. The information entropy is used to serve as a measure of knowledge about the complexity of shortages and pitfalls in relationship between the supply chain components under uncertainty. The concept of conditional (relative entropy is introduced which is a generalization of the conventional (non-relative entropy. An entropy-based algorithm providing efficient assessment of the supply chain complexity as a function of the SC size is developed.

  14. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  15. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  16. A new entropy based method for computing software structural complexity

    CERN Document Server

    Roca, J L

    2002-01-01

    In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relation...

  17. A comparison of EEG spectral entropy with conventional quantitative ...

    African Journals Online (AJOL)

    Adele

    and decrease with increasing depth of anaesthesia. Spectral en- tropy yields two scales: Response Entropy (RE), ranging between. 0 to100, is an amalgam of EEG and frontal muscle activity while. State Entropy (SE), consisting mainly of EEG activity in a lower frequency band, ranges from 0 to 91.2 Initial reports have pro-.

  18. Comparison of transfer entropy methods for financial time series

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian

    2017-09-01

    There is a certain relationship between the global financial markets, which creates an interactive network of global finance. Transfer entropy, a measurement for information transfer, offered a good way to analyse the relationship. In this paper, we analysed the relationship between 9 stock indices from the U.S., Europe and China (from 1995 to 2015) by using transfer entropy (TE), effective transfer entropy (ETE), Rényi transfer entropy (RTE) and effective Rényi transfer entropy (ERTE). We compared the four methods in the sense of the effectiveness for identification of the relationship between stock markets. In this paper, two kinds of information flows are given. One reveals that the U.S. took the leading position when in terms of lagged-current cases, but when it comes to the same date, China is the most influential. And ERTE could provide superior results.

  19. Fractal Image Compression Based on High Entropy Values Technique

    Directory of Open Access Journals (Sweden)

    Douaa Younis Abbaas

    2018-04-01

    Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.

  20. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...

  1. Epoch-based Entropy for Early Screening of Alzheimer's Disease.

    Science.gov (United States)

    Houmani, N; Dreyfus, G; Vialatte, F B

    2015-12-01

    In this paper, we introduce a novel entropy measure, termed epoch-based entropy. This measure quantifies disorder of EEG signals both at the time level and spatial level, using local density estimation by a Hidden Markov Model on inter-channel stationary epochs. The investigation is led on a multi-centric EEG database recorded from patients at an early stage of Alzheimer's disease (AD) and age-matched healthy subjects. We investigate the classification performances of this method, its robustness to noise, and its sensitivity to sampling frequency and to variations of hyperparameters. The measure is compared to two alternative complexity measures, Shannon's entropy and correlation dimension. The classification accuracies for the discrimination of AD patients from healthy subjects were estimated using a linear classifier designed on a development dataset, and subsequently tested on an independent test set. Epoch-based entropy reached a classification accuracy of 83% on the test dataset (specificity = 83.3%, sensitivity = 82.3%), outperforming the two other complexity measures. Furthermore, it was shown to be more stable to hyperparameter variations, and less sensitive to noise and sampling frequency disturbances than the other two complexity measures.

  2. Entropy-Based Privacy against Profiling of User Mobility

    Directory of Open Access Journals (Sweden)

    Alicia Rodriguez-Carrion

    2015-06-01

    Full Text Available Location-based services (LBSs flood mobile phones nowadays, but their use poses an evident privacy risk. The locations accompanying the LBS queries can be exploited by the LBS provider to build the user profile of visited locations, which might disclose sensitive data, such as work or home locations. The classic concept of entropy is widely used to evaluate privacy in these scenarios, where the information is represented as a sequence of independent samples of categorized data. However, since the LBS queries might be sent very frequently, location profiles can be improved by adding temporal dependencies, thus becoming mobility profiles, where location samples are not independent anymore and might disclose the user’s mobility patterns. Since the time dimension is factored in, the classic entropy concept falls short of evaluating the real privacy level, which depends also on the time component. Therefore, we propose to extend the entropy-based privacy metric to the use of the entropy rate to evaluate mobility profiles. Then, two perturbative mechanisms are considered to preserve locations and mobility profiles under gradual utility constraints. We further use the proposed privacy metric and compare it to classic ones to evaluate both synthetic and real mobility profiles when the perturbative methods proposed are applied. The results prove the usefulness of the proposed metric for mobility profiles and the need for tailoring the perturbative methods to the features of mobility profiles in order to improve privacy without completely loosing utility.

  3. A Method of Rotating Machinery Fault Diagnosis Based on the Close Degree of Information Entropy

    Institute of Scientific and Technical Information of China (English)

    GENG Jun-bao; HUANG Shu-hong; JIN Jia-shan; CHEN Fei; LIU Wei

    2006-01-01

    This paper presents a method of rotating machinery fault diagnosis based on the close degree of information entropy. In the view of the information entropy, we introduce four information entropy features of the rotating machinery, which describe the vibration condition of the machinery. The four features are, respectively, denominated as singular spectrum entropy, power spectrum entropy, wavelet space state feature entropy and wavelet power spectrum entropy. The value scopes of the four information entropy features of the rotating machinery in some typical fault conditions are gained by experiments, which can be acted as the standard features of fault diagnosis. According to the principle of the shorter distance between the more similar models, the decision-making method based on the close degree of information entropy is put forward to deal with the recognition of fault patterns. We demonstrate the effectiveness of this approach in an instance involving the fault pattern recognition of some rotating machinery.

  4. A new entropy based method for computing software structural complexity

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2002-01-01

    In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relationship with the number of inherent software errors and it implies a basic hazard failure rate for it, so that a minimum structure assures a certain stability and maturity of the program. This metric can be used, either to evaluate the product or the process of software development, as development tool or for monitoring the stability and the quality of the final product. (author)

  5. An Entropy-Based Network Anomaly Detection Method

    Directory of Open Access Journals (Sweden)

    Przemysław Bereziński

    2015-04-01

    Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.

  6. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  7. Multiscale sample entropy and cross-sample entropy based on symbolic representation and similarity of stock markets

    Science.gov (United States)

    Wu, Yue; Shang, Pengjian; Li, Yilong

    2018-03-01

    A modified multiscale sample entropy measure based on symbolic representation and similarity (MSEBSS) is proposed in this paper to research the complexity of stock markets. The modified algorithm reduces the probability of inducing undefined entropies and is confirmed to be robust to strong noise. Considering the validity and accuracy, MSEBSS is more reliable than Multiscale entropy (MSE) for time series mingled with much noise like financial time series. We apply MSEBSS to financial markets and results show American stock markets have the lowest complexity compared with European and Asian markets. There are exceptions to the regularity that stock markets show a decreasing complexity over the time scale, indicating a periodicity at certain scales. Based on MSEBSS, we introduce the modified multiscale cross-sample entropy measure based on symbolic representation and similarity (MCSEBSS) to consider the degree of the asynchrony between distinct time series. Stock markets from the same area have higher synchrony than those from different areas. And for stock markets having relative high synchrony, the entropy values will decrease with the increasing scale factor. While for stock markets having high asynchrony, the entropy values will not decrease with the increasing scale factor sometimes they tend to increase. So both MSEBSS and MCSEBSS are able to distinguish stock markets of different areas, and they are more helpful if used together for studying other features of financial time series.

  8. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  9. Entropy Based Classifier Combination for Sentence Segmentation

    Science.gov (United States)

    2007-01-01

    speaker diarization system to divide the audio data into hypothetical speakers [17...the prosodic feature also includes turn-based features which describe the position of a word in relation to diarization seg- mentation. The speaker ...ro- bust speaker segmentation: the ICSI-SRI fall 2004 diarization system,” in Proc. RT-04F Workshop, 2004. [18] “The rich transcription fall 2003,” http://nist.gov/speech/tests/rt/rt2003/fall/docs/rt03-fall-eval- plan-v9.pdf.

  10. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    Science.gov (United States)

    Weiss, Brandi A.; Dardick, William

    2016-01-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…

  11. Comparison Between Bayesian and Maximum Entropy Analyses of Flow Networks†

    Directory of Open Access Journals (Sweden)

    Steven H. Waldrip

    2017-02-01

    Full Text Available We compare the application of Bayesian inference and the maximum entropy (MaxEnt method for the analysis of flow networks, such as water, electrical and transport networks. The two methods have the advantage of allowing a probabilistic prediction of flow rates and other variables, when there is insufficient information to obtain a deterministic solution, and also allow the effects of uncertainty to be included. Both methods of inference update a prior to a posterior probability density function (pdf by the inclusion of new information, in the form of data or constraints. The MaxEnt method maximises an entropy function subject to constraints, using the method of Lagrange multipliers,to give the posterior, while the Bayesian method finds its posterior by multiplying the prior with likelihood functions incorporating the measured data. In this study, we examine MaxEnt using soft constraints, either included in the prior or as probabilistic constraints, in addition to standard moment constraints. We show that when the prior is Gaussian,both Bayesian inference and the MaxEnt method with soft prior constraints give the same posterior means, but their covariances are different. In the Bayesian method, the interactions between variables are applied through the likelihood function, using second or higher-order cross-terms within the posterior pdf. In contrast, the MaxEnt method incorporates interactions between variables using Lagrange multipliers, avoiding second-order correlation terms in the posterior covariance. The MaxEnt method with soft prior constraints, therefore, has a numerical advantage over Bayesian inference, in that the covariance terms are avoided in its integrations. The second MaxEnt method with soft probabilistic constraints is shown to give posterior means of similar, but not identical, structure to the other two methods, due to its different formulation.

  12. 2D Tsallis Entropy for Image Segmentation Based on Modified Chaotic Bat Algorithm

    Directory of Open Access Journals (Sweden)

    Zhiwei Ye

    2018-03-01

    Full Text Available Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum entropy, 1D cross entropy, 1D Tsallis entropy, fuzzy entropy, 2D Fisher, 2D maximum entropy and 2D cross entropy. On the other hand, due to the existence of huge computational costs, meta-heuristics algorithms like genetic algorithm (GA, particle swarm optimization (PSO, ant colony optimization algorithm (ACO and differential evolution algorithm (DE are used to accelerate the 2D Tsallis entropy thresholding method. In this paper, considering 2D Tsallis entropy as a constrained optimization problem, the optimal thresholds are acquired by maximizing the objective function using a modified chaotic Bat algorithm (MCBA. The proposed algorithm has been tested on some actual and infrared images. The results are compared with that of PSO, GA, ACO and DE and demonstrate that the proposed method outperforms other approaches involved in the paper, which is a feasible and effective option for image segmentation.

  13. Entropy-based implied volatility and its information content

    NARCIS (Netherlands)

    X. Xiao (Xiao); C. Zhou (Chen)

    2016-01-01

    markdownabstractThis paper investigates the maximum entropy approach on estimating implied volatility. The entropy approach also allows to measure option implied skewness and kurtosis nonparametrically, and to construct confidence intervals. Simulations show that the en- tropy approach outperforms

  14. Inhomogeneity of epidemic spreading with entropy-based infected clusters.

    Science.gov (United States)

    Wen-Jie, Zhou; Xing-Yuan, Wang

    2013-12-01

    Considering the difference in the sizes of the infected clusters in the dynamic complex networks, the normalized entropy based on infected clusters (δ*) is proposed to characterize the inhomogeneity of epidemic spreading. δ* gives information on the variability of the infected clusters in the system. We investigate the variation in the inhomogeneity of the distribution of the epidemic with the absolute velocity v of moving agent, the infection density ρ, and the interaction radius r. By comparing δ* in the dynamic networks with δH* in homogeneous mode, the simulation experiments show that the inhomogeneity of epidemic spreading becomes smaller with the increase of v, ρ, r.

  15. Towards an entropy-based detached-eddy simulation

    Science.gov (United States)

    Zhao, Rui; Yan, Chao; Li, XinLiang; Kong, WeiXuan

    2013-10-01

    A concept of entropy increment ratio ( s¯) is introduced for compressible turbulence simulation through a series of direct numerical simulations (DNS). s¯ represents the dissipation rate per unit mechanical energy with the benefit of independence of freestream Mach numbers. Based on this feature, we construct the shielding function f s to describe the boundary layer region and propose an entropy-based detached-eddy simulation method (SDES). This approach follows the spirit of delayed detached-eddy simulation (DDES) proposed by Spalart et al. in 2005, but it exhibits much better behavior after their performances are compared in the following flows, namely, pure attached flow with thick boundary layer (a supersonic flat-plate flow with high Reynolds number), fully separated flow (the supersonic base flow), and separated-reattached flow (the supersonic cavity-ramp flow). The Reynolds-averaged Navier-Stokes (RANS) resolved region is reliably preserved and the modeled stress depletion (MSD) phenomenon which is inherent in DES and DDES is partly alleviated. Moreover, this new hybrid strategy is simple and general, making it applicable to other models related to the boundary layer predictions.

  16. Entropy based classifier for cross-domain opinion mining

    Directory of Open Access Journals (Sweden)

    Jyoti S. Deshmukh

    2018-01-01

    Full Text Available In recent years, the growth of social network has increased the interest of people in analyzing reviews and opinions for products before they buy them. Consequently, this has given rise to the domain adaptation as a prominent area of research in sentiment analysis. A classifier trained from one domain often gives poor results on data from another domain. Expression of sentiment is different in every domain. The labeling cost of each domain separately is very high as well as time consuming. Therefore, this study has proposed an approach that extracts and classifies opinion words from one domain called source domain and predicts opinion words of another domain called target domain using a semi-supervised approach, which combines modified maximum entropy and bipartite graph clustering. A comparison of opinion classification on reviews on four different product domains is presented. The results demonstrate that the proposed method performs relatively well in comparison to the other methods. Comparison of SentiWordNet of domain-specific and domain-independent words reveals that on an average 72.6% and 88.4% words, respectively, are correctly classified.

  17. Feedback structure based entropy approach for multiple-model estimation

    Institute of Scientific and Technical Information of China (English)

    Shen-tu Han; Xue Anke; Guo Yunfei

    2013-01-01

    The variable-structure multiple-model (VSMM) approach, one of the multiple-model (MM) methods, is a popular and effective approach in handling problems with mode uncertainties. The model sequence set adaptation (MSA) is the key to design a better VSMM. However, MSA methods in the literature have big room to improve both theoretically and practically. To this end, we propose a feedback structure based entropy approach that could find the model sequence sets with the smallest size under certain conditions. The filtered data are fed back in real time and can be used by the minimum entropy (ME) based VSMM algorithms, i.e., MEVSMM. Firstly, the full Markov chains are used to achieve optimal solutions. Secondly, the myopic method together with particle filter (PF) and the challenge match algorithm are also used to achieve sub-optimal solutions, a trade-off between practicability and optimality. The numerical results show that the proposed algorithm provides not only refined model sets but also a good robustness margin and very high accuracy.

  18. Multiscale Permutation Entropy Based Rolling Bearing Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Jinde Zheng

    2014-01-01

    Full Text Available A new rolling bearing fault diagnosis approach based on multiscale permutation entropy (MPE, Laplacian score (LS, and support vector machines (SVMs is proposed in this paper. Permutation entropy (PE was recently proposed and defined to measure the randomicity and detect dynamical changes of time series. However, for the complexity of mechanical systems, the randomicity and dynamic changes of the vibration signal will exist in different scales. Thus, the definition of MPE is introduced and employed to extract the nonlinear fault characteristics from the bearing vibration signal in different scales. Besides, the SVM is utilized to accomplish the fault feature classification to fulfill diagnostic procedure automatically. Meanwhile, in order to avoid a high dimension of features, the Laplacian score (LS is used to refine the feature vector by ranking the features according to their importance and correlations with the main fault information. Finally, the rolling bearing fault diagnosis method based on MPE, LS, and SVM is proposed and applied to the experimental data. The experimental data analysis results indicate that the proposed method could identify the fault categories effectively.

  19. Comparison of bispectral index and entropy monitoring in patients undergoing internalisation of deep brain stimulators

    Directory of Open Access Journals (Sweden)

    Suparna Bharadwaj

    2016-01-01

    Full Text Available Introduction: Depth of anaesthesia (DOA monitors are shown to reduce the intra-operative dose of anaesthetic agents, provide haemodynamic stability and shorten emergence times. Electroencephalography (EEG based DOA monitors such as bispectral index (BIS and entropy have been calibrated and validated in healthy subjects. Hence the clinical effectiveness of these monitors may be affected when monitoring patients with neurological disorders (e.g., epilepsy, dystonia, dementia and Parkinson's disease. The aim of this study was to determine whether BIS and entropy correlate with each other and with clinical indices of DOA in patients with movement disorders under general anaesthesia (GA. Materials and Methods: We conducted a prospective, observational study in patients with movement disorders undergoing internalization of deep brain stimulators. All patients received standard GA with age-adjusted mean alveolar concentration (aaMAC of an inhalational agent between 0.7 and 1.1. BIS and entropy sensors were applied on the patient's left forehead. Data collected included clinical parameters and EEG-based DOA indices. Correlation analysis was performed between entropy, BIS and the clinical indices of DOA. Bland Altman analysis was performed to determine the agreement between BIS and entropy. Results: Thirty patients were studied (mean age was 58.4 ± 11 years, male: female 18:12 and weight 79.2 ± 17 kg. Indications for deep brain stimulation were Parkinson's disease (n = 25, essential tremors (n = 2 and dystonia (n = 3. There was a very strong positive correlation between BIS and response entropy (RE (r = 0.932 and BIS and state entropy (SE (r = 0.950 and a strong negative correlation among aaMAC and BIS, RE and SE with r values of −0.686, −0.788 and −0.732, respectively. However, there was no correlation between BIS, RE, SE and haemodynamic values. Conclusion: Our study showed that BIS and entropy perform well in patients with movement disorders

  20. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    Science.gov (United States)

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation

  1. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    Science.gov (United States)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  2. An Alternative to Chaid Segmentation Algorithm Based on Entropy.

    Directory of Open Access Journals (Sweden)

    María Purificación Galindo Villardón

    2010-07-01

    Full Text Available The CHAID (Chi-Squared Automatic Interaction Detection treebased segmentation technique has been found to be an effective approach for obtaining meaningful segments that are predictive of a K-category (nominal or ordinal criterion variable. CHAID was designed to detect, in an automatic way, the  nteraction between several categorical or ordinal predictors in explaining a categorical response, but, this may not be true when Simpson’s paradox is present. This is due to the fact that CHAID is a forward selection algorithm based on the marginal counts. In this paper we propose a backwards elimination algorithm that starts with the full set of predictors (or full tree and eliminates predictors progressively. The elimination procedure is based on Conditional Independence contrasts using the concept of entropy. The proposed procedure is compared to CHAID.

  3. Scale-invariant entropy-based theory for dynamic ordering

    International Nuclear Information System (INIS)

    Mahulikar, Shripad P.; Kumari, Priti

    2014-01-01

    Dynamically Ordered self-organized dissipative structure exists in various forms and at different scales. This investigation first introduces the concept of an isolated embedding system, which embeds an open system, e.g., dissipative structure and its mass and/or energy exchange with its surroundings. Thereafter, scale-invariant theoretical analysis is presented using thermodynamic principles for Order creation, existence, and destruction. The sustainability criterion for Order existence based on its structured mass and/or energy interactions with the surroundings is mathematically defined. This criterion forms the basis for the interrelationship of physical parameters during sustained existence of dynamic Order. It is shown that the sufficient condition for dynamic Order existence is approached if its sustainability criterion is met, i.e., its destruction path is blocked. This scale-invariant approach has the potential to unify the physical understanding of universal dynamic ordering based on entropy considerations

  4. Some Comments on the Entropy-Based Criteria for Piping

    Directory of Open Access Journals (Sweden)

    Emöke Imre

    2015-04-01

    Full Text Available This paper is an extension of previous work which characterises soil behaviours using the grading entropy diagram. The present work looks at the piping process in granular soils, by considering some new data from flood-protection dikes. The piping process is divided into three parts here: particle movement at the micro scale to segregate free water; sand boil development (which is the initiation of the pipe, and pipe growth. In the first part of the process, which occurs during the rising flood, the increase in shear stress along the dike base may cause segregation of water into micro pipes if the subsoil in the dike base is relatively loose. This occurs at the maximum dike base shear stress level (ratio of shear stress and strength zone which is close to the toe. In the second part of the process, the shear strain increment causes a sudden, asymmetric slide and cracking of the dike leading to the localized excess pore pressure, liquefaction and the formation of a sand boil. In the third part of the process, the soil erosion initiated through the sand boil continues, and the pipe grows. The piping in the Hungarian dikes often occurs in a two-layer system; where the base layer is coarser with higher permeability and the cover layer is finer with lower permeability. The new data presented here show that the soils ejected from the sand boils are generally silty sands and sands, which are prone to both erosion (on the basis of the entropy criterion and liquefaction. They originate from the cover layer which is basically identical to the soil used in the Dutch backward erosion experiments.

  5. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment

    Directory of Open Access Journals (Sweden)

    Shyamal Dalapati

    2017-12-01

    Full Text Available Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set (INS environment, which we call IN-cross entropy measure and prove its basic properties. We also develop weighted IN-cross entropy measure and investigats its basic properties. Based on the weighted IN-cross entropy measure, we develop a novel strategy for multi attribute group decision making (MAGDM strategy under interval neutrosophic environment. The proposed multi attribute group decision making strategy is compared with the existing cross entropy measure based strategy in the literature under interval neutrosophic set environment. Finally, an illustrative example of multi attribute group decision making problem is solved to show the feasibility, validity and efficiency of the proposed MAGDM strategy.

  6. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis.

    Science.gov (United States)

    Azami, Hamed; Fernández, Alberto; Escudero, Javier

    2017-11-01

    Multiscale entropy (MSE) has been a prevalent algorithm to quantify the complexity of biomedical time series. Recent developments in the field have tried to alleviate the problem of undefined MSE values for short signals. Moreover, there has been a recent interest in using other statistical moments than the mean, i.e., variance, in the coarse-graining step of the MSE. Building on these trends, here we introduce the so-called refined composite multiscale fuzzy entropy based on the standard deviation (RCMFE σ ) and mean (RCMFE μ ) to quantify the dynamical properties of spread and mean, respectively, over multiple time scales. We demonstrate the dependency of the RCMFE σ and RCMFE μ , in comparison with other multiscale approaches, on several straightforward signal processing concepts using a set of synthetic signals. The results evidenced that the RCMFE σ and RCMFE μ values are more stable and reliable than the classical multiscale entropy ones. We also inspect the ability of using the standard deviation as well as the mean in the coarse-graining process using magnetoencephalograms in Alzheimer's disease and publicly available electroencephalograms recorded from focal and non-focal areas in epilepsy. Our results indicated that when the RCMFE μ cannot distinguish different types of dynamics of a particular time series at some scale factors, the RCMFE σ may do so, and vice versa. The results showed that RCMFE σ -based features lead to higher classification accuracies in comparison with the RCMFE μ -based ones. We also made freely available all the Matlab codes used in this study at http://dx.doi.org/10.7488/ds/1477 .

  7. LIBOR troubles: Anomalous movements detection based on maximum entropy

    Science.gov (United States)

    Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria

    2016-05-01

    According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.

  8. Entropy-Based Block Processing for Satellite Image Registration

    Directory of Open Access Journals (Sweden)

    Ikhyun Lee

    2012-11-01

    Full Text Available Image registration is an important task in many computer vision applications such as fusion systems, 3D shape recovery and earth observation. Particularly, registering satellite images is challenging and time-consuming due to limited resources and large image size. In such scenario, state-of-the-art image registration methods such as scale-invariant feature transform (SIFT may not be suitable due to high processing time. In this paper, we propose an algorithm based on block processing via entropy to register satellite images. The performance of the proposed method is evaluated using different real images. The comparative analysis shows that it not only reduces the processing time but also enhances the accuracy.

  9. Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy

    Science.gov (United States)

    Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng

    2018-06-01

    To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.

  10. Entropy-based critical reaction time for mixing-controlled reactive transport

    DEFF Research Database (Denmark)

    Chiogna, Gabriele; Rolle, Massimo

    2017-01-01

    Entropy-based metrics, such as the dilution index, have been proposed to quantify dilution and reactive mixing in solute transport problems. In this work, we derive the transient advection dispersion equation for the entropy density of a reactive plume. We restrict our analysis to the case where...... the concentration distribution of the transported species is Gaussian and we observe that, even in case of an instantaneous complete bimolecular reaction, dilution caused by dispersive processes dominates the entropy balance at early times and results in the net increase of the entropy density of a reactive species...

  11. Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location

    Directory of Open Access Journals (Sweden)

    Qiaoning Yang

    2015-10-01

    Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.

  12. Coherence and entanglement measures based on Rényi relative entropies

    International Nuclear Information System (INIS)

    Zhu, Huangjun; Hayashi, Masahito; Chen, Lin

    2017-01-01

    We study systematically resource measures of coherence and entanglement based on Rényi relative entropies, which include the logarithmic robustness of coherence, geometric coherence, and conventional relative entropy of coherence together with their entanglement analogues. First, we show that each Rényi relative entropy of coherence is equal to the corresponding Rényi relative entropy of entanglement for any maximally correlated state. By virtue of this observation, we establish a simple operational connection between entanglement measures and coherence measures based on Rényi relative entropies. We then prove that all these coherence measures, including the logarithmic robustness of coherence, are additive. Accordingly, all these entanglement measures are additive for maximally correlated states. In addition, we derive analytical formulas for Rényi relative entropies of entanglement of maximally correlated states and bipartite pure states, which reproduce a number of classic results on the relative entropy of entanglement and logarithmic robustness of entanglement in a unified framework. Several nontrivial bounds for Rényi relative entropies of coherence (entanglement) are further derived, which improve over results known previously. Moreover, we determine all states whose relative entropy of coherence is equal to the logarithmic robustness of coherence. As an application, we provide an upper bound for the exact coherence distillation rate, which is saturated for pure states. (paper)

  13. Parameters Tuning of Model Free Adaptive Control Based on Minimum Entropy

    Institute of Scientific and Technical Information of China (English)

    Chao Ji; Jing Wang; Liulin Cao; Qibing Jin

    2014-01-01

    Dynamic linearization based model free adaptive control(MFAC) algorithm has been widely used in practical systems, in which some parameters should be tuned before it is successfully applied to process industries. Considering the random noise existing in real processes, a parameter tuning method based on minimum entropy optimization is proposed,and the feature of entropy is used to accurately describe the system uncertainty. For cases of Gaussian stochastic noise and non-Gaussian stochastic noise, an entropy recursive optimization algorithm is derived based on approximate model or identified model. The extensive simulation results show the effectiveness of the minimum entropy optimization for the partial form dynamic linearization based MFAC. The parameters tuned by the minimum entropy optimization index shows stronger stability and more robustness than these tuned by other traditional index,such as integral of the squared error(ISE) or integral of timeweighted absolute error(ITAE), when the system stochastic noise exists.

  14. Properties of Risk Measures of Generalized Entropy in Portfolio Selection

    Directory of Open Access Journals (Sweden)

    Rongxi Zhou

    2017-12-01

    Full Text Available This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and Hybrid Entropy in the hybridized uncertainty of both fuzziness and randomness. We discover that none of the risk measures satisfy all six of the following properties, which various scholars have associated with effective risk measures: Monotonicity, Translation Invariance, Sub-additivity, Positive Homogeneity, Consistency and Convexity. Measures based on Fuzzy Entropy, Credibility Entropy, and Sine Entropy all exhibit the same properties: Sub-additivity, Positive Homogeneity, Consistency, and Convexity. These measures based on Information Entropy and Hybrid Entropy, meanwhile, only exhibit Sub-additivity and Consistency. Cumulative Residual Entropy satisfies just Sub-additivity, Positive Homogeneity, and Convexity. After identifying these properties, we develop seven portfolio models based on different risk measures and made empirical comparisons using samples from both the Shenzhen Stock Exchange of China and the New York Stock Exchange of America. The comparisons show that the Mean Fuzzy Entropy Model performs the best among the seven models with respect to both daily returns and relative cumulative returns. Overall, these results could provide an important reference for both constructing effective risk measures and rationally selecting the appropriate risk measure under different portfolio selection conditions.

  15. Fundamental limits on quantum dynamics based on entropy change

    Science.gov (United States)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  16. Upper entropy axioms and lower entropy axioms

    International Nuclear Information System (INIS)

    Guo, Jin-Li; Suo, Qi

    2015-01-01

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover, different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics

  17. Towards an entropy-based analysis of log variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2017-01-01

    the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  18. Towards an Entropy-based Analysis of Log Variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2018-01-01

    the development of hybrid miners: given a log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  19. Symbolic transfer entropy-based premature signal analysis

    International Nuclear Information System (INIS)

    Wang Jun; Yu Zheng-Feng

    2012-01-01

    In this paper, we use symbolic transfer entropy to study the coupling strength between premature signals. Numerical experiments show that three types of signal couplings are in the same direction. Among them, normal signal coupling is the strongest, followed by that of premature ventricular contractions, and that of atrial premature beats is the weakest. The T test shows that the entropies of the three signals are distinct. Symbolic transfer entropy requires less data, can distinguish the three types of signals and has very good computational efficiency. (interdisciplinary physics and related areas of science and technology)

  20. Calculating the Entropy of Solid and Liquid Metals, Based on Acoustic Data

    Science.gov (United States)

    Tekuchev, V. V.; Kalinkin, D. P.; Ivanova, I. V.

    2018-05-01

    The entropies of iron, cobalt, rhodium, and platinum are studied for the first time, based on acoustic data and using the Debye theory and rigid-sphere model, from 298 K up to the boiling point. A formula for the melting entropy of metals is validated. Good agreement between the research results and the literature data is obtained.

  1. Damage detection in rotating machinery by means of entropy-based parameters

    Science.gov (United States)

    Tocarciuc, Alexandru; Bereteu, Liviu; ǎgǎnescu, Gheorghe Eugen, Dr

    2014-11-01

    The paper is proposing two new entropy-based parameters, namely Renyi Entropy Index (REI) and Sharma-Mittal Entropy Index (SMEI), for detecting the presence of failures (or damages) in rotating machinery, namely: belt structural damage, belt wheels misalignment, failure of the fixing bolt of the machine to its baseplate and eccentricities (i.e.: due to detaching a small piece of material or bad mounting of the rotating components of the machine). The algorithms to obtain the proposed entropy-based parameters are described and test data is used in order to assess their sensitivity. A vibration test bench is used for measuring the levels of vibration while artificially inducing damage. The deviation of the two entropy-based parameters is compared in two states of the vibration test bench: not damaged and damaged. At the end of the study, their sensitivity is compared to Shannon Entropic Index.

  2. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    Science.gov (United States)

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  3. Comparison of Bispectral Index and Entropy values with electroencephalogram during surgical anaesthesia with sevoflurane.

    Science.gov (United States)

    Aho, A J; Kamata, K; Jäntti, V; Kulkas, A; Hagihira, S; Huhtala, H; Yli-Hankala, A

    2015-08-01

    Concomitantly recorded Bispectral Index® (BIS) and Entropy™ values sometimes show discordant trends during general anaesthesia. Previously, no attempt had been made to discover which EEG characteristics cause discrepancies between BIS and Entropy. We compared BIS and Entropy values, and analysed the changes in the raw EEG signal during surgical anaesthesia with sevoflurane. In this prospective, open-label study, 65 patients receiving general anaesthesia with sevoflurane were enrolled. BIS, Entropy and multichannel digital EEG were recorded. Concurrent BIS and State Entropy (SE) values were selected. Whenever BIS and SE values showed ≥10-unit disagreement for ≥60 s, the raw EEG signal was analysed both in time and frequency domain. A ≥10-unit disagreement ≥60 s was detected 428 times in 51 patients. These 428 episodes accounted for 5158 (11%) out of 45 918 analysed index pairs. During EEG burst suppression, SE was higher than BIS in 35 out of 49 episodes. During delta-theta dominance, BIS was higher than SE in 141 out of 157 episodes. During alpha or beta activity, SE was higher than BIS in all 49 episodes. During electrocautery, both BIS and SE changed, sometimes in the opposite direction, but returned to baseline values after electrocautery. Electromyography caused index disagreement four times (BIS > SE). Certain specific EEG patterns, and artifacts, are associated with discrepancies between BIS and SE. Time and frequency domain analyses of the original EEG improve the interpretation of studies involving BIS, Entropy and other EEG-based indices. NCT01077674. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Entropy-Based Model for Interpreting Life Systems in Traditional Chinese Medicine

    Directory of Open Access Journals (Sweden)

    Guo-lian Kang

    2008-01-01

    Full Text Available Traditional Chinese medicine (TCM treats qi as the core of the human life systems. Starting with a hypothetical correlation between TCM qi and the entropy theory, we address in this article a holistic model for evaluating and unveiling the rule of TCM life systems. Several new concepts such as acquired life entropy (ALE, acquired life entropy flow (ALEF and acquired life entropy production (ALEP are propounded to interpret TCM life systems. Using the entropy theory, mathematical models are established for ALE, ALEF and ALEP, which reflect the evolution of life systems. Some criteria are given on physiological activities and pathological changes of the body in different stages of life. Moreover, a real data-based simulation shows life entropies of the human body with different ages, Cold and Hot constitutions and in different seasons in North China are coincided with the manifestations of qi as well as the life evolution in TCM descriptions. Especially, based on the comparative and quantitative analysis, the entropy-based model can nicely describe the evolution of life entropies in Cold and Hot individuals thereby fitting the Yin–Yang theory in TCM. Thus, this work establishes a novel approach to interpret the fundamental principles in TCM, and provides an alternative understanding for the complex life systems.

  5. Application of entropy measurement technique in grey based ...

    African Journals Online (AJOL)

    For this study, four control variables are selected current, voltage, gas flow rate and ... Keywords: Metal Inert Gas (MIG) Welding, Grey-Taguchi Method, Entropy ...... of metal inert gas welding on the corrosion and mechanical behaviour of.

  6. Sample Entropy-Based Approach to Evaluate the Stability of Double-Wire Pulsed MIG Welding

    Directory of Open Access Journals (Sweden)

    Ping Yao

    2014-01-01

    Full Text Available According to the sample entropy, this paper deals with a quantitative method to evaluate the current stability in double-wire pulsed MIG welding. Firstly, the sample entropy of current signals with different stability but the same parameters is calculated. The results show that the more stable the current, the smaller the value and the standard deviation of sample entropy. Secondly, four parameters, which are pulse width, peak current, base current, and frequency, are selected for four-level three-factor orthogonal experiment. The calculation and analysis of desired signals indicate that sample entropy values are affected by welding current parameters. Then, a quantitative method based on sample entropy is proposed. The experiment results show that the method can preferably quantify the welding current stability.

  7. An entropy-based analysis of lane changing behavior: An interactive approach.

    Science.gov (United States)

    Kosun, Caglar; Ozdemir, Serhan

    2017-05-19

    As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would

  8. Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways

    Directory of Open Access Journals (Sweden)

    Keting Hu

    2016-03-01

    Full Text Available In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE, Wavelet Packet Energy Tsallis Entropy (WPETE with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6, Empirical Mode Decomposition Shannon Entropy (EMDESE, and Empirical Mode Decomposition Tsallis Entropy (EMDETE with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT. The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.

  9. A comparison of different entransy flow definitions and entropy generation in thermal radiation optimization

    International Nuclear Information System (INIS)

    Zhou Bing; Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    In thermal radiation, taking heat flow as an extensive quantity and defining the potential as temperature T or the blackbody emissive power U will lead to two different definitions of radiation entransy flow and the corresponding principles for thermal radiation optimization. The two definitions of radiation entransy flow and the corresponding optimization principles are compared in this paper. When the total heat flow is given, the optimization objectives of the extremum entransy dissipation principles (EEDPs) developed based on potentials T and U correspond to the minimum equivalent temperature difference and the minimum equivalent blackbody emissive power difference respectively. The physical meaning of the definition based on potential U is clearer than that based on potential T, but the latter one can be used for the coupled heat transfer optimization problem while the former one cannot. The extremum entropy generation principle (EEGP) for thermal radiation is also derived, which includes the minimum entropy generation principle for thermal radiation. When the radiation heat flow is prescribed, the EEGP reveals that the minimum entropy generation leads to the minimum equivalent thermodynamic potential difference, which is not the expected objective in heat transfer. Therefore, the minimum entropy generation is not always appropriate for thermal radiation optimization. Finally, three thermal radiation optimization examples are discussed, and the results show that the difference in optimization objective between the EEDPs and the EEGP leads to the difference between the optimization results. The EEDP based on potential T is more useful in practical application since its optimization objective is usually consistent with the expected one. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  10. Generalized composite multiscale permutation entropy and Laplacian score based rolling bearing fault diagnosis

    Science.gov (United States)

    Zheng, Jinde; Pan, Haiyang; Yang, Shubao; Cheng, Junsheng

    2018-01-01

    Multiscale permutation entropy (MPE) is a recently proposed nonlinear dynamic method for measuring the randomness and detecting the nonlinear dynamic change of time series and can be used effectively to extract the nonlinear dynamic fault feature from vibration signals of rolling bearing. To solve the drawback of coarse graining process in MPE, an improved MPE method called generalized composite multiscale permutation entropy (GCMPE) was proposed in this paper. Also the influence of parameters on GCMPE and its comparison with the MPE are studied by analyzing simulation data. GCMPE was applied to the fault feature extraction from vibration signal of rolling bearing and then based on the GCMPE, Laplacian score for feature selection and the Particle swarm optimization based support vector machine, a new fault diagnosis method for rolling bearing was put forward in this paper. Finally, the proposed method was applied to analyze the experimental data of rolling bearing. The analysis results show that the proposed method can effectively realize the fault diagnosis of rolling bearing and has a higher fault recognition rate than the existing methods.

  11. Giant irreversible positive to large reversible negative magnetic entropy change evolution in Tb-based bulk metallic glass

    International Nuclear Information System (INIS)

    Luo Qiang; Schwarz, Bjoern; Mattern, Norbert; Eckert, Juergen

    2010-01-01

    We study the effects of amorphous structure and random anisotropy on the magnetic entropy change in a series of Tb-based amorphous alloys. The amorphous structure broadens the peak of magnetic entropy change and facilitates the adjustment of properties. The peak magnetic entropy change above the spin freezing temperature first depends on the average magnetic moment approximately linearly and second on the exchange interaction and random anisotropy. Large and broad reversible negative magnetic entropy changes are observed above the spin freezing temperature and giant positive irreversible magnetic entropy changes which associate with the internal entropy production are obtained well below.

  12. Entropy-Based Economic Denial of Sustainability Detection

    Directory of Open Access Journals (Sweden)

    Marco Antonio Sotelo Monge

    2017-11-01

    Full Text Available In recent years, an important increase in the amount and impact of Distributed Denial of Service (DDoS threats has been reported by the different information security organizations. They typically target the depletion of the computational resources of the victims, hence drastically harming their operational capabilities. Inspired by these methods, Economic Denial of Sustainability (EDoS attacks pose a similar motivation, but adapted to Cloud computing environments, where the denial is achieved by damaging the economy of both suppliers and customers. Therefore, the most common EDoS approach is making the offered services unsustainable by exploiting their auto-scaling algorithms. In order to contribute to their mitigation, this paper introduces a novel EDoS detection method based on the study of entropy variations related with metrics taken into account when deciding auto-scaling actuations. Through the prediction and definition of adaptive thresholds, unexpected behaviors capable of fraudulently demand new resource hiring are distinguished. With the purpose of demonstrate the effectiveness of the proposal, an experimental scenario adapted to the singularities of the EDoS threats and the assumptions driven by their original definition is described in depth. The preliminary results proved high accuracy.

  13. Entropy-based model for miRNA isoform analysis.

    Directory of Open Access Journals (Sweden)

    Shengqin Wang

    Full Text Available MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test, we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.

  14. Entropy-Based Video Steganalysis of Motion Vectors

    Directory of Open Access Journals (Sweden)

    Elaheh Sadat Sadat

    2018-04-01

    Full Text Available In this paper, a new method is proposed for motion vector steganalysis using the entropy value and its combination with the features of the optimized motion vector. In this method, the entropy of blocks is calculated to determine their texture and the precision of their motion vectors. Then, by using a fuzzy cluster, the blocks are clustered into the blocks with high and low texture, while the membership function of each block to a high texture class indicates the texture of that block. These membership functions are used to weight the effective features that are extracted by reconstructing the motion estimation equations. Characteristics of the results indicate that the use of entropy and the irregularity of each block increases the precision of the final video classification into cover and stego classes.

  15. Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing

    Directory of Open Access Journals (Sweden)

    Radu Mutihac

    2009-06-01

    Full Text Available Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.

  16. Driver Fatigue Detection System Using Electroencephalography Signals Based on Combined Entropy Features

    Directory of Open Access Journals (Sweden)

    Zhendong Mu

    2017-02-01

    Full Text Available Driver fatigue has become one of the major causes of traffic accidents, and is a complicated physiological process. However, there is no effective method to detect driving fatigue. Electroencephalography (EEG signals are complex, unstable, and non-linear; non-linear analysis methods, such as entropy, maybe more appropriate. This study evaluates a combined entropy-based processing method of EEG data to detect driver fatigue. In this paper, 12 subjects were selected to take part in an experiment, obeying driving training in a virtual environment under the instruction of the operator. Four types of enthrones (spectrum entropy, approximate entropy, sample entropy and fuzzy entropy were used to extract features for the purpose of driver fatigue detection. Electrode selection process and a support vector machine (SVM classification algorithm were also proposed. The average recognition accuracy was 98.75%. Retrospective analysis of the EEG showed that the extracted features from electrodes T5, TP7, TP8 and FP1 may yield better performance. SVM classification algorithm using radial basis function as kernel function obtained better results. A combined entropy-based method demonstrates good classification performance for studying driver fatigue detection.

  17. Bias-based modeling and entropy analysis of PUFs

    NARCIS (Netherlands)

    van den Berg, R.; Skoric, B.; Leest, van der V.

    2013-01-01

    Physical Unclonable Functions (PUFs) are increasingly becoming a well-known security primitive for secure key storage and anti-counterfeiting. For both applications it is imperative that PUFs provide enough entropy. The aim of this paper is to propose a new model for binary-output PUFs such as SRAM,

  18. Epileptic seizure detection using DWT-based approximate entropy, Shannon entropy and support vector machine: a case study.

    Science.gov (United States)

    Sharmila, A; Aman Raj, Suman; Shashank, Pandey; Mahalakshmi, P

    2018-01-01

    In this work, we have used a time-frequency domain analysis method called discrete wavelet transform (DWT) technique. This method stand out compared to other proposed methods because of its algorithmic elegance and accuracy. A wavelet is a mathematical function based on time-frequency analysis in signal processing. It is useful particularly because it allows a weak signal to be recovered from a noisy signal without much distortion. A wavelet analysis works by analysing the image and converting it to mathematical function which is decoded by the receiver. Furthermore, we have used Shannon entropy and approximate entropy (ApEn) for extracting the complexities associated with electroencephalographic (EEG) signals. The ApEn is a suitable feature to characterise the EEGs because its value drops suddenly due to excessive synchronous discharge of neurons in the brain during epileptic activity in this study. EEG signals are decomposed into six EEG sub-bands namely D1-D5 and A5 using DWT technique. Non-linear features such as ApEn and Shannon entropy are calculated from these sub-bands and support vector machine classifiers are used for classification purpose. This scheme is tested using EEG data recorded from five healthy subjects and five epileptic patients during the inter-ictal and ictal periods. The data are acquired from University of Bonn, Germany. The proposed method is evaluated through 15 classification problems, and obtained high classification accuracy of 100% for two cases and it indicates the good classifying performance of the proposed method.

  19. Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity

    Science.gov (United States)

    2015-10-23

    AFRL-AFOSR-VA-TR-2015-0337 Entropy Viscosity and L1-based Approximations of PDEs: Exploiting Sparsity Jean-Luc Guermond TEXAS A & M UNIVERSITY 750...REPORT DATE (DD-MM-YYYY) 09-05-2015 2. REPORT TYPE Final report 3. DATES COVERED (From - To) 01-07-2012 - 30-06-2015 4. TITLE AND SUBTITLE Entropy ...conservation equations can be stabilized by using the so-called entropy viscosity method and we proposed to to investigate this new technique. We

  20. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  1. New Fault Recognition Method for Rotary Machinery Based on Information Entropy and a Probabilistic Neural Network.

    Science.gov (United States)

    Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu

    2018-01-24

    Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.

  2. A two-phase copula entropy-based multiobjective optimization approach to hydrometeorological gauge network design

    Science.gov (United States)

    Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin

    2017-12-01

    Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.

  3. Characterizing brain structures and remodeling after TBI based on information content, diffusion entropy.

    Science.gov (United States)

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P; Zhang, Zheng Gang; Lehman, Norman L; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease.

  4. Characterizing Brain Structures and Remodeling after TBI Based on Information Content, Diffusion Entropy

    Science.gov (United States)

    Fozouni, Niloufar; Chopp, Michael; Nejad-Davarani, Siamak P.; Zhang, Zheng Gang; Lehman, Norman L.; Gu, Steven; Ueno, Yuji; Lu, Mei; Ding, Guangliang; Li, Lian; Hu, Jiani; Bagher-Ebadian, Hassan; Hearshen, David; Jiang, Quan

    2013-01-01

    Background To overcome the limitations of conventional diffusion tensor magnetic resonance imaging resulting from the assumption of a Gaussian diffusion model for characterizing voxels containing multiple axonal orientations, Shannon's entropy was employed to evaluate white matter structure in human brain and in brain remodeling after traumatic brain injury (TBI) in a rat. Methods Thirteen healthy subjects were investigated using a Q-ball based DTI data sampling scheme. FA and entropy values were measured in white matter bundles, white matter fiber crossing areas, different gray matter (GM) regions and cerebrospinal fluid (CSF). Axonal densities' from the same regions of interest (ROIs) were evaluated in Bielschowsky and Luxol fast blue stained autopsy (n = 30) brain sections by light microscopy. As a case demonstration, a Wistar rat subjected to TBI and treated with bone marrow stromal cells (MSC) 1 week after TBI was employed to illustrate the superior ability of entropy over FA in detecting reorganized crossing axonal bundles as confirmed by histological analysis with Bielschowsky and Luxol fast blue staining. Results Unlike FA, entropy was less affected by axonal orientation and more affected by axonal density. A significant agreement (r = 0.91) was detected between entropy values from in vivo human brain and histologically measured axonal density from post mortum from the same brain structures. The MSC treated TBI rat demonstrated that the entropy approach is superior to FA in detecting axonal remodeling after injury. Compared with FA, entropy detected new axonal remodeling regions with crossing axons, confirmed with immunohistological staining. Conclusions Entropy measurement is more effective in distinguishing axonal remodeling after injury, when compared with FA. Entropy is also more sensitive to axonal density than axonal orientation, and thus may provide a more accurate reflection of axonal changes that occur in neurological injury and disease

  5. The Grading Entropy-based Criteria for Structural Stability of Granular Materials and Filters

    Directory of Open Access Journals (Sweden)

    Janos Lőrincz

    2015-05-01

    Full Text Available This paper deals with three grading entropy-based rules that describe different soil structure stability phenomena: an internal stability rule, a filtering rule and a segregation rule. These rules are elaborated on the basis of a large amount of laboratory testing and from existing knowledge in the field. Use is made of the theory of grading entropy to derive parameters which incorporate all of the information of the grading curve into a pair of entropy-based parameters that allow soils with common behaviours to be grouped into domains on an entropy diagram. Applications of the derived entropy-based rules are presented by examining the reason of a dam failure, by testing against the existing filter rules from the literature, and by giving some examples for the design of non-segregating grading curves (discrete particle size distributions by dry weight. A physical basis for the internal stability rule is established, wherein the higher values of base entropy required for granular stability are shown to reflect the closeness between the mean and maximum grain diameters, which explains how there are sufficient coarser grains to achieve a stable grain skeleton.

  6. Improved Ordinary Measure and Image Entropy Theory based intelligent Copy Detection Method

    Directory of Open Access Journals (Sweden)

    Dengpan Ye

    2011-10-01

    Full Text Available Nowadays, more and more multimedia websites appear in social network. It brings some security problems, such as privacy, piracy, disclosure of sensitive contents and so on. Aiming at copyright protection, the copy detection technology of multimedia contents becomes a hot topic. In our previous work, a new computer-based copyright control system used to detect the media has been proposed. Based on this system, this paper proposes an improved media feature matching measure and an entropy based copy detection method. The Levenshtein Distance was used to enhance the matching degree when using for feature matching measure in copy detection. For entropy based copy detection, we make a fusion of the two features of entropy matrix of the entropy feature we extracted. Firstly,we extract the entropy matrix of the image and normalize it. Then, we make a fusion of the eigenvalue feature and the transfer matrix feature of the entropy matrix. The fused features will be used for image copy detection. The experiments show that compared to use these two kinds of features for image detection singly, using feature fusion matching method is apparent robustness and effectiveness. The fused feature has a high detection for copy images which have been received some attacks such as noise, compression, zoom, rotation and so on. Comparing with referred methods, the method proposed is more intelligent and can be achieved good performance.

  7. Conflict management based on belief function entropy in sensor fusion.

    Science.gov (United States)

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Wireless sensor network plays an important role in intelligent navigation. It incorporates a group of sensors to overcome the limitation of single detection system. Dempster-Shafer evidence theory can combine the sensor data of the wireless sensor network by data fusion, which contributes to the improvement of accuracy and reliability of the detection system. However, due to different sources of sensors, there may be conflict among the sensor data under uncertain environment. Thus, this paper proposes a new method combining Deng entropy and evidence distance to address the issue. First, Deng entropy is adopted to measure the uncertain information. Then, evidence distance is applied to measure the conflict degree. The new method can cope with conflict effectually and improve the accuracy and reliability of the detection system. An example is illustrated to show the efficiency of the new method and the result is compared with that of the existing methods.

  8. An Entropy-Based Statistic for Genomewide Association Studies

    OpenAIRE

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-01-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard χ2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the difference...

  9. A numerically research on energy loss evaluation in a centrifugal pump system based on local entropy production method

    Directory of Open Access Journals (Sweden)

    Hou Hucan

    2017-01-01

    Full Text Available Inspired by wide application of the second law of thermodynamics to flow and heat transfer devices, local entropy production analysis method was creatively introduced into energy assessment system of centrifugal water pump. Based on Reynolds stress turbulent model and energy equation model, the steady numerical simulation of the whole flow passage of one IS centrifugal pump was carried out. The local entropy production terms were calculated by user defined functions, mainly including wall entropy production, turbulent entropy production, and viscous entropy production. The numerical results indicated that the irreversible energy loss calculated by the local entropy production method agreed well with that calculated by the traditional method but with some deviations which were probably caused by high rotatability and high curvature of impeller and volute. The wall entropy production and turbulent entropy production took up large part of the whole entropy production about 48.61% and 47.91%, respectively, which indicated that wall friction and turbulent fluctuation were the major factors in affecting irreversible energy loss. Meanwhile, the entropy production rate distribution was discussed and compared with turbulent kinetic energy dissipation rate distribution, it showed that turbulent entropy production rate increased sharply at the near wall regions and both distributed more uniformly. The blade region in leading edge near suction side, trailing edge and volute tongue were the main regions to generate irreversible exergy loss. This research broadens a completely new view in evaluating energy loss and further optimizes pump using entropy production minimization.

  10. [Identification of special quality eggs with NIR spectroscopy technology based on symbol entropy feature extraction method].

    Science.gov (United States)

    Zhao, Yong; Hong, Wen-Xue

    2011-11-01

    Fast, nondestructive and accurate identification of special quality eggs is an urgent problem. The present paper proposed a new feature extraction method based on symbol entropy to identify near infrared spectroscopy of special quality eggs. The authors selected normal eggs, free range eggs, selenium-enriched eggs and zinc-enriched eggs as research objects and measured the near-infrared diffuse reflectance spectra in the range of 12 000-4 000 cm(-1). Raw spectra were symbolically represented with aggregation approximation algorithm and symbolic entropy was extracted as feature vector. An error-correcting output codes multiclass support vector machine classifier was designed to identify the spectrum. Symbolic entropy feature is robust when parameter changed and the highest recognition rate reaches up to 100%. The results show that the identification method of special quality eggs using near-infrared is feasible and the symbol entropy can be used as a new feature extraction method of near-infrared spectra.

  11. Generalized sample entropy analysis for traffic signals based on similarity measure

    Science.gov (United States)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  12. Design and implementation of fuzzy-PD controller based on relation models: A cross-entropy optimization approach

    Science.gov (United States)

    Anisimov, D. N.; Dang, Thai Son; Banerjee, Santo; Mai, The Anh

    2017-07-01

    In this paper, an intelligent system use fuzzy-PD controller based on relation models is developed for a two-wheeled self-balancing robot. Scaling factors of the fuzzy-PD controller are optimized by a Cross-Entropy optimization method. A linear Quadratic Regulator is designed to bring a comparison with the fuzzy-PD controller by control quality parameters. The controllers are ported and run on STM32F4 Discovery Kit based on the real-time operating system. The experimental results indicate that the proposed fuzzy-PD controller runs exactly on embedded system and has desired performance in term of fast response, good balance and stabilize.

  13. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    Science.gov (United States)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  14. A novel evaluation of heat-electricity cost allocation in cogenerations based on entropy change method

    International Nuclear Information System (INIS)

    Ye, Xuemin; Li, Chunxi

    2013-01-01

    As one of the most significant measures to improve energy utilization efficiency and save energy, cogeneration or combined heat and power (CHP) has been widely applied and promoted with positive motivations in many countries. A rational cost allocation model should indicate the performance of cogenerations and balance the benefits between electricity generation and heat production. Based on the second law of thermodynamics, the present paper proposes an entropy change method for cost allocation by choosing exhaust steam entropy as a datum point, and the new model works in conjunction with entropy change and irreversibility during energy conversion processes. The allocation ratios of heat cost with the present and existing methods are compared for different types of cogenerations. Results show that the allocation ratios with the entropy change method are more rational and the cost allocation model can make up some limitations involved in other approaches. The future energy policies and innovational directions for cogenerations and heat consumers should be developed. - Highlights: • A rational model of cogeneration cost allocation is established. • Entropy change method integrates the relation of entropy change and exergy losses. • The unity of measuring energy quality and quantity is materialized. • The benefits between electricity generation and heat production are balanced

  15. Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy

    Science.gov (United States)

    Dun, Xiaohong

    2018-05-01

    With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.

  16. ACCUMULATED DEFORMATION MODELING OF PERMANENT WAY BASED ON ENTROPY SYSTEM

    Directory of Open Access Journals (Sweden)

    D. M. Kurhan

    2015-07-01

    Full Text Available Purpose. The work provides a theoretical research about the possibility of using methods that determine the lifetime of a railway track not only in terms of total stresses, and accounting its structure and dynamic characteristics. The aim of these studies is creation the model of deformations accumulation for assessment of service life of a railway track taking into account these features. Methodology. To simulate a gradual change state during the operation (accumulation of deformations the railway track is presented as a system that consists of many particles of different materials collected in a coherent design. It is appropriate to speak not about the appearance of deformations of a certain size in a certain section of the track, and the probability of such event on the site. If to operate the probability of occurrence of deviations, comfortable state of the system is characterized by the number of breaks of the conditional internal connections. The same state of the system may correspond to different combinations of breaks. The more breaks, the more the number of options changes in the structure of the system appropriate to its current state. Such a process can be represented as a gradual transition from an ordered state to a chaotic one. To describe the characteristics of the system used the numerical value of the entropy. Findings. Its entropy is constantly increasing at system aging. The growth of entropy is expressed by changes in the internal energy of the system, which can be determined using mechanical work forces, which leads to deformation. This gives the opportunity to show quantitative indication of breaking the bonds in the system as a consequence of performing mechanical work. According to the results of theoretical research methods for estimation of the timing of life cycles of railway operation considering such factors as the structure of the flow of trains, construction of the permanent way, the movement of trains at high

  17. A Possible Ethical Imperative Based on the Entropy Law

    Directory of Open Access Journals (Sweden)

    Mehrdad Massoudi

    2016-11-01

    Full Text Available Lindsay in an article titled, “Entropy consumption and values in physical science,” (Am. Sci. 1959, 47, 678–696 proposed a Thermodynamic Imperative similar to Kant’s Ethical Categorical Imperative. In this paper, after describing the concept of ethical imperative as elaborated by Kant, we provide a brief discussion of the role of science and its relationship to the classical thermodynamics and the physical implications of the first and the second laws of thermodynamics. We finally attempt to extend and supplement Lindsay’s Thermodynamic Imperative (TI, by another Imperative suggesting simplicity, conservation, and harmony.

  18. Multiscale entropy based study of the pathological time series

    International Nuclear Information System (INIS)

    Wang Jun; Ma Qianli

    2008-01-01

    This paper studies the multiscale entropy (MSE) of electrocardiogram's ST segment and compares the MSE results of ST segment with that of electrocardiogram in the first time. Electrocardiogram complexity changing characteristics has important clinical significance for early diagnosis. Study shows that the average MSE values and the varying scope fluctuation could be more effective to reveal the heart health status. Particularly the multiscale values varying scope fluctuation is a more sensitive parameter for early heart disease detection and has a clinical diagnostic significance. (general)

  19. Entropy feature extraction on flow pattern of gas/liquid two-phase flow based on cross-section measurement

    International Nuclear Information System (INIS)

    Han, J; Dong, F; Xu, Y Y

    2009-01-01

    This paper introduces the fundamental of cross-section measurement system based on Electrical Resistance Tomography (ERT). The measured data of four flow regimes of the gas/liquid two-phase flow in horizontal pipe flow are obtained by an ERT system. For the measured data, five entropies are extracted to analyze the experimental data according to the different flow regimes, and the analysis method is examined and compared in three different perspectives. The results indicate that three different perspectives of entropy-based feature extraction are sensitive to the flow pattern transition in gas/liquid two-phase flow. By analyzing the results of three different perspectives with the changes of gas/liquid two-phase flow parameters, the dynamic structures of gas/liquid two-phase flow is obtained, and they also provide an efficient supplementary to reveal the flow pattern transition mechanism of gas/liquid two-phase flow. Comparison of the three different methods of feature extraction shows that the appropriate entropy should be used for the identification and prediction of flow regimes.

  20. Application of the entropy generation minimization method to a solar heat exchanger: A pseudo-optimization design process based on the analysis of the local entropy generation maps

    International Nuclear Information System (INIS)

    Giangaspero, Giorgio; Sciubba, Enrico

    2013-01-01

    This paper presents an application of the entropy generation minimization method to the pseudo-optimization of the configuration of the heat exchange surfaces in a Solar Rooftile. An initial “standard” commercial configuration is gradually improved by introducing design changes aimed at the reduction of the thermodynamic losses due to heat transfer and fluid friction. Different geometries (pins, fins and others) are analysed with a commercial CFD (Computational Fluid Dynamics) code that also computes the local entropy generation rate. The design improvement process is carried out on the basis of a careful analysis of the local entropy generation maps and the rationale behind each step of the process is discussed in this perspective. The results are compared with other entropy generation minimization techniques available in the recent technical literature. It is found that the geometry with pin-fins has the best performance among the tested ones, and that the optimal pin array shape parameters (pitch and span) can be determined by a critical analysis of the integrated and local entropy maps and of the temperature contours. - Highlights: ► An entropy generation minimization method is applied to a solar heat exchanger. ► The approach is heuristic and leads to a pseudo-optimization process with CFD as main tool. ► The process is based on the evaluation of the local entropy generation maps. ► The geometry with pin-fins in general outperforms all other configurations. ► The entropy maps and temperature contours can be used to determine the optimal pin array design parameters

  1. Entropy-based automated classification of independent components separated from fMCG

    International Nuclear Information System (INIS)

    Comani, S; Srinivasan, V; Alleva, G; Romani, G L

    2007-01-01

    Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system. (note)

  2. An entropy-based improved k-top scoring pairs (TSP) method for ...

    African Journals Online (AJOL)

    An entropy-based improved k-top scoring pairs (TSP) (Ik-TSP) method was presented in this study for the classification and prediction of human cancers based on gene-expression data. We compared Ik-TSP classifiers with 5 different machine learning methods and the k-TSP method based on 3 different feature selection ...

  3. A Novel Entropy-Based Centrality Approach for Identifying Vital Nodes in Weighted Networks

    Directory of Open Access Journals (Sweden)

    Tong Qiao

    2018-04-01

    Full Text Available Measuring centrality has recently attracted increasing attention, with algorithms ranging from those that simply calculate the number of immediate neighbors and the shortest paths to those that are complicated iterative refinement processes and objective dynamical approaches. Indeed, vital nodes identification allows us to understand the roles that different nodes play in the structure of a network. However, quantifying centrality in complex networks with various topological structures is not an easy task. In this paper, we introduce a novel definition of entropy-based centrality, which can be applicable to weighted directed networks. By design, the total power of a node is divided into two parts, including its local power and its indirect power. The local power can be obtained by integrating the structural entropy, which reveals the communication activity and popularity of each node, and the interaction frequency entropy, which indicates its accessibility. In addition, the process of influence propagation can be captured by the two-hop subnetworks, resulting in the indirect power. In order to evaluate the performance of the entropy-based centrality, we use four weighted real-world networks with various instance sizes, degree distributions, and densities. Correspondingly, these networks are adolescent health, Bible, United States (US airports, and Hep-th, respectively. Extensive analytical results demonstrate that the entropy-based centrality outperforms degree centrality, betweenness centrality, closeness centrality, and the Eigenvector centrality.

  4. Multiattribute Decision Making Based on Entropy under Interval-Valued Intuitionistic Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Yingjun Zhang

    2013-01-01

    Full Text Available Multiattribute decision making (MADM is one of the central problems in artificial intelligence, specifically in management fields. In most cases, this problem arises from uncertainty both in the data derived from the decision maker and the actions performed in the environment. Fuzzy set and high-order fuzzy sets were proven to be effective approaches in solving decision-making problems with uncertainty. Therefore, in this paper, we investigate the MADM problem with completely unknown attribute weights in the framework of interval-valued intuitionistic fuzzy (IVIF set (IVIFS. We first propose a new definition of IVIF entropy and some calculation methods for IVIF entropy. Furthermore, we propose an entropy-based decision-making method to solve IVIF MADM problems with completely unknown attribute weights. Particular emphasis is put on assessing the attribute weights based on IVIF entropy. Instead of the traditional methods, which use divergence among attributes or the probabilistic discrimination of attributes to obtain attribute weights, we utilize the IVIF entropy to assess the attribute weights based on the credibility of the decision-making matrix for solving the problem. Finally, a supplier selection example is given to demonstrate the feasibility and validity of the proposed MADM method.

  5. Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoming Xu

    2017-01-01

    Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.

  6. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    Science.gov (United States)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  7. Entropy based statistical inference for methane emissions released from wetland

    Czech Academy of Sciences Publication Activity Database

    Sabolová, R.; Sečkárová, Vladimíra; Dušek, Jiří; Stehlík, M.

    2015-01-01

    Roč. 141, č. 1 (2015), s. 125-133 ISSN 0169-7439 R&D Projects: GA ČR GA13-13502S; GA ČR(CZ) GAP504/11/1151; GA MŠk(CZ) ED1.1.00/02.0073 Grant - others:GA ČR(CZ) GA201/12/0083; GA UK(CZ) SVV 2014-260105 Institutional support: RVO:67985556 ; RVO:67179843 Keywords : chaos * entropy * Kullback-Leibler divergence * Pareto distribution * saddlepoint approaximation * wetland ecosystem Subject RIV: BB - Applied Statistics, Operational Research; EH - Ecology, Behaviour (UEK-B) Impact factor: 2.217, year: 2015 http://library.utia.cas.cz/separaty/2014/AS/seckarova-0438651.pdf

  8. Minimum Entropy-Based Cascade Control for Governing Hydroelectric Turbines

    Directory of Open Access Journals (Sweden)

    Mifeng Ren

    2014-06-01

    Full Text Available In this paper, an improved cascade control strategy is presented for hydroturbine speed governors. Different from traditional proportional-integral-derivative (PID control and model predictive control (MPC strategies, the performance index of the outer controller is constructed by integrating the entropy and mean value of the tracking error with the constraints on control energy. The inner controller is implemented by a proportional controller. Compared with the conventional PID-P and MPC-P cascade control methods, the proposed cascade control strategy can effectively decrease fluctuations of hydro-turbine speed under non-Gaussian disturbance conditions in practical hydropower plants. Simulation results show the advantages of the proposed cascade control method.

  9. Branch length similarity entropy-based descriptors for shape representation

    Science.gov (United States)

    Kwon, Ohsung; Lee, Sang-Hee

    2017-11-01

    In previous studies, we showed that the branch length similarity (BLS) entropy profile could be successfully used for the shape recognition such as battle tanks, facial expressions, and butterflies. In the present study, we proposed new descriptors, roundness, symmetry, and surface roughness, for the recognition, which are more accurate and fast in the computation than the previous descriptors. The roundness represents how closely a shape resembles to a circle, the symmetry characterizes how much one shape is similar with another when the shape is moved in flip, and the surface roughness quantifies the degree of vertical deviations of a shape boundary. To evaluate the performance of the descriptors, we used the database of leaf images with 12 species. Each species consisted of 10 - 20 leaf images and the total number of images were 160. The evaluation showed that the new descriptors successfully discriminated the leaf species. We believe that the descriptors can be a useful tool in the field of pattern recognition.

  10. A generalized complexity measure based on Rényi entropy

    Science.gov (United States)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  11. A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Ge Jianjun

    2017-12-01

    Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.

  12. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  13. Entropy Based Analysis of DNS Query Traffic in the Campus Network

    Directory of Open Access Journals (Sweden)

    Dennis Arturo Ludeña Romaña

    2008-10-01

    Full Text Available We carried out the entropy based study on the DNS query traffic from the campus network in a university through January 1st, 2006 to March 31st, 2007. The results are summarized, as follows: (1 The source IP addresses- and query keyword-based entropies change symmetrically in the DNS query traffic from the outside of the campus network when detecting the spam bot activity on the campus network. On the other hand (2, the source IP addresses- and query keywordbased entropies change similarly each other when detecting big DNS query traffic caused by prescanning or distributed denial of service (DDoS attack from the campus network. Therefore, we can detect the spam bot and/or DDoS attack bot by only watching DNS query access traffic.

  14. Biological Aging and Life Span Based on Entropy Stress via Organ and Mitochondrial Metabolic Loading

    Directory of Open Access Journals (Sweden)

    Kalyan Annamalai

    2017-10-01

    Full Text Available The energy for sustaining life is released through the oxidation of glucose, fats, and proteins. A part of the energy released within each cell is stored as chemical energy of Adenosine Tri-Phosphate molecules, which is essential for performing life-sustaining functions, while the remainder is released as heat in order to maintain isothermal state of the body. Earlier literature introduced the availability concepts from thermodynamics, related the specific irreversibility and entropy generation rates to metabolic efficiency and energy release rate of organ k, computed whole body specific entropy generation rate of whole body at any given age as a sum of entropy generation within four vital organs Brain, Heart, Kidney, Liver (BHKL with 5th organ being the rest of organs (R5 and estimated the life span using an upper limit on lifetime entropy generated per unit mass of body, σM,life. The organ entropy stress expressed in terms of lifetime specific entropy generated per unit mass of body organs (kJ/(K kg of organ k was used to rank organs and heart ranked highest while liver ranked lowest. The present work includes the effects of (1 two additional organs: adipose tissue (AT and skeletal muscles (SM which are of importance to athletes; (2 proportions of nutrients oxidized which affects blood temperature and metabolic efficiencies; (3 conversion of the entropy stress from organ/cellular level to mitochondrial level; and (4 use these parameters as metabolism-based biomarkers for quantifying the biological aging process in reaching the limit of σM,life. Based on the 7-organ model and Elia constants for organ metabolic rates for a male of 84 kg steady mass and using basic and derived allometric constants of organs, the lifetime energy expenditure is estimated to be 2725 MJ/kg body mass while lifetime entropy generated is 6050 kJ/(K kg body mass with contributions of 190; 1835.0; 610; 290; 700; 1470 and 95 kJ/K contributed by AT-BHKL-SM-R7 to 1 kg body

  15. Monte Carlo comparison of four normality tests using different entropy estimates

    Czech Academy of Sciences Publication Activity Database

    Esteban, M. D.; Castellanos, M. E.; Morales, D.; Vajda, Igor

    2001-01-01

    Roč. 30, č. 4 (2001), s. 761-785 ISSN 0361-0918 R&D Projects: GA ČR GA102/99/1137 Institutional research plan: CEZ:AV0Z1075907 Keywords : test of normality * entropy test and entropy estimator * table of critical values Subject RIV: BD - Theory of Information Impact factor: 0.153, year: 2001

  16. Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method

    Directory of Open Access Journals (Sweden)

    Majid Shadman Roodposhti

    2016-09-01

    Full Text Available Assessing Landslide Susceptibility Mapping (LSM contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70% and testing (≈30% samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC curves in combination with area under the curve (AUC. The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area.

  17. Proposed Empirical Entropy and Gibbs Energy Based on Observations of Scale Invariance in Open Nonequilibrium Systems.

    Science.gov (United States)

    Tuck, Adrian F

    2017-09-07

    There is no widely agreed definition of entropy, and consequently Gibbs energy, in open systems far from equilibrium. One recent approach has sought to formulate an entropy and Gibbs energy based on observed scale invariances in geophysical variables, particularly in atmospheric quantities, including the molecules constituting stratospheric chemistry. The Hamiltonian flux dynamics of energy in macroscopic open nonequilibrium systems maps to energy in equilibrium statistical thermodynamics, and corresponding equivalences of scale invariant variables with other relevant statistical mechanical variables such as entropy, Gibbs energy, and 1/(k Boltzmann T), are not just formally analogous but are also mappings. Three proof-of-concept representative examples from available adequate stratospheric chemistry observations-temperature, wind speed and ozone-are calculated, with the aim of applying these mappings and equivalences. Potential applications of the approach to scale invariant observations from the literature, involving scales from molecular through laboratory to astronomical, are considered. Theoretical support for the approach from the literature is discussed.

  18. Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis

    Directory of Open Access Journals (Sweden)

    Jianping Li

    2013-12-01

    Full Text Available What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer entropy matrix using bank stock price sequences. This paper also evaluates the stability of Chinese banking system by simulating the risk contagion process. This paper contributes to the literature on interbank contagion mainly in two ways: it establishes a convincing connection between interbank market and transfer entropy, and exploits the market information (stock price rather than presumptions to determine the interbank exposure matrix. Second, the empirical analysis provides an in depth understanding of the stability of the current Chinese banking system.

  19. Statistical region based active contour using a fractional entropy descriptor: Application to nuclei cell segmentation in confocal \\ud microscopy images

    OpenAIRE

    Histace, A; Meziou, B J; Matuszewski, Bogdan; Precioso, F; Murphy, M F; Carreiras, F

    2013-01-01

    We propose an unsupervised statistical region based active contour approach integrating an original fractional entropy measure for image segmentation with a particular application to single channel actin tagged fluorescence confocal microscopy image segmentation. Following description of statistical based active contour segmentation and the mathematical definition of the proposed fractional entropy descriptor, we demonstrate comparative segmentation results between the proposed approach and s...

  20. Activity-Based Approach for Teaching Aqueous Solubility, Energy, and Entropy

    Science.gov (United States)

    Eisen, Laura; Marano, Nadia; Glazier, Samantha

    2014-01-01

    We describe an activity-based approach for teaching aqueous solubility to introductory chemistry students that provides a more balanced presentation of the roles of energy and entropy in dissolution than is found in most general chemistry textbooks. In the first few activities, students observe that polar substances dissolve in water, whereas…

  1. A Text Steganographic System Based on Word Length Entropy Rate

    Directory of Open Access Journals (Sweden)

    Francis Xavier Kofi Akotoye

    2017-10-01

    Full Text Available The widespread adoption of electronic distribution of material is accompanied by illicit copying and distribution. This is why individuals, businesses and governments have come to think of how to protect their work, prevent such illicit activities and trace the distribution of a document. It is in this context that a lot of attention is being focused on steganography. Implementing steganography in text document is not an easy undertaking considering the fact that text document has very few places in which to embed hidden data. Any minute change introduced to text objects can easily be noticed thus attracting attention from possible hackers. This study investigates the possibility of embedding data in text document by employing the entropy rate of the constituent characters of words not less than four characters long. The scheme was used to embed bits in text according to the alphabetic structure of the words, the respective characters were compared with their neighbouring characters and if the first character was alphabetically lower than the succeeding character according to their ASCII codes, a zero bit was embedded otherwise 1 was embedded after the characters had been transposed. Before embedding, the secret message was encrypted with a secret key to add a layer of security to the secret message to be embedded, and then a pseudorandom number was generated from the word counts of the text which was used to paint the starting point of the embedding process. The embedding capacity of the scheme was relatively high compared with the space encoding and semantic method.

  2. Fault detection in nonlinear chemical processes based on kernel entropy component analysis and angular structure

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin [East China University of Science and Technology, Shanghai (China)

    2013-06-15

    Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring.

  3. Fault detection in nonlinear chemical processes based on kernel entropy component analysis and angular structure

    International Nuclear Information System (INIS)

    Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin

    2013-01-01

    Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring

  4. On the Entropy Based Associative Memory Model with Higher-Order Correlations

    Directory of Open Access Journals (Sweden)

    Masahiro Nakagawa

    2010-01-01

    Full Text Available In this paper, an entropy based associative memory model will be proposed and applied to memory retrievals with an orthogonal learning model so as to compare with the conventional model based on the quadratic Lyapunov functional to be minimized during the retrieval process. In the present approach, the updating dynamics will be constructed on the basis of the entropy minimization strategy which may be reduced asymptotically to the above-mentioned conventional dynamics as a special case ignoring the higher-order correlations. According to the introduction of the entropy functional, one may involve higer-order correlation effects between neurons in a self-contained manner without any heuristic coupling coefficients as in the conventional manner. In fact we shall show such higher order coupling tensors are to be uniquely determined in the framework of the entropy based approach. From numerical results, it will be found that the presently proposed novel approach realizes much larger memory capacity than that of the quadratic Lyapunov functional approach, e.g., associatron.

  5. A Real-Time Analysis Method for Pulse Rate Variability Based on Improved Basic Scale Entropy

    Directory of Open Access Journals (Sweden)

    Yongxin Chou

    2017-01-01

    Full Text Available Base scale entropy analysis (BSEA is a nonlinear method to analyze heart rate variability (HRV signal. However, the time consumption of BSEA is too long, and it is unknown whether the BSEA is suitable for analyzing pulse rate variability (PRV signal. Therefore, we proposed a method named sliding window iterative base scale entropy analysis (SWIBSEA by combining BSEA and sliding window iterative theory. The blood pressure signals of healthy young and old subjects are chosen from the authoritative international database MIT/PhysioNet/Fantasia to generate PRV signals as the experimental data. Then, the BSEA and the SWIBSEA are used to analyze the experimental data; the results show that the SWIBSEA reduces the time consumption and the buffer cache space while it gets the same entropy as BSEA. Meanwhile, the changes of base scale entropy (BSE for healthy young and old subjects are the same as that of HRV signal. Therefore, the SWIBSEA can be used for deriving some information from long-term and short-term PRV signals in real time, which has the potential for dynamic PRV signal analysis in some portable and wearable medical devices.

  6. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    Science.gov (United States)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  7. Fault Detection and Diagnosis for Gas Turbines Based on a Kernelized Information Entropy Model

    Directory of Open Access Journals (Sweden)

    Weiying Wang

    2014-01-01

    Full Text Available Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.

  8. Fault detection and diagnosis for gas turbines based on a kernelized information entropy model.

    Science.gov (United States)

    Wang, Weiying; Xu, Zhiqiang; Tang, Rui; Li, Shuying; Wu, Wei

    2014-01-01

    Gas turbines are considered as one kind of the most important devices in power engineering and have been widely used in power generation, airplanes, and naval ships and also in oil drilling platforms. However, they are monitored without man on duty in the most cases. It is highly desirable to develop techniques and systems to remotely monitor their conditions and analyze their faults. In this work, we introduce a remote system for online condition monitoring and fault diagnosis of gas turbine on offshore oil well drilling platforms based on a kernelized information entropy model. Shannon information entropy is generalized for measuring the uniformity of exhaust temperatures, which reflect the overall states of the gas paths of gas turbine. In addition, we also extend the entropy to compute the information quantity of features in kernel spaces, which help to select the informative features for a certain recognition task. Finally, we introduce the information entropy based decision tree algorithm to extract rules from fault samples. The experiments on some real-world data show the effectiveness of the proposed algorithms.

  9. Entropy of the Mixture of Sources and Entropy Dimension

    OpenAIRE

    Smieja, Marek; Tabor, Jacek

    2011-01-01

    We investigate the problem of the entropy of the mixture of sources. There is given an estimation of the entropy and entropy dimension of convex combination of measures. The proof is based on our alternative definition of the entropy based on measures instead of partitions.

  10. Quantitative design of emergency monitoring network for river chemical spills based on discrete entropy theory.

    Science.gov (United States)

    Shi, Bin; Jiang, Jiping; Sivakumar, Bellie; Zheng, Yi; Wang, Peng

    2018-05-01

    Field monitoring strategy is critical for disaster preparedness and watershed emergency environmental management. However, development of such is also highly challenging. Despite the efforts and progress thus far, no definitive guidelines or solutions are available worldwide for quantitatively designing a monitoring network in response to river chemical spill incidents, except general rules based on administrative divisions or arbitrary interpolation on routine monitoring sections. To address this gap, a novel framework for spatial-temporal network design was proposed in this study. The framework combines contaminant transport modelling with discrete entropy theory and spectral analysis. The water quality model was applied to forecast the spatio-temporal distribution of contaminant after spills and then corresponding information transfer indexes (ITIs) and Fourier approximation periodic functions were estimated as critical measures for setting sampling locations and times. The results indicate that the framework can produce scientific preparedness plans of emergency monitoring based on scenario analysis of spill risks as well as rapid design as soon as the incident happened but not prepared. The framework was applied to a hypothetical spill case based on tracer experiment and a real nitrobenzene spill incident case to demonstrate its suitability and effectiveness. The newly-designed temporal-spatial monitoring network captured major pollution information at relatively low costs. It showed obvious benefits for follow-up early-warning and treatment as well as for aftermath recovery and assessment. The underlying drivers of ITIs as well as the limitations and uncertainty of the approach were analyzed based on the case studies. Comparison with existing monitoring network design approaches, management implications, and generalized applicability were also discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. COLLAGE-BASED INVERSE PROBLEMS FOR IFSM WITH ENTROPY MAXIMIZATION AND SPARSITY CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    Herb Kunze

    2013-11-01

    Full Text Available We consider the inverse problem associated with IFSM: Given a target function f, find an IFSM, such that its invariant fixed point f is sufficiently close to f in the Lp distance. In this paper, we extend the collage-based method developed by Forte and Vrscay (1995 along two different directions. We first search for a set of mappings that not only minimizes the collage error but also maximizes the entropy of the dynamical system. We then include an extra term in the minimization process which takes into account the sparsity of the set of mappings. In this new formulation, the minimization of collage error is treated as multi-criteria problem: we consider three different and conflicting criteria i.e., collage error, entropy and sparsity. To solve this multi-criteria program we proceed by scalarization and we reduce the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented. Numerical studies indicate that a maximum entropy principle exists for this approximation problem, i.e., that the suboptimal solutions produced by collage coding can be improved at least slightly by adding a maximum entropy criterion.

  12. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  13. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  14. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  15. Application of SNODAS and hydrologic models to enhance entropy-based snow monitoring network design

    Science.gov (United States)

    Keum, Jongho; Coulibaly, Paulin; Razavi, Tara; Tapsoba, Dominique; Gobena, Adam; Weber, Frank; Pietroniro, Alain

    2018-06-01

    Snow has a unique characteristic in the water cycle, that is, snow falls during the entire winter season, but the discharge from snowmelt is typically delayed until the melting period and occurs in a relatively short period. Therefore, reliable observations from an optimal snow monitoring network are necessary for an efficient management of snowmelt water for flood prevention and hydropower generation. The Dual Entropy and Multiobjective Optimization is applied to design snow monitoring networks in La Grande River Basin in Québec and Columbia River Basin in British Columbia. While the networks are optimized to have the maximum amount of information with minimum redundancy based on entropy concepts, this study extends the traditional entropy applications to the hydrometric network design by introducing several improvements. First, several data quantization cases and their effects on the snow network design problems were explored. Second, the applicability the Snow Data Assimilation System (SNODAS) products as synthetic datasets of potential stations was demonstrated in the design of the snow monitoring network of the Columbia River Basin. Third, beyond finding the Pareto-optimal networks from the entropy with multi-objective optimization, the networks obtained for La Grande River Basin were further evaluated by applying three hydrologic models. The calibrated hydrologic models simulated discharges using the updated snow water equivalent data from the Pareto-optimal networks. Then, the model performances for high flows were compared to determine the best optimal network for enhanced spring runoff forecasting.

  16. Cooperative Localization for Multi-AUVs Based on GM-PHD Filters and Information Entropy Theory

    Directory of Open Access Journals (Sweden)

    Lichuan Zhang

    2017-10-01

    Full Text Available Cooperative localization (CL is considered a promising method for underwater localization with respect to multiple autonomous underwater vehicles (multi-AUVs. In this paper, we proposed a CL algorithm based on information entropy theory and the probability hypothesis density (PHD filter, aiming to enhance the global localization accuracy of the follower. In the proposed framework, the follower carries lower cost navigation systems, whereas the leaders carry better ones. Meanwhile, the leaders acquire the followers’ observations, including both measurements and clutter. Then, the PHD filters are utilized on the leaders and the results are communicated to the followers. The followers then perform weighted summation based on all received messages and obtain a final positioning result. Based on the information entropy theory and the PHD filter, the follower is able to acquire a precise knowledge of its position.

  17. Crane Safety Assessment Method Based on Entropy and Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Aihua Li

    2017-01-01

    Full Text Available Assessing the safety status of cranes is an important problem. To overcome the inaccuracies and misjudgments in such assessments, this work describes a safety assessment method for cranes that combines entropy and cumulative prospect theory. Firstly, the proposed method transforms the set of evaluation indices into an evaluation vector. Secondly, a decision matrix is then constructed from the evaluation vectors and evaluation standards, and an entropy-based technique is applied to calculate the index weights. Thirdly, positive and negative prospect value matrices are established from reference points based on the positive and negative ideal solutions. Thus, this enables the crane safety grade to be determined according to the ranked comprehensive prospect values. Finally, the safety status of four general overhead traveling crane samples is evaluated to verify the rationality and feasibility of the proposed method. The results demonstrate that the method described in this paper can precisely and reasonably reflect the safety status of a crane.

  18. A Variational Level Set Approach Based on Local Entropy for Image Segmentation and Bias Field Correction.

    Science.gov (United States)

    Tang, Jian; Jiang, Xiaoliang

    2017-01-01

    Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.

  19. Non-Gaussian Systems Control Performance Assessment Based on Rational Entropy

    Directory of Open Access Journals (Sweden)

    Jinglin Zhou

    2018-05-01

    Full Text Available Control loop Performance Assessment (CPA plays an important role in system operations. Stochastic statistical CPA index, such as a minimum variance controller (MVC-based CPA index, is one of the most widely used CPA indices. In this paper, a new minimum entropy controller (MEC-based CPA method of linear non-Gaussian systems is proposed. In this method, probability density function (PDF and rational entropy (RE are respectively used to describe the characteristics and the uncertainty of random variables. To better estimate the performance benchmark, an improved EDA algorithm, which is used to estimate the system parameters and noise PDF, is given. The effectiveness of the proposed method is illustrated through case studies on an ARMAX system.

  20. A Novel MADM Approach Based on Fuzzy Cross Entropy with Interval-Valued Intuitionistic Fuzzy Sets

    Directory of Open Access Journals (Sweden)

    Xin Tong

    2015-01-01

    Full Text Available The paper presents a novel multiple attribute decision-making (MADM approach for the problem with completely unknown attribute weights in the framework of interval-valued intuitionistic fuzzy sets (IVIFS. First, the fuzzy cross entropy and discrimination degree of IVIFS are defied. Subsequently, based on the discrimination degree of IVIFS, a nonlinear programming model to minimize the total deviation of discrimination degrees between alternatives and the positive ideal solution PIS as well as the negative ideal solution (NIS is constructed to obtain the attribute weights and, then, the weighted discrimination degree. Finally, all the alternatives are ranked according to the relative closeness coefficients using the extended TOPSIS method, and the most desirable alternative is chosen. The proposed approach extends the research method of MADM based on the IVIF cross entropy. Finally, we illustrate the feasibility and validity of the proposed method by two examples.

  1. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  2. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  3. Comparison of tomography reconstruction by maximum entropy and filtered retro projection

    International Nuclear Information System (INIS)

    Abdala, F.J.P.; Simpson, D.M.; Roberty, N.C.

    1992-01-01

    The tomographic reconstruction with few projections is studied, comparing the maximum entropy method with filtered retro projection. Simulations with and without the presence of noise and also with the presence of an object of high density inside of the skull are showed. (C.G.C.)

  4. BRISENT: An Entropy-Based Model for Bridge-Pier Scour Estimation under Complex Hydraulic Scenarios

    Directory of Open Access Journals (Sweden)

    Alonso Pizarro

    2017-11-01

    Full Text Available The goal of this paper is to introduce the first clear-water scour model based on both the informational entropy concept and the principle of maximum entropy, showing that a variational approach is ideal for describing erosional processes under complex situations. The proposed bridge–pier scour entropic (BRISENT model is capable of reproducing the main dynamics of scour depth evolution under steady hydraulic conditions, step-wise hydrographs, and flood waves. For the calibration process, 266 clear-water scour experiments from 20 precedent studies were considered, where the dimensionless parameters varied widely. Simple formulations are proposed to estimate BRISENT’s fitting coefficients, in which the ratio between pier-diameter and sediment-size was the most critical physical characteristic controlling scour model parametrization. A validation process considering highly unsteady and multi-peaked hydrographs was carried out, showing that the proposed BRISENT model reproduces scour evolution with high accuracy.

  5. The prediction of engineering cost for green buildings based on information entropy

    Science.gov (United States)

    Liang, Guoqiang; Huang, Jinglian

    2018-03-01

    Green building is the developing trend in the world building industry. Additionally, construction costs are an essential consideration in building constructions. Therefore, it is necessary to investigate the problems of cost prediction in green building. On the basis of analyzing the cost of green building, this paper proposes the forecasting method of actual cost in green building based on information entropy and provides the forecasting working procedure. Using the probability density obtained from statistical data, such as labor costs, material costs, machinery costs, administration costs, profits, risk costs a unit project quotation and etc., situations can be predicted which lead to cost variations between budgeted cost and actual cost in constructions, through estimating the information entropy of budgeted cost and actual cost. The research results of this article have a practical significance in cost control of green building. Additionally, the method proposed in this article can be generalized and applied to a variety of other aspects in building management.

  6. Rolling Bearing Fault Diagnosis Based on ELCD Permutation Entropy and RVM

    Directory of Open Access Journals (Sweden)

    Jiang Xingmeng

    2016-01-01

    Full Text Available Aiming at the nonstationary characteristic of a gear fault vibration signal, a recognition method based on permutation entropy of ensemble local characteristic-scale decomposition (ELCD and relevance vector machine (RVM is proposed. First, the vibration signal was decomposed by ELCD; then a series of intrinsic scale components (ISCs were obtained. Second, according to the kurtosis of ISCs, principal ISCs were selected and then the permutation entropy of principal ISCs was calculated and they were combined into a feature vector. Finally, the feature vectors were input in RVM classifier to train and test and identify the type of rolling bearing faults. Experimental results show that this method can effectively diagnose four kinds of working condition, and the effect is better than local characteristic-scale decomposition (LCD method.

  7. Evaluation of Intensive Construction Land Use in the Emerging City Based on PSR-Entropy model

    Science.gov (United States)

    Jia, Yuanyuan; Lei, Guangyu

    2018-01-01

    A comprehensive understanding of emerging city land utilization and the evaluation of intensive land use in the Emerging City will provide the comprehensive and reliable technical basis for the planning and management. It is an important node. According to the Han cheng from 2008 to 2016 years of land use, based on PSR-Entropy model of land use evaluation system, using entropy method to determine the index weight, the introduction of comprehensive index method to evaluate the degree of land use. The results show that the intensive land use comprehensive evaluation index of Han cheng increased from 2008 to 2015, but the land intensive use can not achieve the standards. The potential of further enhancing space is relatively large.

  8. An Entropy-Based Adaptive Hybrid Particle Swarm Optimization for Disassembly Line Balancing Problems

    Directory of Open Access Journals (Sweden)

    Shanli Xiao

    2017-11-01

    Full Text Available In order to improve the product disassembly efficiency, the disassembly line balancing problem (DLBP is transformed into a problem of searching for the optimum path in the directed and weighted graph by constructing the disassembly hierarchy information graph (DHIG. Then, combining the characteristic of the disassembly sequence, an entropy-based adaptive hybrid particle swarm optimization algorithm (AHPSO is presented. In this algorithm, entropy is introduced to measure the changing tendency of population diversity, and the dimension learning, crossover and mutation operator are used to increase the probability of producing feasible disassembly solutions (FDS. Performance of the proposed methodology is tested on the primary problem instances available in the literature, and the results are compared with other evolutionary algorithms. The results show that the proposed algorithm is efficient to solve the complex DLBP.

  9. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    Science.gov (United States)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  10. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    Directory of Open Access Journals (Sweden)

    Moshen Kuai

    2018-03-01

    Full Text Available For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN Adaptive Neuro-fuzzy Inference System (ANFIS in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  11. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS.

    Science.gov (United States)

    Kuai, Moshen; Cheng, Gang; Pang, Yusong; Li, Yong

    2018-03-05

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  12. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  13. EEG entropy measures in anesthesia

    Directory of Open Access Journals (Sweden)

    Zhenhu eLiang

    2015-02-01

    Full Text Available Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs’ effect is lacking. In this study, we compare the capability of twelve entropy indices for monitoring depth of anesthesia (DoA and detecting the burst suppression pattern (BSP, in anesthesia induced by GA-BAergic agents.Methods: Twelve indices were investigated, namely Response Entropy (RE and State entropy (SE, three wavelet entropy (WE measures (Shannon WE (SWE, Tsallis WE (TWE and Renyi WE (RWE, Hilbert-Huang spectral entropy (HHSE, approximate entropy (ApEn, sample entropy (SampEn, Fuzzy entropy, and three permutation entropy (PE measures (Shannon PE (SPE, Tsallis PE (TPE and Renyi PE (RPE. Two EEG data sets from sevoflurane-induced and isoflu-rane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, phar-macokinetic / pharmacodynamic (PK/PD modeling and prediction probability analysis were applied. The multifractal detrended fluctuation analysis (MDFA as a non-entropy measure was compared.Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline vari-ability, higher coefficient of determination and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an ad-vantage in computation efficiency compared with MDFA.Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure.Significance: Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA.

  14. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    Energy Technology Data Exchange (ETDEWEB)

    Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel

    2017-07-01

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.

  15. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    International Nuclear Information System (INIS)

    Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel

    2017-01-01

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.

  16. Efficient algorithms and implementations of entropy-based moment closures for rarefied gases

    Science.gov (United States)

    Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel

    2017-07-01

    We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.

  17. Structure of a Global Network of Financial Companies Based on Transfer Entropy

    Directory of Open Access Journals (Sweden)

    Leonidas Sandoval

    2014-08-01

    Full Text Available This work uses the stocks of the 197 largest companies in the world, in terms of market capitalization, in the financial area, from 2003 to 2012. We study the causal relationships between them using Transfer Entropy, which is calculated using the stocks of those companies and their counterparts lagged by one day. With this, we can assess which companies influence others according to sub-areas of the financial sector, which are banks, diversified financial services, savings and loans, insurance, private equity funds, real estate investment companies, and real estate trust funds. We also analyze the exchange of information between those stocks as seen by Transfer Entropy and the network formed by them based on this measure, verifying that they cluster mainly according to countries of origin, and then by industry and sub-industry. Then we use data on the stocks of companies in the financial sector of some countries that are suffering the most with the current credit crisis, namely Greece, Cyprus, Ireland, Spain, Portugal, and Italy, and assess, also using Transfer Entropy, which companies from the largest 197 are most affected by the stocks of these countries in crisis. The aim is to map a network of influences that may be used in the study of possible contagions originating in those countries in financial crisis.

  18. [Stochastic characteristics of daily precipitation and its spatiotemporal difference over China based on information entropy].

    Science.gov (United States)

    Li, Xin Xin; Sang, Yan Fang; Xie, Ping; Liu, Chang Ming

    2018-04-01

    Daily precipitation process in China showed obvious randomness and spatiotemporal variation. It is important to accurately understand the influence of precipitation changes on control of flood and waterlogging disaster. Using the daily precipitation data measured at 520 stations in China during 1961-2013, we quantified the stochastic characteristics of daily precipitation over China based on the index of information entropy. Results showed that the randomness of daily precipitation in the southeast region were larger than that in the northwest region. Moreover, the spatial distribution of stochastic characteristics of precipitation was different at various grades. Stochastic characteri-stics of P 0 (precipitation at 0.1-10 mm) was large, but the spatial variation was not obvious. The stochastic characteristics of P 10 (precipitation at 10-25 mm) and P 25 (precipitation at 25-50 mm) were the largest and their spatial difference was obvious. P 50 (precipitation ≥50 mm) had the smallest stochastic characteristics and the most obviously spatial difference. Generally, the entropy values of precipitation obviously increased over the last five decades, indicating more significantly stochastic characteristics of precipitation (especially the obvious increase of heavy precipitation events) in most region over China under the scenarios of global climate change. Given that the spatial distribution and long-term trend of entropy values of daily precipitation could reflect thespatial distribution of stochastic characteristics of precipitation, our results could provide scientific basis for the control of flood and waterlogging disaster, the layout of agricultural planning, and the planning of ecological environment.

  19. A Pilot Directional Protection for HVDC Transmission Line Based on Relative Entropy of Wavelet Energy

    Directory of Open Access Journals (Sweden)

    Sheng Lin

    2015-07-01

    Full Text Available On the basis of analyzing high-voltage direct current (HVDC transmission system and its fault superimposed circuit, the direction of the fault components of the voltage and the current measured at one end of transmission line is certified to be different for internal faults and external faults. As an estimate of the differences between two signals, relative entropy is an effective parameter for recognizing transient signals in HVDC transmission lines. In this paper, the relative entropy of wavelet energy is applied to distinguish internal fault from external fault. For internal faults, the directions of fault components of voltage and current are opposite at the two ends of the transmission line, indicating a huge difference of wavelet energy relative entropy; for external faults, the directions are identical, indicating a small difference. The simulation results based on PSCAD/EMTDC show that the proposed pilot protection system acts accurately for faults under different conditions, and its performance is not affected by fault type, fault location, fault resistance and noise.

  20. Heart rate variability analysis based on time–frequency representation and entropies in hypertrophic cardiomyopathy patients

    International Nuclear Information System (INIS)

    Clariá, F; Vallverdú, M; Caminal, P; Baranowski, R; Chojnowska, L

    2008-01-01

    In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time–frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0–0.04 Hz), low frequency band (LF, 0.04–0.15 Hz) and high frequency band (HF, 0.15–0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of

  1. Fault detection of the connection of lithium-ion power batteries based on entropy for electric vehicles

    Science.gov (United States)

    Yao, Lei; Wang, Zhenpo; Ma, Jun

    2015-10-01

    This paper proposes a method of fault detection of the connection of Lithium-Ion batteries based on entropy for electric vehicle. In electric vehicle operation process, some factors, such as road conditions, driving habits, vehicle performance, always affect batteries by vibration, which easily cause loosing or virtual connection between batteries. Through the simulation of the battery charging and discharging experiment under vibration environment, the data of voltage fluctuation can be obtained. Meanwhile, an optimal filtering method is adopted using discrete cosine filter method to analyze the characteristics of system noise, based on the voltage set when batteries are working under different vibration frequency. Experimental data processed by filtering is analyzed based on local Shannon entropy, ensemble Shannon entropy and sample entropy. And the best way to find a method of fault detection of the connection of lithium-ion batteries based on entropy is presented for electric vehicle. The experimental data shows that ensemble Shannon entropy can predict the accurate time and the location of battery connection failure in real time. Besides electric-vehicle industry, this method can also be used in other areas in complex vibration environment.

  2. Cross entropy-based memetic algorithms: An application study over the tool switching problem

    Directory of Open Access Journals (Sweden)

    Jhon Edgar Amaya

    2013-05-01

    Full Text Available This paper presents a parameterized schema for building memetic algorithms based on cross-entropy (CE methods. This novel schema is general in nature, and features multiple probability mass functions and Lamarckian learning. The applicability of the approach is assessed by considering the Tool Switching Problem, a complex combinatorial problem in the field of Flexible Manufacturing Systems. An exhaustive evaluation (including techniques ranging from local search and evolutionary algorithms to constructive methods provides evidence of the effectiveness of CE-based memetic algorithms.

  3. Entropy-Based Experimental Design for Optimal Model Discrimination in the Geosciences

    Directory of Open Access Journals (Sweden)

    Wolfgang Nowak

    2016-11-01

    Full Text Available Choosing between competing models lies at the heart of scientific work, and is a frequent motivation for experimentation. Optimal experimental design (OD methods maximize the benefit of experiments towards a specified goal. We advance and demonstrate an OD approach to maximize the information gained towards model selection. We make use of so-called model choice indicators, which are random variables with an expected value equal to Bayesian model weights. Their uncertainty can be measured with Shannon entropy. Since the experimental data are still random variables in the planning phase of an experiment, we use mutual information (the expected reduction in Shannon entropy to quantify the information gained from a proposed experimental design. For implementation, we use the Preposterior Data Impact Assessor framework (PreDIA, because it is free of the lower-order approximations of mutual information often found in the geosciences. In comparison to other studies in statistics, our framework is not restricted to sequential design or to discrete-valued data, and it can handle measurement errors. As an application example, we optimize an experiment about the transport of contaminants in clay, featuring the problem of choosing between competing isotherms to describe sorption. We compare the results of optimizing towards maximum model discrimination with an alternative OD approach that minimizes the overall predictive uncertainty under model choice uncertainty.

  4. A Novel Entropy-Based Decoding Algorithm for a Generalized High-Order Discrete Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Jason Chin-Tiong Chan

    2018-01-01

    Full Text Available The optimal state sequence of a generalized High-Order Hidden Markov Model (HHMM is tracked from a given observational sequence using the classical Viterbi algorithm. This classical algorithm is based on maximum likelihood criterion. We introduce an entropy-based Viterbi algorithm for tracking the optimal state sequence of a HHMM. The entropy of a state sequence is a useful quantity, providing a measure of the uncertainty of a HHMM. There will be no uncertainty if there is only one possible optimal state sequence for HHMM. This entropy-based decoding algorithm can be formulated in an extended or a reduction approach. We extend the entropy-based algorithm for computing the optimal state sequence that was developed from a first-order to a generalized HHMM with a single observational sequence. This extended algorithm performs the computation exponentially with respect to the order of HMM. The computational complexity of this extended algorithm is due to the growth of the model parameters. We introduce an efficient entropy-based decoding algorithm that used reduction approach, namely, entropy-based order-transformation forward algorithm (EOTFA to compute the optimal state sequence of any generalized HHMM. This EOTFA algorithm involves a transformation of a generalized high-order HMM into an equivalent first-order HMM and an entropy-based decoding algorithm is developed based on the equivalent first-order HMM. This algorithm performs the computation based on the observational sequence and it requires OTN~2 calculations, where N~ is the number of states in an equivalent first-order model and T is the length of observational sequence.

  5. Adjoint entropy vs topological entropy

    OpenAIRE

    Giordano Bruno, Anna

    2012-01-01

    Recently the adjoint algebraic entropy of endomorphisms of abelian groups was introduced and studied. We generalize the notion of adjoint entropy to continuous endomorphisms of topological abelian groups. Indeed, the adjoint algebraic entropy is defined using the family of all finite-index subgroups, while we take only the subfamily of all open finite-index subgroups to define the topological adjoint entropy. This allows us to compare the (topological) adjoint entropy with the known topologic...

  6. Chaos control of ferroresonance system based on RBF-maximum entropy clustering algorithm

    International Nuclear Information System (INIS)

    Liu Fan; Sun Caixin; Sima Wenxia; Liao Ruijin; Guo Fei

    2006-01-01

    With regards to the ferroresonance overvoltage of neutral grounded power system, a maximum-entropy learning algorithm based on radial basis function neural networks is used to control the chaotic system. The algorithm optimizes the object function to derive learning rule of central vectors, and uses the clustering function of network hidden layers. It improves the regression and learning ability of neural networks. The numerical experiment of ferroresonance system testifies the effectiveness and feasibility of using the algorithm to control chaos in neutral grounded system

  7. Local curvature entropy-based 3D terrain representation using a comprehensive Quadtree

    Science.gov (United States)

    Chen, Qiyu; Liu, Gang; Ma, Xiaogang; Mariethoz, Gregoire; He, Zhenwen; Tian, Yiping; Weng, Zhengping

    2018-05-01

    Large scale 3D digital terrain modeling is a crucial part of many real-time applications in geoinformatics. In recent years, the improved speed and precision in spatial data collection make the original terrain data more complex and bigger, which poses challenges for data management, visualization and analysis. In this work, we presented an effective and comprehensive 3D terrain representation based on local curvature entropy and a dynamic Quadtree. The Level-of-detail (LOD) models of significant terrain features were employed to generate hierarchical terrain surfaces. In order to reduce the radical changes of grid density between adjacent LODs, local entropy of terrain curvature was regarded as a measure of subdividing terrain grid cells. Then, an efficient approach was presented to eliminate the cracks among the different LODs by directly updating the Quadtree due to an edge-based structure proposed in this work. Furthermore, we utilized a threshold of local entropy stored in each parent node of this Quadtree to flexibly control the depth of the Quadtree and dynamically schedule large-scale LOD terrain. Several experiments were implemented to test the performance of the proposed method. The results demonstrate that our method can be applied to construct LOD 3D terrain models with good performance in terms of computational cost and the maintenance of terrain features. Our method has already been deployed in a geographic information system (GIS) for practical uses, and it is able to support the real-time dynamic scheduling of large scale terrain models more easily and efficiently.

  8. Analysis of optimal Reynolds number for developing laminar forced convection in double sine ducts based on entropy generation minimization principle

    International Nuclear Information System (INIS)

    Ko, T.H.

    2006-01-01

    In the present paper, the entropy generation and optimal Reynolds number for developing forced convection in a double sine duct with various wall heat fluxes, which frequently occurs in plate heat exchangers, are studied based on the entropy generation minimization principle by analytical thermodynamic analysis as well as numerical investigation. According to the thermodynamic analysis, a very simple expression for the optimal Reynolds number for the double sine duct as a function of mass flow rate, wall heat flux, working fluid and geometric dimensions is proposed. In the numerical simulations, the investigated Reynolds number (Re) covers the range from 86 to 2000 and the wall heat flux (q'') varies as 160, 320 and 640 W/m 2 . From the numerical simulation of the developing laminar forced convection in the double sine duct, the effect of Reynolds number on entropy generation in the duct has been examined, through which the optimal Reynolds number with minimal entropy generation is detected. The optimal Reynolds number obtained from the analytical thermodynamic analysis is compared with the one from the numerical solutions and is verified to have a similar magnitude of entropy generation as the minimal entropy generation predicted by the numerical simulations. The optimal analysis provided in the present paper gives worthy information for heat exchanger design, since the thermal system could have the least irreversibility and best exergy utilization if the optimal Re can be used according to practical design conditions

  9. Chatter detection in milling process based on VMD and energy entropy

    Science.gov (United States)

    Liu, Changfu; Zhu, Lida; Ni, Chenbing

    2018-05-01

    This paper presents a novel approach to detect the milling chatter based on Variational Mode Decomposition (VMD) and energy entropy. VMD has already been employed in feature extraction from non-stationary signals. The parameters like number of modes (K) and the quadratic penalty (α) need to be selected empirically when raw signal is decomposed by VMD. Aimed at solving the problem how to select K and α, the automatic selection method of VMD's based on kurtosis is proposed in this paper. When chatter occurs in the milling process, energy will be absorbed to chatter frequency bands. To detect the chatter frequency bands automatically, the chatter detection method based on energy entropy is presented. The vibration signal containing chatter frequency is simulated and three groups of experiments which represent three cutting conditions are conducted. To verify the effectiveness of method presented by this paper, chatter feather extraction has been successfully employed on simulation signals and experimental signals. The simulation and experimental results show that the proposed method can effectively detect the chatter.

  10. EEG-Based Computer Aided Diagnosis of Autism Spectrum Disorder Using Wavelet, Entropy, and ANN

    Directory of Open Access Journals (Sweden)

    Ridha Djemal

    2017-01-01

    Full Text Available Autism spectrum disorder (ASD is a type of neurodevelopmental disorder with core impairments in the social relationships, communication, imagination, or flexibility of thought and restricted repertoire of activity and interest. In this work, a new computer aided diagnosis (CAD of autism ‎based on electroencephalography (EEG signal analysis is investigated. The proposed method is based on discrete wavelet transform (DWT, entropy (En, and artificial neural network (ANN. DWT is used to decompose EEG signals into approximation and details coefficients to obtain EEG subbands. The feature vector is constructed by computing Shannon entropy values from each EEG subband. ANN classifies the corresponding EEG signal into normal or autistic based on the extracted features. The experimental results show the effectiveness of the proposed method for assisting autism diagnosis. A receiver operating characteristic (ROC curve metric is used to quantify the performance of the proposed method. The proposed method obtained promising results tested using real dataset provided by King Abdulaziz Hospital, Jeddah, Saudi Arabia.

  11. An Integrated Dictionary-Learning Entropy-Based Medical Image Fusion Framework

    Directory of Open Access Journals (Sweden)

    Guanqiu Qi

    2017-10-01

    Full Text Available Image fusion is widely used in different areas and can integrate complementary and relevant information of source images captured by multiple sensors into a unitary synthetic image. Medical image fusion, as an important image fusion application, can extract the details of multiple images from different imaging modalities and combine them into an image that contains complete and non-redundant information for increasing the accuracy of medical diagnosis and assessment. The quality of the fused image directly affects medical diagnosis and assessment. However, existing solutions have some drawbacks in contrast, sharpness, brightness, blur and details. This paper proposes an integrated dictionary-learning and entropy-based medical image-fusion framework that consists of three steps. First, the input image information is decomposed into low-frequency and high-frequency components by using a Gaussian filter. Second, low-frequency components are fused by weighted average algorithm and high-frequency components are fused by the dictionary-learning based algorithm. In the dictionary-learning process of high-frequency components, an entropy-based algorithm is used for informative blocks selection. Third, the fused low-frequency and high-frequency components are combined to obtain the final fusion results. The results and analyses of comparative experiments demonstrate that the proposed medical image fusion framework has better performance than existing solutions.

  12. A Roller Bearing Fault Diagnosis Method Based on LCD Energy Entropy and ACROA-SVM

    Directory of Open Access Journals (Sweden)

    HungLinh Ao

    2014-01-01

    Full Text Available This study investigates a novel method for roller bearing fault diagnosis based on local characteristic-scale decomposition (LCD energy entropy, together with a support vector machine designed using an Artificial Chemical Reaction Optimisation Algorithm, referred to as an ACROA-SVM. First, the original acceleration vibration signals are decomposed into intrinsic scale components (ISCs. Second, the concept of LCD energy entropy is introduced. Third, the energy features extracted from a number of ISCs that contain the most dominant fault information serve as input vectors for the support vector machine classifier. Finally, the ACROA-SVM classifier is proposed to recognize the faulty roller bearing pattern. The analysis of roller bearing signals with inner-race and outer-race faults shows that the diagnostic approach based on the ACROA-SVM and using LCD to extract the energy levels of the various frequency bands as features can identify roller bearing fault patterns accurately and effectively. The proposed method is superior to approaches based on Empirical Mode Decomposition method and requires less time.

  13. Some Consequences of an Analysis of the Kelvin-Clausius Entropy Formulation Based on Traditional Axiomatics

    Science.gov (United States)

    Jesudason, Christopher G.

    2003-09-01

    Recently, there have appeared interesting correctives or challenges [Entropy 1999, 1, 111-147] to the Second law formulations, especially in the interpretation of the Clausius equivalent transformations, closely related in area to extensions of the Clausius principle to irreversible processes [Chem. Phys. Lett. 1988, 143(1), 65-70]. Since the traditional formulations are central to science, a brief analysis of some of these newer theories along traditional lines is attempted, based on well-attested axioms which have formed the basis of equilibrium thermodynamics. It is deduced that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the "Universe") is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer. On the basis of two new simple theorems and three corollaries derived for the correlation between irreversible and reversible pathways and the traditional axiomatics, it is shown that a sequence of reversible states can never be used to describe a corresponding sequence of irreversible states for at least closed systems, thereby restricting the principle of local equilibrium. It is further shown that some of the newer irreversible entropy forms given exhibit some paradoxical properties relative to the standard axiomatics. It is deduced that any reconciliation between the traditional approach and novel theories lie in creating a well defined set of axioms to which all theoretical developments should attempt to be based on unless proven not be useful, in which case there should be consensus in removing such axioms from theory. Clausius' theory of equivalent transformations do not contradict the traditional understanding of heat- work efficiency. It is concluded that the intuitively derived assumptions over the last two centuries seem to

  14. Application of Entropy-Based Metrics to Identify Emotional Distress from Electroencephalographic Recordings

    Directory of Open Access Journals (Sweden)

    Beatriz García-Martínez

    2016-06-01

    Full Text Available Recognition of emotions is still an unresolved challenge, which could be helpful to improve current human-machine interfaces. Recently, nonlinear analysis of some physiological signals has shown to play a more relevant role in this context than their traditional linear exploration. Thus, the present work introduces for the first time the application of three recent entropy-based metrics: sample entropy (SE, quadratic SE (QSE and distribution entropy (DE to discern between emotional states of calm and negative stress (also called distress. In the last few years, distress has received growing attention because it is a common negative factor in the modern lifestyle of people from developed countries and, moreover, it may lead to serious mental and physical health problems. Precisely, 279 segments of 32-channel electroencephalographic (EEG recordings from 32 subjects elicited to be calm or negatively stressed have been analyzed. Results provide that QSE is the first single metric presented to date with the ability to identify negative stress. Indeed, this metric has reported a discriminant ability of around 70%, which is only slightly lower than the one obtained by some previous works. Nonetheless, discriminant models from dozens or even hundreds of features have been previously obtained by using advanced classifiers to yield diagnostic accuracies about 80%. Moreover, in agreement with previous neuroanatomy findings, QSE has also revealed notable differences for all the brain regions in the neural activation triggered by the two considered emotions. Consequently, given these results, as well as easy interpretation of QSE, this work opens a new standpoint in the detection of emotional distress, which may gain new insights about the brain’s behavior under this negative emotion.

  15. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007).

    Science.gov (United States)

    Pueyo, Salvador

    2012-05-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach.

  16. Comparison of Boltzmann and Gibbs entropies for the analysis of single-chain phase transitions

    Science.gov (United States)

    Shakirov, T.; Zablotskiy, S.; Böker, A.; Ivanov, V.; Paul, W.

    2017-03-01

    In the last 10 years, flat histogram Monte Carlo simulations have contributed strongly to our understanding of the phase behavior of simple generic models of polymers. These simulations result in an estimate for the density of states of a model system. To connect this result with thermodynamics, one has to relate the density of states to the microcanonical entropy. In a series of publications, Dunkel, Hilbert and Hänggi argued that it would lead to a more consistent thermodynamic description of small systems, when one uses the Gibbs definition of entropy instead of the Boltzmann one. The latter is the logarithm of the density of states at a certain energy, the former is the logarithm of the integral of the density of states over all energies smaller than or equal to this energy. We will compare the predictions using these two definitions for two polymer models, a coarse-grained model of a flexible-semiflexible multiblock copolymer and a coarse-grained model of the protein poly-alanine. Additionally, it is important to note that while Monte Carlo techniques are normally concerned with the configurational energy only, the microcanonical ensemble is defined for the complete energy. We will show how taking the kinetic energy into account alters the predictions from the analysis. Finally, the microcanonical ensemble is supposed to represent a closed mechanical N-particle system. But due to Galilei invariance such a system has two additional conservation laws, in general: momentum and angular momentum. We will also show, how taking these conservation laws into account alters the results.

  17. Trustworthiness Measurement Algorithm for TWfMS Based on Software Behaviour Entropy

    Directory of Open Access Journals (Sweden)

    Qiang Han

    2018-03-01

    Full Text Available As the virtual mirror of complex real-time business processes of organisations’ underlying information systems, the workflow management system (WfMS has emerged in recent decades as a new self-autonomous paradigm in the open, dynamic, distributed computing environment. In order to construct a trustworthy workflow management system (TWfMS, the design of a software behaviour trustworthiness measurement algorithm is an urgent task for researchers. Accompanying the trustworthiness mechanism, the measurement algorithm, with uncertain software behaviour trustworthiness information of the WfMS, should be resolved as an infrastructure. Based on the framework presented in our research prior to this paper, we firstly introduce a formal model for the WfMS trustworthiness measurement, with the main property reasoning based on calculus operators. Secondly, this paper proposes a novel measurement algorithm from the software behaviour entropy of calculus operators through the principle of maximum entropy (POME and the data mining method. Thirdly, the trustworthiness measurement algorithm for incomplete software behaviour tests and runtime information is discussed and compared by means of a detailed explanation. Finally, we provide conclusions and discuss certain future research areas of the TWfMS.

  18. Measuring time series regularity using nonlinear similarity-based sample entropy

    International Nuclear Information System (INIS)

    Xie Hongbo; He Weixing; Liu Hui

    2008-01-01

    Sampe Entropy (SampEn), a measure quantifying regularity and complexity, is believed to be an effective analyzing method of diverse settings that include both deterministic chaotic and stochastic processes, particularly operative in the analysis of physiological signals that involve relatively small amount of data. However, the similarity definition of vectors is based on Heaviside function, of which the boundary is discontinuous and hard, may cause some problems in the validity and accuracy of SampEn. Sigmoid function is a smoothed and continuous version of Heaviside function. To overcome the problems SampEn encountered, a modified SampEn (mSampEn) based on nonlinear Sigmoid function was proposed. The performance of mSampEn was tested on the independent identically distributed (i.i.d.) uniform random numbers, the MIX stochastic model, the Rossler map, and the Hennon map. The results showed that mSampEn was superior to SampEn in several aspects, including giving entropy definition in case of small parameters, better relative consistency, robust to noise, and more independence on record length when characterizing time series generated from either deterministic or stochastic system with different regularities

  19. Combined Forecasting of Rainfall Based on Fuzzy Clustering and Cross Entropy

    Directory of Open Access Journals (Sweden)

    Baohui Men

    2017-12-01

    Full Text Available Rainfall is an essential index to measure drought, and it is dependent upon various parameters including geographical environment, air temperature and pressure. The nonlinear nature of climatic variables leads to problems such as poor accuracy and instability in traditional forecasting methods. In this paper, the combined forecasting method based on data mining technology and cross entropy is proposed to forecast the rainfall with full consideration of the time-effectiveness of historical data. In view of the flaws of the fuzzy clustering method which is easy to fall into local optimal solution and low speed of operation, the ant colony algorithm is adopted to overcome these shortcomings and, as a result, refine the model. The method for determining weights is also improved by using the cross entropy. Besides, the forecast is conducted by analyzing the weighted average rainfall based on Thiessen polygon in the Beijing–Tianjin–Hebei region. Since the predictive errors are calculated, the results show that improved ant colony fuzzy clustering can effectively select historical data and enhance the accuracy of prediction so that the damage caused by extreme weather events like droughts and floods can be greatly lessened and even kept at bay.

  20. A Novel Object Tracking Algorithm Based on Compressed Sensing and Entropy of Information

    Directory of Open Access Journals (Sweden)

    Ding Ma

    2015-01-01

    Full Text Available Object tracking has always been a hot research topic in the field of computer vision; its purpose is to track objects with specific characteristics or representation and estimate the information of objects such as their locations, sizes, and rotation angles in the current frame. Object tracking in complex scenes will usually encounter various sorts of challenges, such as location change, dimension change, illumination change, perception change, and occlusion. This paper proposed a novel object tracking algorithm based on compressed sensing and information entropy to address these challenges. First, objects are characterized by the Haar (Haar-like and ORB features. Second, the dimensions of computation space of the Haar and ORB features are effectively reduced through compressed sensing. Then the above-mentioned features are fused based on information entropy. Finally, in the particle filter framework, an object location was obtained by selecting candidate object locations in the current frame from the local context neighboring the optimal locations in the last frame. Our extensive experimental results demonstrated that this method was able to effectively address the challenges of perception change, illumination change, and large area occlusion, which made it achieve better performance than existing approaches such as MIL and CT.

  1. EEG entropy measures in anesthesia

    Science.gov (United States)

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation

  2. Energy and entropy analysis of closed adiabatic expansion based trilateral cycles

    International Nuclear Information System (INIS)

    Garcia, Ramon Ferreiro; Carril, Jose Carbia; Gomez, Javier Romero; Gomez, Manuel Romero

    2016-01-01

    Highlights: • The adiabatic expansion based TC surpass Carnot factor at low temperatures. • The fact of surpassing Carnot factor doesn’t violate the 2nd law. • An entropy analysis is applied to verify the fulfilment of the second law. • Correction of the exergy transfer associated with heat transferred to a cycle. - Abstract: A vast amount of heat energy is available at low cost within the range of medium and low temperatures. Existing thermal cycles cannot make efficient use of such available low grade heat because they are mainly based on conventional organic Rankine cycles which are limited by Carnot constraints. However, recent developments related to the performance of thermal cycles composed of closed processes have led to the exceeding of the Carnot factor. Consequently, once the viability of closed process based thermal cycles that surpass the Carnot factor operating at low and medium temperatures is globally accepted, research work will aim at looking into the consequences that lead from surpassing the Carnot factor while fulfilling the 2nd law, its impact on the 2nd law efficiency definition as well as the impact on the exergy transfer from thermal power sources to any heat consumer, including thermal cycles. The methodology used to meet the proposed objectives involves the analysis of energy and entropy on trilateral closed process based thermal cycles. Thus, such energy and entropy analysis is carried out upon non-condensing mode trilateral thermal cycles (TCs) characterised by the conversion of low grade heat into mechanical work undergoing closed adiabatic path functions: isochoric heat absorption, adiabatic heat to mechanical work conversion and isobaric heat rejection. Firstly, cycle energy analysis is performed to determine the range of some relevant cycle parameters, such as the operating temperatures and their associated pressures, entropies, internal energies and specific volumes. In this way, the ranges of temperatures within which

  3. Constructing a Measurement Method of Differences in Group Preferences Based on Relative Entropy

    Directory of Open Access Journals (Sweden)

    Shiyu Zhang

    2017-01-01

    Full Text Available In the research and data analysis of the differences involved in group preferences, conventional statistical methods cannot reflect the integrity and preferences of human minds; in particular, it is difficult to exclude humans’ irrational factors. This paper introduces a preference amount model based on relative entropy theory. A related expansion is made based on the characteristics of the questionnaire data, and we also construct the parameters to measure differences in the data distribution of different groups on the whole. In this paper, this parameter is called the center distance, and it effectively reflects the preferences of human minds. Using the survey data of securities market participants as an example, this paper analyzes differences in market participants’ attitudes toward the effectiveness of securities regulation. Based on this method, differences between groups that were overlooked by analysis of variance are found, and certain aspects obscured by general data characteristics are also found.

  4. Entropy and equilibrium via games of complexity

    Science.gov (United States)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  5. Entropy-Based Voltage Fault Diagnosis of Battery Systems for Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Peng Liu

    2018-01-01

    Full Text Available The battery is a key component and the major fault source in electric vehicles (EVs. Ensuring power battery safety is of great significance to make the diagnosis more effective and predict the occurrence of faults, for the power battery is one of the core technologies of EVs. This paper proposes a voltage fault diagnosis detection mechanism using entropy theory which is demonstrated in an EV with a multiple-cell battery system during an actual operation situation. The preliminary analysis, after collecting and preprocessing the typical data periods from Operation Service and Management Center for Electric Vehicle (OSMC-EV in Beijing, shows that overvoltage fault for Li-ion batteries cell can be observed from the voltage curves. To further locate abnormal cells and predict faults, an entropy weight method is established to calculate the objective weight, which reduces the subjectivity and improves the reliability. The result clearly identifies the abnormity of cell voltage. The proposed diagnostic model can be used for EV real-time diagnosis without laboratory testing methods. It is more effective than traditional methods based on contrastive analysis.

  6. The SSVEP-Based BCI Text Input System Using Entropy Encoding Algorithm

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2015-01-01

    Full Text Available The so-called amyotrophic lateral sclerosis (ALS or motor neuron disease (MND is a neurodegenerative disease with various causes. It is characterized by muscle spasticity, rapidly progressive weakness due to muscle atrophy, and difficulty in speaking, swallowing, and breathing. The severe disabled always have a common problem that is about communication except physical malfunctions. The steady-state visually evoked potential based brain computer interfaces (BCI, which apply visual stimulus, are very suitable to play the role of communication interface for patients with neuromuscular impairments. In this study, the entropy encoding algorithm is proposed to encode the letters of multilevel selection interface for BCI text input systems. According to the appearance frequency of each letter, the entropy encoding algorithm is proposed to construct a variable-length tree for the letter arrangement of multilevel selection interface. Then, the Gaussian mixture models are applied to recognize electrical activity of the brain. According to the recognition results, the multilevel selection interface guides the subject to spell and type the words. The experimental results showed that the proposed approach outperforms the baseline system, which does not consider the appearance frequency of each letter. Hence, the proposed approach is able to ease text input interface for patients with neuromuscular impairments.

  7. Combined Forecasting Method of Landslide Deformation Based on MEEMD, Approximate Entropy, and WLS-SVM

    Directory of Open Access Journals (Sweden)

    Shaofeng Xie

    2017-01-01

    Full Text Available Given the chaotic characteristics of the time series of landslides, a new method based on modified ensemble empirical mode decomposition (MEEMD, approximate entropy and the weighted least square support vector machine (WLS-SVM was proposed. The method mainly started from the chaotic sequence of time-frequency analysis and improved the model performance as follows: first a deformation time series was decomposed into a series of subsequences with significantly different complexity using MEEMD. Then the approximate entropy method was used to generate a new subsequence for the combination of subsequences with similar complexity, which could effectively concentrate the component feature information and reduce the computational scale. Finally the WLS-SVM prediction model was established for each new subsequence. At the same time, phase space reconstruction theory and the grid search method were used to select the input dimension and the optimal parameters of the model, and then the superposition of each predicted value was the final forecasting result. Taking the landslide deformation data of Danba as an example, the experiments were carried out and compared with wavelet neural network, support vector machine, least square support vector machine and various combination schemes. The experimental results show that the algorithm has high prediction accuracy. It can ensure a better prediction effect even in landslide deformation periods of rapid fluctuation, and it can also better control the residual value and effectively reduce the error interval.

  8. A new qualitative acoustic emission parameter based on Shannon's entropy for damage monitoring

    Science.gov (United States)

    Chai, Mengyu; Zhang, Zaoxiao; Duan, Quan

    2018-02-01

    An important objective of acoustic emission (AE) non-destructive monitoring is to accurately identify approaching critical damage and to avoid premature failure by means of the evolutions of AE parameters. One major drawback of most parameters such as count and rise time is that they are strongly dependent on the threshold and other settings employed in AE data acquisition system. This may hinder the correct reflection of original waveform generated from AE sources and consequently bring difficulty for the accurate identification of the critical damage and early failure. In this investigation, a new qualitative AE parameter based on Shannon's entropy, i.e. AE entropy is proposed for damage monitoring. Since it derives from the uncertainty of amplitude distribution of each AE waveform, it is independent of the threshold and other time-driven parameters and can characterize the original micro-structural deformations. Fatigue crack growth test on CrMoV steel and three point bending test on a ductile material are conducted to validate the feasibility and effectiveness of the proposed parameter. The results show that the new parameter, compared to AE amplitude, is more effective in discriminating the different damage stages and identifying the critical damage.

  9. Analysis of financial time series using multiscale entropy based on skewness and kurtosis

    Science.gov (United States)

    Xu, Meng; Shang, Pengjian

    2018-01-01

    There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.

  10. The Maximum Entropy Principle and the Modern Portfolio Theory

    Directory of Open Access Journals (Sweden)

    Ailton Cassetari

    2003-12-01

    Full Text Available In this work, a capital allocation methodology base don the Principle of Maximum Entropy was developed. The Shannons entropy is used as a measure, concerning the Modern Portfolio Theory, are also discuted. Particularly, the methodology is tested making a systematic comparison to: 1 the mean-variance (Markovitz approach and 2 the mean VaR approach (capital allocations based on the Value at Risk concept. In principle, such confrontations show the plausibility and effectiveness of the developed method.

  11. A comparison between index of entropy and catastrophe theory methods for mapping groundwater potential in an arid region.

    Science.gov (United States)

    Al-Abadi, Alaa M; Shahid, Shamsuddin

    2015-09-01

    In this study, index of entropy and catastrophe theory methods were used for demarcating groundwater potential in an arid region using weighted linear combination techniques in geographical information system (GIS) environment. A case study from Badra area in the eastern part of central of Iraq was analyzed and discussed. Six factors believed to have influence on groundwater occurrence namely elevation, slope, aquifer transmissivity and storativity, soil, and distance to fault were prepared as raster thematic layers to facility integration into GIS environment. The factors were chosen based on the availability of data and local conditions of the study area. Both techniques were used for computing weights and assigning ranks vital for applying weighted linear combination approach. The results of application of both modes indicated that the most influential groundwater occurrence factors were slope and elevation. The other factors have relatively smaller values of weights implying that these factors have a minor role in groundwater occurrence conditions. The groundwater potential index (GPI) values for both models were classified using natural break classification scheme into five categories: very low, low, moderate, high, and very high. For validation of generated GPI, the relative operating characteristic (ROC) curves were used. According to the obtained area under the curve, the catastrophe model with 78 % prediction accuracy was found to perform better than entropy model with 77 % prediction accuracy. The overall results indicated that both models have good capability for predicting groundwater potential zones.

  12. Microscopic insights into the NMR relaxation based protein conformational entropy meter

    Science.gov (United States)

    Kasinath, Vignesh; Sharp, Kim A.; Wand, A. Joshua

    2013-01-01

    Conformational entropy is a potentially important thermodynamic parameter contributing to protein function. Quantitative measures of conformational entropy are necessary for an understanding of its role but have been difficult to obtain. An empirical method that utilizes changes in conformational dynamics as a proxy for changes in conformational entropy has recently been introduced. Here we probe the microscopic origins of the link between conformational dynamics and conformational entropy using molecular dynamics simulations. Simulation of seven pro! teins gave an excellent correlation with measures of side-chain motion derived from NMR relaxation. The simulations show that the motion of methyl-bearing side-chains are sufficiently coupled to that of other side chains to serve as excellent reporters of the overall side-chain conformational entropy. These results tend to validate the use of experimentally accessible measures of methyl motion - the NMR-derived generalized order parameters - as a proxy from which to derive changes in protein conformational entropy. PMID:24007504

  13. A High-Precision Time-Frequency Entropy Based on Synchrosqueezing Generalized S-Transform Applied in Reservoir Detection

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2018-06-01

    Full Text Available According to the fact that high frequency will be abnormally attenuated when seismic signals travel across reservoirs, a new method, which is named high-precision time-frequency entropy based on synchrosqueezing generalized S-transform, is proposed for hydrocarbon reservoir detection in this paper. First, the proposed method obtains the time-frequency spectra by synchrosqueezing generalized S-transform (SSGST, which are concentrated around the real instantaneous frequency of the signals. Then, considering the characteristics and effects of noises, we give a frequency constraint condition to calculate the entropy based on time-frequency spectra. The synthetic example verifies that the entropy will be abnormally high when seismic signals have an abnormal attenuation. Besides, comparing with the GST time-frequency entropy and the original SSGST time-frequency entropy in field data, the results of the proposed method show higher precision. Moreover, the proposed method can not only accurately detect and locate hydrocarbon reservoirs, but also effectively suppress the impact of random noises.

  14. Change of entropy in the martensitic transformation and its dependence in Cu-based shape memory alloys

    International Nuclear Information System (INIS)

    Romero, R.; Pelegrina, J.L.

    2003-01-01

    A study of the entropy change ΔS between the β phase and the martensite in Cu-based shape memory alloys is presented. From a compilation of available experimental data, the composition dependence of ΔS was studied. The experimental data were analyzed within the frame of a simple model, which is based on the specific heats of the phases. It was demonstrated that the dependence of ΔS with composition comes only through the lattice parameter and the effective mass of the alloy. For the studied composition range, the greater vibrational entropy of β phase is mainly controlled by the high-mass Cu atoms

  15. Analysis of Entropy Generation in Flow of Methanol-Based Nanofluid in a Sinusoidal Wavy Channel

    Directory of Open Access Journals (Sweden)

    Muhammad Qasim

    2017-10-01

    Full Text Available The entropy generation due to heat transfer and fluid friction in mixed convective peristaltic flow of methanol-Al2O3 nano fluid is examined. Maxwell’s thermal conductivity model is used in analysis. Velocity and temperature profiles are utilized in the computation of the entropy generation number. The effects of involved physical parameters on velocity, temperature, entropy generation number, and Bejan number are discussed and explained graphically.

  16. Digital Image Stabilization Method Based on Variational Mode Decomposition and Relative Entropy

    Directory of Open Access Journals (Sweden)

    Duo Hao

    2017-11-01

    Full Text Available Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD and relative entropy (RE. In this paper, the global motion vector (GMV is initially decomposed into several narrow-banded modes by VMD. REs, which exhibit the difference of probability distribution between two modes, are then calculated to identify the intentional and jitter motion modes. Finally, the summation of the jitter motion modes constitutes jitter motions, whereas the subtraction of the resulting sum from the GMV represents the intentional motions. The proposed stabilization method is compared with several known methods, namely, medium filter (MF, Kalman filter (KF, wavelet decomposition (MD method, empirical mode decomposition (EMD-based method, and enhanced EMD-based method, to evaluate stabilization performance. Experimental results show that the proposed method outperforms the other stabilization methods.

  17. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  18. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    Science.gov (United States)

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  19. Nonuniform Sparse Data Clustering Cascade Algorithm Based on Dynamic Cumulative Entropy

    Directory of Open Access Journals (Sweden)

    Ning Li

    2016-01-01

    Full Text Available A small amount of prior knowledge and randomly chosen initial cluster centers have a direct impact on the accuracy of the performance of iterative clustering algorithm. In this paper we propose a new algorithm to compute initial cluster centers for k-means clustering and the best number of the clusters with little prior knowledge and optimize clustering result. It constructs the Euclidean distance control factor based on aggregation density sparse degree to select the initial cluster center of nonuniform sparse data and obtains initial data clusters by multidimensional diffusion density distribution. Multiobjective clustering approach based on dynamic cumulative entropy is adopted to optimize the initial data clusters and the best number of the clusters. The experimental results show that the newly proposed algorithm has good performance to obtain the initial cluster centers for the k-means algorithm and it effectively improves the clustering accuracy of nonuniform sparse data by about 5%.

  20. An Algorithm of Traffic Perception of DDoS Attacks against SOA Based on Time United Conditional Entropy

    Directory of Open Access Journals (Sweden)

    Yuntao Zhao

    2016-01-01

    Full Text Available DDoS attacks can prevent legitimate users from accessing the service by consuming resource of the target nodes, whose availability of network and service is exposed to a significant threat. Therefore, DDoS traffic perception is the premise and foundation of the whole system security. In this paper the method of DDoS traffic perception for SOA network based on time united conditional entropy was proposed. According to many-to-one relationship mapping between the source IP address and destination IP addresses of DDoS attacks, traffic characteristics of services are analyzed based on conditional entropy. The algorithm is provided with perception ability of DDoS attacks on SOA services by introducing time dimension. Simulation results show that the novel method can realize DDoS traffic perception with analyzing abrupt variation of conditional entropy in time dimension.

  1. Enthalpy/entropy contributions to conformational KIEs: theoretical predictions and comparison with experiment.

    Science.gov (United States)

    Fong, Aaron; Meyer, Matthew P; O'Leary, Daniel J

    2013-02-18

    Previous theoretical studies of Mislow's doubly-bridged biphenyl ketone 1 and dihydrodimethylphenanthrene 2 have determined significant entropic contributions to their normal (1) and inverse (2) conformational kinetic isotope effects (CKIEs). To broaden our investigation, we have used density functional methods to characterize the potential energy surfaces and vibrational frequencies for ground and transition structures of additional systems with measured CKIEs, including [2.2]-metaparacyclophane-d (3), 1,1'-binaphthyl (4), 2,2'-dibromo-[1,1'-biphenyl]-4,4'-dicarboxylic acid (5), and the 2-(N,N,N-trimethyl)-2'-(N,N-dimethyl)-diaminobiphenyl cation (6). We have also computed CKIEs in a number of systems whose experimental CKIEs are unknown. These include analogs of 1 in which the C=O groups have been replaced with CH₂ (7), O (8), and S (9) atoms and ring-expanded variants of 2 containing CH₂ (10), O (11), S (12), or C=O (13) groups. Vibrational entropy contributes to the CKIEs in all of these systems with the exception of cyclophane 3, whose isotope effect is predicted to be purely enthalpic in origin and whose Bigeleisen-Mayer ZPE term is equivalent to DDH‡. There is variable correspondence between these terms in the other molecules studied, thus identifying additional examples of systems in which the Bigeleisen-Mayer formalism does not correlate with DH/DS dissections.

  2. Enthalpy/Entropy Contributions to Conformational KIEs: Theoretical Predictions and Comparison with Experiment

    Directory of Open Access Journals (Sweden)

    Aaron Fong

    2013-02-01

    Full Text Available Previous theoretical studies of Mislow’s doubly-bridged biphenyl ketone 1 and dihydrodimethylphenanthrene 2 have determined significant entropic contributions to their normal (1 and inverse (2 conformational kinetic isotope effects (CKIEs. To broaden our investigation, we have used density functional methods to characterize the potential energy surfaces and vibrational frequencies for ground and transition structures of additional systems with measured CKIEs, including [2.2]-metaparacyclophane-d (3, 1,1'-binaphthyl (4, 2,2'-dibromo-[1,1'-biphenyl]-4,4'-dicarboxylic acid (5, and the 2-(N,N,N-trimethyl-2'-(N,N-dimethyl-diaminobiphenyl cation (6. We have also computed CKIEs in a number of systems whose experimental CKIEs are unknown. These include analogs of 1 in which the C=O groups have been replaced with CH2 (7, O (8, and S (9 atoms and ring-expanded variants of 2 containing CH2 (10, O (11, S (12, or C=O (13 groups. Vibrational entropy contributes to the CKIEs in all of these systems with the exception of cyclophane 3, whose isotope effect is predicted to be purely enthalpic in origin and whose Bigeleisen-Mayer ZPE term is equivalent to ΔΔ H‡. There is variable correspondence between these terms in the other molecules studied, thus identifying additional examples of systems in which the Bigeleisen-Mayer formalism does not correlate with ΔH/ΔS dissections.

  3. An Entropy-based gene selection method for cancer classification using microarray data

    Directory of Open Access Journals (Sweden)

    Krishnan Arun

    2005-03-01

    Full Text Available Abstract Background Accurate diagnosis of cancer subtypes remains a challenging problem. Building classifiers based on gene expression data is a promising approach; yet the selection of non-redundant but relevant genes is difficult. The selected gene set should be small enough to allow diagnosis even in regular clinical laboratories and ideally identify genes involved in cancer-specific regulatory pathways. Here an entropy-based method is proposed that selects genes related to the different cancer classes while at the same time reducing the redundancy among the genes. Results The present study identifies a subset of features by maximizing the relevance and minimizing the redundancy of the selected genes. A merit called normalized mutual information is employed to measure the relevance and the redundancy of the genes. In order to find a more representative subset of features, an iterative procedure is adopted that incorporates an initial clustering followed by data partitioning and the application of the algorithm to each of the partitions. A leave-one-out approach then selects the most commonly selected genes across all the different runs and the gene selection algorithm is applied again to pare down the list of selected genes until a minimal subset is obtained that gives a satisfactory accuracy of classification. The algorithm was applied to three different data sets and the results obtained were compared to work done by others using the same data sets Conclusion This study presents an entropy-based iterative algorithm for selecting genes from microarray data that are able to classify various cancer sub-types with high accuracy. In addition, the feature set obtained is very compact, that is, the redundancy between genes is reduced to a large extent. This implies that classifiers can be built with a smaller subset of genes.

  4. Nonsymmetric entropy and maximum nonsymmetric entropy principle

    International Nuclear Information System (INIS)

    Liu Chengshi

    2009-01-01

    Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.

  5. Assessment of sustainable urban transport development based on entropy and unascertained measure.

    Science.gov (United States)

    Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie

    2017-01-01

    To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.

  6. Permutation entropy analysis of financial time series based on Hill's diversity number

    Science.gov (United States)

    Zhang, Yali; Shang, Pengjian

    2017-12-01

    In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.

  7. Analyzing the Performances of Automotive Companies Using Entropy Based MAUT and SAW Methods

    Directory of Open Access Journals (Sweden)

    Nuri Ömürbek

    2016-06-01

    Full Text Available In this study, performances of automotive companies traded on BİST (Istanbul Stock Exchange and also operated in our country have been  compared with the multi-criteria decision making techniques. Data of the most important automotive companies operating in Turkey have been analyzed based on capital, stock certificate, marketing value, sales revenue, number of employees, net profit margin, current ratio, net profit/capital, net profit/sales and net sales/number of employees. Criteria applied on  Performance measurement  was gained  operating reports of companies  in 2014. Entropy method  has been used to determine the weights of the criteria. Those weights have been used MAUT (Multi-Attribute Utility Theory and SAW (Simple Additive Weighting  methods to rank automative companies’ performances The findings highlight that the same companies were in the first three places  in both methods.

  8. Symplectic entropy

    International Nuclear Information System (INIS)

    De Nicola, Sergio; Fedele, Renato; Man'ko, Margarita A; Man'ko, Vladimir I

    2007-01-01

    The tomographic-probability description of quantum states is reviewed. The symplectic tomography of quantum states with continuous variables is studied. The symplectic entropy of the states with continuous variables is discussed and its relation to Shannon entropy and information is elucidated. The known entropic uncertainty relations of the probability distribution in position and momentum of a particle are extended and new uncertainty relations for symplectic entropy are obtained. The partial case of symplectic entropy, which is optical entropy of quantum states, is considered. The entropy associated to optical tomogram is shown to satisfy the new entropic uncertainty relation. The example of Gaussian states of harmonic oscillator is studied and the entropic uncertainty relations for optical tomograms of the Gaussian state are shown to minimize the uncertainty relation

  9. A Review of Solid-Solution Models of High-Entropy Alloys Based on Ab Initio Calculations

    Directory of Open Access Journals (Sweden)

    Fuyang Tian

    2017-11-01

    Full Text Available Similar to the importance of XRD in experiments, ab initio calculations, as a powerful tool, have been applied to predict the new potential materials and investigate the intrinsic properties of materials in theory. As a typical solid-solution material, the large degree of uncertainty of high-entropy alloys (HEAs results in the difficulty of ab initio calculations application to HEAs. The present review focuses on the available ab initio based solid-solution models (virtual lattice approximation, coherent potential approximation, special quasirandom structure, similar local atomic environment, maximum-entropy method, and hybrid Monte Carlo/molecular dynamics and their applications and limits in single phase HEAs.

  10. Nonequilibrium entropies

    International Nuclear Information System (INIS)

    Maes, Christian

    2012-01-01

    In contrast to the quite unique entropy concept useful for systems in (local) thermodynamic equilibrium, there is a variety of quite distinct nonequilibrium entropies, reflecting different physical points. We disentangle these entropies as they relate to heat, fluctuations, response, time asymmetry, variational principles, monotonicity, volume contraction or statistical forces. However, not all of those extensions yield state quantities as understood thermodynamically. At the end we sketch how aspects of dynamical activity can take over for obtaining an extended Clausius relation.

  11. Carbon emission analysis and evaluation of industrial departments in China: An improved environmental DEA cross model based on information entropy.

    Science.gov (United States)

    Han, Yongming; Long, Chang; Geng, Zhiqiang; Zhang, Keyu

    2018-01-01

    Environmental protection and carbon emission reduction play a crucial role in the sustainable development procedure. However, the environmental efficiency analysis and evaluation based on the traditional data envelopment analysis (DEA) cross model is subjective and inaccurate, because all elements in a column or a row of the cross evaluation matrix (CEM) in the traditional DEA cross model are given the same weight. Therefore, this paper proposes an improved environmental DEA cross model based on the information entropy to analyze and evaluate the carbon emission of industrial departments in China. The information entropy is applied to build the entropy distance based on the turbulence of the whole system, and calculate the weights in the CEM of the environmental DEA cross model in a dynamic way. The theoretical results show that the new weight constructed based on the information entropy is unique and optimal globally by using the Monte Carlo simulation. Finally, compared with the traditional environmental DEA and DEA cross model, the improved environmental DEA cross model has a better efficiency discrimination ability based on the data of industrial departments in China. Moreover, the proposed model can obtain the potential of carbon emission reduction of industrial departments to improve the energy efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.

    Science.gov (United States)

    Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro

    2015-01-01

    Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.

  13. Conservation analysis of dengue virust-cell epitope-based vaccine candidates using peptide block entropy

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Zhang, Guang Lan; Keskin, Derin B.

    2011-01-01

    residues. The block entropy analysis provides broad coverage of variant antigens. We applied the block entropy analysis method to the proteomes of the four serotypes of dengue virus (DENV) and found 1,551 blocks of 9-mer peptides, which cover 99% of available sequences with five or fewer unique peptides...

  14. Entropy Based Feature Selection for Fuzzy Set-Valued Information Systems

    Science.gov (United States)

    Ahmed, Waseem; Sufyan Beg, M. M.; Ahmad, Tanvir

    2018-06-01

    In Set-valued Information Systems (SIS), several objects contain more than one value for some attributes. Tolerance relation used for handling SIS sometimes leads to loss of certain information. To surmount this problem, fuzzy rough model was introduced. However, in some cases, SIS may contain some real or continuous set-values. Therefore, the existing fuzzy rough model for handling Information system with fuzzy set-values needs some changes. In this paper, Fuzzy Set-valued Information System (FSIS) is proposed and fuzzy similarity relation for FSIS is defined. Yager's relative conditional entropy was studied to find the significance measure of a candidate attribute of FSIS. Later, using these significance values, three greedy forward algorithms are discussed for finding the reduct and relative reduct for the proposed FSIS. An experiment was conducted on a sample population of the real dataset and a comparison of classification accuracies of the proposed FSIS with the existing SIS and single-valued Fuzzy Information Systems was made, which demonstrated the effectiveness of proposed FSIS.

  15. Coarse-graining using the relative entropy and simplex-based optimization methods in VOTCA

    Science.gov (United States)

    Rühle, Victor; Jochum, Mara; Koschke, Konstantin; Aluru, N. R.; Kremer, Kurt; Mashayak, S. Y.; Junghans, Christoph

    2014-03-01

    Coarse-grained (CG) simulations are an important tool to investigate systems on larger time and length scales. Several methods for systematic coarse-graining were developed, varying in complexity and the property of interest. Thus, the question arises which method best suits a specific class of system and desired application. The Versatile Object-oriented Toolkit for Coarse-graining Applications (VOTCA) provides a uniform platform for coarse-graining methods and allows for their direct comparison. We present recent advances of VOTCA, namely the implementation of the relative entropy method and downhill simplex optimization for coarse-graining. The methods are illustrated by coarse-graining SPC/E bulk water and a water-methanol mixture. Both CG models reproduce the pair distributions accurately. SYM is supported by AFOSR under grant 11157642 and by NSF under grant 1264282. CJ was supported in part by the NSF PHY11-25915 at KITP. K. Koschke acknowledges funding by the Nestle Research Center.

  16. Physicochemical attack against solid tumors based on the reversal of direction of entropy flow: an attempt to introduce thermodynamics in anticancer therapy

    Directory of Open Access Journals (Sweden)

    Lv Xiaogui

    2006-11-01

    Full Text Available Abstract Background There are many differences between healthy tissue and growing tumor tissue, including metabolic, structural and thermodynamic differences. Both structural and thermodynamic differences can be used to follow the entropy differences in cancerous and normal tissue. Entropy production is a bilinear form of the rates of irreversible processes and the corresponding "generalized forces". Entropy production due to various dissipation mechanisms based on temperature differences, chemical potential gradient, chemical affinity, viscous stress and exerted force is a promising tool for calculations relating to potential targets for tumor isolation and demarcation. Methods The relative importance of five forms of entropy production was assessed through mathematical estimation. Using our mathematical model we demonstrated that the rate of entropy production by a cancerous cell is always higher than that of a healthy cell apart from the case of the application of external energy. Different rates of entropy production by two kinds of cells influence the direction of entropy flow between the cells. Entropy flow from a cancerous cell to a healthy cell transfers information regarding the cancerous cell and propagates its invasive action to the healthy tissues. To change the direction of entropy flow, in addition to designing certain biochemical pathways to reduce the rate of entropy production by cancerous cells, we suggest supplying external energy to the tumor area, changing the relative rate of entropy production by the two kinds of cells and leading to a higher entropy accumulation in the surrounding normal cells than in the tumorous cells. Conclusion Through the use of mathematical models it was quantitatively demonstrated that when no external force field is applied, the rate of entropy production of cancerous cells is always higher than that of healthy cells. However, when the external energy of square wave electric pulses is applied to

  17. Causal relationship between the global foreign exchange market based on complex networks and entropy theory

    International Nuclear Information System (INIS)

    Cao, Guangxi; Zhang, Qi; Li, Qingchen

    2017-01-01

    Highlights: • Mutual information is used as the edge weights of nodes instead of PCC, which overcomes the shortcomings of linear correlation functions. • SGD turns into a new cluster center and gradually becomes a point connecting the Asian and European clusters during and after the US sub-prime crisis. • Liang's entropy theory, which has not been adopted before in the global foreign exchange market, is considered. - Abstract: The foreign exchange (FX) market is a typical complex dynamic system under the background of exchange rate marketization reform and is an important part of the financial market. This study aims to generate an international FX network based on complex network theory. This study employs the mutual information method to judge the nonlinear characteristics of 54 major currencies in international FX markets. Through this method, we find that the FX network possesses a small average path length and a large clustering coefficient under different thresholds and that it exhibits small-world characteristics as a whole. Results show that the relationship between FX rates is close. Volatility can quickly transfer in the whole market, and the FX volatility of influential individual states transfers at a fast pace and a large scale. The period from July 21, 2005 to March 31, 2015 is subdivided into three sub-periods (i.e., before, during, and after the US sub-prime crisis) to analyze the topology evolution of FX markets using the maximum spanning tree approach. Results show that the USD gradually lost its core position, EUR remained a stable center, and the center of the Asian cluster became unstable. Liang's entropy theory is used to analyze the causal relationship between the four large clusters of the world.

  18. On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy

    Directory of Open Access Journals (Sweden)

    Mikołaj Morzy

    2017-01-01

    Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.

  19. Properties of Fuzzy Entropy Based on the Shape Change of Membership Function

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Modification of a fuzzy partition often leads to the change of fuzziness of a fuzzy system. Researches on the change of fuzzy entropy of a fuzzy set, responding to shape alteration of membership function, therefore, play a significant role in analysis of the change of fuzziness of a fuzzy system because a fuzzy partition consists of a set of fuzzy sets which satisfy some special constraints. This paper has shown several results about entropy changes of a fuzzy set. First, the entropies of two same type of fuzzy sets have a constant proportional relationship which depends on the ratio of the sizes of their support intervals. Second, as for Triangular Fuzzy Numbers (TFNs), the entropies of any two TFNs which can not be always the same type, also,have a constant proportional relationship which depends on the ratio of the sizes of their support intervals. Hence, any two TFNs with the same sizes of support intervals have the same entropies. Third, concerning two Triangular Fuzzy Sets (TFSs) with same sizes of support intervals and different heights, the relationship of their entropies lies on their height.Finally, we point it out a mistake that Chen's assertion that the entropy of resultant fuzzy set of elevation operation is directly proportional to that of the original one while elevation factor just acts as a proportional factor. These results should contribute to the analysis and design of a fuzzy system.

  20. Gravel Image Segmentation in Noisy Background Based on Partial Entropy Method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Because of wide variation in gray levels and particle dimensions and the presence of many small gravel objects in the background, as well as corrupting the image by noise, it is difficult o segment gravel objects. In this paper, we develop a partial entropy method and succeed to realize gravel objects segmentation. We give entropy principles and fur calculation methods. Moreover, we use minimum entropy error automaticly to select a threshold to segment image. We introduce the filter method using mathematical morphology. The segment experiments are performed by using different window dimensions for a group of gravel image and demonstrates that this method has high segmentation rate and low noise sensitivity.

  1. Entropy equilibrium equation and dynamic entropy production in environment liquid

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The entropy equilibrium equation is the basis of the nonequilibrium state thermodynamics. But the internal energy implies the kinetic energy of the fluid micelle relative to mass center in the classical entropy equilibrium equation at present. This internal energy is not the mean kinetic energy of molecular movement in thermodynamics. Here a modified entropy equilibrium equation is deduced, based on the concept that the internal energy is just the mean kinetic energy of the molecular movement. A dynamic entropy production is introduced into the entropy equilibrium equation to describe the dynamic process distinctly. This modified entropy equilibrium equation can describe not only the entropy variation of the irreversible processes but also the reversible processes in a thermodynamic system. It is more reasonable and suitable for wider applications.

  2. Entropy Production of Stars

    Directory of Open Access Journals (Sweden)

    Leonid M. Martyushev

    2015-06-01

    Full Text Available The entropy production (inside the volume bounded by a photosphere of main-sequence stars, subgiants, giants, and supergiants is calculated based on B–V photometry data. A non-linear inverse relationship of thermodynamic fluxes and forces as well as an almost constant specific (per volume entropy production of main-sequence stars (for 95% of stars, this quantity lies within 0.5 to 2.2 of the corresponding solar magnitude is found. The obtained results are discussed from the perspective of known extreme principles related to entropy production.

  3. Application of Maximum Entropy Distribution to the Statistical Properties of Wave Groups

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFT filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.

  4. An entropy-based improved k-top scoring pairs (TSP) method for ...

    African Journals Online (AJOL)

    DR. NJ TONUKARI

    2012-06-05

    Jun 5, 2012 ... Key words: Cancer classification, gene expression, k-TSP, information entropy, gene selection. INTRODUCTION ..... The 88 kDa precursor protein, progranulin, is also ... TCF3 is in acute myeloid leukemia pathway, so it is.

  5. Design of high entropy alloys based on the experience from commercial superalloys

    Science.gov (United States)

    Wang, Z.; Huang, Y.; Wang, J.; Liu, C. T.

    2015-01-01

    High entropy alloys (HEAs) have been drawing increasing attention recently and gratifying results have been obtained. However, the existing metallurgic rules of HEAs could not provide specific information of selecting candidate alloys for structural applications. Our brief survey reveals that many commercial superalloys have medium and even to high configurational entropies. The experience of commercial superalloys provides a clue for helping us in the development of HEAs for structural applications.

  6. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    OpenAIRE

    Ge Cheng; Zhenyu Zhang; Moses Ntanda Kyebambe; Nasser Kimbugwe

    2016-01-01

    Predicting the outcome of National Basketball Association (NBA) matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME) model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that...

  7. Thermodynamics based on the principle of least abbreviated action: Entropy production in a network of coupled oscillators

    International Nuclear Information System (INIS)

    Garcia-Morales, Vladimir; Pellicer, Julio; Manzanares, Jose A.

    2008-01-01

    We present some novel thermodynamic ideas based on the Maupertuis principle. By considering Hamiltonians written in terms of appropriate action-angle variables we show that thermal states can be characterized by the action variables and by their evolution in time when the system is nonintegrable. We propose dynamical definitions for the equilibrium temperature and entropy as well as an expression for the nonequilibrium entropy valid for isolated systems with many degrees of freedom. This entropy is shown to increase in the relaxation to equilibrium of macroscopic systems with short-range interactions, which constitutes a dynamical justification of the Second Law of Thermodynamics. Several examples are worked out to show that this formalism yields the right microcanonical (equilibrium) quantities. The relevance of this approach to nonequilibrium situations is illustrated with an application to a network of coupled oscillators (Kuramoto model). We provide an expression for the entropy production in this system finding that its positive value is directly related to dissipation at the steady state in attaining order through synchronization

  8. Changes in the Complexity of Heart Rate Variability with Exercise Training Measured by Multiscale Entropy-Based Measurements

    Directory of Open Access Journals (Sweden)

    Frederico Sassoli Fazan

    2018-01-01

    Full Text Available Quantifying complexity from heart rate variability (HRV series is a challenging task, and multiscale entropy (MSE, along with its variants, has been demonstrated to be one of the most robust approaches to achieve this goal. Although physical training is known to be beneficial, there is little information about the long-term complexity changes induced by the physical conditioning. The present study aimed to quantify the changes in physiological complexity elicited by physical training through multiscale entropy-based complexity measurements. Rats were subject to a protocol of medium intensity training ( n = 13 or a sedentary protocol ( n = 12 . One-hour HRV series were obtained from all conscious rats five days after the experimental protocol. We estimated MSE, multiscale dispersion entropy (MDE and multiscale SDiff q from HRV series. Multiscale SDiff q is a recent approach that accounts for entropy differences between a given time series and its shuffled dynamics. From SDiff q , three attributes (q-attributes were derived, namely SDiff q m a x , q m a x and q z e r o . MSE, MDE and multiscale q-attributes presented similar profiles, except for SDiff q m a x . q m a x showed significant differences between trained and sedentary groups on Time Scales 6 to 20. Results suggest that physical training increases the system complexity and that multiscale q-attributes provide valuable information about the physiological complexity.

  9. Interval-Valued Hesitant Fuzzy Multiattribute Group Decision Making Based on Improved Hamacher Aggregation Operators and Continuous Entropy

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2017-01-01

    Full Text Available Under the interval-valued hesitant fuzzy information environment, we investigate a multiattribute group decision making (MAGDM method with continuous entropy weights and improved Hamacher information aggregation operators. Firstly, we introduce the axiomatic definition of entropy for interval-valued hesitant fuzzy elements (IVHFEs and construct a continuous entropy formula on the basis of the continuous ordered weighted averaging (COWA operator. Then, based on the Hamacher t-norm and t-conorm, the adjusted operational laws for IVHFEs are defined. In order to aggregate interval-valued hesitant fuzzy information, some new improved interval-valued hesitant fuzzy Hamacher aggregation operators are investigated, including the improved interval-valued hesitant fuzzy Hamacher ordered weighted averaging (I-IVHFHOWA operator and the improved interval-valued hesitant fuzzy Hamacher ordered weighted geometric (I-IVHFHOWG operator, the desirable properties of which are discussed. In addition, the relationship among these proposed operators is analyzed in detail. Applying the continuous entropy and the proposed operators, an approach to MAGDM is developed. Finally, a numerical example for emergency operating center (EOC selection is provided, and comparative analyses with existing methods are performed to demonstrate that the proposed approach is both valid and practical to deal with group decision making problems.

  10. Atomistic-level non-equilibrium model for chemically reactive systems based on steepest-entropy-ascent quantum thermodynamics

    International Nuclear Information System (INIS)

    Li, Guanchen; Al-Abbasi, Omar; Von Spakovsky, Michael R

    2014-01-01

    This paper outlines an atomistic-level framework for modeling the non-equilibrium behavior of chemically reactive systems. The framework called steepest- entropy-ascent quantum thermodynamics (SEA-QT) is based on the paradigm of intrinsic quantum thermodynamic (IQT), which is a theory that unifies quantum mechanics and thermodynamics into a single discipline with wide applications to the study of non-equilibrium phenomena at the atomistic level. SEA-QT is a novel approach for describing the state of chemically reactive systems as well as the kinetic and dynamic features of the reaction process without any assumptions of near-equilibrium states or weak-interactions with a reservoir or bath. Entropy generation is the basis of the dissipation which takes place internal to the system and is, thus, the driving force of the chemical reaction(s). The SEA-QT non-equilibrium model is able to provide detailed information during the reaction process, providing a picture of the changes occurring in key thermodynamic properties (e.g., the instantaneous species concentrations, entropy and entropy generation, reaction coordinate, chemical affinities, reaction rate, etc). As an illustration, the SEA-QT framework is applied to an atomistic-level chemically reactive system governed by the reaction mechanism F + H 2 ↔ FH + H

  11. Entropy in bimolecular simulations: A comprehensive review of atomic fluctuations-based methods.

    Science.gov (United States)

    Kassem, Summer; Ahmed, Marawan; El-Sheikh, Salah; Barakat, Khaled H

    2015-11-01

    Entropy of binding constitutes a major, and in many cases a detrimental, component of the binding affinity in biomolecular interactions. While the enthalpic part of the binding free energy is easier to calculate, estimating the entropy of binding is further more complicated. A precise evaluation of entropy requires a comprehensive exploration of the complete phase space of the interacting entities. As this task is extremely hard to accomplish in the context of conventional molecular simulations, calculating entropy has involved many approximations. Most of these golden standard methods focused on developing a reliable estimation of the conformational part of the entropy. Here, we review these methods with a particular emphasis on the different techniques that extract entropy from atomic fluctuations. The theoretical formalisms behind each method is explained highlighting its strengths as well as its limitations, followed by a description of a number of case studies for each method. We hope that this brief, yet comprehensive, review provides a useful tool to understand these methods and realize the practical issues that may arise in such calculations. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. A Bayesian maximum entropy-based methodology for optimal spatiotemporal design of groundwater monitoring networks.

    Science.gov (United States)

    Hosseini, Marjan; Kerachian, Reza

    2017-09-01

    This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.

  13. An entropy-variables-based formulation of residual distribution schemes for non-equilibrium flows

    Science.gov (United States)

    Garicano-Mena, Jesús; Lani, Andrea; Degrez, Gérard

    2018-06-01

    In this paper we present an extension of Residual Distribution techniques for the simulation of compressible flows in non-equilibrium conditions. The latter are modeled by means of a state-of-the-art multi-species and two-temperature model. An entropy-based variable transformation that symmetrizes the projected advective Jacobian for such a thermophysical model is introduced. Moreover, the transformed advection Jacobian matrix presents a block diagonal structure, with mass-species and electronic-vibrational energy being completely decoupled from the momentum and total energy sub-system. The advantageous structure of the transformed advective Jacobian can be exploited by contour-integration-based Residual Distribution techniques: established schemes that operate on dense matrices can be substituted by the same scheme operating on the momentum-energy subsystem matrix and repeated application of scalar scheme to the mass-species and electronic-vibrational energy terms. Finally, the performance gain of the symmetrizing-variables formulation is quantified on a selection of representative testcases, ranging from subsonic to hypersonic, in inviscid or viscous conditions.

  14. An information entropy model on clinical assessment of patients based on the holographic field of meridian

    Science.gov (United States)

    Wu, Jingjing; Wu, Xinming; Li, Pengfei; Li, Nan; Mao, Xiaomei; Chai, Lihe

    2017-04-01

    Meridian system is not only the basis of traditional Chinese medicine (TCM) method (e.g. acupuncture, massage), but also the core of TCM's basic theory. This paper has introduced a new informational perspective to understand the reality and the holographic field of meridian. Based on maximum information entropy principle (MIEP), a dynamic equation for the holographic field has been deduced, which reflects the evolutionary characteristics of meridian. By using self-organizing artificial neural network as algorithm, the evolutionary dynamic equation of the holographic field can be resolved to assess properties of meridians and clinically diagnose the health characteristics of patients. Finally, through some cases from clinical patients (e.g. a 30-year-old male patient, an apoplectic patient, an epilepsy patient), we use this model to assess the evolutionary properties of meridians. It is proved that this model not only has significant implications in revealing the essence of meridian in TCM, but also may play a guiding role in clinical assessment of patients based on the holographic field of meridians.

  15. Transportation Mode Detection Based on Permutation Entropy and Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-01-01

    Full Text Available With the increasing prevalence of GPS devices and mobile phones, transportation mode detection based on GPS data has been a hot topic in GPS trajectory data analysis. Transportation modes such as walking, driving, bus, and taxi denote an important characteristic of the mobile user. Longitude, latitude, speed, acceleration, and direction are usually used as features in transportation mode detection. In this paper, first, we explore the possibility of using Permutation Entropy (PE of speed, a measure of complexity and uncertainty of GPS trajectory segment, as a feature for transportation mode detection. Second, we employ Extreme Learning Machine (ELM to distinguish GPS trajectory segments of different transportation. Finally, to evaluate the performance of the proposed method, we make experiments on GeoLife dataset. Experiments results show that we can get more than 50% accuracy when only using PE as a feature to characterize trajectory sequence. PE can indeed be effectively used to detect transportation mode from GPS trajectory. The proposed method has much better accuracy and faster running time than the methods based on the other features and SVM classifier.

  16. Entropy-based heavy tailed distribution transformation and visual analytics for monitoring massive network traffic

    Science.gov (United States)

    Han, Keesook J.; Hodge, Matthew; Ross, Virginia W.

    2011-06-01

    For monitoring network traffic, there is an enormous cost in collecting, storing, and analyzing network traffic datasets. Data mining based network traffic analysis has a growing interest in the cyber security community, but is computationally expensive for finding correlations between attributes in massive network traffic datasets. To lower the cost and reduce computational complexity, it is desirable to perform feasible statistical processing on effective reduced datasets instead of on the original full datasets. Because of the dynamic behavior of network traffic, traffic traces exhibit mixtures of heavy tailed statistical distributions or overdispersion. Heavy tailed network traffic characterization and visualization are important and essential tasks to measure network performance for the Quality of Services. However, heavy tailed distributions are limited in their ability to characterize real-time network traffic due to the difficulty of parameter estimation. The Entropy-Based Heavy Tailed Distribution Transformation (EHTDT) was developed to convert the heavy tailed distribution into a transformed distribution to find the linear approximation. The EHTDT linearization has the advantage of being amenable to characterize and aggregate overdispersion of network traffic in realtime. Results of applying the EHTDT for innovative visual analytics to real network traffic data are presented.

  17. Comparison of the role that entropy has played in processes of non-enzymatic and enzymatic catalysis

    International Nuclear Information System (INIS)

    Dixon Pineda, Manuel Tomas

    2012-01-01

    The function that entropy has played is compared in processes of non-enzymatic and enzymatic catalysis. The processes followed are showed: the kinetics of the acid hydrolysis of 3-pentyl acetate and cyclopentyl acetate catalyzed by hydrochloric acid and enzymatic hydrolysis of ethyl acetate and γ-butyrolactone catalyzed by pig liver esterase. The activation parameters of Eyring were determined for each process and interpreted the contribution of the entropy of activation for catalysis in this type of model reactions. (author) [es

  18. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  19. Entropy and wigner functions

    Science.gov (United States)

    Manfredi; Feix

    2000-10-01

    The properties of an alternative definition of quantum entropy, based on Wigner functions, are discussed. Such a definition emerges naturally from the Wigner representation of quantum mechanics, and can easily quantify the amount of entanglement of a quantum state. It is shown that smoothing of the Wigner function induces an increase in entropy. This fact is used to derive some simple rules to construct positive-definite probability distributions which are also admissible Wigner functions.

  20. Entropy and Wigner Functions

    OpenAIRE

    Manfredi, G.; Feix, M. R.

    2002-01-01

    The properties of an alternative definition of quantum entropy, based on Wigner functions, are discussed. Such definition emerges naturally from the Wigner representation of quantum mechanics, and can easily quantify the amount of entanglement of a quantum state. It is shown that smoothing of the Wigner function induces an increase in entropy. This fact is used to derive some simple rules to construct positive definite probability distributions which are also admissible Wigner functions

  1. Topological nearly entropy

    Science.gov (United States)

    Gulamsarwar, Syazwani; Salleh, Zabidin

    2017-08-01

    The purpose of this paper is to generalize the notions of Adler's topological entropy along with their several fundamental properties. A function f : X → Y is said to be R-map if f-1 (V) is regular open in X for every regular open set V in Y. Thus, we initiated a notion of topological nearly entropy for topological R-dynamical systems which is based on nearly compact relative to the space by using R-map.

  2. Calculation of Five Thermodynamic Molecular Descriptors by Means of a General Computer Algorithm Based on the Group-Additivity Method: Standard Enthalpies of Vaporization, Sublimation and Solvation, and Entropy of Fusion of Ordinary Organic Molecules and Total Phase-Change Entropy of Liquid Crystals.

    Science.gov (United States)

    Naef, Rudolf; Acree, William E

    2017-06-25

    The calculation of the standard enthalpies of vaporization, sublimation and solvation of organic molecules is presented using a common computer algorithm on the basis of a group-additivity method. The same algorithm is also shown to enable the calculation of their entropy of fusion as well as the total phase-change entropy of liquid crystals. The present method is based on the complete breakdown of the molecules into their constituting atoms and their immediate neighbourhood; the respective calculations of the contribution of the atomic groups by means of the Gauss-Seidel fitting method is based on experimental data collected from literature. The feasibility of the calculations for each of the mentioned descriptors was verified by means of a 10-fold cross-validation procedure proving the good to high quality of the predicted values for the three mentioned enthalpies and for the entropy of fusion, whereas the predictive quality for the total phase-change entropy of liquid crystals was poor. The goodness of fit ( Q ²) and the standard deviation (σ) of the cross-validation calculations for the five descriptors was as follows: 0.9641 and 4.56 kJ/mol ( N = 3386 test molecules) for the enthalpy of vaporization, 0.8657 and 11.39 kJ/mol ( N = 1791) for the enthalpy of sublimation, 0.9546 and 4.34 kJ/mol ( N = 373) for the enthalpy of solvation, 0.8727 and 17.93 J/mol/K ( N = 2637) for the entropy of fusion and 0.5804 and 32.79 J/mol/K ( N = 2643) for the total phase-change entropy of liquid crystals. The large discrepancy between the results of the two closely related entropies is discussed in detail. Molecules for which both the standard enthalpies of vaporization and sublimation were calculable, enabled the estimation of their standard enthalpy of fusion by simple subtraction of the former from the latter enthalpy. For 990 of them the experimental enthalpy-of-fusion values are also known, allowing their comparison with predictions, yielding a correlation coefficient R

  3. Calculation of Five Thermodynamic Molecular Descriptors by Means of a General Computer Algorithm Based on the Group-Additivity Method: Standard Enthalpies of Vaporization, Sublimation and Solvation, and Entropy of Fusion of Ordinary Organic Molecules and Total Phase-Change Entropy of Liquid Crystals

    Directory of Open Access Journals (Sweden)

    Rudolf Naef

    2017-06-01

    Full Text Available The calculation of the standard enthalpies of vaporization, sublimation and solvation of organic molecules is presented using a common computer algorithm on the basis of a group-additivity method. The same algorithm is also shown to enable the calculation of their entropy of fusion as well as the total phase-change entropy of liquid crystals. The present method is based on the complete breakdown of the molecules into their constituting atoms and their immediate neighbourhood; the respective calculations of the contribution of the atomic groups by means of the Gauss-Seidel fitting method is based on experimental data collected from literature. The feasibility of the calculations for each of the mentioned descriptors was verified by means of a 10-fold cross-validation procedure proving the good to high quality of the predicted values for the three mentioned enthalpies and for the entropy of fusion, whereas the predictive quality for the total phase-change entropy of liquid crystals was poor. The goodness of fit (Q2 and the standard deviation (σ of the cross-validation calculations for the five descriptors was as follows: 0.9641 and 4.56 kJ/mol (N = 3386 test molecules for the enthalpy of vaporization, 0.8657 and 11.39 kJ/mol (N = 1791 for the enthalpy of sublimation, 0.9546 and 4.34 kJ/mol (N = 373 for the enthalpy of solvation, 0.8727 and 17.93 J/mol/K (N = 2637 for the entropy of fusion and 0.5804 and 32.79 J/mol/K (N = 2643 for the total phase-change entropy of liquid crystals. The large discrepancy between the results of the two closely related entropies is discussed in detail. Molecules for which both the standard enthalpies of vaporization and sublimation were calculable, enabled the estimation of their standard enthalpy of fusion by simple subtraction of the former from the latter enthalpy. For 990 of them the experimental enthalpy-of-fusion values are also known, allowing their comparison with predictions, yielding a correlation

  4. Differences between state entropy and bispectral index during analysis of identical electroencephalogram signals: a comparison with two randomised anaesthetic techniques.

    Science.gov (United States)

    Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard

    2015-05-01

    It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high

  5. Research and Measurement of Software Complexity Based on Wuli, Shili, Renli (WSR and Information Entropy

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2015-04-01

    Full Text Available Complexity is an important factor throughout the software life cycle. It is increasingly difficult to guarantee software quality, cost and development progress with the increase in complexity. Excessive complexity is one of the main reasons for the failure of software projects, so effective recognition, measurement and control of complexity becomes the key of project management. At first, this paper analyzes the current research situation of software complexity systematically and points out existing problems in current research. Then, it proposes a WSR framework of software complexity, which divides the complexity of software into three levels of Wuli (WL, Shili (SL and Renli (RL, so that the staff in different roles may have a better understanding of complexity. Man is the main source of complexity, but the current research focuses on WL complexity, and the research of RL complexity is extremely scarce, so this paper emphasizes the research of RL complexity of software projects. This paper not only analyzes the composing factors of RL complexity, but also provides the definition of RL complexity. Moreover, it puts forward a quantitative measurement method of the complexity of personnel organization hierarchy and the complexity of personnel communication information based on information entropy first and analyzes and validates the scientificity and rationality of this measurement method through a large number of cases.

  6. [Assessment of ecosystem in giant panda distribution area based on entropy method and coefficient of variation].

    Science.gov (United States)

    Yan, Zhi Gang; Li, Jun Qing

    2017-12-01

    The areas of the habitat and bamboo forest, and the size of the giant panda wild population have greatly increased, while habitat fragmentation and local population isolation have also intensified in recent years. Accurate evaluation of ecosystem status of the panda in the giant panda distribution area is important for giant panda conservation. The ecosystems of the distribution area and six mountain ranges were subdivided into habitat and population subsystems based on the hie-rarchical system theory. Using the panda distribution area as the study area and the three national surveys as the time node, the evolution laws of ecosystems were studied using the entropy method, coefficient of variation, and correlation analysis. We found that with continuous improvement, some differences existed in the evolution and present situation of the ecosystems of six mountain ranges could be divided into three groups. Ecosystems classified into the same group showed many commonalities, and difference between the groups was considerable. Problems of habitat fragmentation and local population isolation became more serious, resulting in ecosystem degradation. Individuali-zed ecological protection measures should be formulated and implemented in accordance with the conditions in each mountain system to achieve the best results.

  7. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Directory of Open Access Journals (Sweden)

    Jinkyu Kim

    Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  8. Robot Evaluation and Selection with Entropy-Based Combination Weighting and Cloud TODIM Approach

    Directory of Open Access Journals (Sweden)

    Jing-Jing Wang

    2018-05-01

    Full Text Available Nowadays robots have been commonly adopted in various manufacturing industries to improve product quality and productivity. The selection of the best robot to suit a specific production setting is a difficult decision making task for manufacturers because of the increase in complexity and number of robot systems. In this paper, we explore two key issues of robot evaluation and selection: the representation of decision makers’ diversified assessments and the determination of the ranking of available robots. Specifically, a decision support model which utilizes cloud model and TODIM (an acronym in Portuguese of interactive and multiple criteria decision making method is developed for the purpose of handling robot selection problems with hesitant linguistic information. Besides, we use an entropy-based combination weighting technique to estimate the weights of evaluation criteria. Finally, we illustrate the proposed cloud TODIM approach with a robot selection example for an automobile manufacturer, and further validate its effectiveness and benefits via a comparative analysis. The results show that the proposed robot selection model has some unique advantages, which is more realistic and flexible for robot selection under a complex and uncertain environment.

  9. Instability risk assessment of construction waste pile slope based on fuzzy entropy

    Science.gov (United States)

    Ma, Yong; Xing, Huige; Yang, Mao; Nie, Tingting

    2018-05-01

    Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.

  10. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    Science.gov (United States)

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  11. An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging

    Directory of Open Access Journals (Sweden)

    Pistorius Stephen

    2010-01-01

    Full Text Available During the last forty years, Subsurface Radar (SR has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI and Breast Microwave Radar (BMR, the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.

  12. An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging

    Science.gov (United States)

    Flores-Tapia, Daniel; Pistorius, Stephen

    2010-12-01

    During the last forty years, Subsurface Radar (SR) has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI) and Breast Microwave Radar (BMR), the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.

  13. Multiple Sclerosis Identification Based on Fractional Fourier Entropy and a Modified Jaya Algorithm

    Directory of Open Access Journals (Sweden)

    Shui-Hua Wang

    2018-04-01

    Full Text Available Aim: Currently, identifying multiple sclerosis (MS by human experts may come across the problem of “normal-appearing white matter”, which causes a low sensitivity. Methods: In this study, we presented a computer vision based approached to identify MS in an automatic way. This proposed method first extracted the fractional Fourier entropy map from a specified brain image. Afterwards, it sent the features to a multilayer perceptron trained by a proposed improved parameter-free Jaya algorithm. We used cost-sensitivity learning to handle the imbalanced data problem. Results: The 10 × 10-fold cross validation showed our method yielded a sensitivity of 97.40 ± 0.60%, a specificity of 97.39 ± 0.65%, and an accuracy of 97.39 ± 0.59%. Conclusions: We validated by experiments that the proposed improved Jaya performs better than plain Jaya algorithm and other latest bioinspired algorithms in terms of classification performance and training speed. In addition, our method is superior to four state-of-the-art MS identification approaches.

  14. Entropy-based gene ranking without selection bias for the predictive classification of microarray data

    Directory of Open Access Journals (Sweden)

    Serafini Maria

    2003-11-01

    Full Text Available Abstract Background We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process. Results With E-RFE, we speed up the recursive feature elimination (RFE with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Conclusions Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  15. A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Shengzhi; Ming, Bo; Huang, Qiang; Leng, Guoyong; Hou, Beibei

    2017-05-05

    It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecasting models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.

  16. Analysis of QCD sum rule based on the maximum entropy method

    International Nuclear Information System (INIS)

    Gubler, Philipp

    2012-01-01

    QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)

  17. Entropy based quantification of Ki-67 positive cell images and its evaluation by a reader study

    Science.gov (United States)

    Niazi, M. Khalid Khan; Pennell, Michael; Elkins, Camille; Hemminger, Jessica; Jin, Ming; Kirby, Sean; Kurt, Habibe; Miller, Barrie; Plocharczyk, Elizabeth; Roth, Rachel; Ziegler, Rebecca; Shana'ah, Arwa; Racke, Fred; Lozanski, Gerard; Gurcan, Metin N.

    2013-03-01

    Presence of Ki-67, a nuclear protein, is typically used to measure cell proliferation. The quantification of the Ki-67 proliferation index is performed visually by the pathologist; however, this is subject to inter- and intra-reader variability. Automated techniques utilizing digital image analysis by computers have emerged. The large variations in specimen preparation, staining, and imaging as well as true biological heterogeneity of tumor tissue often results in variable intensities in Ki-67 stained images. These variations affect the performance of currently developed methods. To optimize the segmentation of Ki-67 stained cells, one should define a data dependent transformation that will account for these color variations instead of defining a fixed linear transformation to separate different hues. To address these issues in images of tissue stained with Ki-67, we propose a methodology that exploits the intrinsic properties of CIE L∗a∗b∗ color space to translate this complex problem into an automatic entropy based thresholding problem. The developed method was evaluated through two reader studies with pathology residents and expert hematopathologists. Agreement between the proposed method and the expert pathologists was good (CCC = 0.80).

  18. Information Entropy-Based Metrics for Measuring Emergences in Artificial Societies

    Directory of Open Access Journals (Sweden)

    Mingsheng Tang

    2014-08-01

    Full Text Available Emergence is a common phenomenon, and it is also a general and important concept in complex dynamic systems like artificial societies. Usually, artificial societies are used for assisting in resolving several complex social issues (e.g., emergency management, intelligent transportation system with the aid of computer science. The levels of an emergence may have an effect on decisions making, and the occurrence and degree of an emergence are generally perceived by human observers. However, due to the ambiguity and inaccuracy of human observers, to propose a quantitative method to measure emergences in artificial societies is a meaningful and challenging task. This article mainly concentrates upon three kinds of emergences in artificial societies, including emergence of attribution, emergence of behavior, and emergence of structure. Based on information entropy, three metrics have been proposed to measure emergences in a quantitative way. Meanwhile, the correctness of these metrics has been verified through three case studies (the spread of an infectious influenza, a dynamic microblog network, and a flock of birds with several experimental simulations on the Netlogo platform. These experimental results confirm that these metrics increase with the rising degree of emergences. In addition, this article also has discussed the limitations and extended applications of these metrics.

  19. Risk assessment of security systems based on entropy theory and the Neyman–Pearson criterion

    International Nuclear Information System (INIS)

    Lv, Haitao; Yin, Chao; Cui, Zongmin; Zhan, Qin; Zhou, Hongbo

    2015-01-01

    For a security system, the risk assessment is an important method to verdict whether its protection effectiveness is good or not. In this paper, a security system is regarded abstractly as a network by the name of a security network. A security network is made up of security nodes that are abstract functional units with the ability of detecting, delaying and responding. By the use of risk entropy and the Neyman–Pearson criterion, we construct a model to computer the protection probability of any position in the area where a security network is deployed. We provide a solution to find the most vulnerable path of a security network and the protection probability on the path is considered as the risk measure. Finally, we study the effect of some parameters on the risk and the breach protection probability of a security network. Ultimately, we can gain insight about the risk assessment of a security system. - Highlights: • A security system is regarded abstractly as a network made up of security nodes. • We construct a model to computer the protection probability provided by a security network. • We provide a better solution to find the most vulnerable path of a security network. • We build a risk assessment model for a security network based on the most vulnerable path

  20. Entropy maximization

    Indian Academy of Sciences (India)

    Abstract. It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf f that satisfy. ∫ fhi dμ = λi for i = 1, 2,...,...k the maximizer of entropy is an f0 that is pro- portional to exp(. ∑ ci hi ) for some choice of ci . An extension of this to a continuum of.

  1. Entropy Maximization

    Indian Academy of Sciences (India)

    It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf that satisfy ∫ f h i d = i for i = 1 , 2 , … , … k the maximizer of entropy is an f 0 that is proportional to exp ⁡ ( ∑ c i h i ) for some choice of c i . An extension of this to a continuum of ...

  2. Entropy? Honest!

    Directory of Open Access Journals (Sweden)

    Tommaso Toffoli

    2016-06-01

    Full Text Available Here we deconstruct, and then in a reasoned way reconstruct, the concept of “entropy of a system”, paying particular attention to where the randomness may be coming from. We start with the core concept of entropy as a count associated with a description; this count (traditionally expressed in logarithmic form for a number of good reasons is in essence the number of possibilities—specific instances or “scenarios”—that match that description. Very natural (and virtually inescapable generalizations of the idea of description are the probability distribution and its quantum mechanical counterpart, the density operator. We track the process of dynamically updating entropy as a system evolves. Three factors may cause entropy to change: (1 the system’s internal dynamics; (2 unsolicited external influences on it; and (3 the approximations one has to make when one tries to predict the system’s future state. The latter task is usually hampered by hard-to-quantify aspects of the original description, limited data storage and processing resource, and possibly algorithmic inadequacy. Factors 2 and 3 introduce randomness—often huge amounts of it—into one’s predictions and accordingly degrade them. When forecasting, as long as the entropy bookkeping is conducted in an honest fashion, this degradation will always lead to an entropy increase. To clarify the above point we introduce the notion of honest entropy, which coalesces much of what is of course already done, often tacitly, in responsible entropy-bookkeping practice. This notion—we believe—will help to fill an expressivity gap in scientific discourse. With its help, we shall prove that any dynamical system—not just our physical universe—strictly obeys Clausius’s original formulation of the second law of thermodynamics if and only if it is invertible. Thus this law is a tautological property of invertible systems!

  3. From Ecology to Finance (and Back?): A Review on Entropy-Based Null Models for the Analysis of Bipartite Networks

    Science.gov (United States)

    Straka, Mika J.; Caldarelli, Guido; Squartini, Tiziano; Saracco, Fabio

    2018-04-01

    Bipartite networks provide an insightful representation of many systems, ranging from mutualistic networks of species interactions to investment networks in finance. The analyses of their topological structures have revealed the ubiquitous presence of properties which seem to characterize many—apparently different—systems. Nestedness, for example, has been observed in biological plant-pollinator as well as in country-product exportation networks. Due to the interdisciplinary character of complex networks, tools developed in one field, for example ecology, can greatly enrich other areas of research, such as economy and finance, and vice versa. With this in mind, we briefly review several entropy-based bipartite null models that have been recently proposed and discuss their application to real-world systems. The focus on these models is motivated by the fact that they show three very desirable features: analytical character, general applicability, and versatility. In this respect, entropy-based methods have been proven to perform satisfactorily both in providing benchmarks for testing evidence-based null hypotheses and in reconstructing unknown network configurations from partial information. Furthermore, entropy-based models have been successfully employed to analyze ecological as well as economic systems. As an example, the application of entropy-based null models has detected early-warning signals, both in economic and financial systems, of the 2007-2008 world crisis. Moreover, they have revealed a statistically-significant export specialization phenomenon of country export baskets in international trade, a result that seems to reconcile Ricardo's hypothesis in classical economics with recent findings on the (empirical) diversification industrial production at the national level. Finally, these null models have shown that the information contained in the nestedness is already accounted for by the degree sequence of the corresponding graphs.

  4. Study on Droplet Size and Velocity Distributions of a Pressure Swirl Atomizer Based on the Maximum Entropy Formalism

    Directory of Open Access Journals (Sweden)

    Kai Yan

    2015-01-01

    Full Text Available A predictive model for droplet size and velocity distributions of a pressure swirl atomizer has been proposed based on the maximum entropy formalism (MEF. The constraint conditions of the MEF model include the conservation laws of mass, momentum, and energy. The effects of liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio on the droplet size and velocity distributions of a pressure swirl atomizer are investigated. Results show that model based on maximum entropy formalism works well to predict droplet size and velocity distributions under different spray conditions. Liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio have different effects on droplet size and velocity distributions of a pressure swirl atomizer.

  5. Rolling bearing fault diagnosis based on time-delayed feedback monostable stochastic resonance and adaptive minimum entropy deconvolution

    Science.gov (United States)

    Li, Jimeng; Li, Ming; Zhang, Jinfeng

    2017-08-01

    Rolling bearings are the key components in the modern machinery, and tough operation environments often make them prone to failure. However, due to the influence of the transmission path and background noise, the useful feature information relevant to the bearing fault contained in the vibration signals is weak, which makes it difficult to identify the fault symptom of rolling bearings in time. Therefore, the paper proposes a novel weak signal detection method based on time-delayed feedback monostable stochastic resonance (TFMSR) system and adaptive minimum entropy deconvolution (MED) to realize the fault diagnosis of rolling bearings. The MED method is employed to preprocess the vibration signals, which can deconvolve the effect of transmission path and clarify the defect-induced impulses. And a modified power spectrum kurtosis (MPSK) index is constructed to realize the adaptive selection of filter length in the MED algorithm. By introducing the time-delayed feedback item in to an over-damped monostable system, the TFMSR method can effectively utilize the historical information of input signal to enhance the periodicity of SR output, which is beneficial to the detection of periodic signal. Furthermore, the influence of time delay and feedback intensity on the SR phenomenon is analyzed, and by selecting appropriate time delay, feedback intensity and re-scaling ratio with genetic algorithm, the SR can be produced to realize the resonance detection of weak signal. The combination of the adaptive MED (AMED) method and TFMSR method is conducive to extracting the feature information from strong background noise and realizing the fault diagnosis of rolling bearings. Finally, some experiments and engineering application are performed to evaluate the effectiveness of the proposed AMED-TFMSR method in comparison with a traditional bistable SR method.

  6. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  7. [Analysis of the Muscle Fatigue Based on Band Spectrum Entropy of Multi-channel Surface Electromyography].

    Science.gov (United States)

    Liu, Jian; Zou, Renling; Zhang, Dongheng; Xu, Xiulin; Hu, Xiufang

    2016-06-01

    Exercise-induced muscle fatigue is a phenomenon that the maximum voluntary contraction force or power output of muscle is temporarily reduced due to muscular movement.If the fatigue is not treated properly,it will bring about a severe injury to the human body.With multi-channel collection of lower limb surface electromyography signals,this article analyzes the muscle fatigue by adoption of band spectrum entropy method which combined electromyographic signal spectral analysis and nonlinear dynamics.The experimental result indicated that with the increase of muscle fatigue,muscle signal spectrum began to move to low frequency,the energy concentrated,the system complexity came down,and the band spectrum entropy which reflected the complexity was also reduced.By monitoring the entropy,we can measure the degree of muscle fatigue,and provide an indicator to judge fatigue degree for the sports training and clinical rehabilitation training.

  8. Quantile-based Bayesian maximum entropy approach for spatiotemporal modeling of ambient air quality levels.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsin

    2013-02-05

    Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.

  9. Exploring the effects of climatic variables on monthly precipitation variation using a continuous wavelet-based multiscale entropy approach.

    Science.gov (United States)

    Roushangar, Kiyoumars; Alizadeh, Farhad; Adamowski, Jan

    2018-08-01

    Understanding precipitation on a regional basis is an important component of water resources planning and management. The present study outlines a methodology based on continuous wavelet transform (CWT) and multiscale entropy (CWME), combined with self-organizing map (SOM) and k-means clustering techniques, to measure and analyze the complexity of precipitation. Historical monthly precipitation data from 1960 to 2010 at 31 rain gauges across Iran were preprocessed by CWT. The multi-resolution CWT approach segregated the major features of the original precipitation series by unfolding the structure of the time series which was often ambiguous. The entropy concept was then applied to components obtained from CWT to measure dispersion, uncertainty, disorder, and diversification of subcomponents. Based on different validity indices, k-means clustering captured homogenous areas more accurately, and additional analysis was performed based on the outcome of this approach. The 31 rain gauges in this study were clustered into 6 groups, each one having a unique CWME pattern across different time scales. The results of clustering showed that hydrologic similarity (multiscale variation of precipitation) was not based on geographic contiguity. According to the pattern of entropy across the scales, each cluster was assigned an entropy signature that provided an estimation of the entropy pattern of precipitation data in each cluster. Based on the pattern of mean CWME for each cluster, a characteristic signature was assigned, which provided an estimation of the CWME of a cluster across scales of 1-2, 3-8, and 9-13 months relative to other stations. The validity of the homogeneous clusters demonstrated the usefulness of the proposed approach to regionalize precipitation. Further analysis based on wavelet coherence (WTC) was performed by selecting central rain gauges in each cluster and analyzing against temperature, wind, Multivariate ENSO index (MEI), and East Atlantic (EA) and

  10. Safety assessment of dangerous goods transport enterprise based on the relative entropy aggregation in group decision making model.

    Science.gov (United States)

    Wu, Jun; Li, Chengbing; Huo, Yueying

    2014-01-01

    Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.

  11. Detection System of HTTP DDoS Attacks in a Cloud Environment Based on Information Theoretic Entropy and Random Forest

    Directory of Open Access Journals (Sweden)

    Mohamed Idhammad

    2018-01-01

    Full Text Available Cloud Computing services are often delivered through HTTP protocol. This facilitates access to services and reduces costs for both providers and end-users. However, this increases the vulnerabilities of the Cloud services face to HTTP DDoS attacks. HTTP request methods are often used to address web servers’ vulnerabilities and create multiple scenarios of HTTP DDoS attack such as Low and Slow or Flooding attacks. Existing HTTP DDoS detection systems are challenged by the big amounts of network traffic generated by these attacks, low detection accuracy, and high false positive rates. In this paper we present a detection system of HTTP DDoS attacks in a Cloud environment based on Information Theoretic Entropy and Random Forest ensemble learning algorithm. A time-based sliding window algorithm is used to estimate the entropy of the network header features of the incoming network traffic. When the estimated entropy exceeds its normal range the preprocessing and the classification tasks are triggered. To assess the proposed approach various experiments were performed on the CIDDS-001 public dataset. The proposed approach achieves satisfactory results with an accuracy of 99.54%, a FPR of 0.4%, and a running time of 18.5s.

  12. CoFea: A Novel Approach to Spam Review Identification Based on Entropy and Co-Training

    Directory of Open Access Journals (Sweden)

    Wen Zhang

    2016-11-01

    Full Text Available With the rapid development of electronic commerce, spam reviews are rapidly growing on the Internet to manipulate online customers’ opinions on goods being sold. This paper proposes a novel approach, called CoFea (Co-training by Features, to identify spam reviews, based on entropy and the co-training algorithm. After sorting all lexical terms of reviews by entropy, we produce two views on the reviews by dividing the lexical terms into two subsets. One subset contains odd-numbered terms and the other contains even-numbered terms. Using SVM (support vector machine as the base classifier, we further propose two strategies, CoFea-T and CoFea-S, embedded with the CoFea approach. The CoFea-T strategy uses all terms in the subsets for spam review identification by SVM. The CoFea-S strategy uses a predefined number of terms with small entropy for spam review identification by SVM. The experiment results show that the CoFea-T strategy produces better accuracy than the CoFea-S strategy, while the CoFea-S strategy saves more computing time than the CoFea-T strategy with acceptable accuracy in spam review identification.

  13. Entropy for Mechanically Vibrating Systems

    Science.gov (United States)

    Tufano, Dante

    , which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.

  14. Noisy EEG signals classification based on entropy metrics. Performance assessment using first and second generation statistics.

    Science.gov (United States)

    Cuesta-Frau, David; Miró-Martínez, Pau; Jordán Núñez, Jorge; Oltra-Crespo, Sandra; Molina Picó, Antonio

    2017-08-01

    This paper evaluates the performance of first generation entropy metrics, featured by the well known and widely used Approximate Entropy (ApEn) and Sample Entropy (SampEn) metrics, and what can be considered an evolution from these, Fuzzy Entropy (FuzzyEn), in the Electroencephalogram (EEG) signal classification context. The study uses the commonest artifacts found in real EEGs, such as white noise, and muscular, cardiac, and ocular artifacts. Using two different sets of publicly available EEG records, and a realistic range of amplitudes for interfering artifacts, this work optimises and assesses the robustness of these metrics against artifacts in class segmentation terms probability. The results show that the qualitative behaviour of the two datasets is similar, with SampEn and FuzzyEn performing the best, and the noise and muscular artifacts are the most confounding factors. On the contrary, there is a wide variability as regards initialization parameters. The poor performance achieved by ApEn suggests that this metric should not be used in these contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Entropy: From Thermodynamics to Hydrology

    Directory of Open Access Journals (Sweden)

    Demetris Koutsoyiannis

    2014-02-01

    Full Text Available Some known results from statistical thermophysics as well as from hydrology are revisited from a different perspective trying: (a to unify the notion of entropy in thermodynamic and statistical/stochastic approaches of complex hydrological systems and (b to show the power of entropy and the principle of maximum entropy in inference, both deductive and inductive. The capability for deductive reasoning is illustrated by deriving the law of phase change transition of water (Clausius-Clapeyron from scratch by maximizing entropy in a formal probabilistic frame. However, such deductive reasoning cannot work in more complex hydrological systems with diverse elements, yet the entropy maximization framework can help in inductive inference, necessarily based on data. Several examples of this type are provided in an attempt to link statistical thermophysics with hydrology with a unifying view of entropy.

  16. Quantitative comparison of entropy analysis of fetal heart rate variability related to the different stages of labor.

    Science.gov (United States)

    Lim, Jongil; Kwon, Ji Young; Song, Juhee; Choi, Hosoon; Shin, Jong Chul; Park, In Yang

    2014-02-01

    The interpretation of the fetal heart rate (FHR) signal considering labor progression may improve perinatal morbidity and mortality. However, there have been few studies that evaluate the fetus in each labor stage quantitatively. To evaluate whether the entropy indices of FHR are different according to labor progression. A retrospective comparative study of FHR recordings in three groups: 280 recordings in the second stage of labor before vaginal delivery, 31 recordings in the first stage of labor before emergency cesarean delivery, and 23 recordings in the pre-labor before elective cesarean delivery. The stored FHR recordings of external cardiotocography during labor. Approximate entropy (ApEn) and sample entropy (SampEn) for the final 2000 RR intervals. The median ApEn and SampEn for the 2000 RR intervals showed the lowest values in the second stage of labor, followed by the emergency cesarean group and the elective cesarean group for all time segments (all PEntropy indices of FHR were significantly different according to labor progression. This result supports the necessity of considering labor progression when developing intrapartum fetal monitoring using the entropy indices of FHR. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Taxi trips distribution modeling based on Entropy-Maximizing theory: A case study in Harbin city-China

    Science.gov (United States)

    Tang, Jinjun; Zhang, Shen; Chen, Xinqiang; Liu, Fang; Zou, Yajie

    2018-03-01

    Understanding Origin-Destination distribution of taxi trips is very important for improving effects of transportation planning and enhancing quality of taxi services. This study proposes a new method based on Entropy-Maximizing theory to model OD distribution in Harbin city using large-scale taxi GPS trajectories. Firstly, a K-means clustering method is utilized to partition raw pick-up and drop-off location into different zones, and trips are assumed to start from and end at zone centers. A generalized cost function is further defined by considering travel distance, time and fee between each OD pair. GPS data collected from more than 1000 taxis at an interval of 30 s during one month are divided into two parts: data from first twenty days is treated as training dataset and last ten days is taken as testing dataset. The training dataset is used to calibrate model while testing dataset is used to validate model. Furthermore, three indicators, mean absolute error (MAE), root mean square error (RMSE) and mean percentage absolute error (MPAE), are applied to evaluate training and testing performance of Entropy-Maximizing model versus Gravity model. The results demonstrate Entropy-Maximizing model is superior to Gravity model. Findings of the study are used to validate the feasibility of OD distribution from taxi GPS data in urban system.

  18. Alloying behavior of iron, gold and silver in AlCoCrCuNi-based equimolar high-entropy alloys

    International Nuclear Information System (INIS)

    Hsu, U.S.; Hung, U.D.; Yeh, J.W.; Chen, S.K.; Huang, Y.S.; Yang, C.C.

    2007-01-01

    High-entropy alloys are newly developed alloys that are composed, by definition, of at least five principal elements with concentrations in the range of 5-35 at.%. Therefore, the alloying behavior of any given principal element is significantly affected by all the other principal elements present. In order to elucidate this further, the influence of iron, silver and gold addition on the microstructure and hardness of AlCoCrCuNi-based equimolar alloys has been examined. The as-cast AlCoCrCuNi base alloy is found to have a dendritic structure, of which only solid solution FCC and BCC phases can be observed. The BCC dendrite has a chemical composition close to that of the nominal alloy, with a deficiency in copper however, which is found to segregate and form a FCC Cu-rich interdendrite. The microstructure of the iron containing alloys is similar to that of the base alloy. It is found that both of these aforementioned alloys have hardnesses of about 420 HV, which is equated to their similar microstructures. The as-cast ingot forms two layers of distinct composition with the addition of silver. These layers, which are gold and silver in color, are determined to have a hypoeutectic Ag-Cu composition and a multielement mixture of the other principal elements, respectively. This indicates the chemical incompatibility of silver with the other principal elements. The hardnesses of the gold (104 HV) and silver layers (451 HV) are the lowest and highest of the alloy systems studied. This is attributed to the hypoeutectic Ag-Cu composition of the former and the reduced copper content of the latter. Only multielement mixtures, i.e. without copper segregation, form in the gold containing alloy. Thus, it may be said that gold acts as a 'mixing agent' between copper and the other elements. Although several of the atom pairs in the gold containing alloy have positive enthalpies, thermodynamic considerations show that the high entropy contribution is sufficient to counterbalance

  19. Relation Entropy and Transferable Entropy Think of Aggregation on Group Decision Making

    Institute of Scientific and Technical Information of China (English)

    CHENG Qi-yue; QIU Wan-hua; LIU Xiao-feng

    2002-01-01

    In this paper, aggregation question based on group decision making and a single decision making is studied. The theory of entropy is applied to the sets pair analysis. The system of relation entropy and the transferable entropy notion are put. The character is studied. An potential by the relation entropy and transferable entropy are defined. It is the consistency measure on the group between a single decision making. We gained a new aggregation effective definition on the group misjudge.

  20. Logarithmic black hole entropy corrections and holographic Renyi entropy

    Energy Technology Data Exchange (ETDEWEB)

    Mahapatra, Subhash [The Institute of Mathematical Sciences, Chennai (India); KU Leuven - KULAK, Department of Physics, Kortrijk (Belgium)

    2018-01-15

    The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G{sub D}{sup 0}. The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)

  1. Logarithmic black hole entropy corrections and holographic Renyi entropy

    International Nuclear Information System (INIS)

    Mahapatra, Subhash

    2018-01-01

    The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G D 0 . The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)

  2. Relationship between Entropy and Dimension of Financial Correlation-Based Network

    Directory of Open Access Journals (Sweden)

    Chun-xiao Nie

    2018-03-01

    Full Text Available We analyze the dimension of a financial correlation-based network and apply our analysis to characterize the complexity of the network. First, we generalize the volume-based dimension and find that it is well defined by the correlation-based network. Second, we establish the relationship between the Rényi index and the volume-based dimension. Third, we analyze the meaning of the dimensions sequence, which characterizes the level of departure from the comparison benchmark based on the randomized time series. Finally, we use real stock market data from three countries for empirical analysis. In some cases, our proposed analysis method can more accurately capture the structural differences of networks than the power law index commonly used in previous studies.

  3. Evaluation of single and multi-threshold entropy-based algorithms for folded substrate analysis

    Directory of Open Access Journals (Sweden)

    Magdolna Apro

    2011-10-01

    Full Text Available This paper presents a detailed evaluation of two variants of Maximum Entropy image segmentation algorithm(single and multi-thresholding with respect to their performance on segmenting test images showing folded substrates.The segmentation quality was determined by evaluating values of four different measures: misclassificationerror, modified Hausdorff distance, relative foreground area error and positive-negative false detection ratio. Newnormalization methods were proposed in order to combine all parameters into a unique algorithm evaluation rating.The segmentation algorithms were tested on images obtained by three different digitalisation methods coveringfour different surface textures. In addition, the methods were also tested on three images presenting a perfect fold.The obtained results showed that Multi-Maximum Entropy algorithm is better suited for the analysis of imagesshowing folded substrates.

  4. Analysis of calculating methods for failure distribution function based on maximal entropy principle

    International Nuclear Information System (INIS)

    Guo Chunying; Lin Yuangen; Jiang Meng; Wu Changli

    2009-01-01

    The computation of invalidation distribution functions of electronic devices when exposed in gamma rays is discussed here. First, the possible devices failure distribution models are determined through the tests of statistical hypotheses using the test data. The results show that: the devices' failure distribution can obey multi-distributions when the test data is few. In order to decide the optimum failure distribution model, the maximal entropy principle is used and the elementary failure models are determined. Then, the Bootstrap estimation method is used to simulate the intervals estimation of the mean and the standard deviation. On the basis of this, the maximal entropy principle is used again and the simulated annealing method is applied to find the optimum values of the mean and the standard deviation. Accordingly, the electronic devices' optimum failure distributions are finally determined and the survival probabilities are calculated. (authors)

  5. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Ge Cheng

    2016-12-01

    Full Text Available Predicting the outcome of National Basketball Association (NBA matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that the model is able to predict the winning team with 74.4% accuracy, outperforming other classical machine learning algorithms that could only afford a maximum prediction accuracy of 70.6% in the experiments that we performed.

  6. Using entropy measures to characterize human locomotion.

    Science.gov (United States)

    Leverick, Graham; Szturm, Tony; Wu, Christine Q

    2014-12-01

    Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.

  7. Regional Sustainable Development Analysis Based on Information Entropy-Sichuan Province as an Example.

    Science.gov (United States)

    Liang, Xuedong; Si, Dongyang; Zhang, Xinli

    2017-10-13

    According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.

  8. Land quality, sustainable development and environmental degradation in agricultural districts: A computational approach based on entropy indexes

    International Nuclear Information System (INIS)

    Zambon, Ilaria; Colantoni, Andrea; Carlucci, Margherita; Morrow, Nathan; Sateriano, Adele; Salvati, Luca

    2017-01-01

    Land Degradation (LD) in socio-environmental systems negatively impacts sustainable development paths. This study proposes a framework to LD evaluation based on indicators of diversification in the spatial distribution of sensitive land. We hypothesize that conditions for spatial heterogeneity in a composite index of land sensitivity are more frequently associated to areas prone to LD than spatial homogeneity. Spatial heterogeneity is supposed to be associated with degraded areas that act as hotspots for future degradation processes. A diachronic analysis (1960–2010) was performed at the Italian agricultural district scale to identify environmental factors associated with spatial heterogeneity in the degree of land sensitivity to degradation based on the Environmentally Sensitive Area Index (ESAI). In 1960, diversification in the level of land sensitivity measured using two common indexes of entropy (Shannon's diversity and Pielou's evenness) increased significantly with the ESAI, indicating a high level of land sensitivity to degradation. In 2010, surface area classified as “critical” to LD was the highest in districts with diversification in the spatial distribution of ESAI values, confirming the hypothesis formulated above. Entropy indexes, based on observed alignment with the concept of LD, constitute a valuable base to inform mitigation strategies against desertification. - Highlights: • Spatial heterogeneity is supposed to be associated with degraded areas. • Entropy indexes can inform mitigation strategies against desertification. • Assessing spatial diversification in the degree of land sensitivity to degradation. • Mediterranean rural areas have an evident diversity in agricultural systems. • A diachronic analysis carried out at the Italian agricultural district scale.

  9. Land quality, sustainable development and environmental degradation in agricultural districts: A computational approach based on entropy indexes

    Energy Technology Data Exchange (ETDEWEB)

    Zambon, Ilaria, E-mail: ilaria.zambon@unitus.it [Department of Agricultural and Forestry scieNcEs (DAFNE), Tuscia University, Via S. Camillo de Lellis, I-01100 Viterbo (Italy); Colantoni, Andrea [Department of Agricultural and Forestry scieNcEs (DAFNE), Tuscia University, Via S. Camillo de Lellis, I-01100 Viterbo (Italy); Carlucci, Margherita [Department of Social and Economic Science, University of Rome La Sapienza, Piazzale A. Moro 5, I-00185 Rome (Italy); Morrow, Nathan [Tulane University, Payson Program in International Development at the School of Law, New Orleans (United States); Sateriano, Adele; Salvati, Luca [Italian Council for Agricultural Research and Economics (CREA-RPS), Via della Navicella 2-4, I-00184 Rome (Italy)

    2017-05-15

    Land Degradation (LD) in socio-environmental systems negatively impacts sustainable development paths. This study proposes a framework to LD evaluation based on indicators of diversification in the spatial distribution of sensitive land. We hypothesize that conditions for spatial heterogeneity in a composite index of land sensitivity are more frequently associated to areas prone to LD than spatial homogeneity. Spatial heterogeneity is supposed to be associated with degraded areas that act as hotspots for future degradation processes. A diachronic analysis (1960–2010) was performed at the Italian agricultural district scale to identify environmental factors associated with spatial heterogeneity in the degree of land sensitivity to degradation based on the Environmentally Sensitive Area Index (ESAI). In 1960, diversification in the level of land sensitivity measured using two common indexes of entropy (Shannon's diversity and Pielou's evenness) increased significantly with the ESAI, indicating a high level of land sensitivity to degradation. In 2010, surface area classified as “critical” to LD was the highest in districts with diversification in the spatial distribution of ESAI values, confirming the hypothesis formulated above. Entropy indexes, based on observed alignment with the concept of LD, constitute a valuable base to inform mitigation strategies against desertification. - Highlights: • Spatial heterogeneity is supposed to be associated with degraded areas. • Entropy indexes can inform mitigation strategies against desertification. • Assessing spatial diversification in the degree of land sensitivity to degradation. • Mediterranean rural areas have an evident diversity in agricultural systems. • A diachronic analysis carried out at the Italian agricultural district scale.

  10. An Entropy-Based Kernel Learning Scheme toward Efficient Data Prediction in Cloud-Assisted Network Environments

    Directory of Open Access Journals (Sweden)

    Xiong Luo

    2016-07-01

    Full Text Available With the recent emergence of wireless sensor networks (WSNs in the cloud computing environment, it is now possible to monitor and gather physical information via lots of sensor nodes to meet the requirements of cloud services. Generally, those sensor nodes collect data and send data to sink node where end-users can query all the information and achieve cloud applications. Currently, one of the main disadvantages in the sensor nodes is that they are with limited physical performance relating to less memory for storage and less source of power. Therefore, in order to avoid such limitation, it is necessary to develop an efficient data prediction method in WSN. To serve this purpose, by reducing the redundant data transmission between sensor nodes and sink node while maintaining the required acceptable errors, this article proposes an entropy-based learning scheme for data prediction through the use of kernel least mean square (KLMS algorithm. The proposed scheme called E-KLMS develops a mechanism to maintain the predicted data synchronous at both sides. Specifically, the kernel-based method is able to adjust the coefficients adaptively in accordance with every input, which will achieve a better performance with smaller prediction errors, while employing information entropy to remove these data which may cause relatively large errors. E-KLMS can effectively solve the tradeoff problem between prediction accuracy and computational efforts while greatly simplifying the training structure compared with some other data prediction approaches. What’s more, the kernel-based method and entropy technique could ensure the prediction effect by both improving the accuracy and reducing errors. Experiments with some real data sets have been carried out to validate the efficiency and effectiveness of E-KLMS learning scheme, and the experiment results show advantages of the our method in prediction accuracy and computational time.

  11. A gravitational entropy proposal

    International Nuclear Information System (INIS)

    Clifton, Timothy; Tavakol, Reza; Ellis, George F R

    2013-01-01

    We propose a thermodynamically motivated measure of gravitational entropy based on the Bel–Robinson tensor, which has a natural interpretation as the effective super-energy–momentum tensor of free gravitational fields. The specific form of this measure differs depending on whether the gravitational field is Coulomb-like or wave-like, and reduces to the Bekenstein–Hawking value when integrated over the interior of a Schwarzschild black hole. For scalar perturbations of a Robertson–Walker geometry we find that the entropy goes like the Hubble weighted anisotropy of the gravitational field, and therefore increases as structure formation occurs. This is in keeping with our expectations for the behaviour of gravitational entropy in cosmology, and provides a thermodynamically motivated arrow of time for cosmological solutions of Einstein’s field equations. It is also in keeping with Penrose’s Weyl curvature hypothesis. (paper)

  12. Microscopic entropy and nonlocality

    International Nuclear Information System (INIS)

    Karpov, E.; Ordonets, G.; Petroskij, T.; Prigozhin, I.

    2003-01-01

    We have obtained a microscopic expression for entropy in terms of H function based on nonunitary Λ transformation which leads from the time evolution as a unitary group to a Markovian dynamics and unifies the reversible and irreversible aspects of quantum mechanics. This requires a new representation outside the Hilbert space. In terms of H, we show the entropy production and the entropy flow during the emission and absorption of radiation by an atom. Analyzing the time inversion experiment, we emphasize the importance of pre- and postcollisional correlations, which break the symmetry between incoming and outgoing waves. We consider the angle dependence of the H function in a three-dimensional situation. A model including virtual transitions is discussed in a subsequent paper

  13. Fault diagnosis technology of nuclear power plant based on weighted degree of grey incidence of optimized entropy

    International Nuclear Information System (INIS)

    Kong Yan; Li Zhenjie; Ren Xin; Wang Chuan

    2012-01-01

    Nuclear power plants (NPPs) are very complex grey system, in which faults and signs have not certain corresponding connection, so it's hard to diagnose the faults. A model based on weighted degree of grey incidence of optimized entropy was proposed according to the problem. To validate the system, some simulation experiments about the typical faults of condenser of NPPs were conducted. The results show that the system's conclusion is right, and the system's velocity is fast which can satisfy diagnosis in real time, and with the distinctive features such as good stability, high resolution rate and so on. (authors)

  14. Order and correlation contributions to the entropy of hydrophobic solvation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Maoyuan; Besford, Quinn Alexander; Mulvaney, Thomas; Gray-Weale, Angus, E-mail: gusgw@gusgw.net [School of Chemistry, The University of Melbourne, Victoria 3010 (Australia)

    2015-03-21

    The entropy of hydrophobic solvation has been explained as the result of ordered solvation structures, of hydrogen bonds, of the small size of the water molecule, of dispersion forces, and of solvent density fluctuations. We report a new approach to the calculation of the entropy of hydrophobic solvation, along with tests of and comparisons to several other methods. The methods are assessed in the light of the available thermodynamic and spectroscopic information on the effects of temperature on hydrophobic solvation. Five model hydrophobes in SPC/E water give benchmark solvation entropies via Widom’s test-particle insertion method, and other methods and models are tested against these particle-insertion results. Entropies associated with distributions of tetrahedral order, of electric field, and of solvent dipole orientations are examined. We find these contributions are small compared to the benchmark particle-insertion entropy. Competitive with or better than other theories in accuracy, but with no free parameters, is the new estimate of the entropy contributed by correlations between dipole moments. Dipole correlations account for most of the hydrophobic solvation entropy for all models studied and capture the distinctive temperature dependence seen in thermodynamic and spectroscopic experiments. Entropies based on pair and many-body correlations in number density approach the correct magnitudes but fail to describe temperature and size dependences, respectively. Hydrogen-bond definitions and free energies that best reproduce entropies from simulations are reported, but it is difficult to choose one hydrogen bond model that fits a variety of experiments. The use of information theory, scaled-particle theory, and related methods is discussed briefly. Our results provide a test of the Frank-Evans hypothesis that the negative solvation entropy is due to structured water near the solute, complement the spectroscopic detection of that solvation structure by

  15. A New Feature Extraction Method Based on EEMD and Multi-Scale Fuzzy Entropy for Motor Bearing

    Directory of Open Access Journals (Sweden)

    Huimin Zhao

    2016-12-01

    Full Text Available Feature extraction is one of the most important, pivotal, and difficult problems in mechanical fault diagnosis, which directly relates to the accuracy of fault diagnosis and the reliability of early fault prediction. Therefore, a new fault feature extraction method, called the EDOMFE method based on integrating ensemble empirical mode decomposition (EEMD, mode selection, and multi-scale fuzzy entropy is proposed to accurately diagnose fault in this paper. The EEMD method is used to decompose the vibration signal into a series of intrinsic mode functions (IMFs with a different physical significance. The correlation coefficient analysis method is used to calculate and determine three improved IMFs, which are close to the original signal. The multi-scale fuzzy entropy with the ability of effective distinguishing the complexity of different signals is used to calculate the entropy values of the selected three IMFs in order to form a feature vector with the complexity measure, which is regarded as the inputs of the support vector machine (SVM model for training and constructing a SVM classifier (EOMSMFD based on EDOMFE and SVM for fulfilling fault pattern recognition. Finally, the effectiveness of the proposed method is validated by real bearing vibration signals of the motor with different loads and fault severities. The experiment results show that the proposed EDOMFE method can effectively extract fault features from the vibration signal and that the proposed EOMSMFD method can accurately diagnose the fault types and fault severities for the inner race fault, the outer race fault, and rolling element fault of the motor bearing. Therefore, the proposed method provides a new fault diagnosis technology for rotating machinery.

  16. A comparison of entropy balance and probability weighting methods to generalize observational cohorts to a population: a simulation and empirical example.

    Science.gov (United States)

    Harvey, Raymond A; Hayden, Jennifer D; Kamble, Pravin S; Bouchard, Jonathan R; Huang, Joanna C

    2017-04-01

    We compared methods to control bias and confounding in observational studies including inverse probability weighting (IPW) and stabilized IPW (sIPW). These methods often require iteration and post-calibration to achieve covariate balance. In comparison, entropy balance (EB) optimizes covariate balance a priori by calibrating weights using the target's moments as constraints. We measured covariate balance empirically and by simulation by using absolute standardized mean difference (ASMD), absolute bias (AB), and root mean square error (RMSE), investigating two scenarios: the size of the observed (exposed) cohort exceeds the target (unexposed) cohort and vice versa. The empirical application weighted a commercial health plan cohort to a nationally representative National Health and Nutrition Examination Survey target on the same covariates and compared average total health care cost estimates across methods. Entropy balance alone achieved balance (ASMD ≤ 0.10) on all covariates in simulation and empirically. In simulation scenario I, EB achieved the lowest AB and RMSE (13.64, 31.19) compared with IPW (263.05, 263.99) and sIPW (319.91, 320.71). In scenario II, EB outperformed IPW and sIPW with smaller AB and RMSE. In scenarios I and II, EB achieved the lowest mean estimate difference from the simulated population outcome ($490.05, $487.62) compared with IPW and sIPW, respectively. Empirically, only EB differed from the unweighted mean cost indicating IPW, and sIPW weighting was ineffective. Entropy balance demonstrated the bias-variance tradeoff achieving higher estimate accuracy, yet lower estimate precision, compared with IPW methods. EB weighting required no post-processing and effectively mitigated observed bias and confounding. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Entropy of a system formed in the collision of heavy ions

    International Nuclear Information System (INIS)

    Gudima, K.K.; Roepke, G.; Toneev, V.D.; Schulz, H.

    1987-01-01

    In the framework of the cascade model, we study the evolution of the entropy of a system formed in the collision of heavy ions. The method of calculating the entropy is based on a smoothing of the momentum distribution function by means of introducing a temperature field. It is shown that the resulting entropy per nucleon is very sensitive to the specific partitioning of phase space into cells in the free-expansion phase of the reaction. From comparison with experiment it is found that the cascade calculations do not show a preference for a particular model calculation of the entropy, but predict that the entropy is smaller than the values following from equilibrium statistics

  18. Evaluation index system of steel industry sustainable development based on entropy method and topsis method

    Science.gov (United States)

    Ronglian, Yuan; Mingye, Ai; Qiaona, Jia; Yuxuan, Liu

    2018-03-01

    Sustainable development is the only way for the development of human society. As an important part of the national economy, the steel industry is an energy-intensive industry and needs to go further for sustainable development. In this paper, we use entropy method and Topsis method to evaluate the development of China’s steel industry during the “12th Five-Year Plan” from four aspects: resource utilization efficiency, main energy and material consumption, pollution status and resource reuse rate. And we also put forward some suggestions for the development of China’s steel industry.

  19. DYNAMIC PARAMETER ESTIMATION BASED ON MINIMUM CROSS-ENTROPY METHOD FOR COMBINING INFORMATION SOURCES

    Czech Academy of Sciences Publication Activity Database

    Sečkárová, Vladimíra

    2015-01-01

    Roč. 24, č. 5 (2015), s. 181-188 ISSN 0204-9805. [XVI-th International Summer Conference on Probability and Statistics (ISCPS-2014). Pomorie, 21.6.-29.6.2014] R&D Projects: GA ČR GA13-13502S Grant - others:GA UK(CZ) SVV 260225/2015 Institutional support: RVO:67985556 Keywords : minimum cross- entropy principle * Kullback-Leibler divergence * dynamic diffusion estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/seckarova-0445817.pdf

  20. Comprehensive benefits analysis of steel structure modular residence based on the entropy evaluation

    Science.gov (United States)

    Zhang, Xiaoxiao; Wang, Li; Jiang, Pengming

    2017-04-01

    Steel structure modular residence is the outstanding residential industrialization. It has many advantages, such as the low whole cost, high resource recovery, a high degree of industrialization. This paper compares the comprehensive benefits of steel structural in modular buildings with prefabricated reinforced concrete residential from economic benefits, environmental benefits, social benefits and technical benefits by the method of entropy evaluation. Finally, it is concluded that the comprehensive benefits of steel structural in modular buildings is better than that of prefabricated reinforced concrete residential. The conclusion of this study will provide certain reference significance to the development of steel structural in modular buildings in China.

  1. Multivariate refined composite multiscale entropy analysis

    International Nuclear Information System (INIS)

    Humeau-Heurtier, Anne

    2016-01-01

    Multiscale entropy (MSE) has become a prevailing method to quantify signals complexity. MSE relies on sample entropy. However, MSE may yield imprecise complexity estimation at large scales, because sample entropy does not give precise estimation of entropy when short signals are processed. A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. Nevertheless, RCMSE is for univariate signals only. The simultaneous analysis of multi-channel (multivariate) data often over-performs studies based on univariate signals. We therefore introduce an extension of RCMSE to multivariate data. Applications of multivariate RCMSE to simulated processes reveal its better performances over the standard multivariate MSE. - Highlights: • Multiscale entropy quantifies data complexity but may be inaccurate at large scale. • A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. • Nevertheless, RCMSE is adapted to univariate time series only. • We herein introduce an extension of RCMSE to multivariate data. • It shows better performances than the standard multivariate multiscale entropy.

  2. A Cross-Entropy-Based Admission Control Optimization Approach for Heterogeneous Virtual Machine Placement in Public Clouds

    Directory of Open Access Journals (Sweden)

    Li Pan

    2016-03-01

    Full Text Available Virtualization technologies make it possible for cloud providers to consolidate multiple IaaS provisions into a single server in the form of virtual machines (VMs. Additionally, in order to fulfill the divergent service requirements from multiple users, a cloud provider needs to offer several types of VM instances, which are associated with varying configurations and performance, as well as different prices. In such a heterogeneous virtual machine placement process, one significant problem faced by a cloud provider is how to optimally accept and place multiple VM service requests into its cloud data centers to achieve revenue maximization. To address this issue, in this paper, we first formulate such a revenue maximization problem during VM admission control as a multiple-dimensional knapsack problem, which is known to be NP-hard to solve. Then, we propose to use a cross-entropy-based optimization approach to address this revenue maximization problem, by obtaining a near-optimal eligible set for the provider to accept into its data centers, from the waiting VM service requests in the system. Finally, through extensive experiments and measurements in a simulated environment with the settings of VM instance classes derived from real-world cloud systems, we show that our proposed cross-entropy-based admission control optimization algorithm is efficient and effective in maximizing cloud providers’ revenue in a public cloud computing environment.

  3. An EGR performance evaluation and decision-making approach based on grey theory and grey entropy analysis.

    Science.gov (United States)

    Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan

    2018-01-01

    Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.

  4. Identification of small-scale discontinuities based on dip-oriented gradient energy entropy coherence estimation

    Science.gov (United States)

    Peng, Da; Yin, Cheng

    2017-09-01

    Locating small-scale discontinuities is one of the most challenging geophysical tasks; these subtle geological features are significant since they are often associated with subsurface petroleum traps. Subtle faults, fractures, unconformities, reef textures, channel boundaries, thin-bed boundaries and other structural and stratigraphic discontinuities have subtle geological edges which may provide lateral variation in seismic expression. Among the different geophysical techniques available, 3D seismic discontinuity attributes are particularly useful for highlighting discontinuities in the seismic data. Traditional seismic discontinuity attributes are sensitive to noise and are not very appropriate for detecting small-scale discontinuities. Thus, we present a dip-oriented gradient energy entropy (DOGEE) coherence estimation method to detect subtle faults and structural features. The DOGEE coherence estimation method uses the gradient structure tensor (GST) algorithm to obtain local dip information and construct a gradient correlation matrix to calculate gradient energy entropy. The proposed DOGEE coherence estimation method is robust to noise, and also improves the clarity of fault edges. It is effective for small-scale discontinuity characterisation and interpretation.

  5. Investigating the Thermodynamic Performances of TO-Based Metamaterial Tunable Cells with an Entropy Generation Approach

    Directory of Open Access Journals (Sweden)

    Guoqiang Xu

    2017-10-01

    Full Text Available Active control of heat flux can be realized with transformation optics (TO thermal metamaterials. Recently, a new class of metamaterial tunable cells has been proposed, aiming to significantly reduce the difficulty of fabrication and to flexibly switch functions by employing several cells assembled on related positions following the TO design. However, owing to the integration and rotation of materials in tunable cells, they might lead to extra thermal losses as compared with the previous continuum design. This paper focuses on investigating the thermodynamic properties of tunable cells under related design parameters. The universal expression for the local entropy generation rate in such metamaterial systems is obtained considering the influence of rotation. A series of contrast schemes are established to describe the thermodynamic process and thermal energy distributions from the viewpoint of entropy analysis. Moreover, effects of design parameters on thermal dissipations and system irreversibility are investigated. In conclusion, more thermal dissipations and stronger thermodynamic processes occur in a system with larger conductivity ratios and rotation angles. This paper presents a detailed description of the thermodynamic properties of metamaterial tunable cells and provides reference for selecting appropriate design parameters on related positions to fabricate more efficient and energy-economical switchable TO devices.

  6. Entropy Based Test Point Evaluation and Selection Method for Analog Circuit Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Yuan Gao

    2014-01-01

    Full Text Available By simplifying tolerance problem and treating faulty voltages on different test points as independent variables, integer-coded table technique is proposed to simplify the test point selection process. Usually, simplifying tolerance problem may induce a wrong solution while the independence assumption will result in over conservative result. To address these problems, the tolerance problem is thoroughly considered in this paper, and dependency relationship between different test points is considered at the same time. A heuristic graph search method is proposed to facilitate the test point selection process. First, the information theoretic concept of entropy is used to evaluate the optimality of test point. The entropy is calculated by using the ambiguous sets and faulty voltage distribution, determined by component tolerance. Second, the selected optimal test point is used to expand current graph node by using dependence relationship between the test point and graph node. Simulated results indicate that the proposed method more accurately finds the optimal set of test points than other methods; therefore, it is a good solution to minimize the size of the test point set. To simplify and clarify the proposed method, only catastrophic and some specific parametric faults are discussed in this paper.

  7. Investigating dynamical complexity in the magnetosphere using various entropy measures

    Science.gov (United States)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather

  8. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  9. Dynamic Cross-Entropy.

    Science.gov (United States)

    Aur, Dorian; Vila-Rodriguez, Fidel

    2017-01-01

    Complexity measures for time series have been used in many applications to quantify the regularity of one dimensional time series, however many dynamical systems are spatially distributed multidimensional systems. We introduced Dynamic Cross-Entropy (DCE) a novel multidimensional complexity measure that quantifies the degree of regularity of EEG signals in selected frequency bands. Time series generated by discrete logistic equations with varying control parameter r are used to test DCE measures. Sliding window DCE analyses are able to reveal specific period doubling bifurcations that lead to chaos. A similar behavior can be observed in seizures triggered by electroconvulsive therapy (ECT). Sample entropy data show the level of signal complexity in different phases of the ictal ECT. The transition to irregular activity is preceded by the occurrence of cyclic regular behavior. A significant increase of DCE values in successive order from high frequencies in gamma to low frequencies in delta band reveals several phase transitions into less ordered states, possible chaos in the human brain. To our knowledge there are no reliable techniques able to reveal the transition to chaos in case of multidimensional times series. In addition, DCE based on sample entropy appears to be robust to EEG artifacts compared to DCE based on Shannon entropy. The applied technique may offer new approaches to better understand nonlinear brain activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Optimized Kernel Entropy Components.

    Science.gov (United States)

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  11. Assessing suitable area for Acacia dealbata Mill. in the Ceira River Basin (Central Portugal based on maximum entropy modelling approach

    Directory of Open Access Journals (Sweden)

    Jorge Pereira

    2015-12-01

    Full Text Available Biological invasion by exotic organisms became a key issue, a concern associated to the deep impacts on several domains described as resultant from such processes. A better understanding of the processes, the identification of more susceptible areas, and the definition of preventive or mitigation measures are identified as critical for the purpose of reducing associated impacts. The use of species distribution modeling might help on the purpose of identifying areas that are more susceptible to invasion. This paper aims to present preliminary results on assessing the susceptibility to invasion by the exotic species Acacia dealbata Mill. in the Ceira river basin. The results are based on the maximum entropy modeling approach, considered one of the correlative modelling techniques with better predictive performance. Models which validation is based on independent data sets present better performance, an evaluation based on the AUC of ROC accuracy measure.

  12. Shannon Entropy-Based Wavelet Transform Method for Autonomous Coherent Structure Identification in Fluid Flow Field Data

    Directory of Open Access Journals (Sweden)

    Kartik V. Bulusu

    2015-09-01

    Full Text Available The coherent secondary flow structures (i.e., swirling motions in a curved artery model possess a variety of spatio-temporal morphologies and can be encoded over an infinitely-wide range of wavelet scales. Wavelet analysis was applied to the following vorticity fields: (i a numerically-generated system of Oseen-type vortices for which the theoretical solution is known, used for bench marking and evaluation of the technique; and (ii experimental two-dimensional, particle image velocimetry data. The mother wavelet, a two-dimensional Ricker wavelet, can be dilated to infinitely large or infinitesimally small scales. We approached the problem of coherent structure detection by means of continuous wavelet transform (CWT and decomposition (or Shannon entropy. The main conclusion of this study is that the encoding of coherent secondary flow structures can be achieved by an optimal number of binary digits (or bits corresponding to an optimal wavelet scale. The optimal wavelet-scale search was driven by a decomposition entropy-based algorithmic approach and led to a threshold-free coherent structure detection method. The method presented in this paper was successfully utilized in the detection of secondary flow structures in three clinically-relevant blood flow scenarios involving the curved artery model under a carotid artery-inspired, pulsatile inflow condition. These scenarios were: (i a clean curved artery; (ii stent-implanted curved artery; and (iii an idealized Type IV stent fracture within the curved artery.

  13. Entropy-Based Method of Choosing the Decomposition Level in Wavelet Threshold De-noising

    Directory of Open Access Journals (Sweden)

    Yan-Fang Sang

    2010-06-01

    Full Text Available In this paper, the energy distributions of various noises following normal, log-normal and Pearson-III distributions are first described quantitatively using the wavelet energy entropy (WEE, and the results are compared and discussed. Then, on the basis of these analytic results, a method for use in choosing the decomposition level (DL in wavelet threshold de-noising (WTD is put forward. Finally, the performance of the proposed method is verified by analysis of both synthetic and observed series. Analytic results indicate that the proposed method is easy to operate and suitable for various signals. Moreover, contrary to traditional white noise testing which depends on “autocorrelations”, the proposed method uses energy distributions to distinguish real signals and noise in noisy series, therefore the chosen DL is reliable, and the WTD results of time series can be improved.

  14. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  15. Maximum entropy based reconstruction of soft X ray emissivity profiles in W7-AS

    International Nuclear Information System (INIS)

    Ertl, K.; Linden, W. von der; Dose, V.; Weller, A.

    1996-01-01

    The reconstruction of 2-D emissivity profiles from soft X ray tomography measurements constitutes a highly underdetermined and ill-posed inversion problem, because of the restricted viewing access, the number of chords and the increased noise level in most plasma devices. An unbiased and consistent probabilistic approach within the framework of Bayesian inference is provided by the maximum entropy method, which is independent of model assumptions, but allows any prior knowledge available to be incorporated. The formalism is applied to the reconstruction of emissivity profiles in an NBI heated plasma discharge to determine the dependence of the Shafranov shift on β, the reduction of which was a particular objective in designing the advanced W7-AS stellarator. (author). 40 refs, 7 figs

  16. An automatic system for Turkish word recognition using Discrete Wavelet Neural Network based on adaptive entropy

    International Nuclear Information System (INIS)

    Avci, E.

    2007-01-01

    In this paper, an automatic system is presented for word recognition using real Turkish word signals. This paper especially deals with combination of the feature extraction and classification from real Turkish word signals. A Discrete Wavelet Neural Network (DWNN) model is used, which consists of two layers: discrete wavelet layer and multi-layer perceptron. The discrete wavelet layer is used for adaptive feature extraction in the time-frequency domain and is composed of Discrete Wavelet Transform (DWT) and wavelet entropy. The multi-layer perceptron used for classification is a feed-forward neural network. The performance of the used system is evaluated by using noisy Turkish word signals. Test results showing the effectiveness of the proposed automatic system are presented in this paper. The rate of correct recognition is about 92.5% for the sample speech signals. (author)

  17. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  18. The integration of weighted gene association networks based on information entropy.

    Science.gov (United States)

    Yang, Fan; Wu, Duzhi; Lin, Limei; Yang, Jian; Yang, Tinghong; Zhao, Jing

    2017-01-01

    Constructing genome scale weighted gene association networks (WGAN) from multiple data sources is one of research hot spots in systems biology. In this paper, we employ information entropy to describe the uncertain degree of gene-gene links and propose a strategy for data integration of weighted networks. We use this method to integrate four existing human weighted gene association networks and construct a much larger WGAN, which includes richer biology information while still keeps high functional relevance between linked gene pairs. The new WGAN shows satisfactory performance in disease gene prediction, which suggests the reliability of our integration strategy. Compared with existing integration methods, our method takes the advantage of the inherent characteristics of the component networks and pays less attention to the biology background of the data. It can make full use of existing biological networks with low computational effort.

  19. Entropy and Graph Based Modelling of Document Coherence using Discourse Entities

    DEFF Research Database (Denmark)

    Petersen, Casper; Lioma, Christina; Simonsen, Jakob Grue

    2015-01-01

    We present two novel models of document coherence and their application to information retrieval (IR). Both models approximate document coherence using discourse entities, e.g. the subject or object of a sentence. Our first model views text as a Markov process generating sequences of discourse...... entities (entity n-grams); we use the entropy of these entity n-grams to approximate the rate at which new information appears in text, reasoning that as more new words appear, the topic increasingly drifts and text coherence decreases. Our second model extends the work of Guinaudeau & Strube [28......] that represents text as a graph of discourse entities, linked by different relations, such as their distance or adjacency in text. We use several graph topology metrics to approximate different aspects of the discourse flow that can indicate coherence, such as the average clustering or betweenness of discourse...

  20. Chemical Engineering Students' Ideas of Entropy

    Science.gov (United States)

    Haglund, Jesper; Andersson, Staffan; Elmgren, Maja

    2015-01-01

    Thermodynamics, and in particular entropy, has been found to be challenging for students, not least due to its abstract character. Comparisons with more familiar and concrete domains, by means of analogy and metaphor, are commonly used in thermodynamics teaching, in particular the metaphor "entropy is disorder." However, this particular…

  1. Application of the EGM Method to a LED-Based Spotlight: A Constrained Pseudo-Optimization Design Process Based on the Analysis of the Local Entropy Generation Maps

    Directory of Open Access Journals (Sweden)

    Enrico Sciubba

    2011-06-01

    Full Text Available In this paper, the entropy generation minimization (EGM method is applied to an industrial heat transfer problem: the forced convective cooling of a LED-based spotlight. The design specification calls for eighteen diodes arranged on a circular copper plate of 35 mm diameter. Every diode dissipates 3 W and the maximum allowedtemperature of the plate is 80 °C. The cooling relies on the forced convection driven by a jet of air impinging on the plate. An initial complex geometry of plate fins is presented and analyzed with a commercial CFD code that computes the entropy generation rate. A pseudo-optimization process is carried out via a successive series of design modifications based on a careful analysis of the entropy generation maps. One of the advantages of the EGM method is that the rationale behind each step of the design process can be justified on a physical basis. It is found that the best performance is attained when the fins are periodically spaced in the radial direction.

  2. Enthalpy-entropy compensation in protein unfolding

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Enthalpy-entropy compensation was found to be a universal law in protein unfolding based on over 3 000 experimental data. Water molecular reorganization accompanying the protein unfolding was suggested as the origin of the enthalpy-entropy compensation in protein unfolding. It is indicated that the enthalpy-entropy compensation constitutes the physical foundation that satisfies the biological need of the small free energy changes in protein unfolding, without the sacrifice of the bio-diversity of proteins. The enthalpy-entropy compensation theory proposed herein also provides valuable insights into the Privalov's puzzle of enthalpy and entropy convergence in protein unfolding.

  3. Controlling the Shannon Entropy of Quantum Systems

    Science.gov (United States)

    Xing, Yifan; Wu, Jun

    2013-01-01

    This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking. PMID:23818819

  4. Controlling the Shannon Entropy of Quantum Systems

    Directory of Open Access Journals (Sweden)

    Yifan Xing

    2013-01-01

    Full Text Available This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking.

  5. Excess Entropy and Diffusivity

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Excess Entropy and Diffusivity. Excess entropy scaling of diffusivity (Rosenfeld,1977). Analogous relationships also exist for viscosity and thermal conductivity.

  6. Explaining the entropy concept and entropy components

    Directory of Open Access Journals (Sweden)

    Marko Popovic

    2018-04-01

    Full Text Available Total entropy of a thermodynamic system consists of two components: thermal entropy due to energy, and residual entropy due to molecular orientation. In this article, a three-step method for explaining entropy is suggested. Step one is to use a classical method to introduce thermal entropy STM as a function of temperature T and heat capacity at constant pressure Cp: STM = ∫(Cp/T dT. Thermal entropy is the entropy due to uncertainty in motion of molecules and vanishes at absolute zero (zero-point energy state. It is also the measure of useless thermal energy that cannot be converted into useful work. The next step is to introduce residual entropy S0 as a function of the number of molecules N and the number of distinct orientations available to them in a crystal m: S0 = N kB ln m, where kB is the Boltzmann constant. Residual entropy quantifies the uncertainty in molecular orientation. Residual entropy, unlike thermal entropy, is independent of temperature and remains present at absolute zero. The third step is to show that thermal entropy and residual entropy add up to the total entropy of a thermodynamic system S: S = S0 + STM. This method of explanation should result in a better comprehension of residual entropy and thermal entropy, as well as of their similarities and differences. The new method was tested in teaching at Faculty of Chemistry University of Belgrade, Serbia. The results of the test show that the new method has a potential to improve the quality of teaching.

  7. Navigation and Self-Semantic Location of Drones in Indoor Environments by Combining the Visual Bug Algorithm and Entropy-Based Vision.

    Science.gov (United States)

    Maravall, Darío; de Lope, Javier; Fuentes, Juan P

    2017-01-01

    We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing) in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV) in typical indoor navigation tasks.

  8. Navigation and Self-Semantic Location of Drones in Indoor Environments by Combining the Visual Bug Algorithm and Entropy-Based Vision

    Directory of Open Access Journals (Sweden)

    Darío Maravall

    2017-08-01

    Full Text Available We introduce a hybrid algorithm for the self-semantic location and autonomous navigation of robots using entropy-based vision and visual topological maps. In visual topological maps the visual landmarks are considered as leave points for guiding the robot to reach a target point (robot homing in indoor environments. These visual landmarks are defined from images of relevant objects or characteristic scenes in the environment. The entropy of an image is directly related to the presence of a unique object or the presence of several different objects inside it: the lower the entropy the higher the probability of containing a single object inside it and, conversely, the higher the entropy the higher the probability of containing several objects inside it. Consequently, we propose the use of the entropy of images captured by the robot not only for the landmark searching and detection but also for obstacle avoidance. If the detected object corresponds to a landmark, the robot uses the suggestions stored in the visual topological map to reach the next landmark or to finish the mission. Otherwise, the robot considers the object as an obstacle and starts a collision avoidance maneuver. In order to validate the proposal we have defined an experimental framework in which the visual bug algorithm is used by an Unmanned Aerial Vehicle (UAV in typical indoor navigation tasks.

  9. Zero entropy continuous interval maps and MMLS-MMA property

    Science.gov (United States)

    Jiang, Yunping

    2018-06-01

    We prove that the flow generated by any continuous interval map with zero topological entropy is minimally mean-attractable and minimally mean-L-stable. One of the consequences is that any oscillating sequence is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy. In particular, the Möbius function is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy (Sarnak’s conjecture for continuous interval maps). Another consequence is a non-trivial example of a flow having discrete spectrum. We also define a log-uniform oscillating sequence and show a result in ergodic theory for comparison. This material is based upon work supported by the National Science Foundation. It is also partially supported by a collaboration grant from the Simons Foundation (grant number 523341) and PSC-CUNY awards and a grant from NSFC (grant number 11571122).

  10. Vector entropy imaging theory with application to computerized tomography

    International Nuclear Information System (INIS)

    Wang Yuanmei; Cheng Jianping; Heng, Pheng Ann

    2002-01-01

    Medical imaging theory for x-ray CT and PET is based on image reconstruction from projections. In this paper a novel vector entropy imaging theory under the framework of multiple criteria decision making is presented. We also study the most frequently used image reconstruction methods, namely, least square, maximum entropy, and filtered back-projection methods under the framework of the single performance criterion optimization. Finally, we introduce some of the results obtained by various reconstruction algorithms using computer-generated noisy projection data from the Hoffman phantom and real CT scanner data. Comparison of the reconstructed images indicates that the vector entropy method gives the best in error (difference between the original phantom data and reconstruction), smoothness (suppression of noise), grey value resolution and is free of ghost images. (author)

  11. An exploration for the macroscopic physical meaning of entropy

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The macroscopic physical meaning of entropy is analyzed based on the exergy (availability) of a combined system (a closed system and its environment), which is the maximum amount of useful work obtainable from the system and the environment as the system is brought into equilibrium with the environment. The process the system experiences can be divided in two sequent sub-processes, the process at constant volume, which represents the heat interaction of the system with the environment, and the adiabatic process, which represents the work interaction of the system with the environment. It is shown that the macroscopic physical meaning of entropy is a measure of the unavailable energy of a closed system for doing useful work through heat interaction. This statement is more precise than those reported in prior literature. The unavailability function of a closed system can be defined as T0S and p0V in volume constant process and adiabatic process, respectively. Their changes, that is, AiTgS) and A (p0V) represent the unusable parts of the internal energy of a closed system for doing useful work in corresponding processes. Finally, the relation between Clausius entropy and Boltzmann entropy is discussed based on the comparison of their expressions for absolute entropy.

  12. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  13. An Entropy-Based Upper Bound Methodology for Robust Predictive Multi-Mode RCPSP Schedules

    Directory of Open Access Journals (Sweden)

    Angela Hsiang-Ling Chen

    2014-09-01

    Full Text Available Projects are an important part of our activities and regardless of their magnitude, scheduling is at the very core of every project. In an ideal world makespan minimization, which is the most commonly sought objective, would give us an advantage. However, every time we execute a project we have to deal with uncertainty; part of it coming from known sources and part remaining unknown until it affects us. For this reason, it is much more practical to focus on making our schedules robust, capable of handling uncertainty, and even to determine a range in which the project could be completed. In this paper we focus on an approach to determine such a range for the Multi-mode Resource Constrained Project Scheduling Problem (MRCPSP, a widely researched, NP-complete problem, but without adding any subjective considerations to its estimation. We do this by using a concept well known in the domain of thermodynamics, entropy and a three-stage approach. First we use Artificial Bee Colony (ABC—an effective and powerful meta-heuristic—to determine a schedule with minimized makespan which serves as a lower bound. The second stage defines buffer times and creates an upper bound makespan using an entropy function, with the advantage over other methods that it only considers elements which are inherent to the schedule itself and does not introduce any subjectivity to the buffer time generation. In the last stage, we use the ABC algorithm with an objective function that seeks to maximize robustness while staying within the makespan boundaries defined previously and in some cases even below the lower boundary. We evaluate our approach with two different benchmarks sets: when using the PSPLIB for the MRCPSP benchmark set, the computational results indicate that it is possible to generate robust schedules which generally result in an increase of less than 10% of the best known solutions while increasing the robustness in at least 20% for practically every

  14. Entropy and cosmology.

    Science.gov (United States)

    Zucker, M. H.

    This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own

  15. Access Selection Algorithm of Heterogeneous Wireless Networks for Smart Distribution Grid Based on Entropy-Weight and Rough Set

    Science.gov (United States)

    Xiang, Min; Qu, Qinqin; Chen, Cheng; Tian, Li; Zeng, Lingkang

    2017-11-01

    To improve the reliability of communication service in smart distribution grid (SDG), an access selection algorithm based on dynamic network status and different service types for heterogeneous wireless networks was proposed. The network performance index values were obtained in real time by multimode terminal and the variation trend of index values was analyzed by the growth matrix. The index weights were calculated by entropy-weight and then modified by rough set to get the final weights. Combining the grey relational analysis to sort the candidate networks, and the optimum communication network is selected. Simulation results show that the proposed algorithm can implement dynamically access selection in heterogeneous wireless networks of SDG effectively and reduce the network blocking probability.

  16. Wavelet entropy characterization of elevated intracranial pressure.

    Science.gov (United States)

    Xu, Peng; Scalzo, Fabien; Bergsneider, Marvin; Vespa, Paul; Chad, Miller; Hu, Xiao

    2008-01-01

    Intracranial Hypertension (ICH) often occurs for those patients with traumatic brain injury (TBI), stroke, tumor, etc. Pathology of ICH is still controversial. In this work, we used wavelet entropy and relative wavelet entropy to study the difference existed between normal and hypertension states of ICP for the first time. The wavelet entropy revealed the similar findings as the approximation entropy that entropy during ICH state is smaller than that in normal state. Moreover, with wavelet entropy, we can see that ICH state has the more focused energy in the low wavelet frequency band (0-3.1 Hz) than the normal state. The relative wavelet entropy shows that the energy distribution in the wavelet bands between these two states is actually different. Based on these results, we suggest that ICH may be formed by the re-allocation of oscillation energy within brain.

  17. Analysis of Neural Oscillations on Drosophila’s Subesophageal Ganglion Based on Approximate Entropy

    Directory of Open Access Journals (Sweden)

    Tian Mei

    2015-10-01

    Full Text Available The suboesophageal ganglion (SOG, which connects to both central and peripheral nerves, is the primary taste-processing center in the Drosophila’s brain. The neural oscillation in this center may be of great research value yet it is rarely reported. This work aims to determine the amount of unique information contained within oscillations of the SOG and describe the variability of these patterns. The approximate entropy (ApEn values of the spontaneous membrane potential (sMP of SOG neurons were calculated in this paper. The arithmetic mean (MA, standard deviation (SDA and the coefficient of variation (CVA of ApEn were proposed as the three statistical indicators to describe the irregularity and complexity of oscillations. The hierarchical clustering method was used to classify them. As a result, the oscillations in SOG were divided into five categories, including: (1 Continuous spike pattern; (2 Mixed oscillation pattern; (3 Spikelet pattern; (4 Busting pattern and (5 Sparse spike pattern. Steady oscillation state has a low level of irregularity, and vice versa. The dopamine stimulation can distinctly cut down the complexity of the mixed oscillation pattern. The current study provides a quantitative method and some critera on mining the information carried in neural oscillations.

  18. Assessment of Urban Ecosystem Health Based on Entropy Weight Extension Decision Model in Urban Agglomeration

    Directory of Open Access Journals (Sweden)

    Qian Yang

    2016-08-01

    Full Text Available Urban ecosystem health evaluation can assist in sustainable ecological management at a regional level. This study examined urban agglomeration ecosystem health in the middle reaches of the Yangtze River with entropy weight and extension theories. The model overcomes information omissions and subjectivity problems in the evaluation process of urban ecosystem health. Results showed that human capital and education, economic development level as well as urban infrastructure have a significant effect on the health states of urban agglomerations. The health status of the urban agglomeration’s ecosystem was not optimistic in 2013. The majority of the cities were unhealthy or verging on unhealthy, accounting for 64.52% of the total number of cities in the urban agglomeration. The regional differences of the 31 cities’ ecosystem health are significant. The cause originated from an imbalance in economic development and the policy guidance of city development. It is necessary to speed up the integration process to promote coordinated regional development. The present study will aid us in understanding and advancing the health situation of the urban ecosystem in the middle reaches of the Yangtze River and will provide an efficient urban ecosystem health evaluation method that can be used in other areas.

  19. Entropy-Based Application Layer DDoS Attack Detection Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Khundrakpam Johnson Singh

    2016-10-01

    Full Text Available Distributed denial-of-service (DDoS attack is one of the major threats to the web server. The rapid increase of DDoS attacks on the Internet has clearly pointed out the limitations in current intrusion detection systems or intrusion prevention systems (IDS/IPS, mostly caused by application-layer DDoS attacks. Within this context, the objective of the paper is to detect a DDoS attack using a multilayer perceptron (MLP classification algorithm with genetic algorithm (GA as learning algorithm. In this work, we analyzed the standard EPA-HTTP (environmental protection agency-hypertext transfer protocol dataset and selected the parameters that will be used as input to the classifier model for differentiating the attack from normal profile. The parameters selected are the HTTP GET request count, entropy, and variance for every connection. The proposed model can provide a better accuracy of 98.31%, sensitivity of 0.9962, and specificity of 0.0561 when compared to other traditional classification models.

  20. A Maximum Entropy-Based Chaotic Time-Variant Fragile Watermarking Scheme for Image Tampering Detection

    Directory of Open Access Journals (Sweden)

    Guo-Jheng Yang

    2013-08-01

    Full Text Available The fragile watermarking technique is used to protect intellectual property rights while also providing security and rigorous protection. In order to protect the copyright of the creators, it can be implanted in some representative text or totem. Because all of the media on the Internet are digital, protection has become a critical issue, and determining how to use digital watermarks to protect digital media is thus the topic of our research. This paper uses the Logistic map with parameter u = 4 to generate chaotic dynamic behavior with the maximum entropy 1. This approach increases the security and rigor of the protection. The main research target of information hiding is determining how to hide confidential data so that the naked eye cannot see the difference. Next, we introduce one method of information hiding. Generally speaking, if the image only goes through Arnold’s cat map and the Logistic map, it seems to lack sufficient security. Therefore, our emphasis is on controlling Arnold’s cat map and the initial value of the chaos system to undergo small changes and generate different chaos sequences. Thus, the current time is used to not only make encryption more stringent but also to enhance the security of the digital media.

  1. Pathological brain detection based on wavelet entropy and Hu moment invariants.

    Science.gov (United States)

    Zhang, Yudong; Wang, Shuihua; Sun, Ping; Phillips, Preetha

    2015-01-01

    With the aim of developing an accurate pathological brain detection system, we proposed a novel automatic computer-aided diagnosis (CAD) to detect pathological brains from normal brains obtained by magnetic resonance imaging (MRI) scanning. The problem still remained a challenge for technicians and clinicians, since MR imaging generated an exceptionally large information dataset. A new two-step approach was proposed in this study. We used wavelet entropy (WE) and Hu moment invariants (HMI) for feature extraction, and the generalized eigenvalue proximal support vector machine (GEPSVM) for classification. To further enhance classification accuracy, the popular radial basis function (RBF) kernel was employed. The 10 runs of k-fold stratified cross validation result showed that the proposed "WE + HMI + GEPSVM + RBF" method was superior to existing methods w.r.t. classification accuracy. It obtained the average classification accuracies of 100%, 100%, and 99.45% over Dataset-66, Dataset-160, and Dataset-255, respectively. The proposed method is effective and can be applied to realistic use.

  2. Information-Dispersion-Entropy-Based Blind Recognition of Binary BCH Codes in Soft Decision Situations

    Directory of Open Access Journals (Sweden)

    Yimeng Zhang

    2013-05-01

    Full Text Available A method of blind recognition of the coding parameters for binary Bose-Chaudhuri-Hocquenghem (BCH codes is proposed in this paper. We consider an intelligent communication receiver which can blindly recognize the coding parameters of the received data stream. The only knowledge is that the stream is encoded using binary BCH codes, while the coding parameters are unknown. The problem can be addressed on the context of the non-cooperative communications or adaptive coding and modulations (ACM for cognitive radio networks. The recognition processing includes two major procedures: code length estimation and generator polynomial reconstruction. A hard decision method has been proposed in a previous literature. In this paper we propose the recognition approach in soft decision situations with Binary-Phase-Shift-Key modulations and Additive-White-Gaussian-Noise (AWGN channels. The code length is estimated by maximizing the root information dispersion entropy function. And then we search for the code roots to reconstruct the primitive and generator polynomials. By utilizing the soft output of the channel, the recognition performance is improved and the simulations show the efficiency of the proposed algorithm.

  3. Environmental efficiency analysis of power industry in China based on an entropy SBM model

    International Nuclear Information System (INIS)

    Zhou, Yan; Xing, Xinpeng; Fang, Kuangnan; Liang, Dapeng; Xu, Chunlin

    2013-01-01

    In order to assess the environmental efficiency of power industry in China, this paper first proposes a new non-radial DEA approach by integrating the entropy weight and the SBM model. This will improve the assessment reliability and reasonableness. Using the model, this study then evaluates the environmental efficiency of the Chinese power industry at the provincial level during 2005–2010. The results show a marked difference in environmental efficiency of the power industry among Chinese provinces. Although the annual, average, environmental efficiency level fluctuates, there is an increasing trend. The Tobit regression analysis reveals the innovation ability of enterprises, the proportion of electricity generated by coal-fired plants and the generation capacity have a significantly positive effect on environmental efficiency. However the waste fees levied on waste discharge and investment in industrial pollutant treatment are negatively associated with environmental efficiency. - Highlights: ► We assess the environmental efficiency of power industry in China by E-SBM model. ► Environmental efficiency of power industry is different among provinces. ► Efficiency stays at a higher level in the eastern and the western area. ► Proportion of coal-fired plants has a positive effect on the efficiency. ► Waste fees and the investment have a negative effect on the efficiency

  4. Tackling Information Asymmetry in Networks: A New Entropy-Based Ranking Index

    Science.gov (United States)

    Barucca, Paolo; Caldarelli, Guido; Squartini, Tiziano

    2018-06-01

    Information is a valuable asset in socio-economic systems, a significant part of which is entailed into the network of connections between agents. The different interlinkages patterns that agents establish may, in fact, lead to asymmetries in the knowledge of the network structure; since this entails a different ability of quantifying relevant, systemic properties (e.g. the risk of contagion in a network of liabilities), agents capable of providing a better estimation of (otherwise) inaccessible network properties, ultimately have a competitive advantage. In this paper, we address the issue of quantifying the information asymmetry of nodes: to this aim, we define a novel index—InfoRank—intended to rank nodes according to their information content. In order to do so, each node ego-network is enforced as a constraint of an entropy-maximization problem and the subsequent uncertainty reduction is used to quantify the node-specific accessible information. We, then, test the performance of our ranking procedure in terms of reconstruction accuracy and show that it outperforms other centrality measures in identifying the "most informative" nodes. Finally, we discuss the socio-economic implications of network information asymmetry.

  5. Two-dimensional maximum entropy image restoration

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.

    1977-07-01

    An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures

  6. Comparison of entropy production rates in two different types of self-organized flows: Benard convection and zonal flow

    International Nuclear Information System (INIS)

    Kawazura, Y.; Yoshida, Z.

    2012-01-01

    Two different types of self-organizing and sustaining ordered motion in fluids or plasmas--one is a Benard convection (or streamer) and the other is a zonal flow--have been compared by introducing a thermodynamic phenomenological model and evaluating the corresponding entropy production rates (EP). These two systems have different topologies in their equivalent circuits: the Benard convection is modeled by parallel connection of linear and nonlinear conductances, while the zonal flow is modeled by series connection. The ''power supply'' that drives the systems is also a determinant of operating modes. When the energy flux is a control parameter (as in usual plasma experiments), the driver is modeled by a constant-current power supply, and when the temperature difference between two separate boundaries is controlled (as in usual computational studies), the driver is modeled by a constant-voltage power supply. The parallel (series)-connection system tends to minimize (maximize) the total EP when a constant-current power supply drives the system. This minimum/maximum relation flips when a constant-voltage power supply is connected.

  7. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  8. Electroencephalogram–Electromyography Coupling Analysis in Stroke Based on Symbolic Transfer Entropy

    Directory of Open Access Journals (Sweden)

    Yunyuan Gao

    2018-01-01

    Full Text Available The coupling strength between electroencephalogram (EEG and electromyography (EMG signals during motion control reflects the interaction between the cerebral motor cortex and muscles. Therefore, neuromuscular coupling characterization is instructive in assessing motor function. In this study, to overcome the limitation of losing the characteristics of signals in conventional time series symbolization methods, a variable scale symbolic transfer entropy (VS-STE analysis approach was proposed for corticomuscular coupling evaluation. Post-stroke patients (n = 5 and healthy volunteers (n = 7 were recruited and participated in various tasks (left and right hand gripping, elbow bending. The proposed VS-STE was employed to evaluate the corticomuscular coupling strength between the EEG signal measured from the motor cortex and EMG signal measured from the upper limb in both the time-domain and frequency-domain. Results showed a greater strength of the bi-directional (EEG-to-EMG and EMG-to-EEG VS-STE in post-stroke patients compared to healthy controls. In addition, the strongest EEG–EMG coupling strength was observed in the beta frequency band (15–35 Hz during the upper limb movement. The predefined coupling strength of EMG-to-EEG in the affected side of the patient was larger than that of EEG-to-EMG. In conclusion, the results suggested that the corticomuscular coupling is bi-directional, and the proposed VS-STE can be used to quantitatively characterize the non-linear synchronization characteristics and information interaction between the primary motor cortex and muscles.

  9. Bubble Entropy: An Entropy Almost Free of Parameters.

    Science.gov (United States)

    Manis, George; Aktaruzzaman, Md; Sassi, Roberto

    2017-11-01

    Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation

  10. SpatEntropy: Spatial Entropy Measures in R

    OpenAIRE

    Altieri, Linda; Cocchi, Daniela; Roli, Giulia

    2018-01-01

    This article illustrates how to measure the heterogeneity of spatial data presenting a finite number of categories via computation of spatial entropy. The R package SpatEntropy contains functions for the computation of entropy and spatial entropy measures. The extension to spatial entropy measures is a unique feature of SpatEntropy. In addition to the traditional version of Shannon's entropy, the package includes Batty's spatial entropy, O'Neill's entropy, Li and Reynolds' contagion index, Ka...

  11. Entropy, related thermodynamic properties, and structure of methylisocyanate

    International Nuclear Information System (INIS)

    Davis, Phil S.; Kilpatrick, John E.

    2013-01-01

    Highlights: ► The thermodynamic properties of methylisocyanate have been determined by isothermal calorimetry from 15 to 298.15 K. ► The third law entropy has been compared with the entropy calculated by statistical thermodynamics. ► The comparisons are consistent with selected proposed molecular structures and vibrational frequencies. -- Abstract: The entropy and related thermodynamic properties of methylisocyanate, CH 3 NCO, have been determined by isothermal calorimetry. The entropy in the ideal gas state at 298.15 K and 1 atmosphere is S m o = 284.3 ± 0.6 J/K · mol. Other thermodynamic properties determined include: the heat capacity from 15 to 300 K, the temperature of fusion (T fus = 178.461 ± 0.024 K), the enthalpy of fusion (ΔH fus = 7455.2 ± 14.0 J/mol), the enthalpy of vaporization at 298.15 K (ΔH vap = 28768 ± 54 J/mol), and the vapor pressure from fusion to 300 K. Using statistical thermodynamics, the entropy in this same state has been calculated for various assumed structures for methylisocyante which have been proposed based on several spectroscopic and ab initio results. Comparisons between the experimental and calculated entropy have led to the following conclusions concerning historical differences among problematic structural properties: (1) The CNC/CNO angles can have the paired values of 140/180° or 135/173° respectively. It is not possible to distinguish between the two by this thermodynamic analysis. (2) The methyl group functions as a free rotor or near free rotor against the NCO rigid frame. The barrier to internal rotation is less than 2100 J/mol. (3) The CNC vibrational bending frequency is consistent with the more recently observed assignments at 165 and 172 cm −1 with some degree of anharmonicity or with a pure harmonic at about 158 cm −1

  12. Entropy and Quantum Gravity

    Directory of Open Access Journals (Sweden)

    Bernard S. Kay

    2015-12-01

    Full Text Available We give a review, in the style of an essay, of the author’s 1998 matter-gravity entanglement hypothesis which, unlike the standard approach to entropy based on coarse-graining, offers a definition for the entropy of a closed system as a real and objective quantity. We explain how this approach offers an explanation for the Second Law of Thermodynamics in general and a non-paradoxical understanding of information loss during black hole formation and evaporation in particular. It also involves a radically different from usual description of black hole equilibrium states in which the total state of a black hole in a box together with its atmosphere is a pure state—entangled in just such a way that the reduced state of the black hole and of its atmosphere are each separately approximately thermal. We also briefly recall some recent work of the author which involves a reworking of the string-theory understanding of black hole entropy consistent with this alternative description of black hole equilibrium states and point out that this is free from some unsatisfactory features of the usual string theory understanding. We also recall the author’s recent arguments based on this alternative description which suggest that the Anti de Sitter space (AdS/conformal field theory (CFT correspondence is a bijection between the boundary CFT and just the matter degrees of freedom of the bulk theory.

  13. Entropy in molecular recognition by proteins.

    Science.gov (United States)

    Caro, José A; Harpole, Kyle W; Kasinath, Vignesh; Lim, Jackwee; Granja, Jeffrey; Valentine, Kathleen G; Sharp, Kim A; Wand, A Joshua

    2017-06-20

    Molecular recognition by proteins is fundamental to molecular biology. Dissection of the thermodynamic energy terms governing protein-ligand interactions has proven difficult, with determination of entropic contributions being particularly elusive. NMR relaxation measurements have suggested that changes in protein conformational entropy can be quantitatively obtained through a dynamical proxy, but the generality of this relationship has not been shown. Twenty-eight protein-ligand complexes are used to show a quantitative relationship between measures of fast side-chain motion and the underlying conformational entropy. We find that the contribution of conformational entropy can range from favorable to unfavorable, which demonstrates the potential of this thermodynamic variable to modulate protein-ligand interactions. For about one-quarter of these complexes, the absence of conformational entropy would render the resulting affinity biologically meaningless. The dynamical proxy for conformational entropy or "entropy meter" also allows for refinement of the contributions of solvent entropy and the loss in rotational-translational entropy accompanying formation of high-affinity complexes. Furthermore, structure-based application of the approach can also provide insight into long-lived specific water-protein interactions that escape the generic treatments of solvent entropy based simply on changes in accessible surface area. These results provide a comprehensive and unified view of the general role of entropy in high-affinity molecular recognition by proteins.

  14. Attractor comparisons based on density

    International Nuclear Information System (INIS)

    Carroll, T. L.

    2015-01-01

    Recognizing a chaotic attractor can be seen as a problem in pattern recognition. Some feature vector must be extracted from the attractor and used to compare to other attractors. The field of machine learning has many methods for extracting feature vectors, including clustering methods, decision trees, support vector machines, and many others. In this work, feature vectors are created by representing the attractor as a density in phase space and creating polynomials based on this density. Density is useful in itself because it is a one dimensional function of phase space position, but representing an attractor as a density is also a way to reduce the size of a large data set before analyzing it with graph theory methods, which can be computationally intensive. The density computation in this paper is also fast to execute. In this paper, as a demonstration of the usefulness of density, the density is used directly to construct phase space polynomials for comparing attractors. Comparisons between attractors could be useful for tracking changes in an experiment when the underlying equations are too complicated for vector field modeling

  15. Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.

    Science.gov (United States)

    Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia

    2017-04-01

    Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.

  16. A Framework for Final Drive Simultaneous Failure Diagnosis Based on Fuzzy Entropy and Sparse Bayesian Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Qing Ye

    2015-01-01

    Full Text Available This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach.

  17. Black hole entropy functions and attractor equations

    International Nuclear Information System (INIS)

    Lopes Cardoso, Gabriel; Wit, Bernard de; Mahapatra, Swapna

    2007-01-01

    The entropy and the attractor equations for static extremal black hole solutions follow from a variational principle based on an entropy function. In the general case such an entropy function can be derived from the reduced action evaluated in a near-horizon geometry. BPS black holes constitute special solutions of this variational principle, but they can also be derived directly from a different entropy function based on supersymmetry enhancement at the horizon. Both functions are consistent with electric/magnetic duality and for BPS black holes their corresponding OSV-type integrals give identical results at the semi-classical level. We clarify the relation between the two entropy functions and the corresponding attractor equations for N = 2 supergravity theories with higher-derivative couplings in four space-time dimensions. We discuss how non-holomorphic corrections will modify these entropy functions

  18. Entropy-based viscous regularization for the multi-dimensional Euler equations in low-Mach and transonic flows

    Energy Technology Data Exchange (ETDEWEB)

    Marc O Delchini; Jean E. Ragusa; Ray A. Berry

    2015-07-01

    We present a new version of the entropy viscosity method, a viscous regularization technique for hyperbolic conservation laws, that is well-suited for low-Mach flows. By means of a low-Mach asymptotic study, new expressions for the entropy viscosity coefficients are derived. These definitions are valid for a wide range of Mach numbers, from subsonic flows (with very low Mach numbers) to supersonic flows, and no longer depend on an analytical expression for the entropy function. In addition, the entropy viscosity method is extended to Euler equations with variable area for nozzle flow problems. The effectiveness of the method is demonstrated using various 1-D and 2-D benchmark tests: flow in a converging–diverging nozzle; Leblanc shock tube; slow moving shock; strong shock for liquid phase; low-Mach flows around a cylinder and over a circular hump; and supersonic flow in a compression corner. Convergence studies are performed for smooth solutions and solutions with shocks present.

  19. Entropy Stable Summation-by-Parts Formulations for Compressible Computational Fluid Dynamics

    KAUST Repository

    Carpenter, M.H.

    2016-11-09

    A systematic approach based on a diagonal-norm summation-by-parts (SBP) framework is presented for implementing entropy stable (SS) formulations of any order for the compressible Navier–Stokes equations (NSE). These SS formulations discretely conserve mass, momentum, energy and satisfy a mathematical entropy equality for smooth problems. They are also valid for discontinuous flows provided sufficient dissipation is added at shocks and discontinuities to satisfy an entropy inequality. Admissible SBP operators include all centred diagonal-norm finite-difference (FD) operators and Legendre spectral collocation-finite element methods (LSC-FEM). Entropy stable multiblock FD and FEM operators follows immediately via nonlinear coupling operators that ensure conservation, accuracy and preserve the interior entropy estimates. Nonlinearly stable solid wall boundary conditions are also available. Existing SBP operators that lack a stability proof (e.g. weighted essentially nonoscillatory) may be combined with an entropy stable operator using a comparison technique to guarantee nonlinear stability of the pair. All capabilities extend naturally to a curvilinear form of the NSE provided that the coordinate mappings satisfy a geometric conservation law constraint. Examples are presented that demonstrate the robustness of current state-of-the-art entropy stable SBP formulations.

  20. Developing Soil Moisture Profiles Utilizing Remotely Sensed MW and TIR Based SM Estimates Through Principle of Maximum Entropy

    Science.gov (United States)

    Mishra, V.; Cruise, J. F.; Mecikalski, J. R.

    2015-12-01

    Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Earlier studies show that the principle of maximum entropy (POME) can be utilized to develop vertical soil moisture profiles with accuracy (MAE of about 1% for a monotonically dry profile; nearly 2% for monotonically wet profiles and 3.8% for mixed profiles) with minimum constraints (surface, mean and bottom soil moisture contents). In this study, the constraints for the vertical soil moisture profiles were obtained from remotely sensed data. Low resolution (25 km) MW soil moisture estimates (AMSR-E) were downscaled to 4 km using a soil evaporation efficiency index based disaggregation approach. The downscaled MW soil moisture estimates served as a surface boundary condition, while 4 km resolution TIR based Atmospheric Land Exchange Inverse (ALEXI) estimates provided the required mean root-zone soil moisture content. Bottom soil moisture content is assumed to be a soil dependent constant. Mulit-year (2002-2011) gridded profiles were developed for the southeastern United States using the POME method. The soil moisture profiles were compared to those generated in land surface models (Land Information System (LIS) and an agricultural model DSSAT) along with available NRCS SCAN sites in the study region. The end product, spatial soil moisture profiles, can be assimilated into agricultural and hydrologic models in lieu of precipitation for data scarce regions.Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Previous studies have shown that the principle of maximum entropy (POME) can be utilized with minimal constraints to develop vertical soil moisture profiles with accuracy (MAE = 1% for monotonically dry profiles; MAE = 2% for monotonically wet profiles and MAE = 3.8% for mixed profiles) when compared to laboratory and field

  1. Secondary structural entropy in RNA switch (Riboswitch) identification.

    Science.gov (United States)

    Manzourolajdad, Amirhossein; Arnold, Jonathan

    2015-04-28

    RNA regulatory elements play a significant role in gene regulation. Riboswitches, a widespread group of regulatory RNAs, are vital components of many bacterial genomes. These regulatory elements generally function by forming a ligand-induced alternative fold that controls access to ribosome binding sites or other regulatory sites in RNA. Riboswitch-mediated mechanisms are ubiquitous across bacterial genomes. A typical class of riboswitch has its own unique structural and biological complexity, making de novo riboswitch identification a formidable task. Traditionally, riboswitches have been identified through comparative genomics based on sequence and structural homology. The limitations of structural-homology-based approaches, coupled with the assumption that there is a great diversity of undiscovered riboswitches, suggests the need for alternative methods for riboswitch identification, possibly based on features intrinsic to their structure. As of yet, no such reliable method has been proposed. We used structural entropy of riboswitch sequences as a measure of their secondary structural dynamics. Entropy values of a diverse set of riboswitches were compared to that of their mutants, their dinucleotide shuffles, and their reverse complement sequences under different stochastic context-free grammar folding models. Significance of our results was evaluated by comparison to other approaches, such as the base-pairing entropy and energy landscapes dynamics. Classifiers based on structural entropy optimized via sequence and structural features were devised as riboswitch identifiers and tested on Bacillus subtilis, Escherichia coli, and Synechococcus elongatus as an exploration of structural entropy based approaches. The unusually long untranslated region of the cotH in Bacillus subtilis, as well as upstream regions of certain genes, such as the sucC genes were associated with significant structural entropy values in genome-wide examinations. Various tests show that there

  2. New Definition and Properties of Fuzzy Entropy

    Institute of Scientific and Technical Information of China (English)

    Qing Ming; Qin Yingbing

    2006-01-01

    Let X = (x1,x2 ,…,xn ) and F(X) be a fuzzy set on a universal set X. A new definition of fuzzy entropy about a fuzzy set A on F(X), e*, is defined based on the order relation "≤" on [0,1/2] n. It is proved that e* is a σ-entropy under an additional requirement. Besides, some entropy formulas are presented and related properties are discussed.

  3. Permutation Entropy: New Ideas and Challenges

    Directory of Open Access Journals (Sweden)

    Karsten Keller

    2017-03-01

    Full Text Available Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data.

  4. Entropy In the Universe: A New Approach

    Directory of Open Access Journals (Sweden)

    Antonio Alfonso-Faus

    2000-09-01

    Full Text Available Abstract: We propose a new definition of entropy for any mass m, based on gravitation and through the concept of a gravitational cross section. It turns out to be proportional to mass, and therefore extensive, and to the age of the Universe. It is a Machian approach. It is also the number of gravity quanta the mass has emitted through its age. The entropy of the Uni-verse is so determined and the cosmological entropy problem solved.

  5. Recognition of Wheat Spike from Field Based Phenotype Platform Using Multi-Sensor Fusion and Improved Maximum Entropy Segmentation Algorithms

    Directory of Open Access Journals (Sweden)

    Chengquan Zhou

    2018-02-01

    Full Text Available To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP is used to obtain orthographic images of wheat plots at the filling stage. The data acquisition system of the FPP provides high-definition RGB images and multispectral images of the corresponding quadrats. Then, the high-definition panchromatic images are obtained by fusion of three channels of RGB. The Gram–Schmidt fusion algorithm is then used to fuse these multispectral and panchromatic images, thereby improving the color identification degree of the targets. Next, the maximum entropy segmentation method is used to do the coarse-segmentation. The threshold of this method is determined by a firefly algorithm based on chaos theory (FACT, and then a morphological filter is used to de-noise the coarse-segmentation results. Finally, morphological reconstruction theory is applied to segment the adhesive part of the de-noised image and realize the fine-segmentation of the image. The computer-generated counting results for the wheat plots, using independent regional statistical function in Matlab R2017b software, are then compared with field measurements which indicate that the proposed method provides a more accurate count of wheat spikes when compared with other traditional fusion and segmentation methods mentioned in this paper.

  6. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jing; GUO ZengYuan

    2008-01-01

    The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.

  7. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  8. Quantum dynamical entropy revisited

    International Nuclear Information System (INIS)

    Hudetz, T.

    1996-10-01

    We define a new quantum dynamical entropy, which is a 'hybrid' of the closely related, physically oriented entropy introduced by Alicki and Fannes in 1994, and of the mathematically well-developed, single-argument entropy introduced by Connes, Narnhofer and Thirring in 1987. We show that this new quantum dynamical entropy has many properties similar to the ones of the Alicki-Fannes entropy, and also inherits some additional properties from the CNT entropy. In particular, the 'hybrid' entropy interpolates between the two different ways in which both the AF and the CNT entropy of the shift automorphism on the quantum spin chain agree with the usual quantum entropy density, resulting in even better agreement. Also, the new quantum dynamical entropy generalizes the classical dynamical entropy of Kolmogorov and Sinai in the same way as does the AF entropy. Finally, we estimate the 'hybrid' entropy both for the Powers-Price shift systems and for the noncommutative Arnold map on the irrational rotation C * -algebra, leaving some interesting open problems. (author)

  9. Entropy type complexity of quantum processes

    International Nuclear Information System (INIS)

    Watanabe, Noboru

    2014-01-01

    von Neumann entropy represents the amount of information in the quantum state, and this was extended by Ohya for general quantum systems [10]. Umegaki first defined the quantum relative entropy for σ-finite von Neumann algebras, which was extended by Araki, and Uhlmann, for general von Neumann algebras and *-algebras, respectively. In 1983 Ohya introduced the quantum mutual entropy by using compound states; this describes the amount of information correctly transmitted through the quantum channel, which was also extended by Ohya for general quantum systems. In this paper, we briefly explain Ohya's S-mixing entropy and the quantum mutual entropy for general quantum systems. By using structure equivalent class, we will introduce entropy type functionals based on quantum information theory to improve treatment for the Gaussian communication process. (paper)

  10. Hierarchical and Complex System Entropy Clustering Analysis Based Validation for Traditional Chinese Medicine Syndrome Patterns of Chronic Atrophic Gastritis.

    Science.gov (United States)

    Zhang, Yin; Liu, Yue; Li, Yannan; Zhao, Xia; Zhuo, Lin; Zhou, Ajian; Zhang, Li; Su, Zeqi; Chen, Cen; Du, Shiyu; Liu, Daming; Ding, Xia

    2018-03-22

    Chronic atrophic gastritis (CAG) is the precancerous stage of gastric carcinoma. Traditional Chinese Medicine (TCM) has been widely used in treating CAG. This study aimed to reveal core pathogenesis of CAG by validating the TCM syndrome patterns and provide evidence for optimization of treatment strategies. This is a cross-sectional study conducted in 4 hospitals in China. Hierarchical clustering analysis (HCA) and complex system entropy clustering analysis (CSECA) were performed, respectively, to achieve syndrome pattern validation. Based on HCA, 15 common factors were assigned to 6 syndrome patterns: liver depression and spleen deficiency and blood stasis in the stomach collateral, internal harassment of phlegm-heat and blood stasis in the stomach collateral, phlegm-turbidity internal obstruction, spleen yang deficiency, internal harassment of phlegm-heat and spleen deficiency, and spleen qi deficiency. By CSECA, 22 common factors were assigned to 7 syndrome patterns: qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, yang deficiency, and yin deficiency. Combination of qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, yang deficiency, and yin deficiency may play a crucial role in CAG pathogenesis. In accord with this, treatment strategies by TCM herbal prescriptions should be targeted to regulating qi, activating blood, resolving turbidity, clearing heat, removing toxin, nourishing yin, and warming yang. Further explorations are needed to verify and expand the current conclusions.

  11. Application of support vector machine based on pattern spectrum entropy in fault diagnostics of rolling element bearings

    International Nuclear Information System (INIS)

    Hao, Rujiang; Chu, Fulei; Peng, Zhike; Feng, Zhipeng

    2011-01-01

    This paper presents a novel pattern classification approach for the fault diagnostics of rolling element bearings, which combines the morphological multi-scale analysis and the 'one to others' support vector machine (SVM) classifiers. The morphological pattern spectrum describes the shape characteristics of the inspected signal based on the morphological opening operation with multi-scale structuring elements. The pattern spectrum entropy and the barycenter scale location of the spectrum curve are extracted as the feature vectors presenting different faults of the bearing, which are more effective and representative than the kurtosis and the enveloping demodulation spectrum. The 'one to others' SVM algorithm is adopted to distinguish six kinds of fault signals which were measured in the experimental test rig under eight different working conditions. The recognition results of the SVM are ideal and more precise than those of the artificial neural network even though the training samples are few. The combination of the morphological pattern spectrum parameters and the 'one to others' multi-class SVM algorithm is suitable for the on-line automated fault diagnosis of the rolling element bearings. This application is promising and worth well exploiting

  12. A Concept Lattice for Semantic Integration of Geo-Ontologies Based on Weight of Inclusion Degree Importance and Information Entropy

    Directory of Open Access Journals (Sweden)

    Jia Xiao

    2016-11-01

    Full Text Available Constructing a merged concept lattice with formal concept analysis (FCA is an important research direction in the field of integrating multi-source geo-ontologies. Extracting essential geographical properties and reducing the concept lattice are two key points of previous research. A formal integration method is proposed to address the challenges in these two areas. We first extract essential properties from multi-source geo-ontologies and use FCA to build a merged formal context. Second, the combined importance weight of each single attribute of the formal context is calculated by introducing the inclusion degree importance from rough set theory and information entropy; then a weighted formal context is built from the merged formal context. Third, a combined weighted concept lattice is established from the weighted formal context with FCA and the importance weight value of every concept is defined as the sum of weight of attributes belonging to the concept’s intent. Finally, semantic granularity of concept is defined by its importance weight; we, then gradually reduce the weighted concept lattice by setting up diminishing threshold of semantic granularity. Additionally, all of those reduced lattices are organized into a regular hierarchy structure based on the threshold of semantic granularity. A workflow is designed to demonstrate this procedure. A case study is conducted to show feasibility and validity of this method and the procedure to integrate multi-source geo-ontologies.

  13. Combining Generalized Renewal Processes with Non-Extensive Entropy-Based q-Distributions for Reliability Applications

    Directory of Open Access Journals (Sweden)

    Isis Didier Lins

    2018-03-01

    Full Text Available The Generalized Renewal Process (GRP is a probabilistic model for repairable systems that can represent the usual states of a system after a repair: as new, as old, or in a condition between new and old. It is often coupled with the Weibull distribution, widely used in the reliability context. In this paper, we develop novel GRP models based on probability distributions that stem from the Tsallis’ non-extensive entropy, namely the q-Exponential and the q-Weibull distributions. The q-Exponential and Weibull distributions can model decreasing, constant or increasing failure intensity functions. However, the power law behavior of the q-Exponential probability density function for specific parameter values is an advantage over the Weibull distribution when adjusting data containing extreme values. The q-Weibull probability distribution, in turn, can also fit data with bathtub-shaped or unimodal failure intensities in addition to the behaviors already mentioned. Therefore, the q-Exponential-GRP is an alternative for the Weibull-GRP model and the q-Weibull-GRP generalizes both. The method of maximum likelihood is used for their parameters’ estimation by means of a particle swarm optimization algorithm, and Monte Carlo simulations are performed for the sake of validation. The proposed models and algorithms are applied to examples involving reliability-related data of complex systems and the obtained results suggest GRP plus q-distributions are promising techniques for the analyses of repairable systems.

  14. A New Scrambling Evaluation Scheme Based on Spatial Distribution Entropy and Centroid Difference of Bit-Plane

    Science.gov (United States)

    Zhao, Liang; Adhikari, Avishek; Sakurai, Kouichi

    Watermarking is one of the most effective techniques for copyright protection and information hiding. It can be applied in many fields of our society. Nowadays, some image scrambling schemes are used as one part of the watermarking algorithm to enhance the security. Therefore, how to select an image scrambling scheme and what kind of the image scrambling scheme may be used for watermarking are the key problems. Evaluation method of the image scrambling schemes can be seen as a useful test tool for showing the property or flaw of the image scrambling method. In this paper, a new scrambling evaluation system based on spatial distribution entropy and centroid difference of bit-plane is presented to obtain the scrambling degree of image scrambling schemes. Our scheme is illustrated and justified through computer simulations. The experimental results show (in Figs. 6 and 7) that for the general gray-scale image, the evaluation degree of the corresponding cipher image for the first 4 significant bit-planes selection is nearly the same as that for the 8 bit-planes selection. That is why, instead of taking 8 bit-planes of a gray-scale image, it is sufficient to take only the first 4 significant bit-planes for the experiment to find the scrambling degree. This 50% reduction in the computational cost makes our scheme efficient.

  15. ENTROPY - OUR BEST FRIEND

    Directory of Open Access Journals (Sweden)

    Urban Kordes

    2005-10-01

    Full Text Available The paper tries to tackle the question of connection between entropy and the living. Definitions of life as the phenomenon that defies entropy are overviewed and the conclusion is reached that life is in a way dependant on entropy - it couldn't exist without it. Entropy is a sort of medium, a fertile soil, that gives life possibility to blossom. Paper ends with presenting some consequences for the field of artificial intelligence.

  16. Entropy of Baker's Transformation

    Institute of Scientific and Technical Information of China (English)

    栾长福

    2003-01-01

    Four theorems about four different kinds of entropies for Baker's transformation are presented. The Kolmogorov entropy of Baker's transformation is sensitive to the initial flips by the time. The topological entropy of Baker's transformation is found to be log k. The conditions for the state of Baker's transformation to be forbidden are also derived. The relations among the Shanonn, Kolmogorov, topological and Boltzmann entropies are discussed in details.

  17. Physical entropy, information entropy and their evolution equations

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Inspired by the evolution equation of nonequilibrium statistical physics entropy and the concise statistical formula of the entropy production rate, we develop a theory of the dynamic information entropy and build a nonlinear evolution equation of the information entropy density changing in time and state variable space. Its mathematical form and physical meaning are similar to the evolution equation of the physical entropy: The time rate of change of information entropy density originates together from drift, diffusion and production. The concise statistical formula of information entropy production rate is similar to that of physical entropy also. Furthermore, we study the similarity and difference between physical entropy and information entropy and the possible unification of the two statistical entropies, and discuss the relationship among the principle of entropy increase, the principle of equilibrium maximum entropy and the principle of maximum information entropy as well as the connection between them and the entropy evolution equation.

  18. Loss of conformational entropy in protein folding calculated using realistic ensembles and its implications for NMR-based calculations

    Science.gov (United States)

    Baxa, Michael C.; Haddadian, Esmael J.; Jumper, John M.; Freed, Karl F.; Sosnick, Tobin R.

    2014-01-01

    The loss of conformational entropy is a major contribution in the thermodynamics of protein folding. However, accurate determination of the quantity has proven challenging. We calculate this loss using molecular dynamic simulations of both the native protein and a realistic denatured state ensemble. For ubiquitin, the total change in entropy is TΔSTotal = 1.4 kcal⋅mol−1 per residue at 300 K with only 20% from the loss of side-chain entropy. Our analysis exhibits mixed agreement with prior studies because of the use of more accurate ensembles and contributions from correlated motions. Buried side chains lose only a factor of 1.4 in the number of conformations available per rotamer upon folding (ΩU/ΩN). The entropy loss for helical and sheet residues differs due to the smaller motions of helical residues (TΔShelix−sheet = 0.5 kcal⋅mol−1), a property not fully reflected in the amide N-H and carbonyl C=O bond NMR order parameters. The results have implications for the thermodynamics of folding and binding, including estimates of solvent ordering and microscopic entropies obtained from NMR. PMID:25313044

  19. Entropy: Order or Information

    Science.gov (United States)

    Ben-Naim, Arieh

    2011-01-01

    Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)

  20. Entropy of the system formed in heavy ion collision

    International Nuclear Information System (INIS)

    Gudima, K.K.; Schulz, H.; Toneev, V.D.

    1985-01-01

    In frames of a cascade model the entropy evolution in a system producted in heavy ion collisions is investigated. Entropy calculation is based on smoothing of the distribution function over the momentum space by the temperature field introduction. The resulting entropy per one nucleon is shown to be rather sensitive to phase space subdivision into cells at the stage of free scattering of reaction products. Compared to recent experimental results for specific entropy values inferred from the composite particle yield of 4π measurements, it is found that cascade calculations do not favour some particular entropy model treatments and suggest smaller entropy values than following from consideration within equilibrium statistics

  1. Software Component Clustering and Retrieval: An Entropy-based Fuzzy k-Modes Methodology

    OpenAIRE

    Stylianou, Constantinos; Andreou, Andreas S.

    2008-01-01

    The number of software houses attempting to adopt a component-based development approach is rapidly increasing. However many organisations still find it difficult to complete the shift as it requires them to alter their entire software development process and philosophy. Furthermore, to promote component-based software engineering, organisations must be ready to promote reusability and this can only be attained if the proper framework exists from which a developer can access, search and retri...

  2. A Noise Reduction Method for Dual-Mass Micro-Electromechanical Gyroscopes Based on Sample Entropy Empirical Mode Decomposition and Time-Frequency Peak Filtering.

    Science.gov (United States)

    Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun

    2016-05-31

    The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods.

  3. A Noise Reduction Method for Dual-Mass Micro-Electromechanical Gyroscopes Based on Sample Entropy Empirical Mode Decomposition and Time-Frequency Peak Filtering

    Directory of Open Access Journals (Sweden)

    Chong Shen

    2016-05-01

    Full Text Available The different noise components in a dual-mass micro-electromechanical system (MEMS gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN, electronic-thermal noise (ETN, flicker noise (FN and Coriolis signal in-phase noise (IPN. The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD and time-frequency peak filtering (TFPF. There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods.

  4. Modeling the Non-Equilibrium Process of the Chemical Adsorption of Ammonia on GaN(0001) Reconstructed Surfaces Based on Steepest-Entropy-Ascent Quantum Thermodynamics

    OpenAIRE

    Kusaba, Akira; Li, Guanchen; von Spakovsky, Michael R.; Kangawa, Yoshihiro; Kakimoto, Koichi

    2017-01-01

    Clearly understanding elementary growth processes that depend on surface reconstruction is essential to controlling vapor-phase epitaxy more precisely. In this study, ammonia chemical adsorption on GaN(0001) reconstructed surfaces under metalorganic vapor phase epitaxy (MOVPE) conditions (3Ga-H and Nad-H + Ga-H on a 2 × 2 unit cell) is investigated using steepest-entropy-ascent quantum thermodynamics (SEAQT). SEAQT is a thermodynamic-ensemble based, first-principles framework that can predict...

  5. A thermodynamic perspective on food webs: Quantifying entropy production within detrital-based ecosystems

    NARCIS (Netherlands)

    Meysman, F.J.R.; Bruers, S.

    2007-01-01

    Because ecosystems fit so nicely the framework of a “dissipative system”, a better integration of thermodynamic and ecological perspectives could benefit the quantitative analysis of ecosystems. One obstacle is that traditional food web models are solely based upon the principles of mass and energy

  6. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.

  7. RFID Privacy Risk Evaluation Based on Synthetic Method of Extended Attack Tree and Information Feature Entropy

    OpenAIRE

    Li, Peng; Xu, Chao; Chen, Long; Wang, Ruchuan

    2015-01-01

    Evaluation of security risks in radio frequency identification (RFID) systems is a challenging problem in Internet of Things (IoT). This paper proposes an extended attack tree (EAT) model to identify RFID system’s flaws and vulnerabilities. A corresponding formal description of the model is described which adds a probability SAND node together with the probability attribute of the node attack. In addition, we model the process of an RFID data privacy attack based on EAT, taking a sensitive in...

  8. Bayes-Optimal Entropy Pursuit for Active Choice-Based Preference Learning

    OpenAIRE

    Pallone, Stephen N.; Frazier, Peter I.; Henderson, Shane G.

    2017-01-01

    We analyze the problem of learning a single user's preferences in an active learning setting, sequentially and adaptively querying the user over a finite time horizon. Learning is conducted via choice-based queries, where the user selects her preferred option among a small subset of offered alternatives. These queries have been shown to be a robust and efficient way to learn an individual's preferences. We take a parametric approach and model the user's preferences through a linear classifier...

  9. Quantum chaos: entropy signatures

    International Nuclear Information System (INIS)

    Miller, P.A.; Sarkar, S.; Zarum, R.

    1998-01-01

    A definition of quantum chaos is given in terms of entropy production rates for a quantum system coupled weakly to a reservoir. This allows the treatment of classical and quantum chaos on the same footing. In the quantum theory the entropy considered is the von Neumann entropy and in classical systems it is the Gibbs entropy. The rate of change of the coarse-grained Gibbs entropy of the classical system with time is given by the Kolmogorov-Sinai (KS) entropy. The relation between KS entropy and the rate of change of von Neumann entropy is investigated for the kicked rotator. For a system which is classically chaotic there is a linear relationship between these two entropies. Moreover it is possible to construct contour plots for the local KS entropy and compare it with the corresponding plots for the rate of change of von Neumann entropy. The quantitative and qualitative similarities of these plots are discussed for the standard map (kicked rotor) and the generalised cat maps. (author)

  10. Identification method of gas-liquid two-phase flow regime based on image wavelet packet information entropy and genetic neural network

    International Nuclear Information System (INIS)

    Zhou Yunlong; Chen Fei; Sun Bin

    2008-01-01

    Based on the characteristic that wavelet packet transform image can be decomposed by different scales, a flow regime identification method based on image wavelet packet information entropy feature and genetic neural network was proposed. Gas-liquid two-phase flow images were captured by digital high speed video systems in horizontal pipe. The information entropy feature from transformation coefficients were extracted using image processing techniques and multi-resolution analysis. The genetic neural network was trained using those eigenvectors, which was reduced by the principal component analysis, as flow regime samples, and the flow regime intelligent identification was realized. The test result showed that image wavelet packet information entropy feature could excellently reflect the difference between seven typical flow regimes, and the genetic neural network with genetic algorithm and BP algorithm merits were with the characteristics of fast convergence for simulation and avoidance of local minimum. The recognition possibility of the network could reach up to about 100%, and a new and effective method was presented for on-line flow regime. (authors)

  11. Investigation on thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing in scramjet cooling channel based on wavelet entropy method

    Science.gov (United States)

    Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen

    2018-06-01

    As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.

  12. Heat transfer and entropy generation analysis of HFE 7000 based nanorefrigerants

    OpenAIRE

    Helvaci, H.; Khan, Zulfiqar Ahmad

    2017-01-01

    In this study, two dimensional numerical simulations of forced convection flow of HFE 7000 based nanofluids in a horizontal circular tube subjected to a constant and uniform heat flux in laminar flow was performed by using single phase homogeneous model. Four different nanofluids considered in the present study are Al2O3, CuO, SiO2 and MgO nanoparticles dispersed in pure HFE 7000. The simulations were performed with particle volumetric concentrations of 0, 1, 4 and 6% and Reynolds number of 4...

  13. Entropy and information

    CERN Document Server

    Volkenstein, Mikhail V

    2009-01-01

    The book "Entropy and Information" deals with the thermodynamical concept of entropy and its relationship to information theory. It is successful in explaining the universality of the term "Entropy" not only as a physical phenomenon, but reveals its existence also in other domains. E.g., Volkenstein discusses the "meaning" of entropy in a biological context and shows how entropy is related to artistic activities. Written by the renowned Russian bio-physicist Mikhail V. Volkenstein, this book on "Entropy and Information" surely serves as a timely introduction to understand entropy from a thermodynamic perspective and is definitely an inspiring and thought-provoking book that should be read by every physicist, information-theorist, biologist, and even artist.

  14. Evaluation and Comparison of Extremal Hypothesis-Based Regime Methods

    Directory of Open Access Journals (Sweden)

    Ishwar Joshi

    2018-03-01

    Full Text Available Regime channels are important for stable canal design and to determine river response to environmental changes, e.g., due to the construction of a dam, land use change, and climate shifts. A plethora of methods is available describing the hydraulic geometry of alluvial rivers in the regime. However, comparison of these methods using the same set of data seems lacking. In this study, we evaluate and compare four different extremal hypothesis-based regime methods, namely minimization of Froude number (MFN, maximum entropy and minimum energy dissipation rate (ME and MEDR, maximum flow efficiency (MFE, and Millar’s method, by dividing regime channel data into sand and gravel beds. The results show that for sand bed channels MFN gives a very high accuracy of prediction for regime channel width and depth. For gravel bed channels we find that MFN and ‘ME and MEDR’ give a very high accuracy of prediction for width and depth. Therefore the notion that extremal hypotheses which do not contain bank stability criteria are inappropriate for use is shown false as both MFN and ‘ME and MEDR’ lack bank stability criteria. Also, we find that bank vegetation has significant influence in the prediction of hydraulic geometry by MFN and ‘ME and MEDR’.

  15. RNA Thermodynamic Structural Entropy.

    Science.gov (United States)

    Garcia-Martin, Juan Antonio; Clote, Peter

    2015-01-01

    Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http

  16. RNA Thermodynamic Structural Entropy.

    Directory of Open Access Journals (Sweden)

    Juan Antonio Garcia-Martin

    Full Text Available Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs. However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http

  17. The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis

    Science.gov (United States)

    Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali

    2018-04-01

    The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.

  18. Methods for calculating nonconcave entropies

    International Nuclear Information System (INIS)

    Touchette, Hugo

    2010-01-01

    Five different methods which can be used to analytically calculate entropies that are nonconcave as functions of the energy in the thermodynamic limit are discussed and compared. The five methods are based on the following ideas and techniques: (i) microcanonical contraction, (ii) metastable branches of the free energy, (iii) generalized canonical ensembles with specific illustrations involving the so-called Gaussian and Betrag ensembles, (iv) the restricted canonical ensemble, and (v) the inverse Laplace transform. A simple long-range spin model having a nonconcave entropy is used to illustrate each method

  19. Resource and environment efficiency analysis of provinces in China: A DEA approach based on Shannon's entropy

    International Nuclear Information System (INIS)

    Bian Yiwen; Yang Feng

    2010-01-01

    Data envelopment analysis (DEA) has been widely used in energy efficiency and environment efficiency analysis in recent years. Based on the existing environment DEA technology, this paper presents several DEA models for estimating the aggregated efficiency of resource and environment. These models can evaluate DMUs' energy efficiencies and environment efficiencies simultaneously. However, efficiency ranking results obtained from these models are not the same, and each model can provide some valuable information of DMUs' efficiencies, which we could not ignore. Under this situation, it may be hard for us to choose a specific model in practice. To address this kind of performance evaluation problem, the current paper extends Shannon-DEA procedure to establish a comprehensive efficiency measure for appraising DMUs' resource and environment efficiencies. In the proposed approach, the measure for evaluating a model's importance degree is provided, and the targets setting approach of inputs/outputs for DMU managers to improve DMUs' energy and environmental efficiencies is also discussed. We illustrate the proposed approach using real data set of 30 provinces in China.

  20. Efficient Multi-Label Feature Selection Using Entropy-Based Label Selection

    Directory of Open Access Journals (Sweden)

    Jaesung Lee

    2016-11-01

    Full Text Available Multi-label feature selection is designed to select a subset of features according to their importance to multiple labels. This task can be achieved by ranking the dependencies of features and selecting the features with the highest rankings. In a multi-label feature selection problem, the algorithm may be faced with a dataset containing a large number of labels. Because the computational cost of multi-label feature selection increases according to the number of labels, the algorithm may suffer from a degradation in performance when processing very large datasets. In this study, we propose an efficient multi-label feature selection method based on an information-theoretic label selection strategy. By identifying a subset of labels that significantly influence the importance of features, the proposed method efficiently outputs a feature subset. Experimental results demonstrate that the proposed method can identify a feature subset much faster than conventional multi-label feature selection methods for large multi-label datasets.

  1. Entropy of Iterated Function Systems and Their Relations with Black Holes and Bohr-Like Black Holes Entropies

    Directory of Open Access Journals (Sweden)

    Christian Corda

    2018-01-01

    Full Text Available In this paper we consider the metric entropies of the maps of an iterated function system deduced from a black hole which are known the Bekenstein–Hawking entropies and its subleading corrections. More precisely, we consider the recent model of a Bohr-like black hole that has been recently analysed in some papers in the literature, obtaining the intriguing result that the metric entropies of a black hole are created by the metric entropies of the functions, created by the black hole principal quantum numbers, i.e., by the black hole quantum levels. We present a new type of topological entropy for general iterated function systems based on a new kind of the inverse of covers. Then the notion of metric entropy for an Iterated Function System ( I F S is considered, and we prove that these definitions for topological entropy of IFS’s are equivalent. It is shown that this kind of topological entropy keeps some properties which are hold by the classic definition of topological entropy for a continuous map. We also consider average entropy as another type of topological entropy for an I F S which is based on the topological entropies of its elements and it is also an invariant object under topological conjugacy. The relation between Axiom A and the average entropy is investigated.

  2. Maximum-entropy clustering algorithm and its global convergence analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Constructing a batch of differentiable entropy functions touniformly approximate an objective function by means of the maximum-entropy principle, a new clustering algorithm, called maximum-entropy clustering algorithm, is proposed based on optimization theory. This algorithm is a soft generalization of the hard C-means algorithm and possesses global convergence. Its relations with other clustering algorithms are discussed.

  3. Investigation of phase stability of novel equiatomic FeCoNiCuZn based-high entropy alloy prepared by mechanical alloying

    Science.gov (United States)

    Soni, Vinay Kumar; Sanyal, S.; Sinha, S. K.

    2018-05-01

    The present work reports the structural and phase stability analysis of equiatomic FeCoNiCuZn High entropy alloy (HEA) systems prepared by mechanical alloying (MA) method. In this research effort some 1287 alloy combinations were extensively studied to arrive at most favourable combination. FeCoNiCuZn based alloy system was selected on the basis of physiochemical parameters such as enthalpy of mixing (ΔHmix), entropy of mixing (ΔSmix), atomic size difference (ΔX) and valence electron concentration (VEC) such that it fulfils the formation criteria of stable multi component high entropy alloy system. In this context, we have investigated the effect of novel alloying addition in view of microstructure and phase formation aspect. XRD plots of the MA samples shows the formation of stable solid solution with FCC (Face Cantered Cubic) after 20 hr of milling time and no indication of any amorphous or intermetallic phase formation. Our results are in good agreement with calculation and analysis done on the basis of physiochemical parameters during selection of constituent elements of HEA.

  4. Gravitational entropies in LTB dust models

    International Nuclear Information System (INIS)

    Sussman, Roberto A; Larena, Julien

    2014-01-01

    We consider generic Lemaître–Tolman–Bondi (LTB) dust models to probe the gravitational entropy proposals of Clifton, Ellis and Tavakol (CET) and of Hosoya and Buchert (HB). We also consider a variant of the HB proposal based on a suitable quasi-local scalar weighted average. We show that the conditions for entropy growth for all proposals are directly related to a negative correlation of similar fluctuations of the energy density and Hubble scalar. While this correlation is evaluated locally for the CET proposal, it must be evaluated in a non-local domain dependent manner for the two HB proposals. By looking at the fulfilment of these conditions at the relevant asymptotic limits we are able to provide a well grounded qualitative description of the full time evolution and radial asymptotic scaling of the three entropies in generic models. The following rigorous analytic results are obtained for the three proposals: (i) entropy grows when the density growing mode is dominant, (ii) all ever-expanding hyperbolic models reach a stable terminal equilibrium characterized by an inhomogeneous entropy maximum in their late time evolution; (iii) regions with decaying modes and collapsing elliptic models exhibit unstable equilibria associated with an entropy minimum (iv) near singularities the CET entropy diverges while the HB entropies converge; (v) the CET entropy converges for all models in the radial asymptotic range, whereas the HB entropies only converge for models asymptotic to a Friedmann–Lemaître–Robertson–Walker background. The fact that different independent proposals yield fairly similar conditions for entropy production, time evolution and radial scaling in generic LTB models seems to suggest that their common notion of a ‘gravitational entropy’ may be a theoretically robust concept applicable to more general spacetimes. (paper)

  5. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns.

    Science.gov (United States)

    Osgood, Nathaniel D; Paul, Tuhin; Stanley, Kevin G; Qian, Weicheng

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist.

  6. ENTROPIES AND FLUX-SPLITTINGS FOR THE ISENTROPIC EULER EQUATIONS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The authors establish the existence of a large class of mathematical entropies (the so-called weak entropies) associated with the Euler equations for an isentropic, compressible fluid governed by a general pressure law. A mild assumption on the behavior of the pressure law near the vacuum is solely required. The analysis is based on an asymptotic expansion of the fundamental solution (called here the entropy kernel) of a highly singular Euler-Poisson-Darboux equation. The entropy kernel is only H lder continuous and its regularity is carefully investigated. Relying on a notion introduced earlier by the authors, it is also proven that, for the Euler equations, the set of entropy flux-splittings coincides with the set of entropies-entropy fluxes. These results imply the existence of a flux-splitting consistent with all of the entropy inequalities.

  7. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    DEFF Research Database (Denmark)

    de Palma, Giacomo; Trevisan, Dario

    2018-01-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally...... independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically...... achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under...

  8. Entropy generation of nanofluid flow in a microchannel heat sink

    Science.gov (United States)

    Manay, Eyuphan; Akyürek, Eda Feyza; Sahin, Bayram

    2018-06-01

    Present study aims to investigate the effects of the presence of nano sized TiO2 particles in the base fluid on entropy generation rate in a microchannel heat sink. Pure water was chosen as base fluid, and TiO2 particles were suspended into the pure water in five different particle volume fractions of 0.25%, 0.5%, 1.0%, 1.5% and 2.0%. Under laminar, steady state flow and constant heat flux boundary conditions, thermal, frictional, total entropy generation rates and entropy generation number ratios of nanofluids were experimentally analyzed in microchannel flow for different channel heights of 200 μm, 300 μm, 400 μm and 500 μm. It was observed that frictional and total entropy generation rates increased as thermal entropy generation rate were decreasing with an increase in particle volume fraction. In microchannel flows, thermal entropy generation could be neglected due to its too low rate smaller than 1.10e-07 in total entropy generation. Higher channel heights caused higher thermal entropy generation rates, and increasing channel height yielded an increase from 30% to 52% in thermal entropy generation. When channel height decreased, an increase of 66%-98% in frictional entropy generation was obtained. Adding TiO2 nanoparticles into the base fluid caused thermal entropy generation to decrease about 1.8%-32.4%, frictional entropy generation to increase about 3.3%-21.6%.

  9. Entropy of black holes with multiple horizons

    Directory of Open Access Journals (Sweden)

    Yun He

    2018-05-01

    Full Text Available We examine the entropy of black holes in de Sitter space and black holes surrounded by quintessence. These black holes have multiple horizons, including at least the black hole event horizon and a horizon outside it (cosmological horizon for de Sitter black holes and “quintessence horizon” for the black holes surrounded by quintessence. Based on the consideration that the two horizons are not independent each other, we conjecture that the total entropy of these black holes should not be simply the sum of entropies of the two horizons, but should have an extra term coming from the correlations between the two horizons. Different from our previous works, in this paper we consider the cosmological constant as the variable and employ an effective method to derive the explicit form of the entropy. We also try to discuss the thermodynamic stabilities of these black holes according to the entropy and the effective temperature.

  10. Entropy of black holes with multiple horizons

    Science.gov (United States)

    He, Yun; Ma, Meng-Sen; Zhao, Ren

    2018-05-01

    We examine the entropy of black holes in de Sitter space and black holes surrounded by quintessence. These black holes have multiple horizons, including at least the black hole event horizon and a horizon outside it (cosmological horizon for de Sitter black holes and "quintessence horizon" for the black holes surrounded by quintessence). Based on the consideration that the two horizons are not independent each other, we conjecture that the total entropy of these black holes should not be simply the sum of entropies of the two horizons, but should have an extra term coming from the correlations between the two horizons. Different from our previous works, in this paper we consider the cosmological constant as the variable and employ an effective method to derive the explicit form of the entropy. We also try to discuss the thermodynamic stabilities of these black holes according to the entropy and the effective temperature.

  11. Entanglement entropy in top-down models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Peter A.R.; Taylor, Marika [Mathematical Sciences and STAG Research Centre, University of Southampton,Highfield, Southampton, SO17 1BJ (United Kingdom)

    2016-08-26

    We explore holographic entanglement entropy in ten-dimensional supergravity solutions. It has been proposed that entanglement entropy can be computed in such top-down models using minimal surfaces which asymptotically wrap the compact part of the geometry. We show explicitly in a wide range of examples that the holographic entanglement entropy thus computed agrees with the entanglement entropy computed using the Ryu-Takayanagi formula from the lower-dimensional Einstein metric obtained from reduction over the compact space. Our examples include not only consistent truncations but also cases in which no consistent truncation exists and Kaluza-Klein holography is used to identify the lower-dimensional Einstein metric. We then give a general proof, based on the Lewkowycz-Maldacena approach, of the top-down entanglement entropy formula.

  12. Entanglement entropy in top-down models

    International Nuclear Information System (INIS)

    Jones, Peter A.R.; Taylor, Marika

    2016-01-01

    We explore holographic entanglement entropy in ten-dimensional supergravity solutions. It has been proposed that entanglement entropy can be computed in such top-down models using minimal surfaces which asymptotically wrap the compact part of the geometry. We show explicitly in a wide range of examples that the holographic entanglement entropy thus computed agrees with the entanglement entropy computed using the Ryu-Takayanagi formula from the lower-dimensional Einstein metric obtained from reduction over the compact space. Our examples include not only consistent truncations but also cases in which no consistent truncation exists and Kaluza-Klein holography is used to identify the lower-dimensional Einstein metric. We then give a general proof, based on the Lewkowycz-Maldacena approach, of the top-down entanglement entropy formula.

  13. Problems in black-hole entropy interpretation

    International Nuclear Information System (INIS)

    Liberati, S.

    1997-01-01

    In this work some proposals for black-hole entropy interpretation are exposed and investigated. In particular, the author will firstly consider the so-called 'entanglement entropy' interpretation, in the framework of the brick wall model and the divergence problem arising in the one-loop calculations of various thermodynamical quantities, like entropy, internal energy and heat capacity. It is shown that the assumption of equality of entanglement entropy and Bekenstein-Hawking one appears to give inconsistent results. These will be a starting point for a different interpretation of black.hole entropy based on peculiar topological structures of manifolds with 'intrinsic' thermodynamical features. It is possible to show an exact relation between black-hole gravitational entropy and topology of these Euclidean space-times. the expression for the Euler characteristic, through the Gauss-Bonnet integral, and the one for entropy for gravitational instantons are proposed in a form which makes the relation between these self-evident. Using this relation he propose a generalization of the Bekenstein-Hawking entropy in which the former and Euler characteristic are related in the equation S = χA / 8. Finally, he try to expose some conclusions and hypotheses about possible further development of this research

  14. Spatiotemporal modeling of ozone levels in Quebec (Canada): a comparison of kriging, land-use regression (LUR), and combined Bayesian maximum entropy-LUR approaches.

    Science.gov (United States)

    Adam-Poupart, Ariane; Brand, Allan; Fournier, Michel; Jerrett, Michael; Smargiassi, Audrey

    2014-09-01

    Ambient air ozone (O3) is a pulmonary irritant that has been associated with respiratory health effects including increased lung inflammation and permeability, airway hyperreactivity, respiratory symptoms, and decreased lung function. Estimation of O3 exposure is a complex task because the pollutant exhibits complex spatiotemporal patterns. To refine the quality of exposure estimation, various spatiotemporal methods have been developed worldwide. We sought to compare the accuracy of three spatiotemporal models to predict summer ground-level O3 in Quebec, Canada. We developed a land-use mixed-effects regression (LUR) model based on readily available data (air quality and meteorological monitoring data, road networks information, latitude), a Bayesian maximum entropy (BME) model incorporating both O3 monitoring station data and the land-use mixed model outputs (BME-LUR), and a kriging method model based only on available O3 monitoring station data (BME kriging). We performed leave-one-station-out cross-validation and visually assessed the predictive capability of each model by examining the mean temporal and spatial distributions of the average estimated errors. The BME-LUR was the best predictive model (R2 = 0.653) with the lowest root mean-square error (RMSE ;7.06 ppb), followed by the LUR model (R2 = 0.466, RMSE = 8.747) and the BME kriging model (R2 = 0.414, RMSE = 9.164). Our findings suggest that errors of estimation in the interpolation of O3 concentrations with BME can be greatly reduced by incorporating outputs from a LUR model developed with readily available data.

  15. Maximum Quantum Entropy Method

    OpenAIRE

    Sim, Jae-Hoon; Han, Myung Joon

    2018-01-01

    Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...

  16. Transplanckian entanglement entropy

    International Nuclear Information System (INIS)

    Chang, Darwin; Chu, C.-S.; Lin Fengli

    2004-01-01

    The entanglement entropy of the event horizon is known to be plagued by the UV divergence due to the infinitely blue-shifted near horizon modes. In this Letter we calculate the entanglement entropy using the transplanckian dispersion relation, which has been proposed to model the quantum gravity effects. We show that, very generally, the entropy is rendered UV finite due to the suppression of high energy modes effected by the transplanckian dispersion relation

  17. Entropy-Stabilized Oxides

    Science.gov (United States)

    2015-09-29

    antiferroelectrics. Phys. Rev. Lett. 110, 017603 (2013). 22. Cantor , B., Chang, I., Knight, P. & Vincent, A. Microstructural development in equiatomic...Science 345, 1153–1158 (2014). 24. Gali, A. & George , E. Tensile properties of high- and medium-entropy alloys. Intermetallics 39, 74–78 (2013). 25...148–153 (2014). 26. Otto, F., Yang, Y., Bei, H. & George , E. Relative effects of enthalpy and entropy on the phase stability of equiatomic high-entropy

  18. Fault feature extraction method based on local mean decomposition Shannon entropy and improved kernel principal component analysis model

    Directory of Open Access Journals (Sweden)

    Jinlu Sheng

    2016-07-01

    Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.

  19. Estimation of a tube diameter in a ‘church window’ condenser based on entropy generation minimization

    Directory of Open Access Journals (Sweden)

    Laskowski Rafał

    2015-09-01

    Full Text Available The internal diameter of a tube in a ‘church window’ condenser was estimated using an entropy generation minimization approach. The adopted model took into account the entropy generation due to heat transfer and flow resistance from the cooling-water side. Calculations were performed considering two equations for the flow resistance coefficient for four different roughness values of a condenser tube. Following the analysis, the internal diameter of the tube was obtained in the range of 17.5 mm to 20 mm (the current internal diameter of the condenser tube is 22 mm. The calculated diameter depends on and is positively related to the roughness assumed in the model.

  20. Spare optimistic based on improved ADMM and the minimum entropy de-convolution for the early weak fault diagnosis of bearings in marine systems.

    Science.gov (United States)

    Gao, Yangde; Karimi, Mohammad; Kudreyko, Aleksey A; Song, Wanqing

    2017-12-30

    In the marine systems, engines represent the most important part of ships, the probability of the bearings fault is the highest in the engines, so in the bearing vibration analysis, early weak fault detection is very important for long term monitoring. In this paper, we propose a novel method to solve the early weak fault diagnosis of bearing. Firstly, we should improve the alternating direction method of multipliers (ADMM), structure of the traditional ADMM is changed, and then the improved ADMM is applied to the compressed sensing (CS) theory, which realizes the sparse optimization of bearing signal for a mount of data. After the sparse signal is reconstructed, the calculated signal is restored with the minimum entropy de-convolution (MED) to get clear fault information. Finally we adopt the sample entropy. Morphological mean square amplitude and the root mean square (RMS) to find the early fault diagnosis of bearing respectively, at the same time, we plot the Boxplot comparison chart to find the best of the three indicators. The experimental results prove that the proposed method can effectively identify the early weak fault diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. More dimensions: Less entropy

    International Nuclear Information System (INIS)

    Kolb, E.W.; Lindley, D.; Seckel, D.

    1984-01-01

    For a cosmological model with d noncompact and D compact spatial dimensions and symmetry R 1 x S/sup d/ x S/sup D/, we calculate the entropy produced in d dimensions due to the compactification of D dimensions and show it too small to be of cosmological interest. Although insufficient entropy is produced in the model we study, the contraction of extra dimensions does lead to entropy production. We discuss modifications of our assumptions, including changing our condition for decoupling of the extra dimensions, which may lead to a large entropy production and change our conclusions

  2. ENTROPY FUNCTIONAL FOR CONTINUOUS SYSTEMS OF FINITE ENTROPY

    Institute of Scientific and Technical Information of China (English)

    M. Rahimi A. Riazi

    2012-01-01

    In this article,we introduce the concept of entropy functional for continuous systems on compact metric spaces,and prove some of its properties.We also extract the Kolmogorov entropy from the entropy functional.

  3. A Classification Detection Algorithm Based on Joint Entropy Vector against Application-Layer DDoS Attack

    Directory of Open Access Journals (Sweden)

    Yuntao Zhao

    2018-01-01

    Full Text Available The application-layer distributed denial of service (AL-DDoS attack makes a great threat against cyberspace security. The attack detection is an important part of the security protection, which provides effective support for defense system through the rapid and accurate identification of attacks. According to the attacker’s different URL of the Web service, the AL-DDoS attack is divided into three categories, including a random URL attack and a fixed and a traverse one. In order to realize identification of attacks, a mapping matrix of the joint entropy vector is constructed. By defining and computing the value of EUPI and jEIPU, a visual coordinate discrimination diagram of entropy vector is proposed, which also realizes data dimension reduction from N to two. In terms of boundary discrimination and the region where the entropy vectors fall in, the class of AL-DDoS attack can be distinguished. Through the study of training data set and classification, the results show that the novel algorithm can effectively distinguish the web server DDoS attack from normal burst traffic.

  4. Deep learning for classification of islanding and grid disturbance based on multi-resolution singular spectrum entropy

    Science.gov (United States)

    Li, Tie; He, Xiaoyang; Tang, Junci; Zeng, Hui; Zhou, Chunying; Zhang, Nan; Liu, Hui; Lu, Zhuoxin; Kong, Xiangrui; Yan, Zheng

    2018-02-01

    Forasmuch as the distinguishment of islanding is easy to be interfered by grid disturbance, island detection device may make misjudgment thus causing the consequence of photovoltaic out of service. The detection device must provide with the ability to differ islanding from grid disturbance. In this paper, the concept of deep learning is introduced into classification of islanding and grid disturbance for the first time. A novel deep learning framework is proposed to detect and classify islanding or grid disturbance. The framework is a hybrid of wavelet transformation, multi-resolution singular spectrum entropy, and deep learning architecture. As a signal processing method after wavelet transformation, multi-resolution singular spectrum entropy combines multi-resolution analysis and spectrum analysis with entropy as output, from which we can extract the intrinsic different features between islanding and grid disturbance. With the features extracted, deep learning is utilized to classify islanding and grid disturbance. Simulation results indicate that the method can achieve its goal while being highly accurate, so the photovoltaic system mistakenly withdrawing from power grids can be avoided.

  5. A parametrization of two-dimensional turbulence based on a maximum entropy production principle with a local conservation of energy

    International Nuclear Information System (INIS)

    Chavanis, Pierre-Henri

    2014-01-01

    In the context of two-dimensional (2D) turbulence, we apply the maximum entropy production principle (MEPP) by enforcing a local conservation of energy. This leads to an equation for the vorticity distribution that conserves all the Casimirs, the energy, and that increases monotonically the mixing entropy (H-theorem). Furthermore, the equation for the coarse-grained vorticity dissipates monotonically all the generalized enstrophies. These equations may provide a parametrization of 2D turbulence. They do not generally relax towards the maximum entropy state. The vorticity current vanishes for any steady state of the 2D Euler equation. Interestingly, the equation for the coarse-grained vorticity obtained from the MEPP turns out to coincide, after some algebraic manipulations, with the one obtained with the anticipated vorticity method. This shows a connection between these two approaches when the conservation of energy is treated locally. Furthermore, the newly derived equation, which incorporates a diffusion term and a drift term, has a nice physical interpretation in terms of a selective decay principle. This sheds new light on both the MEPP and the anticipated vorticity method. (paper)

  6. Downstream-Conditioned Maximum Entropy Method for Exit Boundary Conditions in the Lattice Boltzmann Method

    Directory of Open Access Journals (Sweden)

    Javier A. Dottori

    2015-01-01

    Full Text Available A method for modeling outflow boundary conditions in the lattice Boltzmann method (LBM based on the maximization of the local entropy is presented. The maximization procedure is constrained by macroscopic values and downstream components. The method is applied to fully developed boundary conditions of the Navier-Stokes equations in rectangular channels. Comparisons are made with other alternative methods. In addition, the new downstream-conditioned entropy is studied and it was found that there is a correlation with the velocity gradient during the flow development.

  7. Option price calibration from Renyi entropy

    International Nuclear Information System (INIS)

    Brody, Dorje C.; Buckley, Ian R.C.; Constantinou, Irene C.

    2007-01-01

    The calibration of the risk-neutral density function for the future asset price, based on the maximisation of the entropy measure of Renyi, is proposed. Whilst the conventional approach based on the use of logarithmic entropy measure fails to produce the observed power-law distribution when calibrated against option prices, the approach outlined here is shown to produce the desired form of the distribution. Procedures for the maximisation of the Renyi entropy under constraints are outlined in detail, and a number of interesting properties of the resulting power-law distributions are also derived. The result is applied to efficiently evaluate prices of path-independent derivatives

  8. Logarithmic black hole entropy corrections and holographic Rényi entropy

    Science.gov (United States)

    Mahapatra, Subhash

    2018-01-01

    The entanglement and Rényi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Rényi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order GD^0. The entropic c-function and the inequalities of the Rényi entropy are also satisfied even with the correction terms.

  9. 基于信息熵的SVM入侵检测技术%Exploring SVM-based intrusion detection through information entropy theory

    Institute of Scientific and Technical Information of China (English)

    朱文杰; 王强; 翟献军

    2013-01-01

    在传统基于SVM的入侵检测中,核函数构造和特征选择采用先验知识,普遍存在准确度不高、效率低下的问题.通过信息熵理论与SVM算法相结合的方法改进为基于信息熵的SVM入侵检测算法,可以提高入侵检测的准确性,提升入侵检测的效率.基于信息熵的SVM入侵检测算法包括两个方面:一方面,根据样本包含的用户信息熵和方差,将样本特征统一,以特征是否属于置信区间来度量.将得到的样本特征置信向量作为SVM核函数的构造参数,既可保证训练样本集与最优分类面之间的对应关系,又可得到入侵检测需要的最大分类间隔;另一方面,将样本包含的用户信息量作为度量大幅度约简样本特征子集,不但降低了样本计算规模,而且提高了分类器的训练速度.实验表明,该算法在入侵检测系统中的应用优于传统的SVM算法.%In traditional SVM based intrusion detection approaches,both core function construction and feature selection use prior knowdege.Due to this,they are not only inefficient but also inaccurate.It is observed that integrating information entropy theory into SVM-based intrusion detection can enhance both the precision and the speed.Concludely speaking,SVM-based entropy intrusion detection algorithms are made up of two aspects:on one hand,setting sample confidence vector as core function's constructor of SVM algorithm can guarantee the mapping relationship between training sample and optimization classification plane.Also,the intrusion detection's maximum interval can be acquired.On the other hand,simplifying feature subset with samples's entropy as metric standard can not only shrink the computing scale but also improve the speed.Experiments prove that the SVM based entropy intrusion detection algoritm outperfomrs other tradional algorithms.

  10. Hypoglycemia-Related Electroencephalogram Changes Assessed by Multiscale Entropy

    DEFF Research Database (Denmark)

    Fabris, C.; Sparacino, G.; Sejling, A. S.

    2014-01-01

    derivation in the two glycemic intervals was assessed using the multiscale entropy (MSE) approach, obtaining measures of sample entropy (SampEn) at various temporal scales. The comparison of how signal irregularity measured by SampEn varies as the temporal scale increases in the two glycemic states provides...

  11. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.

    Science.gov (United States)

    Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier

    2016-08-01

    Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.

  12. Enthalpy–entropy compensation

    Indian Academy of Sciences (India)

    Enthalpy–entropy compensation is the name given to the correlation sometimes observed between the estimates of the enthalpy and entropy of a reaction obtained from temperature-dependence data. Although the mainly artefactual nature of this correlation has been known for many years, the subject enjoys periodical ...

  13. Entropy in Biology

    Indian Academy of Sciences (India)

    During the process of ageing, the balance shifts in the direction of anarchy. Death is ... tion of life and the laws of statistieal physics and entropy, both of which ... capable of doing work. ... defined by Ludwig Boltzmann in 1877, the entropy of the.

  14. The holographic entropy cone

    Energy Technology Data Exchange (ETDEWEB)

    Bao, Ning [Institute for Quantum Information and Matter, California Institute of Technology,Pasadena, CA 91125 (United States); Walter Burke Institute for Theoretical Physics, California Institute of Technology,452-48, Pasadena, CA 91125 (United States); Nezami, Sepehr [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States); Ooguri, Hirosi [Walter Burke Institute for Theoretical Physics, California Institute of Technology,452-48, Pasadena, CA 91125 (United States); Kavli Institute for the Physics and Mathematics of the Universe, University of Tokyo,Kashiwa 277-8583 (Japan); Stoica, Bogdan [Walter Burke Institute for Theoretical Physics, California Institute of Technology,452-48, Pasadena, CA 91125 (United States); Sully, James [Theory Group, SLAC National Accelerator Laboratory, Stanford University,Menlo Park, CA 94025 (United States); Walter, Michael [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States)

    2015-09-21

    We initiate a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries. For 2, 3, and 4 regions, we prove that the strong subadditivity and the monogamy of mutual information give the complete set of inequalities. This is in contrast to the situation for generic quantum systems, where a complete set of entropy inequalities is not known for 4 or more regions. We also find an infinite new family of inequalities applicable to 5 or more regions. The set of all holographic entropy inequalities bounds the phase space of Ryu-Takayanagi entropies, defining the holographic entropy cone. We characterize this entropy cone by reducing geometries to minimal graph models that encode the possible cutting and gluing relations of minimal surfaces. We find that, for a fixed number of regions, there are only finitely many independent entropy inequalities. To establish new holographic entropy inequalities, we introduce a combinatorial proof technique that may also be of independent interest in Riemannian geometry and graph theory.

  15. The holographic entropy cone

    International Nuclear Information System (INIS)

    Bao, Ning; Nezami, Sepehr; Ooguri, Hirosi; Stoica, Bogdan; Sully, James; Walter, Michael

    2015-01-01

    We initiate a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries. For 2, 3, and 4 regions, we prove that the strong subadditivity and the monogamy of mutual information give the complete set of inequalities. This is in contrast to the situation for generic quantum systems, where a complete set of entropy inequalities is not known for 4 or more regions. We also find an infinite new family of inequalities applicable to 5 or more regions. The set of all holographic entropy inequalities bounds the phase space of Ryu-Takayanagi entropies, defining the holographic entropy cone. We characterize this entropy cone by reducing geometries to minimal graph models that encode the possible cutting and gluing relations of minimal surfaces. We find that, for a fixed number of regions, there are only finitely many independent entropy inequalities. To establish new holographic entropy inequalities, we introduce a combinatorial proof technique that may also be of independent interest in Riemannian geometry and graph theory.

  16. Entropy and Digital Installation

    Directory of Open Access Journals (Sweden)

    Susan Ballard

    2005-01-01

    Full Text Available This paper examines entropy as a process which introduces ideas of distributed materiality to digital installation. Beginning from an analysis of entropy as both force and probability measure within information theory and it’s extension in Ruldof Arnheim’s text ‘Entropy and Art” it develops an argument for the positive rather thannegative forces of entropy. The paper centres on a discussion of two recent works by New Zealand artists Ronnie van Hout (“On the Run”, Wellington City Gallery, NZ, 2004 and Alex Monteith (“Invisible Cities”, Physics Room Contemporary Art Space, Christchurch, NZ, 2004. Ballard suggests that entropy, rather than being a hindrance to understanding or a random chaotic force, discloses a necessary and material politics of noise present in digital installation.

  17. Entropy viscosity method for nonlinear conservation laws

    KAUST Repository

    Guermond, Jean-Luc

    2011-05-01

    A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.

  18. Entropy viscosity method for nonlinear conservation laws

    KAUST Repository

    Guermond, Jean-Luc; Pasquetti, Richard; Popov, Bojan

    2011-01-01

    A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.

  19. Permutation Entropy for Random Binary Sequences

    Directory of Open Access Journals (Sweden)

    Lingfeng Liu

    2015-12-01

    Full Text Available In this paper, we generalize the permutation entropy (PE measure to binary sequences, which is based on Shannon’s entropy, and theoretically analyze this measure for random binary sequences. We deduce the theoretical value of PE for random binary sequences, which can be used to measure the randomness of binary sequences. We also reveal the relationship between this PE measure with other randomness measures, such as Shannon’s entropy and Lempel–Ziv complexity. The results show that PE is consistent with these two measures. Furthermore, we use PE as one of the randomness measures to evaluate the randomness of chaotic binary sequences.

  20. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  1. Time dependence of entropy flux and entropy production for a dynamical system driven by noises with coloured cross-correlation

    Institute of Scientific and Technical Information of China (English)

    Xie Wen-Xian; Xu Wei; Cai Li

    2007-01-01

    This paper shows the Fokker-Planck equation of a dynamical system driven by coloured cross-correlated white noises in the absence and presence of a small external force. Based on the Fokker-Planck equation and the definition of Shannon's information entropy, the time dependence of entropy flux and entropy production can be calculated. The present results can be used to explain the extremal behaviour of time dependence of entropy flux and entropy production in view of the dissipative parameter γ of the system, coloured cross-correlation time τ and coloured cross-correlation strength λ.

  2. Time Dependence of Entropy Flux and Entropy Production of a Dissipative Dynamical System Driven by Non-Gaussian Noise

    International Nuclear Information System (INIS)

    Guo Yongfeng; Xu Wei; Li Dongxi; Xie Wenxian

    2008-01-01

    A stochastic dissipative dynamical system driven by non-Gaussian noise is investigated. A general approximate Fokker-Planck equation of the system is derived through a path-integral approach. Based on the definition of Shannon's information entropy, the exact time dependence of entropy flux and entropy production of the system is calculated both in the absence and in the presence of non-equilibrium constraint. The present calculation can be used to interpret the interplay of the dissipative constant and non-Gaussian noise on the entropy flux and entropy production

  3. Entropy resistance minimization: An alternative method for heat exchanger analyses

    International Nuclear Information System (INIS)

    Cheng, XueTao

    2013-01-01

    In this paper, the concept of entropy resistance is proposed based on the entropy generation analyses of heat transfer processes. It is shown that smaller entropy resistance leads to larger heat transfer rate with fixed thermodynamic force difference and smaller thermodynamic force difference with fixed heat transfer rate, respectively. For the discussed two-stream heat exchangers in which the heat transfer rates are not given and the three-stream heat exchanger with prescribed heat capacity flow rates and inlet temperatures of the streams, smaller entropy resistance leads to larger heat transfer rate. For the two-stream heat exchangers with fixed heat transfer rate, smaller entropy resistance leads to larger effectiveness. Furthermore, it is shown that smaller values of the concepts of entropy generation numbers and modified entropy generation number do not always correspond to better performance of the discussed heat exchangers. - Highlights: • The concept of entropy resistance is defined for heat exchangers. • The concepts based on entropy generation are used to analyze heat exchangers. • Smaller entropy resistance leads to better performance of heat exchangers. • The applicability of entropy generation minimization is conditional

  4. Tsallis Entropy Theory for Modeling in Water Engineering: A Review

    Directory of Open Access Journals (Sweden)

    Vijay P. Singh

    2017-11-01

    Full Text Available Water engineering is an amalgam of engineering (e.g., hydraulics, hydrology, irrigation, ecosystems, environment, water resources and non-engineering (e.g., social, economic, political aspects that are needed for planning, designing and managing water systems. These aspects and the associated issues have been dealt with in the literature using different techniques that are based on different concepts and assumptions. A fundamental question that still remains is: Can we develop a unifying theory for addressing these? The second law of thermodynamics permits us to develop a theory that helps address these in a unified manner. This theory can be referred to as the entropy theory. The thermodynamic entropy theory is analogous to the Shannon entropy or the information theory. Perhaps, the most popular generalization of the Shannon entropy is the Tsallis entropy. The Tsallis entropy has been applied to a wide spectrum of problems in water engineering. This paper provides an overview of Tsallis entropy theory in water engineering. After some basic description of entropy and Tsallis entropy, a review of its applications in water engineering is presented, based on three types of problems: (1 problems requiring entropy maximization; (2 problems requiring coupling Tsallis entropy theory with another theory; and (3 problems involving physical relations.

  5. Combined Power Quality Disturbances Recognition Using Wavelet Packet Entropies and S-Transform

    Directory of Open Access Journals (Sweden)

    Zhigang Liu

    2015-08-01

    Full Text Available Aiming at the combined power quality +disturbance recognition, an automated recognition method based on wavelet packet entropy (WPE and modified incomplete S-transform (MIST is proposed in this paper. By combining wavelet packet Tsallis singular entropy, energy entropy and MIST, a 13-dimension vector of different power quality (PQ disturbances including single disturbances and combined disturbances is extracted. Then, a ruled decision tree is designed to recognize the combined disturbances. The proposed method is tested and evaluated using a large number of simulated PQ disturbances and some real-life signals, which include voltage sag, swell, interruption, oscillation transient, impulsive transient, harmonics, voltage fluctuation and their combinations. In addition, the comparison of the proposed recognition approach with some existing techniques is made. The experimental results show that the proposed method can effectively recognize the single and combined PQ disturbances.

  6. Nonsymmetric entropy I: basic concepts and results

    OpenAIRE

    Liu, Chengshi

    2006-01-01

    A new concept named nonsymmetric entropy which generalizes the concepts of Boltzman's entropy and shannon's entropy, was introduced. Maximal nonsymmetric entropy principle was proven. Some important distribution laws were derived naturally from maximal nonsymmetric entropy principle.

  7. Towards operational interpretations of generalized entropies

    Science.gov (United States)

    Topsøe, Flemming

    2010-12-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  8. Towards operational interpretations of generalized entropies

    International Nuclear Information System (INIS)

    Topsoee, Flemming

    2010-01-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  9. Expected Utility and Entropy-Based Decision-Making Model for Large Consumers in the Smart Grid

    Directory of Open Access Journals (Sweden)

    Bingtuan Gao

    2015-09-01

    Full Text Available In the smart grid, large consumers can procure electricity energy from various power sources to meet their load demands. To maximize its profit, each large consumer needs to decide their energy procurement strategy under risks such as price fluctuations from the spot market and power quality issues. In this paper, an electric energy procurement decision-making model is studied for large consumers who can obtain their electric energy from the spot market, generation companies under bilateral contracts, the options market and self-production facilities in the smart grid. Considering the effect of unqualified electric energy, the profit model of large consumers is formulated. In order to measure the risks from the price fluctuations and power quality, the expected utility and entropy is employed. Consequently, the expected utility and entropy decision-making model is presented, which helps large consumers to minimize their expected profit of electricity procurement while properly limiting the volatility of this cost. Finally, a case study verifies the feasibility and effectiveness of the proposed model.

  10. Discussing Landscape Compositional Scenarios Generated with Maximization of Non-Expected Utility Decision Models Based on Weighted Entropies

    Directory of Open Access Journals (Sweden)

    José Pinto Casquilho

    2017-02-01

    Full Text Available The search for hypothetical optimal solutions of landscape composition is a major issue in landscape planning and it can be outlined in a two-dimensional decision space involving economic value and landscape diversity, the latter being considered as a potential safeguard to the provision of services and externalities not accounted in the economic value. In this paper, we use decision models with different utility valuations combined with weighted entropies respectively incorporating rarity factors associated to Gini-Simpson and Shannon measures. A small example of this framework is provided and discussed for landscape compositional scenarios in the region of Nisa, Portugal. The optimal solutions relative to the different cases considered are assessed in the two-dimensional decision space using a benchmark indicator. The results indicate that the likely best combination is achieved by the solution using Shannon weighted entropy and a square root utility function, corresponding to a risk-averse behavior associated to the precautionary principle linked to safeguarding landscape diversity, anchoring for ecosystem services provision and other externalities. Further developments are suggested, mainly those relative to the hypothesis that the decision models here outlined could be used to revisit the stability-complexity debate in the field of ecological studies.

  11. Developing the fuzzy c-means clustering algorithm based on maximum entropy for multitarget tracking in a cluttered environment

    Science.gov (United States)

    Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing

    2018-01-01

    For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.

  12. Advancing Shannon Entropy for Measuring Diversity in Systems

    Directory of Open Access Journals (Sweden)

    R. Rajaram

    2017-01-01

    Full Text Available From economic inequality and species diversity to power laws and the analysis of multiple trends and trajectories, diversity within systems is a major issue for science. Part of the challenge is measuring it. Shannon entropy H has been used to rethink diversity within probability distributions, based on the notion of information. However, there are two major limitations to Shannon’s approach. First, it cannot be used to compare diversity distributions that have different levels of scale. Second, it cannot be used to compare parts of diversity distributions to the whole. To address these limitations, we introduce a renormalization of probability distributions based on the notion of case-based entropy Cc as a function of the cumulative probability c. Given a probability density p(x, Cc measures the diversity of the distribution up to a cumulative probability of c, by computing the length or support of an equivalent uniform distribution that has the same Shannon information as the conditional distribution of p^c(x up to cumulative probability c. We illustrate the utility of our approach by renormalizing and comparing three well-known energy distributions in physics, namely, the Maxwell-Boltzmann, Bose-Einstein, and Fermi-Dirac distributions for energy of subatomic particles. The comparison shows that Cc is a vast improvement over H as it provides a scale-free comparison of these diversity distributions and also allows for a comparison between parts of these diversity distributions.

  13. Entropy coherent and entropy convex measures of risk

    NARCIS (Netherlands)

    Laeven, Roger; Stadje, M.A.

    2010-01-01

    We introduce entropy coherent and entropy convex measures of risk and prove a collection of axiomatic characterization and duality results. We show in particular that entropy coherent and entropy convex measures of risk emerge as negative certainty equivalents in (the regular and a generalized

  14. Entropy coherent and entropy convex measures of risk

    NARCIS (Netherlands)

    Laeven, R.J.A.; Stadje, M.

    2013-01-01

    We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. Entropy coherent and entropy convex measures of risk are special cases of φ-coherent and φ-convex measures of risk. Contrary to the classical use of coherent and convex

  15. Entropy Coherent and Entropy Convex Measures of Risk

    NARCIS (Netherlands)

    Laeven, R.J.A.; Stadje, M.A.

    2011-01-01

    We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. We prove that convex, entropy convex and entropy coherent measures of risk emerge as certainty equivalents under variational, homothetic and multiple priors preferences,

  16. Infinite Shannon entropy

    International Nuclear Information System (INIS)

    Baccetti, Valentina; Visser, Matt

    2013-01-01

    Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann) entropy can easily be infinite. We carefully analyze conditions under which this phenomenon can occur. Roughly speaking, this happens when arbitrarily small amounts of probability are dispersed into an infinite number of states; we shall quantify this observation and make it precise. We develop several particularly simple, elementary, and useful bounds, and also provide some asymptotic estimates, leading to necessary and sufficient conditions for the occurrence of infinite Shannon entropy. We go to some effort to keep technical computations as simple and conceptually clear as possible. In particular, we shall see that large entropies cannot be localized in state space; large entropies can only be supported on an exponentially large number of states. We are for the time being interested in single-channel Shannon entropy in the information theoretic sense, not entropy in a stochastic field theory or quantum field theory defined over some configuration space, on the grounds that this simple problem is a necessary precursor to understanding infinite entropy in a field theoretic context. (paper)

  17. Black hole thermodynamical entropy

    International Nuclear Information System (INIS)

    Tsallis, Constantino; Cirto, Leonardo J.L.

    2013-01-01

    As early as 1902, Gibbs pointed out that systems whose partition function diverges, e.g. gravitation, lie outside the validity of the Boltzmann-Gibbs (BG) theory. Consistently, since the pioneering Bekenstein-Hawking results, physically meaningful evidence (e.g., the holographic principle) has accumulated that the BG entropy S BG of a (3+1) black hole is proportional to its area L 2 (L being a characteristic linear length), and not to its volume L 3 . Similarly it exists the area law, so named because, for a wide class of strongly quantum-entangled d-dimensional systems, S BG is proportional to lnL if d=1, and to L d-1 if d>1, instead of being proportional to L d (d ≥ 1). These results violate the extensivity of the thermodynamical entropy of a d-dimensional system. This thermodynamical inconsistency disappears if we realize that the thermodynamical entropy of such nonstandard systems is not to be identified with the BG additive entropy but with appropriately generalized nonadditive entropies. Indeed, the celebrated usefulness of the BG entropy is founded on hypothesis such as relatively weak probabilistic correlations (and their connections to ergodicity, which by no means can be assumed as a general rule of nature). Here we introduce a generalized entropy which, for the Schwarzschild black hole and the area law, can solve the thermodynamic puzzle. (orig.)

  18. Two dissimilar approaches to dynamical systems on hyper MV -algebras and their information entropy

    Science.gov (United States)

    Mehrpooya, Adel; Ebrahimi, Mohammad; Davvaz, Bijan

    2017-09-01

    Measuring the flow of information that is related to the evolution of a system which is modeled by applying a mathematical structure is of capital significance for science and usually for mathematics itself. Regarding this fact, a major issue in concern with hyperstructures is their dynamics and the complexity of the varied possible dynamics that exist over them. Notably, the dynamics and uncertainty of hyper MV -algebras which are hyperstructures and extensions of a central tool in infinite-valued Lukasiewicz propositional calculus that models many valued logics are of primary concern. Tackling this problem, in this paper we focus on the subject of dynamical systems on hyper MV -algebras and their entropy. In this respect, we adopt two varied approaches. One is the set-based approach in which hyper MV -algebra dynamical systems are developed by employing set functions and set partitions. By the other method that is based on points and point partitions, we establish the concept of hyper injective dynamical systems on hyper MV -algebras. Next, we study the notion of entropy for both kinds of systems. Furthermore, we consider essential ergodic characteristics of those systems and their entropy. In particular, we introduce the concept of isomorphic hyper injective and hyper MV -algebra dynamical systems, and we demonstrate that isomorphic systems have the same entropy. We present a couple of theorems in order to help calculate entropy. In particular, we prove a contemporary version of addition and Kolmogorov-Sinai Theorems. Furthermore, we provide a comparison between the indispensable properties of hyper injective and semi-independent dynamical systems. Specifically, we present and prove theorems that draw comparisons between the entropies of such systems. Lastly, we discuss some possible relationships between the theories of hyper MV -algebra and MV -algebra dynamical systems.

  19. A review of entropy generation in microchannels

    Directory of Open Access Journals (Sweden)

    Mohamed M Awad

    2015-12-01

    Full Text Available In this study, a critical review of thermodynamic optimum of microchannels based on entropy generation analysis is presented. Using entropy generation analysis as evaluation parameter of microchannels has been reported by many studies in the literature. In these studies, different working fluids such as nanofluids, air, water, engine oil, aniline, ethylene glycol, and non-Newtonian fluids have been used. For the case of nanofluids, “nanoparticles” has been used in various kinds such as Al2O3 and Cu, and “base fluid” has been used in various kinds such as water and ethylene glycol. Furthermore, studies on thermodynamic optimum of microchannels based on entropy generation analysis are summarized in a table. At the end, recommendations of future work for thermodynamic optimum of microchannels based on entropy generation analysis are given. As a result, this article can not only be used as the starting point for the researcher interested in entropy generation in microchannels, but it also includes recommendations for future studies on entropy generation in microchannels.

  20. Some remarks on conditional entropy

    NARCIS (Netherlands)

    Nijst, A.G.P.M.

    1969-01-01

    Using a definition of conditional entropy given by Hanen and Neveu [5, 10, 11] we discuss in this paper some properties of conditional entropy and mean entropy, in particular an integral representation of conditional entropy (§ 2), and the decomposition theorem of the KolmogorovSina¯i invariant (§

  1. Does Income Diversification Benefit the Sustainable Development of Chinese Listed Banks? Analysis Based on Entropy and the Herfindahl–Hirschman Index

    Directory of Open Access Journals (Sweden)

    Huichen Jiang

    2018-04-01

    Full Text Available We collected data pertaining to Chinese listed commercial banks from 2008 to 2016 and found that the competition between banks is becoming increasingly fierce. Commercial banks have actively carried out diversification strategies for greater returns, and the financial reports show that profits are increasingly coming from the non-interest income benefits of diversification strategies. However, diversification comes with risk. We built a panel threshold model and investigated the effect of income diversification on a bank’s profitability and risk. Diversification was first measured by the Herfindahl–Hirschman index (HHI, and the results show that there is a nonlinear relationship between diversification and profitability or risk does exist. We introduced an interesting index based on the entropy to test the robustness of our model and found that a threshold effect exists in both our models, which is statistically significant. We believe the combination of the entropy index (ENTI and the HHI enables more efficient study of the relationship between diversification and profitability or risk more efficiently. Bankers and their customers have increasingly been interested in income diversification, and they value risk as well. We suggest that banks of different sizes should adopt the corresponding diversification strategy to achieve sustainable development.

  2. Juxta-Vascular Pulmonary Nodule Segmentation in PET-CT Imaging Based on an LBF Active Contour Model with Information Entropy and Joint Vector

    Directory of Open Access Journals (Sweden)

    Rui Hao

    2018-01-01

    Full Text Available The accurate segmentation of pulmonary nodules is an important preprocessing step in computer-aided diagnoses of lung cancers. However, the existing segmentation methods may cause the problem of edge leakage and cannot segment juxta-vascular pulmonary nodules accurately. To address this problem, a novel automatic segmentation method based on an LBF active contour model with information entropy and joint vector is proposed in this paper. Our method extracts the interest area of pulmonary nodules by a standard uptake value (SUV in Positron Emission Tomography (PET images, and automatic threshold iteration is used to construct an initial contour roughly. The SUV information entropy and the gray-value joint vector of Positron Emission Tomography–Computed Tomography (PET-CT images are calculated to drive the evolution of contour curve. At the edge of pulmonary nodules, evolution will be stopped and accurate results of pulmonary nodule segmentation can be obtained. Experimental results show that our method can achieve 92.35% average dice similarity coefficient, 2.19 mm Hausdorff distance, and 3.33% false positive with the manual segmentation results. Compared with the existing methods, our proposed method that segments juxta-vascular pulmonary nodules in PET-CT images is more accurate and efficient.

  3. Phase equilibrium of PuO2-x - Pu2O3 based on first-principles calculations and configurational entropy change

    International Nuclear Information System (INIS)

    Minamoto, Satoshi; Kato, Masato; Konashi, Kenji

    2011-01-01

    Combination of an oxygen vacancy formation energy calculated using first-principles approach and the configurational entropy change treated within the framework of statistical mechanics gives an expression of the Gibbs free energy at large deviation from stoichiometry of plutonium oxide PuO 2 . An oxygen vacancy formation energy 4.20 eV derived from our previously first-principles calculation was used to evaluate the Gibbs free energy change due to oxygen vacancies in the crystal. The oxygen partial pressures then can be evaluated from the change of the free energy with two fitting parameters (a vacancy-vacancy interaction energy and vibration entropy change due to induced vacancies). Derived thermodynamic expression for the free energy based on the SGTE thermodynamic data for the stoichiometric PuO 2 and the Pu 2 O 3 compounds was further incorporated into the CALPHAD modeling, then phase equilibrium between the stoichiometric Pu 2 O 3 and non-stoichiometric PuO 2-x were reproduced.

  4. Phase equilibrium of PuO{sub 2-x} - Pu{sub 2}O{sub 3} based on first-principles calculations and configurational entropy change

    Energy Technology Data Exchange (ETDEWEB)

    Minamoto, Satoshi, E-mail: satoshi.minamoto@ctc-g.co.jp [ITOCHU Techno-Solutions Corporation, Kasumigaseki, 2-5, Kasumigaseki 3-chome, Chiyoda-ku, Tokyo 100-6080 (Japan); Kato, Masato [Japan Atomic Energy Agency, Tokai-mura, Naka-gun, Ibaraki (Japan); Konashi, Kenji [Institute for Materials Research, Tohoku University, Oarai-chou, Ibaraki (Japan)

    2011-05-31

    Combination of an oxygen vacancy formation energy calculated using first-principles approach and the configurational entropy change treated within the framework of statistical mechanics gives an expression of the Gibbs free energy at large deviation from stoichiometry of plutonium oxide PuO{sub 2}. An oxygen vacancy formation energy 4.20 eV derived from our previously first-principles calculation was used to evaluate the Gibbs free energy change due to oxygen vacancies in the crystal. The oxygen partial pressures then can be evaluated from the change of the free energy with two fitting parameters (a vacancy-vacancy interaction energy and vibration entropy change due to induced vacancies). Derived thermodynamic expression for the free energy based on the SGTE thermodynamic data for the stoichiometric PuO{sub 2} and the Pu{sub 2}O{sub 3} compounds was further incorporated into the CALPHAD modeling, then phase equilibrium between the stoichiometric Pu{sub 2}O{sub 3} and non-stoichiometric PuO{sub 2-x} were reproduced.

  5. Trajectories entropy in dynamical graphs with memory

    Directory of Open Access Journals (Sweden)

    Francesco eCaravelli

    2016-04-01

    Full Text Available In this paper we investigate the application of non-local graph entropy to evolving and dynamical graphs. The measure is based upon the notion of Markov diffusion on a graph, and relies on the entropy applied to trajectories originating at a specific node. In particular, we study the model of reinforcement-decay graph dynamics, which leads to scale free graphs. We find that the node entropy characterizes the structure of the network in the two parameter phase-space describing the dynamical evolution of the weighted graph. We then apply an adapted version of the entropy measure to purely memristive circuits. We provide evidence that meanwhile in the case of DC voltage the entropy based on the forward probability is enough to characterize the graph properties, in the case of AC voltage generators one needs to consider both forward and backward based transition probabilities. We provide also evidence that the entropy highlights the self-organizing properties of memristive circuits, which re-organizes itself to satisfy the symmetries of the underlying graph.

  6. Measurement of Scenic Spots Sustainable Capacity Based on PCA-Entropy TOPSIS: A Case Study from 30 Provinces, China

    Directory of Open Access Journals (Sweden)

    Xuedong Liang

    2017-12-01

    Full Text Available In connection with the sustainable development of scenic spots, this paper, with consideration of resource conditions, economic benefits, auxiliary industry scale and ecological environment, establishes a comprehensive measurement model of the sustainable capacity of scenic spots; optimizes the index system by principal components analysis to extract principal components; assigns the weight of principal components by entropy method; analyzes the sustainable capacity of scenic spots in each province of China comprehensively in combination with TOPSIS method and finally puts forward suggestions aid decision-making. According to the study, this method provides an effective reference for the study of the sustainable development of scenic spots and is very significant for considering the sustainable development of scenic spots and auxiliary industries to establish specific and scientific countermeasures for improvement.

  7. Measurement of Scenic Spots Sustainable Capacity Based on PCA-Entropy TOPSIS: A Case Study from 30 Provinces, China.

    Science.gov (United States)

    Liang, Xuedong; Liu, Canmian; Li, Zhi

    2017-12-22

    In connection with the sustainable development of scenic spots, this paper, with consideration of resource conditions, economic benefits, auxiliary industry scale and ecological environment, establishes a comprehensive measurement model of the sustainable capacity of scenic spots; optimizes the index system by principal components analysis to extract principal components; assigns the weight of principal components by entropy method; analyzes the sustainable capacity of scenic spots in each province of China comprehensively in combination with TOPSIS method and finally puts forward suggestions aid decision-making. According to the study, this method provides an effective reference for the study of the sustainable development of scenic spots and is very significant for considering the sustainable development of scenic spots and auxiliary industries to establish specific and scientific countermeasures for improvement.

  8. Investigation on multi-objective performance optimization algorithm application of fan based on response surface method and entropy method

    Science.gov (United States)

    Zhang, Li; Wu, Kexin; Liu, Yang

    2017-12-01

    A multi-objective performance optimization method is proposed, and the problem that single structural parameters of small fan balance the optimization between the static characteristics and the aerodynamic noise is solved. In this method, three structural parameters are selected as the optimization variables. Besides, the static pressure efficiency and the aerodynamic noise of the fan are regarded as the multi-objective performance. Furthermore, the response surface method and the entropy method are used to establish the optimization function between the optimization variables and the multi-objective performances. Finally, the optimized model is found when the optimization function reaches its maximum value. Experimental data shows that the optimized model not only enhances the static characteristics of the fan but also obviously reduces the noise. The results of the study will provide some reference for the optimization of multi-objective performance of other types of rotating machinery.

  9. Entropy of international trades

    Science.gov (United States)

    Oh, Chang-Young; Lee, D.-S.

    2017-05-01

    The organization of international trades is highly complex under the collective efforts towards economic profits of participating countries given inhomogeneous resources for production. Considering the trade flux as the probability of exporting a product from a country to another, we evaluate the entropy of the world trades in the period 1950-2000. The trade entropy has increased with time, and we show that it is mainly due to the extension of trade partnership. For a given number of trade partners, the mean trade entropy is about 60% of the maximum possible entropy, independent of time, which can be regarded as a characteristic of the trade fluxes' heterogeneity and is shown to be derived from the scaling and functional behaviors of the universal trade-flux distribution. The correlation and time evolution of the individual countries' gross-domestic products and the number of trade partners show that most countries achieved their economic growth partly by extending their trade relationship.

  10. On holographic defect entropy

    International Nuclear Information System (INIS)

    Estes, John; Jensen, Kristan; O’Bannon, Andy; Tsatis, Efstratios; Wrase, Timm

    2014-01-01

    We study a number of (3+1)- and (2+1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3+1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1+1)-dimensional field theories generalizes to higher dimensions

  11. Minimum entropy production principle

    Czech Academy of Sciences Publication Activity Database

    Maes, C.; Netočný, Karel

    2013-01-01

    Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle

  12. Entropy in halide perovskites

    Science.gov (United States)

    Katan, Claudine; Mohite, Aditya D.; Even, Jacky

    2018-05-01

    Claudine Katan, Aditya D. Mohite and Jacky Even discuss the possible impact of various entropy contributions (stochastic structural fluctuations, anharmonicity and lattice softness) on the optoelectronic properties of halide perovskite materials and devices.

  13. Entropy of Vaidya-deSitter Spacetime

    Institute of Scientific and Technical Information of China (English)

    LI Xiang; ZHAO Zheng

    2001-01-01

    As a statistical model of black hole entropy, the brick-wall method based on the thermal equilibrium in a large scale cannot be applied to the cases out of equilibrium, such as the non-static hole or the case with two horizons.However, the leading term of hole entropy called the Bekenstein-Hawking entropy comes from the contribution of the field near the horizon. According to this idea, the entropy of Vaidya-deSitter spacetime is calculated. A difference from the static case is that the result proportional to the area of horizon relies on a time-dependent cut-off. The condition of local equilibrium near the horizon is used as a working postulate.

  14. Minimal entropy approximation for cellular automata

    International Nuclear Information System (INIS)

    Fukś, Henryk

    2014-01-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)

  15. Feasible Histories, Maximum Entropy

    International Nuclear Information System (INIS)

    Pitowsky, I.

    1999-01-01

    We consider the broadest possible consistency condition for a family of histories, which extends all previous proposals. A family that satisfies this condition is called feasible. On each feasible family of histories we choose a probability measure by maximizing entropy, while keeping the probabilities of commuting histories to their quantum mechanical values. This procedure is justified by the assumption that decoherence increases entropy. Finally, a criterion for identifying the nearly classical families is proposed

  16. Manufacturing of High Entropy Alloys

    Science.gov (United States)

    Jablonski, Paul D.; Licavoli, Joseph J.; Gao, Michael C.; Hawk, Jeffrey A.

    2015-07-01

    High entropy alloys (HEAs) have generated interest in recent years due to their unique positioning within the alloy world. By incorporating a number of elements in high proportion they have high configurational entropy, and thus they hold the promise of interesting and useful properties such as enhanced strength and phase stability. The present study investigates the microstructure of two single-phase face-centered cubic (FCC) HEAs, CoCrFeNi and CoCrFeNiMn, with special attention given to melting, homogenization and thermo-mechanical processing. Large-scale ingots were made by vacuum induction melting to avoid the extrinsic factors inherent in small-scale laboratory button samples. A computationally based homogenization heat treatment was applied to both alloys in order to eliminate segregation due to normal ingot solidification. The alloys fabricated well, with typical thermo-mechanical processing parameters being employed.

  17. Entropy favours open colloidal lattices

    Science.gov (United States)

    Mao, Xiaoming; Chen, Qian; Granick, Steve

    2013-03-01

    Burgeoning experimental and simulation activity seeks to understand the existence of self-assembled colloidal structures that are not close-packed. Here we describe an analytical theory based on lattice dynamics and supported by experiments that reveals the fundamental role entropy can play in stabilizing open lattices. The entropy we consider is associated with the rotational and vibrational modes unique to colloids interacting through extended attractive patches. The theory makes predictions of the implied temperature, pressure and patch-size dependence of the phase diagram of open and close-packed structures. More generally, it provides guidance for the conditions at which targeted patchy colloidal assemblies in two and three dimensions are stable, thus overcoming the difficulty in exploring by experiment or simulation the full range of conceivable parameters.

  18. Preserved entropy and fragile magnetism.

    Science.gov (United States)

    Canfield, Paul C; Bud'ko, Sergey L

    2016-08-01

    A large swath of quantum critical and strongly correlated electron systems can be associated with the phenomena of preserved entropy and fragile magnetism. In this overview we present our thoughts and plans for the discovery and development of lanthanide and transition metal based, strongly correlated systems that are revealed by suppressed, fragile magnetism, quantum criticality, or grow out of preserved entropy. We will present and discuss current examples such as YbBiPt, YbAgGe, YbFe2Zn20, PrAg2In, BaFe2As2, CaFe2As2, LaCrSb3 and LaCrGe3 as part of our motivation and to provide illustrative examples.

  19. Entropy of quasiblack holes

    International Nuclear Information System (INIS)

    Lemos, Jose P. S.; Zaslavskii, Oleg B.

    2010-01-01

    We trace the origin of the black hole entropy S, replacing a black hole by a quasiblack hole. Let the boundary of a static body approach its own gravitational radius, in such a way that a quasihorizon forms. We show that if the body is thermal with the temperature taking the Hawking value at the quasihorizon limit, it follows, in the nonextremal case, from the first law of thermodynamics that the entropy approaches the Bekenstein-Hawking value S=A/4. In this setup, the key role is played by the surface stresses on the quasihorizon and one finds that the entropy comes from the quasihorizon surface. Any distribution of matter inside the surface leads to the same universal value for the entropy in the quasihorizon limit. This can be of some help in the understanding of black hole entropy. Other similarities between black holes and quasiblack holes such as the mass formulas for both objects had been found previously. We also discuss the entropy for extremal quasiblack holes, a more subtle issue.

  20. Entropy, matter, and cosmology.

    Science.gov (United States)

    Prigogine, I; Géhéniau, J

    1986-09-01

    The role of irreversible processes corresponding to creation of matter in general relativity is investigated. The use of Landau-Lifshitz pseudotensors together with conformal (Minkowski) coordinates suggests that this creation took place in the early universe at the stage of the variation of the conformal factor. The entropy production in this creation process is calculated. It is shown that these dissipative processes lead to the possibility of cosmological models that start from empty conditions and gradually build up matter and entropy. Gravitational entropy takes a simple meaning as associated to the entropy that is necessary to produce matter. This leads to an extension of the third law of thermodynamics, as now the zero point of entropy becomes the space-time structure out of which matter is generated. The theory can be put into a convenient form using a supplementary "C" field in Einstein's field equations. The role of the C field is to express the coupling between gravitation and matter leading to irreversible entropy production.

  1. Phase Composition of a CrMo0.5NbTa0.5TiZr High Entropy Alloy: Comparison of Experimental and Simulated Data

    OpenAIRE

    Fan Zhang; Oleg N. Senkov; Jonathan D. Miller

    2013-01-01

    Microstructure and phase composition of a CrMo0.5NbTa0.5TiZr high entropy alloy were studied in the as-solidified and heat treated conditions. In the as-solidified condition, the alloy consisted of two disordered BCC phases and an ordered cubic Laves phase. The BCC1 phase solidified in the form of dendrites enriched with Mo, Ta and Nb, and its volume fraction was 42%. The BCC2 and Laves phases solidified by the eutectic-type reaction, and their volume fractions were 27% and 31%, respectively....

  2. Prediction of Protein Configurational Entropy (Popcoen).

    Science.gov (United States)

    Goethe, Martin; Gleixner, Jan; Fita, Ignacio; Rubi, J Miguel

    2018-03-13

    A knowledge-based method for configurational entropy prediction of proteins is presented; this methodology is extremely fast, compared to previous approaches, because it does not involve any type of configurational sampling. Instead, the configurational entropy of a query fold is estimated by evaluating an artificial neural network, which was trained on molecular-dynamics simulations of ∼1000 proteins. The predicted entropy can be incorporated into a large class of protein software based on cost-function minimization/evaluation, in which configurational entropy is currently neglected for performance reasons. Software of this type is used for all major protein tasks such as structure predictions, proteins design, NMR and X-ray refinement, docking, and mutation effect predictions. Integrating the predicted entropy can yield a significant accuracy increase as we show exemplarily for native-state identification with the prominent protein software FoldX. The method has been termed Popcoen for Prediction of Protein Configurational Entropy. An implementation is freely available at http://fmc.ub.edu/popcoen/ .

  3. On the Conditional Rényi Entropy

    NARCIS (Netherlands)

    S. Fehr (Serge); S. Berens (Stefan)

    2014-01-01

    htmlabstractThe Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy notions, like the min-entropy or the collision entropy. In contrast to the Shannon entropy, there seems to be no commonly accepted definition for the conditional Rényi entropy: several

  4. Entropy in an expanding universe

    International Nuclear Information System (INIS)

    Frautschi, S.

    1982-01-01

    The question of how the observed evolution of organized structures from initial chaos in the expanding universe can be reconciled with the laws of statistical mechanics is studied, with emphasis on effects of the expansion and gravity. Some major sources of entropy increase are listed. An expanding causal region is defined in which the entropy, though increasing, tends to fall further and further behind its maximum possible value, thus allowing for the development of order. The related questions of whether entropy will continue increasing without limit in the future, and whether such increase in the form of Hawking radiation or radiation from positronium might enable life to maintain itself permanently, are considered. Attempts to find a scheme for preserving life based on solid structures fail because events such as quantum tunneling recurrently disorganize matter on a very long but fixed time scale, whereas all energy sources slow down progressively in an expanding universe. However, there remains hope that other modes of life capable of maintaining themselves permanently can be found

  5. Multivariate Generalized Multiscale Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Anne Humeau-Heurtier

    2016-11-01

    Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.

  6. Gradient Dynamics and Entropy Production Maximization

    Science.gov (United States)

    Janečka, Adam; Pavelka, Michal

    2018-01-01

    We compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell-Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potential) and entropy production. It also leads to the linear Onsager reciprocal relations and it has proven successful in thermodynamics of complex materials. Both methods are thermodynamically sound as they ensure approach to equilibrium, and we compare them and discuss their advantages and shortcomings. In particular, conditions under which the two approaches coincide and are capable of providing the same constitutive relations are identified. Besides, a commonly used but not often mentioned step in the entropy production maximization is pinpointed and the condition of incompressibility is incorporated into gradient dynamics.

  7. A Trustworthiness Evaluation Method for Software Architectures Based on the Principle of Maximum Entropy (POME and the Grey Decision-Making Method (GDMM

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2014-09-01

    Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.

  8. The Conditional Entropy Power Inequality for Bosonic Quantum Systems

    Science.gov (United States)

    De Palma, Giacomo; Trevisan, Dario

    2018-06-01

    We prove the conditional Entropy Power Inequality for Gaussian quantum systems. This fundamental inequality determines the minimum quantum conditional von Neumann entropy of the output of the beam-splitter or of the squeezing among all the input states where the two inputs are conditionally independent given the memory and have given quantum conditional entropies. We also prove that, for any couple of values of the quantum conditional entropies of the two inputs, the minimum of the quantum conditional entropy of the output given by the conditional Entropy Power Inequality is asymptotically achieved by a suitable sequence of quantum Gaussian input states. Our proof of the conditional Entropy Power Inequality is based on a new Stam inequality for the quantum conditional Fisher information and on the determination of the universal asymptotic behaviour of the quantum conditional entropy under the heat semigroup evolution. The beam-splitter and the squeezing are the central elements of quantum optics, and can model the attenuation, the amplification and the noise of electromagnetic signals. This conditional Entropy Power Inequality will have a strong impact in quantum information and quantum cryptography. Among its many possible applications there is the proof of a new uncertainty relation for the conditional Wehrl entropy.

  9. Precipitation Interpolation by Multivariate Bayesian Maximum Entropy Based on Meteorological Data in Yun- Gui-Guang region, Mainland China

    Science.gov (United States)

    Wang, Chaolin; Zhong, Shaobo; Zhang, Fushen; Huang, Quanyi

    2016-11-01

    Precipitation interpolation has been a hot area of research for many years. It had close relation to meteorological factors. In this paper, precipitation from 91 meteorological stations located in and around Yunnan, Guizhou and Guangxi Zhuang provinces (or autonomous region), Mainland China was taken into consideration for spatial interpolation. Multivariate Bayesian maximum entropy (BME) method with auxiliary variables, including mean relative humidity, water vapour pressure, mean temperature, mean wind speed and terrain elevation, was used to get more accurate regional distribution of annual precipitation. The means, standard deviations, skewness and kurtosis of meteorological factors were calculated. Variogram and cross- variogram were fitted between precipitation and auxiliary variables. The results showed that the multivariate BME method was precise with hard and soft data, probability density function. Annual mean precipitation was positively correlated with mean relative humidity, mean water vapour pressure, mean temperature and mean wind speed, negatively correlated with terrain elevation. The results are supposed to provide substantial reference for research of drought and waterlog in the region.

  10. An Algorithm Creating Thumbnail for Web Map Services Based on Information Entropy and Trans-scale Similarity

    Directory of Open Access Journals (Sweden)

    CHENG Xiaoqiang

    2017-11-01

    Full Text Available Thumbnail can greatly increase the efficiency of browsing pictures,videos and other image resources and improve the user experience prominently. Map service is a kind of graphic resource coupling spatial information and representation scale,its crafting,retrieval and management will not function well without the support of thumbnail. Sophisticated designed thumbnails bring users vivid first impressions and help users make efficient exploration. On the contrast,coarse thumbnail cause negative emotion and discourage users to explore the map service positively. Inspired by video summarization,key position and key scale of web map service were proposed. Meanwhile,corresponding quantitative measures and an automatic algorithm were drawn up and implemented. With the help of this algorithm,poor visual quality,lack of map information and low automation of current thumbnails was solved successfully. Information entropy was used to determine areas richer in content and tran-scale similarity was calculated to judge at which scale the appearance of the map service has changed drastically,and finally a series of static pictures were extracted which can represent the content of the map service. Experimental results show that this method produced medium-sized,content-rich and well-representative thumbnails which effectively reflect the content and appearance of map service.

  11. Sample Entropy Analysis of EEG Signals via Artificial Neural Networks to Model Patients’ Consciousness Level Based on Anesthesiologists Experience

    Directory of Open Access Journals (Sweden)

    George J. A. Jiang

    2015-01-01

    Full Text Available Electroencephalogram (EEG signals, as it can express the human brain’s activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA. Bispectral (BIS index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD method and analyzed using sample entropy (SampEn analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN model through using expert assessment of consciousness level (EACL which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.

  12. Novel Approach for Lithium-Ion Battery On-Line Remaining Useful Life Prediction Based on Permutation Entropy

    Directory of Open Access Journals (Sweden)

    Luping Chen

    2018-04-01

    Full Text Available The degradation of lithium-ion battery often leads to electrical system failure. Battery remaining useful life (RUL prediction can effectively prevent this failure. Battery capacity is usually utilized as health indicator (HI for RUL prediction. However, battery capacity is often estimated on-line and it is difficult to be obtained by monitoring on-line parameters. Therefore, there is a great need to find a simple and on-line prediction method to solve this issue. In this paper, as a novel HI, permutation entropy (PE is extracted from the discharge voltage curve for analyzing battery degradation. Then the similarity between PE and battery capacity are judged by Pearson and Spearman correlation analyses. Experiment results illustrate the effectiveness and excellent similar performance of the novel HI for battery fading indication. Furthermore, we propose a hybrid approach combining Variational mode decomposition (VMD denoising technique, autoregressive integrated moving average (ARIMA, and GM(1,1 models for RUL prediction. Experiment results illustrate the accuracy of the proposed approach for lithium-ion battery on-line RUL prediction.

  13. Multi-Objective Optimal Design of Stand-Alone Hybrid Energy System Using Entropy Weight Method Based on HOMER

    Directory of Open Access Journals (Sweden)

    Jiaxin Lu

    2017-10-01

    Full Text Available Implementation of hybrid energy system (HES is generally considered as a promising way to satisfy the electrification requirements for remote areas. In the present study, a novel decision making methodology is proposed to identify the best compromise configuration of HES from a set of feasible combinations obtained from HOMER. For this purpose, a multi-objective function, which comprises four crucial and representative indices, is formulated by applying the weighted sum method. The entropy weight method is employed as a quantitative methodology for weighting factors calculation to enhance the objectivity of decision-making. Moreover, the optimal design of a stand-alone PV/wind/battery/diesel HES in Yongxing Island, China, is conducted as a case study to validate the effectiveness of the proposed method. Both the simulation and optimization results indicate that, the optimization method is able to identify the best trade-off configuration among system reliability, economy, practicability and environmental sustainability. Several useful conclusions are given by analyzing the operation of the best configuration.

  14. Extensitivity of entropy and modern form of Gibbs paradox

    International Nuclear Information System (INIS)

    Home, D.; Sengupta, S.

    1981-01-01

    The extensivity property of entropy is clarified in the light of a critical examination of the entropy formula based on quantum statistics and the relevant thermodynamic requirement. The modern form of the Gibbs paradox, related to the discontinuous jump in entropy due to identity or non-identity of particles, is critically investigated. Qualitative framework of a new resolution of this paradox, which analyses the general effect of distinction mark on the Hamiltonian of a system of identical particles, is outlined. (author)

  15. Entropy of network ensembles

    Science.gov (United States)

    Bianconi, Ginestra

    2009-03-01

    In this paper we generalize the concept of random networks to describe network ensembles with nontrivial features by a statistical mechanics approach. This framework is able to describe undirected and directed network ensembles as well as weighted network ensembles. These networks might have nontrivial community structure or, in the case of networks embedded in a given space, they might have a link probability with a nontrivial dependence on the distance between the nodes. These ensembles are characterized by their entropy, which evaluates the cardinality of networks in the ensemble. In particular, in this paper we define and evaluate the structural entropy, i.e., the entropy of the ensembles of undirected uncorrelated simple networks with given degree sequence. We stress the apparent paradox that scale-free degree distributions are characterized by having small structural entropy while they are so widely encountered in natural, social, and technological complex systems. We propose a solution to the paradox by proving that scale-free degree distributions are the most likely degree distribution with the corresponding value of the structural entropy. Finally, the general framework we present in this paper is able to describe microcanonical ensembles of networks as well as canonical or hidden-variable network ensembles with significant implications for the formulation of network-constructing algorithms.

  16. Entropy Production in Stochastics

    Directory of Open Access Journals (Sweden)

    Demetris Koutsoyiannis

    2017-10-01

    Full Text Available While the modern definition of entropy is genuinely probabilistic, in entropy production the classical thermodynamic definition, as in heat transfer, is typically used. Here we explore the concept of entropy production within stochastics and, particularly, two forms of entropy production in logarithmic time, unconditionally (EPLT or conditionally on the past and present having been observed (CEPLT. We study the theoretical properties of both forms, in general and in application to a broad set of stochastic processes. A main question investigated, related to model identification and fitting from data, is how to estimate the entropy production from a time series. It turns out that there is a link of the EPLT with the climacogram, and of the CEPLT with two additional tools introduced here, namely the differenced climacogram and the climacospectrum. In particular, EPLT and CEPLT are related to slopes of log-log plots of these tools, with the asymptotic slopes at the tails being most important as they justify the emergence of scaling laws of second-order characteristics of stochastic processes. As a real-world application, we use an extraordinary long time series of turbulent velocity and show how a parsimonious stochastic model can be identified and fitted using the tools developed.

  17. Estimating the spatial distribution of soil moisture based on Bayesian maximum entropy method with auxiliary data from remote sensing

    Science.gov (United States)

    Gao, Shengguo; Zhu, Zhongli; Liu, Shaomin; Jin, Rui; Yang, Guangchao; Tan, Lei

    2014-10-01

    Soil moisture (SM) plays a fundamental role in the land-atmosphere exchange process. Spatial estimation based on multi in situ (network) data is a critical way to understand the spatial structure and variation of land surface soil moisture. Theoretically, integrating densely sampled auxiliary data spatially correlated with soil moisture into the procedure of spatial estimation can improve its accuracy. In this study, we present a novel approach to estimate the spatial pattern of soil moisture by using the BME method based on wireless sensor network data and auxiliary information from ASTER (Terra) land surface temperature measurements. For comparison, three traditional geostatistic methods were also applied: ordinary kriging (OK), which used the wireless sensor network data only, regression kriging (RK) and ordinary co-kriging (Co-OK) which both integrated the ASTER land surface temperature as a covariate. In Co-OK, LST was linearly contained in the estimator, in RK, estimator is expressed as the sum of the regression estimate and the kriged estimate of the spatially correlated residual, but in BME, the ASTER land surface temperature was first retrieved as soil moisture based on the linear regression, then, the t-distributed prediction interval (PI) of soil moisture was estimated and used as soft data in probability form. The results indicate that all three methods provide reasonable estimations. Co-OK, RK and BME can provide a more accurate spatial estimation by integrating the auxiliary information Compared to OK. RK and BME shows more obvious improvement compared to Co-OK, and even BME can perform slightly better than RK. The inherent issue of spatial estimation (overestimation in the range of low values and underestimation in the range of high values) can also be further improved in both RK and BME. We can conclude that integrating auxiliary data into spatial estimation can indeed improve the accuracy, BME and RK take better advantage of the auxiliary

  18. Empirical study on entropy models of cellular manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Zhifeng Zhang; Renbin Xiao

    2009-01-01

    From the theoretical point of view,the states of manufacturing resources can be monitored and assessed through the amount of information needed to describe their technological structure and operational state.The amount of information needed to describe cellular manufacturing systems is investigated by two measures:the structural entropy and the operational entropy.Based on the Shannon entropy,the models of the structural entropy and the operational entropy of cellular manufacturing systems are developed,and the cognizance of the states of manufacturing resources is also illustrated.Scheduling is introduced to measure the entropy models of cellular manufacturing systems,and the feasible concepts of maximum schedule horizon and schedule adherence are advanced to quantitatively evaluate the effectiveness of schedules.Finally,an example is used to demonstrate the validity of the proposed methodology.

  19. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy

    OpenAIRE

    Gupta; Srivastava

    2010-01-01

    Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian est...

  20. Algebraic topological entropy

    International Nuclear Information System (INIS)

    Hudetz, T.

    1989-01-01

    As a 'by-product' of the Connes-Narnhofer-Thirring theory of dynamical entropy for (originally non-Abelian) nuclear C * -algebras, the well-known variational principle for topological entropy is eqivalently reformulated in purly algebraically defined terms for (separable) Abelian C * -algebras. This 'algebraic variational principle' should not only nicely illustrate the 'feed-back' of methods developed for quantum dynamical systems to the classical theory, but it could also be proved directly by 'algebraic' methods and could thus further simplify the original proof of the variational principle (at least 'in principle'). 23 refs. (Author)