Analysis of thermal systems using the entropy balance method
Energy Technology Data Exchange (ETDEWEB)
Huang, C L.D.; Fartaj, S A; Fenton, D L [Kansas State Univ., Manhattan, KS (United States). Dept. of Mechanical Engineering
1992-04-01
This study investigates the applicability of the second law of thermodynamics using an entropy balance method to analyse and design thermal systems. As examples, the entropy balance method is used to analyse a single stage chiller system and a single stage heat transformer, both with lithium-bromide/water as the working fluid. The entropy method yields not only the same information as is conveyed by the methods of energy and exergy analysis, but it also predicts clearly the influence of irreversibilities of individual components on the coefficient of performance and its effectiveness, based on the process properties, rather than on ambient conditions. Furthermore, this method is capable of presenting the overall distribution of the heat input by displaying the additional heat required to overcome irreversibility of each component without ambiguity. (Author).
Application of the maximum entropy method to profile analysis
International Nuclear Information System (INIS)
Armstrong, N.; Kalceff, W.; Cline, J.P.
1999-01-01
Full text: A maximum entropy (MaxEnt) method for analysing crystallite size- and strain-induced x-ray profile broadening is presented. This method treats the problems of determining the specimen profile, crystallite size distribution, and strain distribution in a general way by considering them as inverse problems. A common difficulty faced by many experimenters is their inability to determine a well-conditioned solution of the integral equation, which preserves the positivity of the profile or distribution. We show that the MaxEnt method overcomes this problem, while also enabling a priori information, in the form of a model, to be introduced into it. Additionally, we demonstrate that the method is fully quantitative, in that uncertainties in the solution profile or solution distribution can be determined and used in subsequent calculations, including mean particle sizes and rms strain. An outline of the MaxEnt method is presented for the specific problems of determining the specimen profile and crystallite or strain distributions for the correspondingly broadened profiles. This approach offers an alternative to standard methods such as those of Williamson-Hall and Warren-Averbach. An application of the MaxEnt method is demonstrated in the analysis of alumina size-broadened diffraction data (from NIST, Gaithersburg). It is used to determine the specimen profile and column-length distribution of the scattering domains. Finally, these results are compared with the corresponding Williamson-Hall and Warren-Averbach analyses. Copyright (1999) Australian X-ray Analytical Association Inc
Minimum entropy density method for the time series analysis
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
Maximum Quantum Entropy Method
Sim, Jae-Hoon; Han, Myung Joon
2018-01-01
Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...
Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions
Xie, Jigang; Song, Wenyun
The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.
Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis
Directory of Open Access Journals (Sweden)
Xiaoming Xu
2017-01-01
Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.
International Nuclear Information System (INIS)
Ponman, T.J.
1984-01-01
For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)
A Real-Time Analysis Method for Pulse Rate Variability Based on Improved Basic Scale Entropy
Directory of Open Access Journals (Sweden)
Yongxin Chou
2017-01-01
Full Text Available Base scale entropy analysis (BSEA is a nonlinear method to analyze heart rate variability (HRV signal. However, the time consumption of BSEA is too long, and it is unknown whether the BSEA is suitable for analyzing pulse rate variability (PRV signal. Therefore, we proposed a method named sliding window iterative base scale entropy analysis (SWIBSEA by combining BSEA and sliding window iterative theory. The blood pressure signals of healthy young and old subjects are chosen from the authoritative international database MIT/PhysioNet/Fantasia to generate PRV signals as the experimental data. Then, the BSEA and the SWIBSEA are used to analyze the experimental data; the results show that the SWIBSEA reduces the time consumption and the buffer cache space while it gets the same entropy as BSEA. Meanwhile, the changes of base scale entropy (BSE for healthy young and old subjects are the same as that of HRV signal. Therefore, the SWIBSEA can be used for deriving some information from long-term and short-term PRV signals in real time, which has the potential for dynamic PRV signal analysis in some portable and wearable medical devices.
Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data
Directory of Open Access Journals (Sweden)
Heiko Balzter
2015-03-01
Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
International Nuclear Information System (INIS)
Giangaspero, Giorgio; Sciubba, Enrico
2013-01-01
This paper presents an application of the entropy generation minimization method to the pseudo-optimization of the configuration of the heat exchange surfaces in a Solar Rooftile. An initial “standard” commercial configuration is gradually improved by introducing design changes aimed at the reduction of the thermodynamic losses due to heat transfer and fluid friction. Different geometries (pins, fins and others) are analysed with a commercial CFD (Computational Fluid Dynamics) code that also computes the local entropy generation rate. The design improvement process is carried out on the basis of a careful analysis of the local entropy generation maps and the rationale behind each step of the process is discussed in this perspective. The results are compared with other entropy generation minimization techniques available in the recent technical literature. It is found that the geometry with pin-fins has the best performance among the tested ones, and that the optimal pin array shape parameters (pitch and span) can be determined by a critical analysis of the integrated and local entropy maps and of the temperature contours. - Highlights: ► An entropy generation minimization method is applied to a solar heat exchanger. ► The approach is heuristic and leads to a pseudo-optimization process with CFD as main tool. ► The process is based on the evaluation of the local entropy generation maps. ► The geometry with pin-fins in general outperforms all other configurations. ► The entropy maps and temperature contours can be used to determine the optimal pin array design parameters
Directory of Open Access Journals (Sweden)
Hujun He
2017-01-01
Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.
Takahashi, Osamu; Nomura, Tetsuo; Tabayashi, Kiyohiko; Yamasaki, Katsuyoshi
2008-07-01
We performed spectral analysis by using the maximum entropy method instead of the traditional Fourier transform technique to investigate the short-time behavior in molecular systems, such as the energy transfer between vibrational modes and chemical reactions. This procedure was applied to direct ab initio molecular dynamics calculations for the decomposition of formic acid. More reactive trajectories of dehydrolation than those of decarboxylation were obtained for Z-formic acid, which was consistent with the prediction of previous theoretical and experimental studies. Short-time maximum entropy method analyses were performed for typical reactive and non-reactive trajectories. Spectrograms of a reactive trajectory were obtained; these clearly showed the reactant, transient, and product regions, especially for the dehydrolation path.
Directory of Open Access Journals (Sweden)
Kemal Vatansever
2017-07-01
Full Text Available The revolutionary alterations and conversions occurring in information and communication technologies, have triggered an increase in the electronic commerce applications. Airline tickets are one of the most popular items purchased on the internet. The airline websites have become a big distribution channel for the companies to sustain their competitiveness. At this moment, the competition is increasing as airlines try to acquire and retain customers in the airline industry. To acquire and retain customers in such a highly competitive market, it is important for airlines to understand their relative levels of quality in terms of critical elements affecting their competitive advantages. In this study, an integrated two-stage multi-criteria decision-making techniques were used for the measurement of the performance of the airline websites using the Entropy Weight Method and the Grey Relational Analysis approach. The performance of 11 airline companies’ websites operating in Turkey was evaluated in terms of seven criteria. The data of quality website from airlines websites were taken more than 30 trails on various occasions on different periods of times. The data has been taken from 1 December 2016 to 31 December 2016. The weights of the attributes were calculated by Entropy Weight Method, the evaluation of the alternatives using the Grey Relational Analysis method were given ranking of websites.
Bonito, Andrea; Guermond, Jean-Luc; Popov, Bojan
2013-01-01
We establish the L2-stability of an entropy viscosity technique applied to nonlinear scalar conservation equations. First-and second-order explicit time-stepping techniques using continuous finite elements in space are considered. The method
Multivariate refined composite multiscale entropy analysis
International Nuclear Information System (INIS)
Humeau-Heurtier, Anne
2016-01-01
Multiscale entropy (MSE) has become a prevailing method to quantify signals complexity. MSE relies on sample entropy. However, MSE may yield imprecise complexity estimation at large scales, because sample entropy does not give precise estimation of entropy when short signals are processed. A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. Nevertheless, RCMSE is for univariate signals only. The simultaneous analysis of multi-channel (multivariate) data often over-performs studies based on univariate signals. We therefore introduce an extension of RCMSE to multivariate data. Applications of multivariate RCMSE to simulated processes reveal its better performances over the standard multivariate MSE. - Highlights: • Multiscale entropy quantifies data complexity but may be inaccurate at large scale. • A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. • Nevertheless, RCMSE is adapted to univariate time series only. • We herein introduce an extension of RCMSE to multivariate data. • It shows better performances than the standard multivariate multiscale entropy.
Performance Analysis of Entropy Methods on K Means in Clustering Process
Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib
2017-12-01
K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.
Analysis of QCD sum rule based on the maximum entropy method
International Nuclear Information System (INIS)
Gubler, Philipp
2012-01-01
QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)
Critical Analysis of Non-Nuclear Electron-Density Maxima and the Maximum Entropy Method
de Vries, R.Y.; Briels, Willem J.; Feil, D.; Feil, D.
1996-01-01
Experimental evidence for the existence of non-nuclear maxima in charge densities is questioned. It is shown that the non-nuclear maxima reported for silicon are artifacts of the maximum entropy method that was used to analyze the x-ray diffraction data. This method can be improved by the use of
Bonito, Andrea
2013-10-03
We establish the L2-stability of an entropy viscosity technique applied to nonlinear scalar conservation equations. First-and second-order explicit time-stepping techniques using continuous finite elements in space are considered. The method is shown to be stable independently of the polynomial degree of the space approximation under the standard CFL condition. © 2013 American Mathematical Society.
The Maximum Entropy Method for Optical Spectrum Analysis of Real-Time TDDFT
International Nuclear Information System (INIS)
Toogoshi, M; Kano, S S; Zempo, Y
2015-01-01
The maximum entropy method (MEM) is one of the key techniques for spectral analysis. The major feature is that spectra in the low frequency part can be described by the short time-series data. Thus, we applied MEM to analyse the spectrum from the time dependent dipole moment obtained from the time-dependent density functional theory (TDDFT) calculation in real time. It is intensively studied for computing optical properties. In the MEM analysis, however, the maximum lag of the autocorrelation is restricted by the total number of time-series data. We proposed that, as an improved MEM analysis, we use the concatenated data set made from the several-times repeated raw data. We have applied this technique to the spectral analysis of the TDDFT dipole moment of ethylene and oligo-fluorene with n = 8. As a result, the higher resolution can be obtained, which is closer to that of FT with practically time-evoluted data as the same total number of time steps. The efficiency and the characteristic feature of this technique are presented in this paper. (paper)
Methods for calculating nonconcave entropies
International Nuclear Information System (INIS)
Touchette, Hugo
2010-01-01
Five different methods which can be used to analytically calculate entropies that are nonconcave as functions of the energy in the thermodynamic limit are discussed and compared. The five methods are based on the following ideas and techniques: (i) microcanonical contraction, (ii) metastable branches of the free energy, (iii) generalized canonical ensembles with specific illustrations involving the so-called Gaussian and Betrag ensembles, (iv) the restricted canonical ensemble, and (v) the inverse Laplace transform. A simple long-range spin model having a nonconcave entropy is used to illustrate each method
Analysis of calculating methods for failure distribution function based on maximal entropy principle
International Nuclear Information System (INIS)
Guo Chunying; Lin Yuangen; Jiang Meng; Wu Changli
2009-01-01
The computation of invalidation distribution functions of electronic devices when exposed in gamma rays is discussed here. First, the possible devices failure distribution models are determined through the tests of statistical hypotheses using the test data. The results show that: the devices' failure distribution can obey multi-distributions when the test data is few. In order to decide the optimum failure distribution model, the maximal entropy principle is used and the elementary failure models are determined. Then, the Bootstrap estimation method is used to simulate the intervals estimation of the mean and the standard deviation. On the basis of this, the maximal entropy principle is used again and the simulated annealing method is applied to find the optimum values of the mean and the standard deviation. Accordingly, the electronic devices' optimum failure distributions are finally determined and the survival probabilities are calculated. (authors)
Directory of Open Access Journals (Sweden)
Jinlu Sheng
2016-07-01
Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.
International Nuclear Information System (INIS)
Fiebig, H. Rudolf
2002-01-01
We study various aspects of extracting spectral information from time correlation functions of lattice QCD by means of Bayesian inference with an entropic prior, the maximum entropy method (MEM). Correlator functions of a heavy-light meson-meson system serve as a repository for lattice data with diverse statistical quality. Attention is given to spectral mass density functions, inferred from the data, and their dependence on the parameters of the MEM. We propose to employ simulated annealing, or cooling, to solve the Bayesian inference problem, and discuss the practical issues of the approach
International Nuclear Information System (INIS)
Nasser, Hassan; Cessac, Bruno; Marre, Olivier
2013-01-01
Understanding the dynamics of neural networks is a major challenge in experimental neuroscience. For that purpose, a modelling of the recorded activity that reproduces the main statistics of the data is required. In the first part, we present a review on recent results dealing with spike train statistics analysis using maximum entropy models (MaxEnt). Most of these studies have focused on modelling synchronous spike patterns, leaving aside the temporal dynamics of the neural activity. However, the maximum entropy principle can be generalized to the temporal case, leading to Markovian models where memory effects and time correlations in the dynamics are properly taken into account. In the second part, we present a new method based on Monte Carlo sampling which is suited for the fitting of large-scale spatio-temporal MaxEnt models. The formalism and the tools presented here will be essential to fit MaxEnt spatio-temporal models to large neural ensembles. (paper)
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
Entropy methods for diffusive partial differential equations
Jüngel, Ansgar
2016-01-01
This book presents a range of entropy methods for diffusive PDEs devised by many researchers in the course of the past few decades, which allow us to understand the qualitative behavior of solutions to diffusive equations (and Markov diffusion processes). Applications include the large-time asymptotics of solutions, the derivation of convex Sobolev inequalities, the existence and uniqueness of weak solutions, and the analysis of discrete and geometric structures of the PDEs. The purpose of the book is to provide readers an introduction to selected entropy methods that can be found in the research literature. In order to highlight the core concepts, the results are not stated in the widest generality and most of the arguments are only formal (in the sense that the functional setting is not specified or sufficient regularity is supposed). The text is also suitable for advanced master and PhD students and could serve as a textbook for special courses and seminars.
Texture analysis using Renyi's generalized entropies
Grigorescu, SE; Petkov, N
2003-01-01
We propose a texture analysis method based on Renyi's generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The
Maximum entropy and Bayesian methods
International Nuclear Information System (INIS)
Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.
1992-01-01
Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come
RELAP-7 Numerical Stabilization: Entropy Viscosity Method
Energy Technology Data Exchange (ETDEWEB)
R. A. Berry; M. O. Delchini; J. Ragusa
2014-06-01
The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL's modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5's capability and extends the analysis capability for all reactor system simulation scenarios. RELAP-7 utilizes a single phase and a novel seven-equation two-phase flow models as described in the RELAP-7 Theory Manual (INL/EXT-14-31366). The basic equation systems are hyperbolic, which generally require some type of stabilization (or artificial viscosity) to capture nonlinear discontinuities and to suppress advection-caused oscillations. This report documents one of the available options for this stabilization in RELAP-7 -- a new and novel approach known as the entropy viscosity method. Because the code is an ongoing development effort in which the physical sub models, numerics, and coding are evolving, so too must the specific details of the entropy viscosity stabilization method. Here the fundamentals of the method in their current state are presented.
Directory of Open Access Journals (Sweden)
SAURABH KUMAR GUPTA
2015-01-01
Full Text Available The present research focus on optimization of Friction Stir Welding (FSW process parameters for joining of AA6061 aluminium alloy using hybrid approach. The FSW process parameters considered are tool rotational speed, welding speed and axial force. The quality characteristics considered are tensile strength (TS and percentage of tensile elongation (TE. Taguchi based experimental design L9 orthogonal array is used for determining the experimental results. The value of weights corresponding to each quality characteristic is determined by using the entropy measurement method so that their importance can be properly explained. Analysis of Variance (ANOVA is used to determine the contribution of FSW process parameters. The confirmation tests also have been done for verifying the results.
Entropy viscosity method for nonlinear conservation laws
Guermond, Jean-Luc
2011-05-01
A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.
Entropy viscosity method for nonlinear conservation laws
Guermond, Jean-Luc; Pasquetti, Richard; Popov, Bojan
2011-01-01
A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.
Combined analysis of steady state and transient transport by the maximum entropy method
Energy Technology Data Exchange (ETDEWEB)
Giannone, L.; Stroth, U; Koellermeyer, J [Association Euratom-Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); and others
1996-04-01
A new maximum entropy approach has been applied to analyse three types of transient transport experiments. For sawtooth propagation experiments in the ASDEX Upgrade and ECRH power modulation and power-switching experiments in the Wendelstein 7-AS Stellarator, either the time evolution of the temperature perturbation or the phase and amplitude of the modulated temperature perturbation are used as non-linear constraints to the {chi}{sub e} profile to be fitted. Simultaneously, the constraints given by the equilibrium temperature profile for steady-state power balance are fitted. In the maximum entropy formulation, the flattest {chi}{sub e} profile consistent with the constraints is found. It was found that {chi}{sub e} determined from sawtooth propagation was greater than the power balance value by a factor of five in the ASDEX Upgrade. From power modulation experiments, employing the measurements of four modulation frequencies simultaneously, the power deposition profile as well as the {chi}{sub e} profile could be determined. A comparison of the predictions of a time-independent {chi}{sub e} model and a power-dependent {chi}{sub e} model is made. The power-switching experiments show that the {chi}{sub e} profile must change within a millisecond to a new value consistent with the power balance value at the new input power. Neither power deposition broadening due to suprathermal electrons nor temperature or gradient dependences of {chi}{sub e} can explain this observation. (author).
International Nuclear Information System (INIS)
Kim, Jongkwang; Kim, Sowun; Lee, Kunsang; Kwon, Younghun
2009-01-01
In this article, we investigate the language structure in yeast 16 chromosomes. In order to find it, we use the entropy analysis for codons (or amino acids) of yeast 16 chromosomes, developed in analysis of natural language by Montemurro et al. From the analysis, we can see that there exists a language structure in codons (or amino acids) of yeast 16 chromosomes. Also we find that the grammar structure of amino acids of yeast 16 chromosomes has a deep relationship with secondary structure of protein.
Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo
2010-02-01
Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.
Maximum entropy analysis of EGRET data
DEFF Research Database (Denmark)
Pohl, M.; Strong, A.W.
1997-01-01
EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....
Delgado-Villanueva, Kiko Alexi; Romero Gil, Inmaculada
2016-01-01
[EN] Environmental conflict analysis (henceforth ECA) has become a key factor for the viability of projects and welfare of affected populations. In this study, we propose an approach for ECA using an integrated grey clustering and entropy-weight method (The IGCEW method). The case study considered a mining project in northern Peru. Three stakeholder groups and seven criteria were identified. The data were gathered by conducting field interviews. The results revealed that for the groups urban ...
Entropy viscosity method applied to Euler equations
International Nuclear Information System (INIS)
Delchini, M. O.; Ragusa, J. C.; Berry, R. A.
2013-01-01
The entropy viscosity method [4] has been successfully applied to hyperbolic systems of equations such as Burgers equation and Euler equations. The method consists in adding dissipative terms to the governing equations, where a viscosity coefficient modulates the amount of dissipation. The entropy viscosity method has been applied to the 1-D Euler equations with variable area using a continuous finite element discretization in the MOOSE framework and our results show that it has the ability to efficiently smooth out oscillations and accurately resolve shocks. Two equations of state are considered: Ideal Gas and Stiffened Gas Equations Of State. Results are provided for a second-order time implicit schemes (BDF2). Some typical Riemann problems are run with the entropy viscosity method to demonstrate some of its features. Then, a 1-D convergent-divergent nozzle is considered with open boundary conditions. The correct steady-state is reached for the liquid and gas phases with a time implicit scheme. The entropy viscosity method correctly behaves in every problem run. For each test problem, results are shown for both equations of state considered here. (authors)
The maximum-entropy method in superspace
Czech Academy of Sciences Publication Activity Database
van Smaalen, S.; Palatinus, Lukáš; Schneider, M.
2003-01-01
Roč. 59, - (2003), s. 459-469 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method, * aperiodic crystals * electron density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.558, year: 2003
Entropy-based benchmarking methods
Temurshoev, Umed
2012-01-01
We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth
An Adaptively Accelerated Bayesian Deblurring Method with Entropy Prior
Directory of Open Access Journals (Sweden)
Yong-Hoon Kim
2008-05-01
Full Text Available The development of an efficient adaptively accelerated iterative deblurring algorithm based on Bayesian statistical concept has been reported. Entropy of an image has been used as a Ã¢Â€ÂœpriorÃ¢Â€Â distribution and instead of additive form, used in conventional acceleration methods an exponent form of relaxation constant has been used for acceleration. Thus the proposed method is called hereafter as adaptively accelerated maximum a posteriori with entropy prior (AAMAPE. Based on empirical observations in different experiments, the exponent is computed adaptively using first-order derivatives of the deblurred image from previous two iterations. This exponent improves speed of the AAMAPE method in early stages and ensures stability at later stages of iteration. In AAMAPE method, we also consider the constraint of the nonnegativity and flux conservation. The paper discusses the fundamental idea of the Bayesian image deblurring with the use of entropy as prior, and the analytical analysis of superresolution and the noise amplification characteristics of the proposed method. The experimental results show that the proposed AAMAPE method gives lower RMSE and higher SNR in 44% lesser iterations as compared to nonaccelerated maximum a posteriori with entropy prior (MAPE method. Moreover, AAMAPE followed by wavelet wiener filtering gives better result than the state-of-the-art methods.
Entropy Generation Analysis of Desalination Technologies
Directory of Open Access Journals (Sweden)
John H. Lienhard V
2011-09-01
Full Text Available Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies. Entropy generation analysis, and specifically, Second Law efficiency, is an important tool for illustrating the influence of irreversibilities within a system on the required energy input. When defining Second Law efficiency, the useful exergy output of the system must be properly defined. For desalination systems, this is the minimum least work of separation required to extract a unit of water from a feed stream of a given salinity. In order to evaluate the Second Law efficiency, entropy generation mechanisms present in a wide range of desalination processes are analyzed. In particular, entropy generated in the run down to equilibrium of discharge streams must be considered. Physical models are applied to estimate the magnitude of entropy generation by component and individual processes. These formulations are applied to calculate the total entropy generation in several desalination systems including multiple effect distillation, multistage flash, membrane distillation, mechanical vapor compression, reverse osmosis, and humidification-dehumidification. Within each technology, the relative importance of each source of entropy generation is discussed in order to determine which should be the target of entropy generation minimization. As given here, the correct application of Second Law efficiency shows which systems operate closest to the reversible limit and helps to indicate which systems have the greatest potential for improvement.
Directory of Open Access Journals (Sweden)
Enrico Sciubba
2011-06-01
Full Text Available In this paper, the entropy generation minimization (EGM method is applied to an industrial heat transfer problem: the forced convective cooling of a LED-based spotlight. The design specification calls for eighteen diodes arranged on a circular copper plate of 35 mm diameter. Every diode dissipates 3 W and the maximum allowedtemperature of the plate is 80 °C. The cooling relies on the forced convection driven by a jet of air impinging on the plate. An initial complex geometry of plate fins is presented and analyzed with a commercial CFD code that computes the entropy generation rate. A pseudo-optimization process is carried out via a successive series of design modifications based on a careful analysis of the entropy generation maps. One of the advantages of the EGM method is that the rationale behind each step of the design process can be justified on a physical basis. It is found that the best performance is attained when the fins are periodically spaced in the radial direction.
Multivariate Generalized Multiscale Entropy Analysis
Directory of Open Access Journals (Sweden)
Anne Humeau-Heurtier
2016-11-01
Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.
Maximum entropy method in momentum density reconstruction
International Nuclear Information System (INIS)
Dobrzynski, L.; Holas, A.
1997-01-01
The Maximum Entropy Method (MEM) is applied to the reconstruction of the 3-dimensional electron momentum density distributions observed through the set of Compton profiles measured along various crystallographic directions. It is shown that the reconstruction of electron momentum density may be reliably carried out with the aid of simple iterative algorithm suggested originally by Collins. A number of distributions has been simulated in order to check the performance of MEM. It is shown that MEM can be recommended as a model-free approach. (author). 13 refs, 1 fig
Precise charge density studies by maximum entropy method
Takata, M
2003-01-01
For the production research and development of nanomaterials, their structural information is indispensable. Recently, a sophisticated analytical method, which is based on information theory, the Maximum Entropy Method (MEM) using synchrotron radiation powder data, has been successfully applied to determine precise charge densities of metallofullerenes and nanochannel microporous compounds. The results revealed various endohedral natures of metallofullerenes and one-dimensional array formation of adsorbed gas molecules in nanochannel microporous compounds. The concept of MEM analysis was also described briefly. (author)
An Entropy-Based Network Anomaly Detection Method
Directory of Open Access Journals (Sweden)
Przemysław Bereziński
2015-04-01
Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.
Texture Analysis Using Rényi’s Generalized Entropies
Grigorescu, S.E.; Petkov, N.
2003-01-01
We propose a texture analysis method based on Rényi’s generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The
Power spectrum of the geomagnetic field by the maximum entropy method
International Nuclear Information System (INIS)
Kantor, I.J.; Trivedi, N.B.
1980-01-01
Monthly mean values of Vassouras (state of Rio de Janeiro) geomagnetic field are analyzed us the maximum entropy method. The method is described and compared with other methods of spectral analysis, and its advantages and disadvantages are presented. (Author) [pt
Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory
Rahimi, A.; Zhang, L.
2012-12-01
Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further
Maximum entropy analysis of liquid diffraction data
International Nuclear Information System (INIS)
Root, J.H.; Egelstaff, P.A.; Nickel, B.G.
1986-01-01
A maximum entropy method for reducing truncation effects in the inverse Fourier transform of structure factor, S(q), to pair correlation function, g(r), is described. The advantages and limitations of the method are explored with the PY hard sphere structure factor as model input data. An example using real data on liquid chlorine, is then presented. It is seen that spurious structure is greatly reduced in comparison to traditional Fourier transform methods. (author)
Entropie analysis of floating car data systems
Directory of Open Access Journals (Sweden)
F. Gössel
2004-01-01
Full Text Available The knowledge of the actual traffic state is a basic prerequisite of modern traffic telematic systems. Floating Car Data (FCD systems are becoming more and more important for the provision of actual and reliable traffic data. In these systems the vehicle velocity is the original variable for the evaluation of the current traffic condition. As real FCDsystems are operating under conditions of limited transmission and processing capacity the analysis of the original variable vehicle speed is of special interest. Entropy considerations are especially useful for the deduction of fundamental restrictions and limitations. The paper analyses velocity-time profiles by means of information entropy. It emphasises in quantification of the information content of velocity-time profiles and the discussion of entropy dynamic in velocity-time profiles. Investigations are based on empirical data derived during field trials. The analysis of entropy dynamic is carried out in two different ways. On one hand velocity differences within a certain interval of time are used, on the other hand the transinformation between velocities in certain time distances was evaluated. One important result is an optimal sample-rate for the detection of velocity data in FCD-systems. The influence of spatial segmentation and of different states of traffic was discussed.
The improvement of Clausius entropy and its application in entropy analysis
Institute of Scientific and Technical Information of China (English)
WU Jing; GUO ZengYuan
2008-01-01
The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.
Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location
Directory of Open Access Journals (Sweden)
Qiaoning Yang
2015-10-01
Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.
Entropy method of measuring and evaluating periodicity of quasi-periodic trajectories
Ni, Yanshuo; Turitsyn, Konstantin; Baoyin, Hexi; Junfeng, Li
2018-06-01
This paper presents a method for measuring the periodicity of quasi-periodic trajectories by applying discrete Fourier transform (DFT) to the trajectories and analyzing the frequency domain within the concept of entropy. Having introduced the concept of entropy, analytical derivation and numerical results indicate that entropies increase as a logarithmic function of time. Periodic trajectories typically have higher entropies, and trajectories with higher entropies mean the periodicities of the motions are stronger. Theoretical differences between two trajectories expressed as summations of trigonometric functions are also derived analytically. Trajectories in the Henon-Heiles system and the circular restricted three-body problem (CRTBP) are analyzed with the indicator entropy and compared with orthogonal fast Lyapunov indicator (OFLI). The results show that entropy is a better tool for discriminating periodicity in quasiperiodic trajectories than OFLI and can detect periodicity while excluding the spirals that are judged as periodic cases by OFLI. Finally, trajectories in the vicinity of 243 Ida and 6489 Golevka are considered as examples, and the numerical results verify these conclusions. Some trajectories near asteroids look irregular, but their higher entropy values as analyzed by this method serve as evidence of frequency regularity in three directions. Moreover, these results indicate that applying DFT to the trajectories in the vicinity of irregular small bodies and calculating their entropy in the frequency domain provides a useful quantitative analysis method for evaluating orderliness in the periodicity of quasi-periodic trajectories within a given time interval.
The improvement of Clausius entropy and its application in entropy analysis
Institute of Scientific and Technical Information of China (English)
2008-01-01
The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.
Multiscale permutation entropy analysis of electrocardiogram
Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao
2017-04-01
To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.
Comparison of transfer entropy methods for financial time series
He, Jiayi; Shang, Pengjian
2017-09-01
There is a certain relationship between the global financial markets, which creates an interactive network of global finance. Transfer entropy, a measurement for information transfer, offered a good way to analyse the relationship. In this paper, we analysed the relationship between 9 stock indices from the U.S., Europe and China (from 1995 to 2015) by using transfer entropy (TE), effective transfer entropy (ETE), Rényi transfer entropy (RTE) and effective Rényi transfer entropy (ERTE). We compared the four methods in the sense of the effectiveness for identification of the relationship between stock markets. In this paper, two kinds of information flows are given. One reveals that the U.S. took the leading position when in terms of lagged-current cases, but when it comes to the same date, China is the most influential. And ERTE could provide superior results.
DEFF Research Database (Denmark)
Olsen, Lars Rønn; Zhang, Guang Lan; Keskin, Derin B.
2011-01-01
residues. The block entropy analysis provides broad coverage of variant antigens. We applied the block entropy analysis method to the proteomes of the four serotypes of dengue virus (DENV) and found 1,551 blocks of 9-mer peptides, which cover 99% of available sequences with five or fewer unique peptides...
Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy
Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng
2018-06-01
To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.
Directory of Open Access Journals (Sweden)
Hou Hucan
2017-01-01
Full Text Available Inspired by wide application of the second law of thermodynamics to flow and heat transfer devices, local entropy production analysis method was creatively introduced into energy assessment system of centrifugal water pump. Based on Reynolds stress turbulent model and energy equation model, the steady numerical simulation of the whole flow passage of one IS centrifugal pump was carried out. The local entropy production terms were calculated by user defined functions, mainly including wall entropy production, turbulent entropy production, and viscous entropy production. The numerical results indicated that the irreversible energy loss calculated by the local entropy production method agreed well with that calculated by the traditional method but with some deviations which were probably caused by high rotatability and high curvature of impeller and volute. The wall entropy production and turbulent entropy production took up large part of the whole entropy production about 48.61% and 47.91%, respectively, which indicated that wall friction and turbulent fluctuation were the major factors in affecting irreversible energy loss. Meanwhile, the entropy production rate distribution was discussed and compared with turbulent kinetic energy dissipation rate distribution, it showed that turbulent entropy production rate increased sharply at the near wall regions and both distributed more uniformly. The blade region in leading edge near suction side, trailing edge and volute tongue were the main regions to generate irreversible exergy loss. This research broadens a completely new view in evaluating energy loss and further optimizes pump using entropy production minimization.
Statistical-mechanical entropy by the thin-layer method
International Nuclear Information System (INIS)
Feng, He; Kim, Sung Won
2003-01-01
G. Hooft first studied the statistical-mechanical entropy of a scalar field in a Schwarzschild black hole background by the brick-wall method and hinted that the statistical-mechanical entropy is the statistical origin of the Bekenstein-Hawking entropy of the black hole. However, according to our viewpoint, the statistical-mechanical entropy is only a quantum correction to the Bekenstein-Hawking entropy of the black-hole. The brick-wall method based on thermal equilibrium at a large scale cannot be applied to the cases out of equilibrium such as a nonstationary black hole. The statistical-mechanical entropy of a scalar field in a nonstationary black hole background is calculated by the thin-layer method. The condition of local equilibrium near the horizon of the black hole is used as a working postulate and is maintained for a black hole which evaporates slowly enough and whose mass is far greater than the Planck mass. The statistical-mechanical entropy is also proportional to the area of the black hole horizon. The difference from the stationary black hole is that the result relies on a time-dependent cutoff
On cell entropy inequality for discontinuous Galerkin methods
Jiang, Guangshan; Shu, Chi-Wang
1993-01-01
We prove a cell entropy inequality for a class of high order discontinuous Galerkin finite element methods approximating conservation laws, which implies convergence for the one dimensional scalar convex case.
Entropy generation method to quantify thermal comfort
Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.
2001-01-01
The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study
Packer Detection for Multi-Layer Executables Using Entropy Analysis
Directory of Open Access Journals (Sweden)
Munkhbayar Bat-Erdene
2017-03-01
Full Text Available Packing algorithms are broadly used to avoid anti-malware systems, and the proportion of packed malware has been growing rapidly. However, just a few studies have been conducted on detection various types of packing algorithms in a systemic way. Following this understanding, we elaborate a method to classify packing algorithms of a given executable into three categories: single-layer packing, re-packing, or multi-layer packing. We convert entropy values of the executable file loaded into memory into symbolic representations, for which we used SAX (Symbolic Aggregate Approximation. Based on experiments of 2196 programs and 19 packing algorithms, we identify that precision (97.7%, accuracy (97.5%, and recall ( 96.8% of our method are respectively high to confirm that entropy analysis is applicable in identifying packing algorithms.
Rahimi, Alireza; Sepehr, Mohammad; Lariche, Milad Janghorban; Mesbah, Mohammad; Kasaeipoor, Abbas; Malekshah, Emad Hasani
2018-03-01
The lattice Boltzmann simulation of natural convection in H-shaped cavity filled with nanofluid is performed. The entropy generation analysis and heatline visualization are employed to analyze the considered problem comprehensively. The produced nanofluid is SiO2-TiO2/Water-EG (60:40) hybrid nanofluid, and the thermal conductivity and dynamic viscosity of used nanofluid are measured experimentally. To use the experimental data of thermal conductivity and dynamic viscosity, two sets of correlations based on temperature for six different solid volume fractions of 0.5, 1, 1.5, 2, 2.5 and 3 vol% are derived. The influences of different governing parameters such different aspect ratio, solid volume fractions of nanofluid and Rayleigh numbers on the fluid flow, temperature filed, average/local Nusselt number, total/local entropy generation and heatlines are presented.
Energy Technology Data Exchange (ETDEWEB)
Kawaguchi, K; Egashira, Y; Watanabe, G [Mazda Motor Corp., Hiroshima (Japan)
1997-10-01
Vehicle and unit performance change according to not only external causes represented by the environment such as temperature or weather, but also internal causes which are dispersion of component characteristics and manufacturing processes or aged deteriorations. We developed the design method to estimate thus performance distributions with maximum entropy method and to calculate specifications with high performance robustness using Fuzzy theory. This paper describes the details of these methods and examples applied to power window system. 3 refs., 7 figs., 4 tabs.
Financial time series analysis based on effective phase transfer entropy
Yang, Pengbo; Shang, Pengjian; Lin, Aijing
2017-02-01
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.
A Maximum Entropy Method for a Robust Portfolio Problem
Directory of Open Access Journals (Sweden)
Yingying Xu
2014-06-01
Full Text Available We propose a continuous maximum entropy method to investigate the robustoptimal portfolio selection problem for the market with transaction costs and dividends.This robust model aims to maximize the worst-case portfolio return in the case that allof asset returns lie within some prescribed intervals. A numerical optimal solution tothe problem is obtained by using a continuous maximum entropy method. Furthermore,some numerical experiments indicate that the robust model in this paper can result in betterportfolio performance than a classical mean-variance model.
A Tutorial on the Cross-Entropy Method
de Boer, Pieter-Tjerk; Kroese, Dirk; Mannor, Shie; Rubinstein, Reuven Y.
The cross-entropy (CE) method is a new generic approach to combinatorial and multi-extremal optimization and rare event simulation. The purpose of this tutorial is to give a gentle introduction to the CE method. We present the CE methodology, the basic algorithm and its modiï¿½ï¿½?cations, and
Current opinion about maximum entropy methods in Moessbauer spectroscopy
International Nuclear Information System (INIS)
Szymanski, K
2009-01-01
Current opinion about Maximum Entropy Methods in Moessbauer Spectroscopy is presented. The most important advantage offered by the method is the correct data processing under circumstances of incomplete information. Disadvantage is the sophisticated algorithm and its application to the specific problems.
Generalized sample entropy analysis for traffic signals based on similarity measure
Shang, Du; Xu, Mengjia; Shang, Pengjian
2017-05-01
Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.
A simple method for estimating the entropy of neural activity
International Nuclear Information System (INIS)
Berry II, Michael J; Tkačik, Gašper; Dubuis, Julien; Marre, Olivier; Da Silveira, Rava Azeredo
2013-01-01
The number of possible activity patterns in a population of neurons grows exponentially with the size of the population. Typical experiments explore only a tiny fraction of the large space of possible activity patterns in the case of populations with more than 10 or 20 neurons. It is thus impossible, in this undersampled regime, to estimate the probabilities with which most of the activity patterns occur. As a result, the corresponding entropy—which is a measure of the computational power of the neural population—cannot be estimated directly. We propose a simple scheme for estimating the entropy in the undersampled regime, which bounds its value from both below and above. The lower bound is the usual ‘naive’ entropy of the experimental frequencies. The upper bound results from a hybrid approximation of the entropy which makes use of the naive estimate, a maximum entropy fit, and a coverage adjustment. We apply our simple scheme to artificial data, in order to check their accuracy; we also compare its performance to those of several previously defined entropy estimators. We then apply it to actual measurements of neural activity in populations with up to 100 cells. Finally, we discuss the similarities and differences between the proposed simple estimation scheme and various earlier methods. (paper)
Multifractal diffusion entropy analysis: Optimal bin width of probability histograms
Jizba, Petr; Korbel, Jan
2014-11-01
In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.
Gamma-ray spectra deconvolution by maximum-entropy methods
International Nuclear Information System (INIS)
Los Arcos, J.M.
1996-01-01
A maximum-entropy method which includes the response of detectors and the statistical fluctuations of spectra is described and applied to the deconvolution of γ-ray spectra. Resolution enhancement of 25% can be reached for experimental peaks and up to 50% for simulated ones, while the intensities are conserved within 1-2%. (orig.)
Applications of the Maximum Entropy Method in superspace
Czech Academy of Sciences Publication Activity Database
van Smaalen, S.; Palatinus, Lukáš
2004-01-01
Roč. 305, - (2004), s. 57-62 ISSN 0015-0193 Grant - others:DFG and FCI(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : Maximum Entropy Method * modulated structures * charge density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.517, year: 2004
Refined generalized multiscale entropy analysis for physiological signals
Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian
2018-01-01
Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.
Entropy analysis on non-equilibrium two-phase flow models
International Nuclear Information System (INIS)
Karwat, H.; Ruan, Y.Q.
1995-01-01
A method of entropy analysis according to the second law of thermodynamics is proposed for the assessment of a class of practical non-equilibrium two-phase flow models. Entropy conditions are derived directly from a local instantaneous formulation for an arbitrary control volume of a structural two-phase fluid, which are finally expressed in terms of the averaged thermodynamic independent variables and their time derivatives as well as the boundary conditions for the volume. On the basis of a widely used thermal-hydraulic system code it is demonstrated with practical examples that entropy production rates in control volumes can be numerically quantified by using the data from the output data files. Entropy analysis using the proposed method is useful in identifying some potential problems in two-phase flow models and predictions as well as in studying the effects of some free parameters in closure relationships
Entropy analysis on non-equilibrium two-phase flow models
Energy Technology Data Exchange (ETDEWEB)
Karwat, H.; Ruan, Y.Q. [Technische Universitaet Muenchen, Garching (Germany)
1995-09-01
A method of entropy analysis according to the second law of thermodynamics is proposed for the assessment of a class of practical non-equilibrium two-phase flow models. Entropy conditions are derived directly from a local instantaneous formulation for an arbitrary control volume of a structural two-phase fluid, which are finally expressed in terms of the averaged thermodynamic independent variables and their time derivatives as well as the boundary conditions for the volume. On the basis of a widely used thermal-hydraulic system code it is demonstrated with practical examples that entropy production rates in control volumes can be numerically quantified by using the data from the output data files. Entropy analysis using the proposed method is useful in identifying some potential problems in two-phase flow models and predictions as well as in studying the effects of some free parameters in closure relationships.
International Nuclear Information System (INIS)
Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin
2014-01-01
To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts
Friedrich, Lucas; Winters, Andrew R.; Ferná ndez, David C. Del Rey; Gassner, Gregor J.; Parsani, Matteo; Carpenter, Mark H.
2017-01-01
analysis are discretely mimicked. Special attention is given to the coupling between nonconforming elements as we demonstrate that the standard mortar approach for DG methods does not guarantee entropy stability for non-linear problems, which can lead
A Method of Rotating Machinery Fault Diagnosis Based on the Close Degree of Information Entropy
Institute of Scientific and Technical Information of China (English)
GENG Jun-bao; HUANG Shu-hong; JIN Jia-shan; CHEN Fei; LIU Wei
2006-01-01
This paper presents a method of rotating machinery fault diagnosis based on the close degree of information entropy. In the view of the information entropy, we introduce four information entropy features of the rotating machinery, which describe the vibration condition of the machinery. The four features are, respectively, denominated as singular spectrum entropy, power spectrum entropy, wavelet space state feature entropy and wavelet power spectrum entropy. The value scopes of the four information entropy features of the rotating machinery in some typical fault conditions are gained by experiments, which can be acted as the standard features of fault diagnosis. According to the principle of the shorter distance between the more similar models, the decision-making method based on the close degree of information entropy is put forward to deal with the recognition of fault patterns. We demonstrate the effectiveness of this approach in an instance involving the fault pattern recognition of some rotating machinery.
International Nuclear Information System (INIS)
He, Z J; Zhang, X L; Chen, X F
2012-01-01
Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.
The maximum entropy method of moments and Bayesian probability theory
Bretthorst, G. Larry
2013-08-01
The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.
Symbolic phase transfer entropy method and its application
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-10-01
In this paper, we introduce symbolic phase transfer entropy (SPTE) to infer the direction and strength of information flow among systems. The advantages of the proposed method are investigated by simulations on synthetic signals and real-world data. We demonstrate that symbolic phase transfer entropy is a robust and efficient tool to infer the information flow between complex systems. Based on the study of the synthetic data, we find a significant advantage of SPTE is its reduced sensitivity to noise. In addition, SPTE requires less amount of data than symbolic transfer entropy(STE). We analyze the direction and strength of information flow between six stock markets during the period from 2006 to 2016. The results indicate that the information flow among stocks varies over different periods. We also find that the interaction network pattern among stocks undergoes hierarchial reorganization with transition from one period to another. It is shown that the clusters are mainly classified according to period, and then by region. The stocks during the same time period are shown to drop into the same cluster.
Path length entropy analysis of diastolic heart sounds.
Griffel, Benjamin; Zia, Mohammad K; Fridman, Vladamir; Saponieri, Cesare; Semmlow, John L
2013-09-01
Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multiscale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%-81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. Copyright © 2013 Elsevier Ltd. All rights reserved.
Implementation of the entropy viscosity method with the discontinuous Galerkin method
Zingan, Valentin
2013-01-01
The notion of entropy viscosity method introduced in Guermond and Pasquetti [21] is extended to the discontinuous Galerkin framework for scalar conservation laws and the compressible Euler equations. © 2012 Elsevier B.V.
International Nuclear Information System (INIS)
Li Min; Lai, Alvin C.K.
2013-01-01
Highlights: ► A second-law-based analysis is performed for single U-tube ground heat exchangers. ► Two expressions for the optimal length and flow velocity are developed for GHEs. ► Empirical velocities of GHEs are large compared to thermodynamic optimum values. - Abstract: This paper investigates thermodynamic performance of borehole ground heat exchangers with a single U-tube by the entropy generation minimization method which requires information of heat transfer and fluid mechanics, in addition to thermodynamics analysis. This study first derives an expression for dimensionless entropy generation number, a function that consists of five dimensionless variables, including Reynolds number, dimensionless borehole length, scale factor of pressures, and two duty parameters of ground heat exchangers. The derivation combines a heat transfer model and a hydraulics model for borehole ground heat exchangers with the first law and the second law of thermodynamics. Next, the entropy generation number is minimized to produce two analytical expressions for the optimal length and the optimal flow velocity of ground heat exchangers. Then, this paper discusses and analyzes implications and applications of these optimization formulas with two case studies. An important finding from the case studies is that widely used empirical velocities of circulating fluid are too large to operate ground-coupled heat pump systems in a thermodynamic optimization way. This paper demonstrates that thermodynamic optimal parameters of ground heat exchangers can probably be determined by using the entropy generation minimization method.
Use of the maximum entropy method in X-ray astronomy
International Nuclear Information System (INIS)
Willingale, R.
1981-01-01
An algorithm used to apply the maximum entropy method in X-ray astronomy is described. It is easy to programme on a digital computer and fast enough to allow processing of two-dimensional images. The method gives good noise suppression without loss of instrumental resolution and has been successfully applied to several data analysis problems in X-ray astronomy. The restoration of a high-resolution image from the Einstein Observatory demonstrates the use of the algorithm. (author)
Multi-scale symbolic transfer entropy analysis of EEG
Yao, Wenpo; Wang, Jun
2017-10-01
From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.
Entropy resistance minimization: An alternative method for heat exchanger analyses
International Nuclear Information System (INIS)
Cheng, XueTao
2013-01-01
In this paper, the concept of entropy resistance is proposed based on the entropy generation analyses of heat transfer processes. It is shown that smaller entropy resistance leads to larger heat transfer rate with fixed thermodynamic force difference and smaller thermodynamic force difference with fixed heat transfer rate, respectively. For the discussed two-stream heat exchangers in which the heat transfer rates are not given and the three-stream heat exchanger with prescribed heat capacity flow rates and inlet temperatures of the streams, smaller entropy resistance leads to larger heat transfer rate. For the two-stream heat exchangers with fixed heat transfer rate, smaller entropy resistance leads to larger effectiveness. Furthermore, it is shown that smaller values of the concepts of entropy generation numbers and modified entropy generation number do not always correspond to better performance of the discussed heat exchangers. - Highlights: • The concept of entropy resistance is defined for heat exchangers. • The concepts based on entropy generation are used to analyze heat exchangers. • Smaller entropy resistance leads to better performance of heat exchangers. • The applicability of entropy generation minimization is conditional
International Nuclear Information System (INIS)
Feng Guangwen; Hu Youhua; Liu Qian
2010-01-01
In this paper, the principle of TOPSIS method was introduced and applied to sorting the given indexes of glazed brick and granite respectively in different areas' decorative building materials in order to selecting the optimal low radiological decorative building materials. First, the entropy weight TOPSIS method was used for data processing about the sample numbers and radio nuclides content, and then different weights were given to different indexes. Finally, by using the SAS software for data analysis and sorting, we obtained that the optimal low radiological decorative building materials were Sichuan glazed brick and Henan granite. Through the results, it could be seen that the application of entropy weight TOPSIS method in selecting low radiological decorative building materials was feasible, and it will also provide the method reference. (authors)
A Connection Entropy Approach to Water Resources Vulnerability Analysis in a Changing Environment
Directory of Open Access Journals (Sweden)
Zhengwei Pan
2017-11-01
Full Text Available This paper establishes a water resources vulnerability framework based on sensitivity, natural resilience and artificial adaptation, through the analyses of the four states of the water system and its accompanying transformation processes. Furthermore, it proposes an analysis method for water resources vulnerability based on connection entropy, which extends the concept of contact entropy. An example is given of the water resources vulnerability in Anhui Province of China, which analysis illustrates that, overall, vulnerability levels fluctuated and showed apparent improvement trends from 2001 to 2015. Some suggestions are also provided for the improvement of the level of water resources vulnerability in Anhui Province, considering the viewpoint of the vulnerability index.
Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.
Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier
2016-08-01
Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.
Applicability of the minimum entropy generation method for optimizing thermodynamic cycles
Institute of Scientific and Technical Information of China (English)
Cheng Xue-Tao; Liang Xin-Gang
2013-01-01
Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations.In this paper,it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed.For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates,it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered.However,the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included,because the total heat into the system of interest is not fixed.An irreversible Carnot cycle and an irreversible Brayton cycle are analysed.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed.
Applicability of the minimum entropy generation method for optimizing thermodynamic cycles
International Nuclear Information System (INIS)
Cheng Xue-Tao; Liang Xin-Gang
2013-01-01
Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations. In this paper, it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed. For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates, it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered. However, the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included, because the total heat into the system of interest is not fixed. An irreversible Carnot cycle and an irreversible Brayton cycle are analysed. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed. (general)
EEG entropy measures in anesthesia
Directory of Open Access Journals (Sweden)
Zhenhu eLiang
2015-02-01
Full Text Available Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs’ effect is lacking. In this study, we compare the capability of twelve entropy indices for monitoring depth of anesthesia (DoA and detecting the burst suppression pattern (BSP, in anesthesia induced by GA-BAergic agents.Methods: Twelve indices were investigated, namely Response Entropy (RE and State entropy (SE, three wavelet entropy (WE measures (Shannon WE (SWE, Tsallis WE (TWE and Renyi WE (RWE, Hilbert-Huang spectral entropy (HHSE, approximate entropy (ApEn, sample entropy (SampEn, Fuzzy entropy, and three permutation entropy (PE measures (Shannon PE (SPE, Tsallis PE (TPE and Renyi PE (RPE. Two EEG data sets from sevoflurane-induced and isoflu-rane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, phar-macokinetic / pharmacodynamic (PK/PD modeling and prediction probability analysis were applied. The multifractal detrended fluctuation analysis (MDFA as a non-entropy measure was compared.Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline vari-ability, higher coefficient of determination and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an ad-vantage in computation efficiency compared with MDFA.Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure.Significance: Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA.
Entropy correlation distance method. The Euro introduction effect on the Consumer Price Index
Miśkiewicz, Janusz
2010-04-01
The idea of entropy was introduced in thermodynamics, but it can be used in time series analysis. There are various ways to define and measure the entropy of a system. Here the so called Theil index, which is often used in economy and finance, is applied as it were an entropy measure. In this study the time series are remapped through the Theil index. Then the linear correlation coefficient between the remapped time series is evaluated as a function of time and time window size and the corresponding statistical distance is defined. The results are compared with the the usual correlation distance measure for the time series themselves. As an example this entropy correlation distance method (ECDM) is applied to several series, as those of the Consumer Price Index (CPI) in order to test some so called globalisation processes. Distance matrices are calculated in order to construct two network structures which are next analysed. The role of two different time scales introduced by the Theil index and a correlation coefficient is also discussed. The evolution of the mean distance between the most developed countries is presented and the globalisation periods of the prices discussed. It is finally shown that the evolution of mean distance between the most developed countries on several networks follows the process of introducing the European currency - the Euro. It is contrasted to the GDP based analysis. It is stressed that the entropy correlation distance measure is more suitable in detecting significant changes, like a globalisation process than the usual statistical (correlation based) measure.
Directory of Open Access Journals (Sweden)
Javier A. Dottori
2015-01-01
Full Text Available A method for modeling outflow boundary conditions in the lattice Boltzmann method (LBM based on the maximization of the local entropy is presented. The maximization procedure is constrained by macroscopic values and downstream components. The method is applied to fully developed boundary conditions of the Navier-Stokes equations in rectangular channels. Comparisons are made with other alternative methods. In addition, the new downstream-conditioned entropy is studied and it was found that there is a correlation with the velocity gradient during the flow development.
Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis
Directory of Open Access Journals (Sweden)
Jianping Li
2013-12-01
Full Text Available What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer entropy matrix using bank stock price sequences. This paper also evaluates the stability of Chinese banking system by simulating the risk contagion process. This paper contributes to the literature on interbank contagion mainly in two ways: it establishes a convincing connection between interbank market and transfer entropy, and exploits the market information (stock price rather than presumptions to determine the interbank exposure matrix. Second, the empirical analysis provides an in depth understanding of the stability of the current Chinese banking system.
Entropy generation analysis of an adsorption cooling cycle
Thu, Kyaw
2013-05-01
This paper discusses the analysis of an adsorption (AD) chiller using system entropy generation as a thermodynamic framework for evaluating total dissipative losses that occurred in a batch-operated AD cycle. The study focuses on an adsorption cycle operating at heat source temperatures ranging from 60 to 85 °C, whilst the chilled water inlet temperature is fixed at 12.5 °C,-a temperature of chilled water deemed useful for dehumidification and cooling. The total entropy generation model examines the processes of key components of the AD chiller such as the heat and mass transfer, flushing and de-superheating of liquid refrigerant. The following key findings are observed: (i) The cycle entropy generation increases with the increase in the heat source temperature (10.8 to 46.2 W/K) and the largest share of entropy generation or rate of energy dissipation occurs at the adsorption process, (ii) the second highest energy rate dissipation is the desorption process, (iii) the remaining energy dissipation rates are the evaporation and condensation processes, respectively. Some of the noteworthy highlights from the study are the inevitable but significant dissipative losses found in switching processes of adsorption-desorption and vice versa, as well as the de-superheating of warm condensate that is refluxed at non-thermal equilibrium conditions from the condenser to the evaporator for the completion of the refrigeration cycle. © 2012 Elsevier Ltd. All rights reserved.
Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy
Yujun, Yang; Jianping, Li; Yimei, Yang
This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.
Analysis of complex time series using refined composite multiscale entropy
International Nuclear Information System (INIS)
Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang
2014-01-01
Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael
2014-01-01
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and
Liu, Jian; Zou, Renling; Zhang, Dongheng; Xu, Xiulin; Hu, Xiufang
2016-06-01
Exercise-induced muscle fatigue is a phenomenon that the maximum voluntary contraction force or power output of muscle is temporarily reduced due to muscular movement.If the fatigue is not treated properly,it will bring about a severe injury to the human body.With multi-channel collection of lower limb surface electromyography signals,this article analyzes the muscle fatigue by adoption of band spectrum entropy method which combined electromyographic signal spectral analysis and nonlinear dynamics.The experimental result indicated that with the increase of muscle fatigue,muscle signal spectrum began to move to low frequency,the energy concentrated,the system complexity came down,and the band spectrum entropy which reflected the complexity was also reduced.By monitoring the entropy,we can measure the degree of muscle fatigue,and provide an indicator to judge fatigue degree for the sports training and clinical rehabilitation training.
Liu, Yong; Qin, Zhimeng; Hu, Baodan; Feng, Shuai
2018-04-01
Stability analysis is of great significance to landslide hazard prevention, especially the dynamic stability. However, many existing stability analysis methods are difficult to analyse the continuous landslide stability and its changing regularities in a uniform criterion due to the unique landslide geological conditions. Based on the relationship between displacement monitoring data, deformation states and landslide stability, a state fusion entropy method is herein proposed to derive landslide instability through a comprehensive multi-attribute entropy analysis of deformation states, which are defined by a proposed joint clustering method combining K-means and a cloud model. Taking Xintan landslide as the detailed case study, cumulative state fusion entropy presents an obvious increasing trend after the landslide entered accelerative deformation stage and historical maxima match highly with landslide macroscopic deformation behaviours in key time nodes. Reasonable results are also obtained in its application to several other landslides in the Three Gorges Reservoir in China. Combined with field survey, state fusion entropy may serve for assessing landslide stability and judging landslide evolutionary stages.
RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS
Institute of Scientific and Technical Information of China (English)
Sun Youchao; Shi Jun
2004-01-01
The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.
International Nuclear Information System (INIS)
Reginatto, Marcel; Zimbal, Andreas
2008-01-01
In applications of neutron spectrometry to fusion diagnostics, it is advantageous to use methods of data analysis which can extract information from the spectrum that is directly related to the parameters of interest that describe the plasma. We present here methods of data analysis which were developed with this goal in mind, and which were applied to spectrometric measurements made with an organic liquid scintillation detector (type NE213). In our approach, we combine Bayesian parameter estimation methods and unfolding methods based on the maximum entropy principle. This two-step method allows us to optimize the analysis of the data depending on the type of information that we want to extract from the measurements. To illustrate these methods, we analyze neutron measurements made at the PTB accelerator under controlled conditions, using accelerator-produced neutron beams. Although the methods have been chosen with a specific application in mind, they are general enough to be useful for many other types of measurements
A Maximum Entropy Approach to Loss Distribution Analysis
Directory of Open Access Journals (Sweden)
Marco Bee
2013-03-01
Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Directory of Open Access Journals (Sweden)
John P Marken
Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
A new entropy based method for computing software structural complexity
Roca, J L
2002-01-01
In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relation...
Energy Technology Data Exchange (ETDEWEB)
Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin [East China University of Science and Technology, Shanghai (China)
2013-06-15
Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring.
International Nuclear Information System (INIS)
Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin
2013-01-01
Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring
Application of the maximum entropy method to dynamical fermion simulations
Clowser, Jonathan
This thesis presents results for spectral functions extracted from imaginary-time correlation functions obtained from Monte Carlo simulations using the Maximum Entropy Method (MEM). The advantages this method are (i) no a priori assumptions or parametrisations of the spectral function are needed, (ii) a unique solution exists and (iii) the statistical significance of the resulting image can be quantitatively analysed. The Gross Neveu model in d = 3 spacetime dimensions (GNM3) is a particularly interesting model to study with the MEM because at T = 0 it has a broken phase with a rich spectrum of mesonic bound states and a symmetric phase where there are resonances. Results for the elementary fermion, the Goldstone boson (pion), the sigma, the massive pseudoscalar meson and the symmetric phase resonances are presented. UKQCD Nf = 2 dynamical QCD data is also studied with MEM. Results are compared to those found from the quenched approximation, where the effects of quark loops in the QCD vacuum are neglected, to search for sea-quark effects in the extracted spectral functions. Information has been extract from the difficult axial spatial and scalar as well as the pseudoscalar, vector and axial temporal channels. An estimate for the non-singlet scalar mass in the chiral limit is given which is in agreement with the experimental value of Mao = 985 MeV.
Maximum entropy method approach to the θ term
International Nuclear Information System (INIS)
Imachi, Masahiro; Shinno, Yasuhiko; Yoneyama, Hiroshi
2004-01-01
In Monte Carlo simulations of lattice field theory with a θ term, one confronts the complex weight problem, or the sign problem. This is circumvented by performing the Fourier transform of the topological charge distribution P(Q). This procedure, however, causes flattening phenomenon of the free energy f(θ), which makes study of the phase structure unfeasible. In order to treat this problem, we apply the maximum entropy method (MEM) to a Gaussian form of P(Q), which serves as a good example to test whether the MEM can be applied effectively to the θ term. We study the case with flattering as well as that without flattening. In the latter case, the results of the MEM agree with those obtained from the direct application of the Fourier transform. For the former, the MEM gives a smoother f(θ) than that of the Fourier transform. Among various default models investigated, the images which yield the least error do not show flattening, although some others cannot be excluded given the uncertainly related to statistical error. (author)
Directory of Open Access Journals (Sweden)
Jikai Chen
2016-12-01
Full Text Available In a power system, the analysis of transient signals is the theoretical basis of fault diagnosis and transient protection theory. Shannon wavelet entropy (SWE and Shannon wavelet packet entropy (SWPE are powerful mathematics tools for transient signal analysis. Combined with the recent achievements regarding SWE and SWPE, their applications are summarized in feature extraction of transient signals and transient fault recognition. For wavelet aliasing at adjacent scale of wavelet decomposition, the impact of wavelet aliasing is analyzed for feature extraction accuracy of SWE and SWPE, and their differences are compared. Meanwhile, the analyses mentioned are verified by partial discharge (PD feature extraction of power cable. Finally, some new ideas and further researches are proposed in the wavelet entropy mechanism, operation speed and how to overcome wavelet aliasing.
Gravel Image Segmentation in Noisy Background Based on Partial Entropy Method
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
Because of wide variation in gray levels and particle dimensions and the presence of many small gravel objects in the background, as well as corrupting the image by noise, it is difficult o segment gravel objects. In this paper, we develop a partial entropy method and succeed to realize gravel objects segmentation. We give entropy principles and fur calculation methods. Moreover, we use minimum entropy error automaticly to select a threshold to segment image. We introduce the filter method using mathematical morphology. The segment experiments are performed by using different window dimensions for a group of gravel image and demonstrates that this method has high segmentation rate and low noise sensitivity.
EEG entropy measures in anesthesia
Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli
2015-01-01
Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation
Second Law Analysis of the Optimal Fin by Minimum Entropy Generation
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
Based on the entropy generation concept of thermodynamics, this paper established a general theoretical model for the analysis of entropy generation to optimize fms, in which the minimum entropy generation was selected as the object to be studied. The irreversibility due to heat transfer and friction was taken into account so that the minimum entropygeneration number has been analyzed with respect to second law of thermodynamics in the forced cross-flow. The optimum dimensions of cylinder pins were discussed. It's found that the minimum entropy generation number depends on parameters related to the fluid and fin physical parameters. Variations of the minimum entropy generation number with different parameters were analyzed.
Friedrich, Lucas
2017-12-29
This work presents an entropy stable discontinuous Galerkin (DG) spectral element approximation for systems of non-linear conservation laws with general geometric (h) and polynomial order (p) non-conforming rectangular meshes. The crux of the proofs presented is that the nodal DG method is constructed with the collocated Legendre-Gauss-Lobatto nodes. This choice ensures that the derivative/mass matrix pair is a summation-by-parts (SBP) operator such that entropy stability proofs from the continuous analysis are discretely mimicked. Special attention is given to the coupling between nonconforming elements as we demonstrate that the standard mortar approach for DG methods does not guarantee entropy stability for non-linear problems, which can lead to instabilities. As such, we describe a precise procedure and modify the mortar method to guarantee entropy stability for general non-linear hyperbolic systems on h/p non-conforming meshes. We verify the high-order accuracy and the entropy conservation/stability of fully non-conforming approximation with numerical examples.
Entropy for gravitational Chern-Simons terms by squashed cone method
International Nuclear Information System (INIS)
Guo, Wu-Zhong; Miao, Rong-Xin
2016-01-01
In this paper we investigate the entropy of gravitational Chern-Simons terms for the horizon with non-vanishing extrinsic curvatures, or the holographic entanglement entropy for arbitrary entangling surface. In 3D there is no anomaly of entropy. But the original squashed cone method can not be used directly to get the correct result. For higher dimensions the anomaly of entropy would appear, still, we can not use the squashed cone method directly. That is becasuse the Chern-Simons action is not gauge invariant. To get a reasonable result we suggest two methods. One is by adding a boundary term to recover the gauge invariance. This boundary term can be derived from the variation of the Chern-Simons action. The other one is by using the Chern-Simons relation dΩ_4_n_−_1=tr(R"2"n). We notice that the entropy of tr(R"2"n) is a total derivative locally, i.e. S=ds_C_S. We propose to identify s_C_S with the entropy of gravitational Chern-Simons terms Ω_4_n_−_1. In the first method we could get the correct result for Wald entropy in arbitrary dimension. In the second approach, in addition to Wald entropy, we can also obtain the anomaly of entropy with non-zero extrinsic curvatures. Our results imply that the entropy of a topological invariant, such as the Pontryagin term tr(R"2"n) and the Euler density, is a topological invariant on the entangling surface.
Dynamical noise filter and conditional entropy analysis in chaos synchronization.
Wang, Jiao; Lai, C-H
2006-06-01
It is shown that, in a chaotic synchronization system whose driving signal is exposed to channel noise, the estimation of the drive system states can be greatly improved by applying the dynamical noise filtering to the response system states. If the noise is bounded in a certain range, the estimation errors, i.e., the difference between the filtered responding states and the driving states, can be made arbitrarily small. This property can be used in designing an alternative digital communication scheme. An analysis based on the conditional entropy justifies the application of dynamical noise filtering in generating quality synchronization.
Maximum entropy technique in the doublet structure analysis
International Nuclear Information System (INIS)
Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.
1998-01-01
The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm
Analysis of entropy extraction efficiencies in random number generation systems
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Maximum-entropy clustering algorithm and its global convergence analysis
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Constructing a batch of differentiable entropy functions touniformly approximate an objective function by means of the maximum-entropy principle, a new clustering algorithm, called maximum-entropy clustering algorithm, is proposed based on optimization theory. This algorithm is a soft generalization of the hard C-means algorithm and possesses global convergence. Its relations with other clustering algorithms are discussed.
Application of Markov chains-entropy to analysis of depositional environments
Energy Technology Data Exchange (ETDEWEB)
Men Guizhen; Shi Xiaohong; Zhao Shuzhi
1989-01-01
The paper systematically and comprehensively discussed application of Markov chains-entropy to analysis of depositional environments of the upper Carboniferous series Taiyuan Formation in Anjialing, Pingshuo open-cast mine, Shanxi. Definite geological meanings were given respectively to calculated values of transition probability matrix, extremity probability matrix, substitution matrix and the entropy. The lithologic successions of coarse-fine-coarse grained layers from bottom upwards in the coal-bearing series made up the general symmetric cyclic patterns. It was suggested that the coal-bearing strata deposited in the coal-forming environment in delta plain-littoral swamps. Quantitative study of cyclic visibility and variation of formation was conducted. The assemblage relation among stratigraphic sequences and the significance of predicting vertical change were emphasized. Results of study showed that overall analysis of Markov chains was an effective method for analysis of depositional environments of coal-bearing strata. 2 refs., 5 figs.
Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu
2018-01-24
Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.
Evaluation of single and multi-threshold entropy-based algorithms for folded substrate analysis
Directory of Open Access Journals (Sweden)
Magdolna Apro
2011-10-01
Full Text Available This paper presents a detailed evaluation of two variants of Maximum Entropy image segmentation algorithm(single and multi-thresholding with respect to their performance on segmenting test images showing folded substrates.The segmentation quality was determined by evaluating values of four different measures: misclassificationerror, modified Hausdorff distance, relative foreground area error and positive-negative false detection ratio. Newnormalization methods were proposed in order to combine all parameters into a unique algorithm evaluation rating.The segmentation algorithms were tested on images obtained by three different digitalisation methods coveringfour different surface textures. In addition, the methods were also tested on three images presenting a perfect fold.The obtained results showed that Multi-Maximum Entropy algorithm is better suited for the analysis of imagesshowing folded substrates.
A new entropy based method for computing software structural complexity
International Nuclear Information System (INIS)
Roca, Jose L.
2002-01-01
In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relationship with the number of inherent software errors and it implies a basic hazard failure rate for it, so that a minimum structure assures a certain stability and maturity of the program. This metric can be used, either to evaluate the product or the process of software development, as development tool or for monitoring the stability and the quality of the final product. (author)
Directory of Open Access Journals (Sweden)
Shaofeng Xie
2017-01-01
Full Text Available Given the chaotic characteristics of the time series of landslides, a new method based on modified ensemble empirical mode decomposition (MEEMD, approximate entropy and the weighted least square support vector machine (WLS-SVM was proposed. The method mainly started from the chaotic sequence of time-frequency analysis and improved the model performance as follows: first a deformation time series was decomposed into a series of subsequences with significantly different complexity using MEEMD. Then the approximate entropy method was used to generate a new subsequence for the combination of subsequences with similar complexity, which could effectively concentrate the component feature information and reduce the computational scale. Finally the WLS-SVM prediction model was established for each new subsequence. At the same time, phase space reconstruction theory and the grid search method were used to select the input dimension and the optimal parameters of the model, and then the superposition of each predicted value was the final forecasting result. Taking the landslide deformation data of Danba as an example, the experiments were carried out and compared with wavelet neural network, support vector machine, least square support vector machine and various combination schemes. The experimental results show that the algorithm has high prediction accuracy. It can ensure a better prediction effect even in landslide deformation periods of rapid fluctuation, and it can also better control the residual value and effectively reduce the error interval.
Efficiency of crude oil markets: Evidences from informational entropy analysis
International Nuclear Information System (INIS)
Ortiz-Cruz, Alejandro; Rodriguez, Eduardo; Ibarra-Valdez, Carlos; Alvarez-Ramirez, Jose
2012-01-01
The role of crude oil as the main energy source for the global economic activity has motivated the discussion about the dynamics and causes of crude oil price changes. An accurate understanding of the issue should provide important guidelines for the design of optimal policies and government budget planning. Using daily data for WTI over the period January 1986–March 2011, we analyze the evolution of the informational complexity and efficiency for the crude oil market through multiscale entropy analysis. The results indicated that the crude oil market is informationally efficient over the scrutinized period except for two periods that correspond to the early 1990s and late 2000s US recessions. Overall, the results showed that deregulation has improved the operation of the market in the sense of making returns less predictable. On the other hand, there is some evidence that the probability of having a severe US economic recession increases as the informational efficiency decreases, which indicates that returns from crude oil markets are less uncertain during economic downturns. - Highlights: ► Entropy concepts are used to characterize crude oil prices. ► An index of market efficiency is introduced. ► Except for periods of economic recession, the crude oil market is informationally efficient.
Zhao, Yong; Hong, Wen-Xue
2011-11-01
Fast, nondestructive and accurate identification of special quality eggs is an urgent problem. The present paper proposed a new feature extraction method based on symbol entropy to identify near infrared spectroscopy of special quality eggs. The authors selected normal eggs, free range eggs, selenium-enriched eggs and zinc-enriched eggs as research objects and measured the near-infrared diffuse reflectance spectra in the range of 12 000-4 000 cm(-1). Raw spectra were symbolically represented with aggregation approximation algorithm and symbolic entropy was extracted as feature vector. An error-correcting output codes multiclass support vector machine classifier was designed to identify the spectrum. Symbolic entropy feature is robust when parameter changed and the highest recognition rate reaches up to 100%. The results show that the identification method of special quality eggs using near-infrared is feasible and the symbol entropy can be used as a new feature extraction method of near-infrared spectra.
International Nuclear Information System (INIS)
Sivia, D.S.; Hamilton, W.A.; Smith, G.S.
1991-01-01
The analysis of neutron reflectivity data to obtain nuclear scattering length density profiles is akin to the notorious phaseless Fourier problem, well known in many fields such as crystallography. Current methods of analysis culminate in the refinement of a few parameters of a functional model, and are often preceded by a long and laborious process of trial and error. We start by discussing the use of maximum entropy for obtained 'free-form' solutions of the density profile, as an alternative to the trial and error phase when a functional model is not available. Next we consider a Bayesian spectral analysis approach, which is appropriate for optimising the parameters of a simple (but adequate) type of model when the number of parameters is not known. Finally, we suggest a novel experimental procedure, the analogue of astronomical speckle holography, designed to alleviate the ambiguity problems inherent in traditional reflectivity measurements. (orig.)
Directory of Open Access Journals (Sweden)
Prasad Radha K.
2017-09-01
Full Text Available This paper presents mathematical modelling and numerical analysis to evaluate entropy generation analysis (EGA by considering pressure drop and second law efficiency based on thermodynamics for forced convection heat transfer in rectangular duct of a solar air heater with wire as artificial roughness in the form of arc shape geometry on the absorber plate. The investigation includes evaluations of entropy generation, entropy generation number, Bejan number and irreversibilities of roughened as well as smooth absorber plate solar air heaters to compare the relative performances. Furthermore, effects of various roughness parameters and operating parameters on entropy generation have also been investigated. Entropy generation and irreversibilities (exergy destroyed has its minimum value at relative roughness height of 0.0422 and relative angle of attack of 0.33, which leads to the maximum exergetic efficiency. Entropy generation and exergy based analyses can be adopted for the evaluation of the overall performance of solar air heaters.
Entropy-Based Method of Choosing the Decomposition Level in Wavelet Threshold De-noising
Directory of Open Access Journals (Sweden)
Yan-Fang Sang
2010-06-01
Full Text Available In this paper, the energy distributions of various noises following normal, log-normal and Pearson-III distributions are first described quantitatively using the wavelet energy entropy (WEE, and the results are compared and discussed. Then, on the basis of these analytic results, a method for use in choosing the decomposition level (DL in wavelet threshold de-noising (WTD is put forward. Finally, the performance of the proposed method is verified by analysis of both synthetic and observed series. Analytic results indicate that the proposed method is easy to operate and suitable for various signals. Moreover, contrary to traditional white noise testing which depends on “autocorrelations”, the proposed method uses energy distributions to distinguish real signals and noise in noisy series, therefore the chosen DL is reliable, and the WTD results of time series can be improved.
Entropy in bimolecular simulations: A comprehensive review of atomic fluctuations-based methods.
Kassem, Summer; Ahmed, Marawan; El-Sheikh, Salah; Barakat, Khaled H
2015-11-01
Entropy of binding constitutes a major, and in many cases a detrimental, component of the binding affinity in biomolecular interactions. While the enthalpic part of the binding free energy is easier to calculate, estimating the entropy of binding is further more complicated. A precise evaluation of entropy requires a comprehensive exploration of the complete phase space of the interacting entities. As this task is extremely hard to accomplish in the context of conventional molecular simulations, calculating entropy has involved many approximations. Most of these golden standard methods focused on developing a reliable estimation of the conformational part of the entropy. Here, we review these methods with a particular emphasis on the different techniques that extract entropy from atomic fluctuations. The theoretical formalisms behind each method is explained highlighting its strengths as well as its limitations, followed by a description of a number of case studies for each method. We hope that this brief, yet comprehensive, review provides a useful tool to understand these methods and realize the practical issues that may arise in such calculations. Copyright © 2015 Elsevier Inc. All rights reserved.
A Roller Bearing Fault Diagnosis Method Based on LCD Energy Entropy and ACROA-SVM
Directory of Open Access Journals (Sweden)
HungLinh Ao
2014-01-01
Full Text Available This study investigates a novel method for roller bearing fault diagnosis based on local characteristic-scale decomposition (LCD energy entropy, together with a support vector machine designed using an Artificial Chemical Reaction Optimisation Algorithm, referred to as an ACROA-SVM. First, the original acceleration vibration signals are decomposed into intrinsic scale components (ISCs. Second, the concept of LCD energy entropy is introduced. Third, the energy features extracted from a number of ISCs that contain the most dominant fault information serve as input vectors for the support vector machine classifier. Finally, the ACROA-SVM classifier is proposed to recognize the faulty roller bearing pattern. The analysis of roller bearing signals with inner-race and outer-race faults shows that the diagnostic approach based on the ACROA-SVM and using LCD to extract the energy levels of the various frequency bands as features can identify roller bearing fault patterns accurately and effectively. The proposed method is superior to approaches based on Empirical Mode Decomposition method and requires less time.
Entropy methods for reaction-diffusion equations: slowly growing a-priori bounds
Desvillettes, Laurent; Fellner, Klemens
2008-01-01
In the continuation of [Desvillettes, L., Fellner, K.: Exponential Decay toward Equilibrium via Entropy Methods for Reaction-Diffusion Equations. J. Math. Anal. Appl. 319 (2006), no. 1, 157-176], we study reversible reaction-diffusion equations via entropy methods (based on the free energy functional) for a 1D system of four species. We improve the existing theory by getting 1) almost exponential convergence in L1 to the steady state via a precise entropy-entropy dissipation estimate, 2) an explicit global L∞ bound via interpolation of a polynomially growing H1 bound with the almost exponential L1 convergence, and 3), finally, explicit exponential convergence to the steady state in all Sobolev norms.
Analysis of Entropy Generation in Flow of Methanol-Based Nanofluid in a Sinusoidal Wavy Channel
Directory of Open Access Journals (Sweden)
Muhammad Qasim
2017-10-01
Full Text Available The entropy generation due to heat transfer and fluid friction in mixed convective peristaltic flow of methanol-Al2O3 nano fluid is examined. Maxwell’s thermal conductivity model is used in analysis. Velocity and temperature profiles are utilized in the computation of the entropy generation number. The effects of involved physical parameters on velocity, temperature, entropy generation number, and Bejan number are discussed and explained graphically.
International Nuclear Information System (INIS)
Ye, Xuemin; Li, Chunxi
2013-01-01
As one of the most significant measures to improve energy utilization efficiency and save energy, cogeneration or combined heat and power (CHP) has been widely applied and promoted with positive motivations in many countries. A rational cost allocation model should indicate the performance of cogenerations and balance the benefits between electricity generation and heat production. Based on the second law of thermodynamics, the present paper proposes an entropy change method for cost allocation by choosing exhaust steam entropy as a datum point, and the new model works in conjunction with entropy change and irreversibility during energy conversion processes. The allocation ratios of heat cost with the present and existing methods are compared for different types of cogenerations. Results show that the allocation ratios with the entropy change method are more rational and the cost allocation model can make up some limitations involved in other approaches. The future energy policies and innovational directions for cogenerations and heat consumers should be developed. - Highlights: • A rational model of cogeneration cost allocation is established. • Entropy change method integrates the relation of entropy change and exergy losses. • The unity of measuring energy quality and quantity is materialized. • The benefits between electricity generation and heat production are balanced
Improved Ordinary Measure and Image Entropy Theory based intelligent Copy Detection Method
Directory of Open Access Journals (Sweden)
Dengpan Ye
2011-10-01
Full Text Available Nowadays, more and more multimedia websites appear in social network. It brings some security problems, such as privacy, piracy, disclosure of sensitive contents and so on. Aiming at copyright protection, the copy detection technology of multimedia contents becomes a hot topic. In our previous work, a new computer-based copyright control system used to detect the media has been proposed. Based on this system, this paper proposes an improved media feature matching measure and an entropy based copy detection method. The Levenshtein Distance was used to enhance the matching degree when using for feature matching measure in copy detection. For entropy based copy detection, we make a fusion of the two features of entropy matrix of the entropy feature we extracted. Firstly,we extract the entropy matrix of the image and normalize it. Then, we make a fusion of the eigenvalue feature and the transfer matrix feature of the entropy matrix. The fused features will be used for image copy detection. The experiments show that compared to use these two kinds of features for image detection singly, using feature fusion matching method is apparent robustness and effectiveness. The fused feature has a high detection for copy images which have been received some attacks such as noise, compression, zoom, rotation and so on. Comparing with referred methods, the method proposed is more intelligent and can be achieved good performance.
Constructing a Measurement Method of Differences in Group Preferences Based on Relative Entropy
Directory of Open Access Journals (Sweden)
Shiyu Zhang
2017-01-01
Full Text Available In the research and data analysis of the differences involved in group preferences, conventional statistical methods cannot reflect the integrity and preferences of human minds; in particular, it is difficult to exclude humans’ irrational factors. This paper introduces a preference amount model based on relative entropy theory. A related expansion is made based on the characteristics of the questionnaire data, and we also construct the parameters to measure differences in the data distribution of different groups on the whole. In this paper, this parameter is called the center distance, and it effectively reflects the preferences of human minds. Using the survey data of securities market participants as an example, this paper analyzes differences in market participants’ attitudes toward the effectiveness of securities regulation. Based on this method, differences between groups that were overlooked by analysis of variance are found, and certain aspects obscured by general data characteristics are also found.
Unification of field theory and maximum entropy methods for learning probability densities
Kinney, Justin B.
2014-01-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy de...
Entropy Viscosity Method for High-Order Approximations of Conservation Laws
Guermond, J. L.
2010-09-17
A stabilization technique for conservation laws is presented. It introduces in the governing equations a nonlinear dissipation function of the residual of the associated entropy equation and bounded from above by a first order viscous term. Different two-dimensional test cases are simulated - a 2D Burgers problem, the "KPP rotating wave" and the Euler system - using high order methods: spectral elements or Fourier expansions. Details on the tuning of the parameters controlling the entropy viscosity are given. © 2011 Springer.
Entropy Viscosity Method for High-Order Approximations of Conservation Laws
Guermond, J. L.; Pasquetti, R.
2010-01-01
A stabilization technique for conservation laws is presented. It introduces in the governing equations a nonlinear dissipation function of the residual of the associated entropy equation and bounded from above by a first order viscous term. Different two-dimensional test cases are simulated - a 2D Burgers problem, the "KPP rotating wave" and the Euler system - using high order methods: spectral elements or Fourier expansions. Details on the tuning of the parameters controlling the entropy viscosity are given. © 2011 Springer.
Directory of Open Access Journals (Sweden)
Xiao-ping Bai
2013-01-01
Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Maximum entropy methods for extracting the learned features of deep neural networks.
Finnegan, Alex; Song, Jun S
2017-10-01
New architectures of multilayer artificial neural networks and new methods for training them are rapidly revolutionizing the application of machine learning in diverse fields, including business, social science, physical sciences, and biology. Interpreting deep neural networks, however, currently remains elusive, and a critical challenge lies in understanding which meaningful features a network is actually learning. We present a general method for interpreting deep neural networks and extracting network-learned features from input data. We describe our algorithm in the context of biological sequence analysis. Our approach, based on ideas from statistical physics, samples from the maximum entropy distribution over possible sequences, anchored at an input sequence and subject to constraints implied by the empirical function learned by a network. Using our framework, we demonstrate that local transcription factor binding motifs can be identified from a network trained on ChIP-seq data and that nucleosome positioning signals are indeed learned by a network trained on chemical cleavage nucleosome maps. Imposing a further constraint on the maximum entropy distribution also allows us to probe whether a network is learning global sequence features, such as the high GC content in nucleosome-rich regions. This work thus provides valuable mathematical tools for interpreting and extracting learned features from feed-forward neural networks.
Teschendorff, Andrew E; Sollich, Peter; Kuehn, Reimer
2014-06-01
A key challenge in systems biology is the elucidation of the underlying principles, or fundamental laws, which determine the cellular phenotype. Understanding how these fundamental principles are altered in diseases like cancer is important for translating basic scientific knowledge into clinical advances. While significant progress is being made, with the identification of novel drug targets and treatments by means of systems biological methods, our fundamental systems level understanding of why certain treatments succeed and others fail is still lacking. We here advocate a novel methodological framework for systems analysis and interpretation of molecular omic data, which is based on statistical mechanical principles. Specifically, we propose the notion of cellular signalling entropy (or uncertainty), as a novel means of analysing and interpreting omic data, and more fundamentally, as a means of elucidating systems-level principles underlying basic biology and disease. We describe the power of signalling entropy to discriminate cells according to differentiation potential and cancer status. We further argue the case for an empirical cellular entropy-robustness correlation theorem and demonstrate its existence in cancer cell line drug sensitivity data. Specifically, we find that high signalling entropy correlates with drug resistance and further describe how entropy could be used to identify the achilles heels of cancer cells. In summary, signalling entropy is a deep and powerful concept, based on rigorous statistical mechanical principles, which, with improved data quality and coverage, will allow a much deeper understanding of the systems biological principles underlying normal and disease physiology. Copyright © 2014 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Rashidi Mohammad Mehdi
2015-01-01
Full Text Available The similar solution on the equations of the revised Cheng-Minkowycz problem for natural convective boundary layer flow of nanofluid through a porous medium gives (using an analytical method, a system of non-linear partial differential equations which are solved by optimal homotopy analysis method. Effects of various drastic parameters on the fluid and heat transfer characteristics have been analyzed. A very good agreement is observed between the obtained results and the numerical ones. The entropy generation has been derived and a comprehensive parametric analysis on that has been done. Each component of the entropy generation has been analyzed separately and the contribution of each one on the total value of entropy generation has been determined. It is found that the entropy generation as an important aspect of the industrial applications has been affected by various parameters which should be controlled to minimize the entropy generation.
International Nuclear Information System (INIS)
Gutierrez, Rafael M.; Useche, Gina M.; Buitrago, Elias
2007-01-01
We present a procedure developed to detect stochastic and deterministic information contained in empirical time series, useful to characterize and make models of different aspects of complex phenomena represented by such data. This procedure is applied to a seismological time series to obtain new information to study and understand geological phenomena. We use concepts and methods from nonlinear dynamics and maximum entropy. The mentioned method allows an optimal analysis of the available information
Ronglian, Yuan; Mingye, Ai; Qiaona, Jia; Yuxuan, Liu
2018-03-01
Sustainable development is the only way for the development of human society. As an important part of the national economy, the steel industry is an energy-intensive industry and needs to go further for sustainable development. In this paper, we use entropy method and Topsis method to evaluate the development of China’s steel industry during the “12th Five-Year Plan” from four aspects: resource utilization efficiency, main energy and material consumption, pollution status and resource reuse rate. And we also put forward some suggestions for the development of China’s steel industry.
ENTROPY FLOW CHARACTERISTICS ANALYSIS OF TYPHOON MATSA (0509)
Institute of Scientific and Technical Information of China (English)
XU Hui; LIU Chong-jian
2008-01-01
The evolution of Typhoon Matsa (0509) is examined in terms of entropy flow through an entropy balance equation derived from the Gibbs relation, according to the second law of thermodynamics. The entropy flows in the various significant stages of (genesis, development and decaying) during its evolution are diagnosed based on the outputs of the PSU/NCAR mesoscale model (known as MM5). The results show that: (1) the vertical spatial distribution of entropy flow for Matsa is characterized by a predominantly negative entropy flow in a large portion of the troposphere and a positive flow in the upper levels; (2) the fields of entropy flows at the middle troposphere (500 hPa) show that the growth of the typhoon is greatly dependent on the negative entropy flows from its surroundings; and (3) the simulated centres of heavy rainfall associated with the typhoon match well with the zones of large negative entropy flows, suggesting that they may be a significant indicator for severe weather events.
the use of entropy index for gender inequality analysis
African Journals Online (AJOL)
DJFLEX
years old; the share of females in wage employment in the non-agricultural ... have greater access to education than females and the gap in access ... (1) is referred to as Shannon entropy (see Ciuperca and Girardin, 2005). Shannon entropy has the property of symmetry i.e. the measure is unchanged if the outcomes i.
Symmetry Analysis of Gait between Left and Right Limb Using Cross-Fuzzy Entropy
Directory of Open Access Journals (Sweden)
Yi Xia
2016-01-01
Full Text Available The purpose of this paper is the investigation of gait symmetry problem by using cross-fuzzy entropy (C-FuzzyEn, which is a recently proposed cross entropy that has many merits as compared to the frequently used cross sample entropy (C-SampleEn. First, we used several simulation signals to test its performance regarding the relative consistency and dependence on data length. Second, the gait time series of the left and right stride interval were used to calculate the C-FuzzyEn values for gait symmetry analysis. Besides the statistical analysis, we also realized a support vector machine (SVM classifier to perform the classification of normal and abnormal gaits. The gait dataset consists of 15 patients with Parkinson’s disease (PD and 16 control (CO subjects. The results show that the C-FuzzyEn values of the PD patients’ gait are significantly higher than that of the CO subjects with a p value of less than 10-5, and the best classification performance evaluated by a leave-one-out (LOO cross-validation method is an accuracy of 96.77%. Such encouraging results imply that the C-FuzzyEn-based gait symmetry measure appears as a suitable tool for analyzing abnormal gaits.
Unification of field theory and maximum entropy methods for learning probability densities
Kinney, Justin B.
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
The entropy dissipation method for spatially inhomogeneous reaction-diffusion-type systems
Di Francesco, M.
2008-12-08
We study the long-time asymptotics of reaction-diffusion-type systems that feature a monotone decaying entropy (Lyapunov, free energy) functional. We consider both bounded domains and confining potentials on the whole space for arbitrary space dimensions. Our aim is to derive quantitative expressions for (or estimates of) the rates of convergence towards an (entropy minimizing) equilibrium state in terms of the constants of diffusion and reaction and with respect to conserved quantities. Our method, the so-called entropy approach, seeks to quantify convergence to equilibrium by using functional inequalities, which relate quantitatively the entropy and its dissipation in time. The entropy approach is well suited to nonlinear problems and known to be quite robust with respect to model variations. It has already been widely applied to scalar diffusion-convection equations, and the main goal of this paper is to study its generalization to systems of partial differential equations that contain diffusion and reaction terms and admit fewer conservation laws than the size of the system. In particular, we successfully apply the entropy approach to general linear systems and to a nonlinear example of a reaction-diffusion-convection system arising in solid-state physics as a paradigm for general nonlinear systems. © 2008 The Royal Society.
Directory of Open Access Journals (Sweden)
Milivoje M. Kostic
2016-01-01
Full Text Available There is a growing trend in recently-submitted manuscripts and publications to present calculated results of entropy generation, also known as entropy production, as field quantities in a system or device control volume, based on prior calculation of velocity and temperature fields, frequently using CFD numerical methods. [...
Namazi, Hamidreza; Akrami, Amin; Nazeri, Sina; Kulish, Vladimir V
2016-01-01
An important challenge in brain research is to make out the relation between the features of olfactory stimuli and the electroencephalogram (EEG) signal. Yet, no one has discovered any relation between the structures of olfactory stimuli and the EEG signal. This study investigates the relation between the structures of EEG signal and the olfactory stimulus (odorant). We show that the complexity of the EEG signal is coupled with the molecular complexity of the odorant, where more structurally complex odorant causes less fractal EEG signal. Also, odorant having higher entropy causes the EEG signal to have lower approximate entropy. The method discussed here can be applied and investigated in case of patients with brain diseases as the rehabilitation purpose.
Towards an entropy-based analysis of log variability
DEFF Research Database (Denmark)
Back, Christoffer Olling; Debois, Søren; Slaats, Tijs
2017-01-01
the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....
Towards an Entropy-based Analysis of Log Variability
DEFF Research Database (Denmark)
Back, Christoffer Olling; Debois, Søren; Slaats, Tijs
2018-01-01
the development of hybrid miners: given a log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....
Symbolic transfer entropy-based premature signal analysis
International Nuclear Information System (INIS)
Wang Jun; Yu Zheng-Feng
2012-01-01
In this paper, we use symbolic transfer entropy to study the coupling strength between premature signals. Numerical experiments show that three types of signal couplings are in the same direction. Among them, normal signal coupling is the strongest, followed by that of premature ventricular contractions, and that of atrial premature beats is the weakest. The T test shows that the entropies of the three signals are distinct. Symbolic transfer entropy requires less data, can distinguish the three types of signals and has very good computational efficiency. (interdisciplinary physics and related areas of science and technology)
An entropy-based improved k-top scoring pairs (TSP) method for ...
African Journals Online (AJOL)
An entropy-based improved k-top scoring pairs (TSP) (Ik-TSP) method was presented in this study for the classification and prediction of human cancers based on gene-expression data. We compared Ik-TSP classifiers with 5 different machine learning methods and the k-TSP method based on 3 different feature selection ...
Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics
Abe, Sumiyoshi
2014-11-01
The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.
A Dynamic and Adaptive Selection Radar Tracking Method Based on Information Entropy
Directory of Open Access Journals (Sweden)
Ge Jianjun
2017-12-01
Full Text Available Nowadays, the battlefield environment has become much more complex and variable. This paper presents a quantitative method and lower bound for the amount of target information acquired from multiple radar observations to adaptively and dynamically organize the detection of battlefield resources based on the principle of information entropy. Furthermore, for minimizing the given information entropy’s lower bound for target measurement at every moment, a method to dynamically and adaptively select radars with a high amount of information for target tracking is proposed. The simulation results indicate that the proposed method has higher tracking accuracy than that of tracking without adaptive radar selection based on entropy.
Entropy-based model for miRNA isoform analysis.
Directory of Open Access Journals (Sweden)
Shengqin Wang
Full Text Available MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test, we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.
Dynamical glucometry: Use of multiscale entropy analysis in diabetes
Costa, Madalena D.; Henriques, Teresa; Munshi, Medha N.; Segal, Alissa R.; Goldberger, Ary L.
2014-09-01
Diabetes mellitus (DM) is one of the world's most prevalent medical conditions. Contemporary management focuses on lowering mean blood glucose values toward a normal range, but largely ignores the dynamics of glucose fluctuations. We probed analyte time series obtained from continuous glucose monitor (CGM) sensors. We show that the fluctuations in CGM values sampled every 5 min are not uncorrelated noise. Next, using multiscale entropy analysis, we quantified the complexity of the temporal structure of the CGM time series from a group of elderly subjects with type 2 DM and age-matched controls. We further probed the structure of these CGM time series using detrended fluctuation analysis. Our findings indicate that the dynamics of glucose fluctuations from control subjects are more complex than those of subjects with type 2 DM over time scales ranging from about 5 min to 5 h. These findings support consideration of a new framework, dynamical glucometry, to guide mechanistic research and to help assess and compare therapeutic interventions, which should enhance complexity of glucose fluctuations and not just lower mean and variance of blood glucose levels.
Entropy for Mechanically Vibrating Systems
Tufano, Dante
, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.
Analysis of entropy models with equality and inequality constraints
Energy Technology Data Exchange (ETDEWEB)
Jefferson, T R; Scott, C H
1979-06-01
Entropy models are emerging as valuable tools in the study of various social problems of spatial interaction. With the development of the modeling has come diversity. Increased flexibility in the model can be obtained by allowing certain constraints to be relaxed from equality to inequality. To provide a better understanding of these entropy models they are analyzed by geometric programming. Dual mathematical programs and algorithms are obtained. 7 references.
Variational method for the minimization of entropy generation in solar cells
Smit, S.; Kessels, W.M.M.
2015-01-01
In this work, a method is presented to extend traditional solar cell simulation tools to make it possible to calculate the most efficient design of practical solar cells. The method is based on the theory of nonequilibrium thermodynamics, which is used to derive an expression for the local entropy
Parameterized entropy analysis of EEG following hypoxic-ischemic brain injury
International Nuclear Information System (INIS)
Tong Shanbao; Bezerianos, Anastasios; Malhotra, Amit; Zhu Yisheng; Thakor, Nitish
2003-01-01
In the present study Tsallis and Renyi entropy methods were used to study the electric activity of brain following hypoxic-ischemic (HI) injury. We investigated the performances of these parameterized information measures in describing the electroencephalogram (EEG) signal of controlled experimental animal HI injury. The results show that (a): compared with Shannon and Renyi entropy, the parameterized Tsallis entropy acts like a spatial filter and the information rate can either tune to long range rhythms or to short abrupt changes, such as bursts or spikes during the beginning of recovery, by the entropic index q; (b): Renyi entropy is a compact and predictive indicator for monitoring the physiological changes during the recovery of brain injury. There is a reduction in the Renyi entropy after brain injury followed by a gradual recovery upon resuscitation
Music viewed by its entropy content: A novel window for comparative analysis.
Directory of Open Access Journals (Sweden)
Gerardo Febres
Full Text Available Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.
Music viewed by its entropy content: A novel window for comparative analysis.
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.
Directory of Open Access Journals (Sweden)
Maman Abdurohman
2017-12-01
Full Text Available This research proposed a new method to enhance Distributed Denial of Service (DDoS detection attack on Software Defined Network (SDN environment. This research utilized the OpenFlow controller of SDN for DDoS attack detection using modified method and regarding entropy value. The new method would check whether the traffic was a normal traffic or DDoS attack by measuring the randomness of the packets. This method consisted of two steps, detecting attack and checking the entropy. The result shows that the new method can reduce false positive when there is a temporary and sudden increase in normal traffic. The new method succeeds in not detecting this as a DDoS attack. Compared to previous methods, this proposed method can enhance DDoS attack detection on SDN environment.
Czech Academy of Sciences Publication Activity Database
Palatinus, Lukáš; Amami, M.; van Smaalen, S.
2004-01-01
Roč. 60, - (2004), s. 127-137 ISSN 0108-7681 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : incommensurate modulation * superspace * maximum entropy method Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 5.418, year: 2004
Incommensurate modulations made visible by the Maximum Entropy Method in superspace
Czech Academy of Sciences Publication Activity Database
Palatinus, Lukáš; van Smaalen, S.
2004-01-01
Roč. 219, - (2004), s. 719-729 ISSN 0044-2968 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : Maximum Entropy Method * modulated structures * charge density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.390, year: 2004
The prior-derived F constraints in the maximum-entropy method
Czech Academy of Sciences Publication Activity Database
Palatinus, Lukáš; van Smaalen, S.
2005-01-01
Roč. 61, - (2005), s. 363-372 ISSN 0108-7673 Institutional research plan: CEZ:AV0Z10100521 Keywords : charge density * maximum-entropy method * sodium nitrite Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.791, year: 2005
The generalized F constraint in the maximum-entropy method - a study on simulated data
Czech Academy of Sciences Publication Activity Database
Palatinus, Lukáš; van Smaalen, S.
2002-01-01
Roč. 58, - (2002), s. 559-567 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method * electron density * oxalic acid Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.417, year: 2002
Neurodevelopment in newborns: a sample entropy analysis of electroencephalogram
International Nuclear Information System (INIS)
Zhang, Dandan; Ding, Haiyan; Liu, Yunfeng; Ding, Haishu; Zhou, Congle; Ye, Datian
2009-01-01
The present paper investigates the neural ontogeny of newborns in view of electroencephalogram (EEG) complexity during active sleep (AS) and quiet sleep (QS). Sample entropy (SampEn) is applied to EEG recordings from 168 newborns with postmenstrual age (PMA) ranging from 25 to 60 weeks. The relationship between neurodevelopment and PMA is then explored according to the statistical analysis of the median and interquartile range of SampEn curves. It is found that SampEn of EEG during AS is higher than that during QS. SampEn increases during both AS and QS before about 42 weeks in PMA while it ceases its increase in QS and even decreases in AS after newborns reaching term age. A distinct decrease in the interquartile range of SampEn is found with increasing PMA (from 25 to about 50 weeks), followed by maintenance of low fluctuation in SampEn curves. The study in this paper sets the stage for exhaustive investigation of the SampEn of EEG during brain maturation in newborns. And it could be hoped that SampEn in sleep EEG might be a useful parameter against which delays and aberrations in brain maturation might be tested. The SampEn changes during brain maturation also offer functional clues about neurodevelopment, based on which further explorations could be done. The significance of this paper is the discovery of the decrease in EEG complexity after newborns reaching term. Although some potential neurophysiologic reasons are given, this new discovery might require more study to investigate. In addition, the fluctuation of EEG complexity is analyzed for the first time, which helps to understand the EEG maturation in neurodevelopment
Xie, Hong-Bo; Guo, Jing-Yi; Zheng, Yong-Ping
2010-04-01
In the present contribution, a complexity measure is proposed to assess surface electromyography (EMG) in the study of muscle fatigue during sustained, isometric muscle contractions. Approximate entropy (ApEn) is believed to provide quantitative information about the complexity of experimental data that is often corrupted with noise, short data length, and in many cases, has inherent dynamics that exhibit both deterministic and stochastic behaviors. We developed an improved ApEn measure, i.e., fuzzy approximate entropy (fApEn), which utilizes the fuzzy membership function to define the vectors' similarity. Tests were conducted on independent, identically distributed (i.i.d.) Gaussian and uniform noises, a chirp signal, MIX processes, Rossler equation, and Henon map. Compared with the standard ApEn, the fApEn showed better monotonicity, relative consistency, and more robustness to noise when characterizing signals with different complexities. Performance analysis on experimental EMG signals demonstrated that the fApEn significantly decreased during the development of muscle fatigue, which is a similar trend to that of the mean frequency (MNF) of the EMG signal, while the standard ApEn failed to detect this change. Moreover, fApEn of EMG demonstrated a better robustness to the length of the analysis window in comparison with the MNF of EMG. The results suggest that the fApEn of an EMG signal may potentially become a new reliable method for muscle fatigue assessment and be applicable to other short noisy physiological signal analysis.
Normal Mode Analysis in Zeolites: Toward an Efficient Calculation of Adsorption Entropies.
De Moor, Bart A; Ghysels, An; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B
2011-04-12
An efficient procedure for normal-mode analysis of extended systems, such as zeolites, is developed and illustrated for the physisorption and chemisorption of n-octane and isobutene in H-ZSM-22 and H-FAU using periodic DFT calculations employing the Vienna Ab Initio Simulation Package. Physisorption and chemisorption entropies resulting from partial Hessian vibrational analysis (PHVA) differ at most 10 J mol(-1) K(-1) from those resulting from full Hessian vibrational analysis, even for PHVA schemes in which only a very limited number of atoms are considered free. To acquire a well-conditioned Hessian, much tighter optimization criteria than commonly used for electronic energy calculations in zeolites are required, i.e., at least an energy cutoff of 400 eV, maximum force of 0.02 eV/Å, and self-consistent field loop convergence criteria of 10(-8) eV. For loosely bonded complexes the mobile adsorbate method is applied, in which frequency contributions originating from translational or rotational motions of the adsorbate are removed from the total partition function and replaced by free translational and/or rotational contributions. The frequencies corresponding with these translational and rotational modes can be selected unambiguously based on a mobile block Hessian-PHVA calculation, allowing the prediction of physisorption entropies within an accuracy of 10-15 J mol(-1) K(-1) as compared to experimental values. The approach presented in this study is useful for studies on other extended catalytic systems.
Entropy generation analysis of an adsorption cooling cycle
Thu, Kyaw; Kim, Youngdeuk; Myat, Aung; Chun, Wongee; Ng, K. C.
2013-01-01
temperature (10.8 to 46.2 W/K) and the largest share of entropy generation or rate of energy dissipation occurs at the adsorption process, (ii) the second highest energy rate dissipation is the desorption process, (iii) the remaining energy dissipation rates
Bias-based modeling and entropy analysis of PUFs
van den Berg, R.; Skoric, B.; Leest, van der V.
2013-01-01
Physical Unclonable Functions (PUFs) are increasingly becoming a well-known security primitive for secure key storage and anti-counterfeiting. For both applications it is imperative that PUFs provide enough entropy. The aim of this paper is to propose a new model for binary-output PUFs such as SRAM,
Prot, Olivier; SantolíK, OndřEj; Trotignon, Jean-Gabriel; Deferaudy, Hervé
2006-06-01
An entropy regularization algorithm (ERA) has been developed to compute the wave-energy density from electromagnetic field measurements. It is based on the wave distribution function (WDF) concept. To assess its suitability and efficiency, the algorithm is applied to experimental data that has already been analyzed using other inversion techniques. The FREJA satellite data that is used consists of six spectral matrices corresponding to six time-frequency points of an ELF hiss-event spectrogram. The WDF analysis is performed on these six points and the results are compared with those obtained previously. A statistical stability analysis confirms the stability of the solutions. The WDF computation is fast and without any prespecified parameters. The regularization parameter has been chosen in accordance with the Morozov's discrepancy principle. The Generalized Cross Validation and L-curve criterions are then tentatively used to provide a fully data-driven method. However, these criterions fail to determine a suitable value of the regularization parameter. Although the entropy regularization leads to solutions that agree fairly well with those already published, some differences are observed, and these are discussed in detail. The main advantage of the ERA is to return the WDF that exhibits the largest entropy and to avoid the use of a priori models, which sometimes seem to be more accurate but without any justification.
Pan, Keyao; Deem, Michael W.
2011-01-01
Many viruses evolve rapidly. For example, haemagglutinin (HA) of the H3N2 influenza A virus evolves to escape antibody binding. This evolution of the H3N2 virus means that people who have previously been exposed to an influenza strain may be infected by a newly emerged virus. In this paper, we use Shannon entropy and relative entropy to measure the diversity and selection pressure by an antibody in each amino acid site of H3 HA between the 1992–1993 season and the 2009–2010 season. Shannon entropy and relative entropy are two independent state variables that we use to characterize H3N2 evolution. The entropy method estimates future H3N2 evolution and migration using currently available H3 HA sequences. First, we show that the rate of evolution increases with the virus diversity in the current season. The Shannon entropy of the sequence in the current season predicts relative entropy between sequences in the current season and those in the next season. Second, a global migration pattern of H3N2 is assembled by comparing the relative entropy flows of sequences sampled in China, Japan, the USA and Europe. We verify this entropy method by describing two aspects of historical H3N2 evolution. First, we identify 54 amino acid sites in HA that have evolved in the past to evade the immune system. Second, the entropy method shows that epitopes A and B on the top of HA evolve most vigorously to escape antibody binding. Our work provides a novel entropy-based method to predict and quantify future H3N2 evolution and to describe the evolutionary history of H3N2. PMID:21543352
International Nuclear Information System (INIS)
Tang, Pingzhou; Chen, Di; Hou, Yushuo
2016-01-01
As the world’s energy problem becomes more severe day by day, photovoltaic power generation has opened a new door for us with no doubt. It will provide an effective solution for this severe energy problem and meet human’s needs for energy if we can apply photovoltaic power generation in real life, Similar to wind power generation, photovoltaic power generation is uncertain. Therefore, the forecast of photovoltaic power generation is very crucial. In this paper, entropy method and extreme learning machine (ELM) method were combined to forecast a short-term photovoltaic power generation. First, entropy method is used to process initial data, train the network through the data after unification, and then forecast electricity generation. Finally, the data results obtained through the entropy method with ELM were compared with that generated through generalized regression neural network (GRNN) and radial basis function neural network (RBF) method. We found that entropy method combining with ELM method possesses higher accuracy and the calculation is faster.
Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.
Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L
2008-06-01
Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.
Crane Safety Assessment Method Based on Entropy and Cumulative Prospect Theory
Directory of Open Access Journals (Sweden)
Aihua Li
2017-01-01
Full Text Available Assessing the safety status of cranes is an important problem. To overcome the inaccuracies and misjudgments in such assessments, this work describes a safety assessment method for cranes that combines entropy and cumulative prospect theory. Firstly, the proposed method transforms the set of evaluation indices into an evaluation vector. Secondly, a decision matrix is then constructed from the evaluation vectors and evaluation standards, and an entropy-based technique is applied to calculate the index weights. Thirdly, positive and negative prospect value matrices are established from reference points based on the positive and negative ideal solutions. Thus, this enables the crane safety grade to be determined according to the ranked comprehensive prospect values. Finally, the safety status of four general overhead traveling crane samples is evaluated to verify the rationality and feasibility of the proposed method. The results demonstrate that the method described in this paper can precisely and reasonably reflect the safety status of a crane.
Corrected entropy of Friedmann-Robertson-Walker universe in tunneling method
International Nuclear Information System (INIS)
Zhu, Tao; Ren, Ji-Rong; Li, Ming-Fan
2009-01-01
In this paper, we study the thermodynamic quantities of Friedmann-Robertson-Walker (FRW) universe by using the tunneling formalism beyond semiclassical approximation developed by Banerjee and Majhi [25]. For this we first calculate the corrected Hawking-like temperature on apparent horizon by considering both scalar particle and fermion tunneling. With this corrected Hawking-like temperature, the explicit expressions of the corrected entropy of apparent horizon for various gravity theories including Einstein gravity, Gauss-Bonnet gravity, Lovelock gravity, f(R) gravity and scalar-tensor gravity, are computed. Our results show that the corrected entropy formula for different gravity theories can be written into a general expression (4.39) of a same form. It is also shown that this expression is also valid for black holes. This might imply that the expression for the corrected entropy derived from tunneling method is independent of gravity theory, spacetime and dimension of the spacetime. Moreover, it is concluded that the basic thermodynamical property that the corrected entropy on apparent horizon is a state function is satisfied by the FRW universe
Maximum Entropy Method in Moessbauer Spectroscopy - a Problem of Magnetic Texture
International Nuclear Information System (INIS)
Satula, D.; Szymanski, K.; Dobrzynski, L.
2011-01-01
A reconstruction of the three dimensional distribution of the hyperfine magnetic field, isomer shift and texture parameter z from the Moessbauer spectra by the maximum entropy method is presented. The method was tested on the simulated spectrum consisting of two Gaussian hyperfine field distributions with different values of the texture parameters. It is shown that proper prior has to be chosen in order to arrive at the physically meaningful results. (authors)
Entropy production in a box: Analysis of instabilities in confined hydrothermal systems
Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.
2017-09-01
We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.
Numerical analysis of entropy generation in an annular microcombustor using multistep kinetics
International Nuclear Information System (INIS)
Jejurkar, Swarup Y.; Mishra, D.P.
2013-01-01
Entropy generation by combustion and additional irreversibility due to heat loss was studied numerically for a premixed flame based microcombustor. Detailed axisymmetric reactive flow model employing a 21 step–9 species reaction mechanism for hydrogen–air mixture was considered. The analysis identified reactions contributing most of the entropy generated in combustion. These reactions are removed from thermodynamic equilibrium in the low temperature region between 400 and 700 K of the flame and a combination of their high affinity and low temperature induces entropy generation in this region. Single step kinetics and a reduced scheme neglecting HO 2 is consequently incapable of accurately calculating the entropy generation and second law performance. Overall entropy generation rates increased from lean to rich mixtures in the range Φ = 0.5–1.4 and were dominated by combustion reactions. Characterization of combustor performance in terms of second law efficiency showed that availability reduction by wall heat losses and combustion irreversibility were of the same order for stoichiometric and both decreased for rich flames. On the other hand, near-quenching fuel lean flames (Φ≤0.75) suffered mostly from combustion irreversibility. These trends caused the minimum efficiency (maximum thermodynamic irreversibility) point to locate near stoichiometric fuel–air composition. -- Highlights: ► Reaction set dominating heat release and entropy generation involve HO 2 . ► Entropy generation increased from lean to rich Φ. ► Second law efficiency is minimum at stoichiometric Φ. ► Post-flame heat loss, transport processes needed in microcombustor entropy analysis
A New Feature Extraction Method Based on EEMD and Multi-Scale Fuzzy Entropy for Motor Bearing
Directory of Open Access Journals (Sweden)
Huimin Zhao
2016-12-01
Full Text Available Feature extraction is one of the most important, pivotal, and difficult problems in mechanical fault diagnosis, which directly relates to the accuracy of fault diagnosis and the reliability of early fault prediction. Therefore, a new fault feature extraction method, called the EDOMFE method based on integrating ensemble empirical mode decomposition (EEMD, mode selection, and multi-scale fuzzy entropy is proposed to accurately diagnose fault in this paper. The EEMD method is used to decompose the vibration signal into a series of intrinsic mode functions (IMFs with a different physical significance. The correlation coefficient analysis method is used to calculate and determine three improved IMFs, which are close to the original signal. The multi-scale fuzzy entropy with the ability of effective distinguishing the complexity of different signals is used to calculate the entropy values of the selected three IMFs in order to form a feature vector with the complexity measure, which is regarded as the inputs of the support vector machine (SVM model for training and constructing a SVM classifier (EOMSMFD based on EDOMFE and SVM for fulfilling fault pattern recognition. Finally, the effectiveness of the proposed method is validated by real bearing vibration signals of the motor with different loads and fault severities. The experiment results show that the proposed EDOMFE method can effectively extract fault features from the vibration signal and that the proposed EOMSMFD method can accurately diagnose the fault types and fault severities for the inner race fault, the outer race fault, and rolling element fault of the motor bearing. Therefore, the proposed method provides a new fault diagnosis technology for rotating machinery.
Entropy production analysis of hysteresis characteristic of a pump-turbine model
International Nuclear Information System (INIS)
Li, Deyou; Wang, Hongjie; Qin, Yonglin; Han, Lei; Wei, Xianzhu; Qin, Daqing
2017-01-01
Highlights: • An interesting hysteresis phenomenon was analyzed using entropy production theory. • A function was used to calculate the entropy production in the wall region. • Generation mechanism of the hump and hysteresis characteristics was obtained. - Abstract: The hydraulic loss due to friction and unstable flow patterns in hydro-turbines causes a drop in their efficiency. The traditional method for analyzing the hydraulic loss is by evaluating the pressure drop, which has certain limitations and cannot determine the exact locations at which the high hydraulic loss occurs. In this study, entropy production theory was adopted to obtain a detailed distribution of the hydraulic loss in a pump-turbine in the pump mode. In the past, the wall effects of entropy production were not considered, which caused larger errors as compared with the method of pressure difference. First, a wall equation was proposed to calculate the hydraulic loss in the wall region. The comparison of hydraulic loss calculated by entropy production and pressure difference revealed a better result. Then, through the use of the entropy production theory, the performance characteristics were determined for a pump-turbine with 19 mm guide vane opening, and the variation in the entropy production was obtained. Recently, an interesting phenomenon, i.e., a hysteresis characteristic, was observed in the hump region in pump-turbines. Research shows that the hysteresis characteristic is a result of the Euler momentum and hydraulic loss; the hydraulic loss accounts for a major portion of the hysteresis characteristic. Finally, the hysteresis characteristic in the hump region was analyzed in detail through the entropy production. The results showed that the hump characteristic and the accompanying hysteresis phenomenon are caused by backflow at the runner inlet and the presence of separation vortices close to the hub and the shroud in the stay/guide vanes, which is dependent on the direction of
Variational method for the minimization of entropy generation in solar cells
Energy Technology Data Exchange (ETDEWEB)
Smit, Sjoerd; Kessels, W. M. M., E-mail: w.m.m.kessels@tue.nl [Department of Applied Physics, Eindhoven University of Technology, P.O. Box 513, 5600 MB Eindhoven (Netherlands)
2015-04-07
In this work, a method is presented to extend traditional solar cell simulation tools to make it possible to calculate the most efficient design of practical solar cells. The method is based on the theory of nonequilibrium thermodynamics, which is used to derive an expression for the local entropy generation rate in the solar cell, making it possible to quantify all free energy losses on the same scale. The framework of non-equilibrium thermodynamics can therefore be combined with the calculus of variations and existing solar cell models to minimize the total entropy generation rate in the cell to find the most optimal design. The variational method is illustrated by applying it to a homojunction solar cell. The optimization results in a set of differential algebraic equations, which determine the optimal shape of the doping profile for given recombination and transport models.
Explaining the entropy concept and entropy components
Directory of Open Access Journals (Sweden)
Marko Popovic
2018-04-01
Full Text Available Total entropy of a thermodynamic system consists of two components: thermal entropy due to energy, and residual entropy due to molecular orientation. In this article, a three-step method for explaining entropy is suggested. Step one is to use a classical method to introduce thermal entropy STM as a function of temperature T and heat capacity at constant pressure Cp: STM = ∫(Cp/T dT. Thermal entropy is the entropy due to uncertainty in motion of molecules and vanishes at absolute zero (zero-point energy state. It is also the measure of useless thermal energy that cannot be converted into useful work. The next step is to introduce residual entropy S0 as a function of the number of molecules N and the number of distinct orientations available to them in a crystal m: S0 = N kB ln m, where kB is the Boltzmann constant. Residual entropy quantifies the uncertainty in molecular orientation. Residual entropy, unlike thermal entropy, is independent of temperature and remains present at absolute zero. The third step is to show that thermal entropy and residual entropy add up to the total entropy of a thermodynamic system S: S = S0 + STM. This method of explanation should result in a better comprehension of residual entropy and thermal entropy, as well as of their similarities and differences. The new method was tested in teaching at Faculty of Chemistry University of Belgrade, Serbia. The results of the test show that the new method has a potential to improve the quality of teaching.
Fuzzy Entropy Method for Quantifying Supply Chain Networks Complexity
Zhang, Jihui; Xu, Junqin
Supply chain is a special kind of complex network. Its complexity and uncertainty makes it very difficult to control and manage. Supply chains are faced with a rising complexity of products, structures, and processes. Because of the strong link between a supply chain’s complexity and its efficiency the supply chain complexity management becomes a major challenge of today’s business management. The aim of this paper is to quantify the complexity and organization level of an industrial network working towards the development of a ‘Supply Chain Network Analysis’ (SCNA). By measuring flows of goods and interaction costs between different sectors of activity within the supply chain borders, a network of flows is built and successively investigated by network analysis. The result of this study shows that our approach can provide an interesting conceptual perspective in which the modern supply network can be framed, and that network analysis can handle these issues in practice.
Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy
Dun, Xiaohong
2018-05-01
With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.
Directory of Open Access Journals (Sweden)
Yu Ji
2017-03-01
Full Text Available The entropy generation analysis of fully turbulent convective heat transfer to nanofluids in a circular tube is investigated numerically using the Reynolds Averaged Navier–Stokes (RANS model. The nanofluids with particle concentration of 0%, 1%, 2%, 4% and 6% are treated as single phases of effective properties. The uniform heat flux is enforced at the tube wall. To confirm the validity of the numerical approach, the results have been compared with empirical correlations and analytical formula. The self-similarity profiles of local entropy generation are also studied, in which the peak values of entropy generation by direct dissipation, turbulent dissipation, mean temperature gradients and fluctuating temperature gradients for different Reynolds number as well as different particle concentration are observed. In addition, the effects of Reynolds number, volume fraction of nanoparticles and heat flux on total entropy generation and Bejan number are discussed. In the results, the intersection points of total entropy generation for water and four nanofluids are observed, when the entropy generation decrease before the intersection and increase after the intersection as the particle concentration increases. Finally, by definition of Ep, which combines the first law and second law of thermodynamics and attributed to evaluate the real performance of heat transfer processes, the optimal Reynolds number Reop corresponding to the best performance and the advisable Reynolds number Read providing the appropriate Reynolds number range for nanofluids in convective heat transfer can be determined.
Information entropies in antikaon-nucleon scattering and optimal state analysis
International Nuclear Information System (INIS)
Ion, D.B.; Ion, M.L.; Petrascu, C.
1998-01-01
It is known that Jaynes interpreted the entropy as the expected self-information of a class of mutually exclusive and exhaustive events, while the probability is considered to be the rational degree of belief we assign to events based on available experimental evidence. The axiomatic derivation of Jaynes principle of maximum entropy as well as of the Kullback principle of minimum cross-entropy have been reported. Moreover, the optimal states in the Hilbert space of the scattering amplitude, which are analogous to the coherent states from the Hilbert space of the wave functions, were introduced and developed. The possibility that each optimal state possesses a specific minimum entropic uncertainty relation similar to that of the coherent states was recently conjectured. In fact, the (angle and angular momenta) information entropies, as well as the entropic angle-angular momentum uncertainty relations, in the hadron-hadron scattering, are introduced. The experimental information entropies for the pion-nucleon scattering are calculated by using the available phase shift analyses. These results are compared with the information entropies of the optimal states. Then, the optimal state dominance in the pion-nucleon scattering is systematically observed for all P LAB = 0.02 - 10 GeV/c. Also, it is shown that the angle-angular momentum entropic uncertainty relations are satisfied with high accuracy by all the experimental information entropies. In this paper the (angle and angular momentum) information entropies of hadron-hadron scattering are experimentally investigated by using the antikaon-nucleon phase shift analysis. Then, it is shown that the experimental entropies are in agreement with the informational entropies of optimal states. The results obtained in this paper can be explained not only by the presence of an optimal background which accompanied the production of the elementary resonances but also by the presence of the optimal resonances. On the other hand
Reconstruction of the electron momentum density distribution by the maximum entropy method
International Nuclear Information System (INIS)
Dobrzynski, L.
1996-01-01
The application of the Maximum Entropy Algorithm to the analysis of the Compton profiles is discussed. It is shown that the reconstruction of electron momentum density may be reliably carried out. However, there are a number of technical problems which have to be overcome in order to produce trustworthy results. In particular one needs the experimental Compton profiles measured for many directions, and to have efficient computational resources. The use of various cross-checks is recommended. (orig.)
Lattice Field Theory with the Sign Problem and the Maximum Entropy Method
Directory of Open Access Journals (Sweden)
Masahiro Imachi
2007-02-01
Full Text Available Although numerical simulation in lattice field theory is one of the most effective tools to study non-perturbative properties of field theories, it faces serious obstacles coming from the sign problem in some theories such as finite density QCD and lattice field theory with the θ term. We reconsider this problem from the point of view of the maximum entropy method.
Directory of Open Access Journals (Sweden)
Jing Zhou
2014-01-01
Full Text Available According to the characteristics of emergency repair in overhead transmission line accidents, a complexity quantification method for emergency repair scheme is proposed based on the entropy method in software engineering, which is improved by using group AHP (analytical hierarchical process method and Petri net. Firstly, information structure chart model and process control flowchart model could be built by Petri net. Then impact factors on complexity of emergency repair scheme could be quantified into corresponding entropy values, respectively. Finally, by using group AHP method, weight coefficient of each entropy value would be given before calculating the overall entropy value for the whole emergency repair scheme. By comparing group AHP weighting method with average weighting method, experiment results for the former showed a stronger correlation between quantified entropy values of complexity and the actual consumed time in repair, which indicates that this new method is more valid.
Chung-Wei, Li; Gwo-Hshiung, Tzeng
To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.
Yan, Zhi Gang; Li, Jun Qing
2017-12-01
The areas of the habitat and bamboo forest, and the size of the giant panda wild population have greatly increased, while habitat fragmentation and local population isolation have also intensified in recent years. Accurate evaluation of ecosystem status of the panda in the giant panda distribution area is important for giant panda conservation. The ecosystems of the distribution area and six mountain ranges were subdivided into habitat and population subsystems based on the hie-rarchical system theory. Using the panda distribution area as the study area and the three national surveys as the time node, the evolution laws of ecosystems were studied using the entropy method, coefficient of variation, and correlation analysis. We found that with continuous improvement, some differences existed in the evolution and present situation of the ecosystems of six mountain ranges could be divided into three groups. Ecosystems classified into the same group showed many commonalities, and difference between the groups was considerable. Problems of habitat fragmentation and local population isolation became more serious, resulting in ecosystem degradation. Individuali-zed ecological protection measures should be formulated and implemented in accordance with the conditions in each mountain system to achieve the best results.
de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie
2011-12-14
We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics
International Nuclear Information System (INIS)
Feng Guangwen; Hu Youhua; Liu Qian
2009-01-01
In this paper, the application of the entropy weight TOPSIS method to optimal layout points in monitoring the Xinjiang radiation environment has been indroduced. With the help of SAS software, It has been found that the method is more ideal and feasible. The method can provide a reference for us to monitor radiation environment in the same regions further. As the method could bring great convenience and greatly reduce the inspecting work, it is very simple, flexible and effective for a comprehensive evaluation. (authors)
A second law analysis and entropy generation minimization of an absorption chiller
Myat, Aung; Thu, Kyaw; Kim, Youngdeuk; Chakraborty, Anutosh; Chun, Wongee; Ng, K. C.
2011-01-01
This paper presents performance analysis of absorption refrigeration system (ARS) using an entropy generation analysis. A numerical model predicts the performance of absorption cycle operating under transient conditions along with the entropy generation computation at assorted heat source temperatures, and it captures also the dynamic changes of lithium bromide solution properties such as concentration, density, vapor pressure and overall heat transfer coefficients. An optimization tool, namely the genetic algorithm (GA), is used as to locate the system minima for all defined domain of heat source and cooling water temperatures. The analysis shows that minimization of entropy generation the in absorption cycle leads to the maximization of the COP. © 2011 Elsevier Ltd. All rights reserved.
A second law analysis and entropy generation minimization of an absorption chiller
Myat, Aung
2011-10-01
This paper presents performance analysis of absorption refrigeration system (ARS) using an entropy generation analysis. A numerical model predicts the performance of absorption cycle operating under transient conditions along with the entropy generation computation at assorted heat source temperatures, and it captures also the dynamic changes of lithium bromide solution properties such as concentration, density, vapor pressure and overall heat transfer coefficients. An optimization tool, namely the genetic algorithm (GA), is used as to locate the system minima for all defined domain of heat source and cooling water temperatures. The analysis shows that minimization of entropy generation the in absorption cycle leads to the maximization of the COP. © 2011 Elsevier Ltd. All rights reserved.
Reginatto, M; Neumann, S
2002-01-01
MAXED was developed to apply the maximum entropy principle to the unfolding of neutron spectrometric measurements. The approach followed in MAXED has several features that make it attractive: it permits inclusion of a priori information in a well-defined and mathematically consistent way, the algorithm used to derive the solution spectrum is not ad hoc (it can be justified on the basis of arguments that originate in information theory), and the solution spectrum is a non-negative function that can be written in closed form. This last feature permits the use of standard methods for the sensitivity analysis and propagation of uncertainties of MAXED solution spectra. We illustrate its use with unfoldings of NE 213 scintillation detector measurements of photon calibration spectra, and of multisphere neutron spectrometer measurements of cosmic-ray induced neutrons at high altitude (approx 20 km) in the atmosphere.
Extended statistical entropy analysis as a quantitative management tool for water resource systems
Sobantka, Alicja; Rechberger, Helmut
2010-05-01
The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency
Nuclear Enhanced X-ray Maximum Entropy Method Used to Analyze Local Distortions in Simple Structures
DEFF Research Database (Denmark)
Christensen, Sebastian; Bindzus, Niels; Christensen, Mogens
We introduce a novel method for reconstructing pseudo nuclear density distributions (NDDs): Nuclear Enhanced X-ray Maximum Entropy Method (NEXMEM). NEXMEM offers an alternative route to experimental NDDs, exploiting the superior quality of synchrotron X-ray data compared to neutron data. The method...... proposed to result from anharmonic phonon scattering or from local fluctuating dipoles on the Pb site.[1,2] No macroscopic symmetry change are associated with these effects, rendering them invisible to conventional crystallographic techniques. For this reason PbX was until recently believed to adopt...
Time-series analysis of foreign exchange rates using time-dependent pattern entropy
Ishizaki, Ryuji; Inoue, Masayoshi
2013-08-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.
Time-series analysis of multiple foreign exchange rates using time-dependent pattern entropy
Ishizaki, Ryuji; Inoue, Masayoshi
2018-01-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in multiple foreign exchange rates. The time-dependent pattern entropy of 7 foreign exchange rates (AUD/USD, CAD/USD, CHF/USD, EUR/USD, GBP/USD, JPY/USD, and NZD/USD) was found to be high in the long period after the Lehman shock, and be low in the long period after Mar 2012. We compared the correlation matrix between exchange rates in periods of high and low of the time-dependent pattern entropy.
Entropy and complexity analysis of hydrogenic Rydberg atoms
Energy Technology Data Exchange (ETDEWEB)
Lopez-Rosa, S. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Fisica Aplicada II, Universidad de Sevilla, 41012-Sevilla (Spain); Toranzo, I. V.; Dehesa, J. S. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, 18071-Granada (Spain); Sanchez-Moreno, P. [Instituto Carlos I de Fisica Teorica y Computacional, Universidad de Granada, 18071-Granada (Spain); Departamento de Matematica Aplicada, Universidad de Granada, 18071-Granada (Spain)
2013-05-15
The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Cramer-Rao, Fisher-Shannon, and Lopez Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associated generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n- 1), and quasicircular (l=n- 2) states is explicitly done.
Entropy and complexity analysis of hydrogenic Rydberg atoms
International Nuclear Information System (INIS)
López-Rosa, S.; Toranzo, I. V.; Dehesa, J. S.; Sánchez-Moreno, P.
2013-01-01
The internal disorder of hydrogenic Rydberg atoms as contained in their position and momentum probability densities is examined by means of the following information-theoretic spreading quantities: the radial and logarithmic expectation values, the Shannon entropy, and the Fisher information. As well, the complexity measures of Crámer-Rao, Fisher-Shannon, and López Ruiz-Mancini-Calvet types are investigated in both reciprocal spaces. The leading term of these quantities is rigorously calculated by use of the asymptotic properties of the concomitant entropic functionals of the Laguerre and Gegenbauer orthogonal polynomials which control the wavefunctions of the Rydberg states in both position and momentum spaces. The associated generalized Heisenberg-like, logarithmic and entropic uncertainty relations are also given. Finally, application to linear (l= 0), circular (l=n− 1), and quasicircular (l=n− 2) states is explicitly done.
Spectrum unfolding in X-ray spectrometry using the maximum entropy method
International Nuclear Information System (INIS)
Fernandez, Jorge E.; Scot, Viviana; Di Giulio, Eugenio
2014-01-01
The solution of the unfolding problem is an ever-present issue in X-ray spectrometry. The maximum entropy technique solves this problem by taking advantage of some known a priori physical information and by ensuring an outcome with only positive values. This method is implemented in MAXED (MAXimum Entropy Deconvolution), a software code contained in the package UMG (Unfolding with MAXED and GRAVEL) developed at PTB and distributed by NEA Data Bank. This package contains also the code GRAVEL (used to estimate the precision of the solution). This article introduces the new code UMESTRAT (Unfolding Maximum Entropy STRATegy) which applies a semi-automatic strategy to solve the unfolding problem by using a suitable combination of MAXED and GRAVEL for applications in X-ray spectrometry. Some examples of the use of UMESTRAT are shown, demonstrating its capability to remove detector artifacts from the measured spectrum consistently with the model used for the detector response function (DRF). - Highlights: ► A new strategy to solve the unfolding problem in X-ray spectrometry is presented. ► The presented strategy uses a suitable combination of the codes MAXED and GRAVEL. ► The applied strategy provides additional information on the Detector Response Function. ► The code UMESTRAT is developed to apply this new strategy in a semi-automatic mode
Optimized Kernel Entropy Components.
Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau
2017-06-01
This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.
Machine Selection in A Dairy Product Company with Entropy and SAW Method Integration
Directory of Open Access Journals (Sweden)
Aşkın Özdağoğlu
2017-07-01
Full Text Available Machine selection is an important and difficult process for the firms, and its results may generate more problems than anticipated. In order to find the best alternative, managers should define the requirements of the factory and determine the necessary criteria. On the other hand, the decision making criteria in order to choose the right equipment may vary according to the type of the manufacturing facility, market requirements, and consumer assigned criteria. This study aims to find the best machine alternative among the three machine offerings according to twelve evaluation criteria by integrating entropy method with SAW method.
A Modified Entropy Generation Number for Heat Exchangers
Institute of Scientific and Technical Information of China (English)
无
1996-01-01
This paper demonstrates the difference between the entropy generation number method proposed by Bejian and the method of entropy generation per unit amount of heat transferred in analyzing the ther-modynamic performance of heat exchangers,points out the reason for leading to the above difference.A modified entropy generation number for evaluating the irreversibility of heat exchangers is proposed which is in consistent with the entropy generation per unit amount of heat transferred in entropy generation analysis.The entropy generated by friction is also investigated.Results show that when the entropy generated by friction in heat exchangers in taken into account,there is a minimum total entropy generation number while the NTU and the ratio of heat capacity rates vary.The existence of this minimum is the prerequisite of heat exchanger optimization.
Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis
Chen, Lu; Singh, Vijay P.
2018-02-01
Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.
Directory of Open Access Journals (Sweden)
Konchada Pavan Kumar
2016-06-01
Full Text Available The presence of nanoparticles in heat exchangers ascertained increment in heat transfer. The present work focuses on heat transfer in a longitudinal finned tube heat exchanger. Experimentation is done on longitudinal finned tube heat exchanger with pure water as working fluid and the outcome is compared numerically using computational fluid dynamics (CFD package based on finite volume method for different flow rates. Further 0.8% volume fraction of aluminum oxide (Al2O3 nanofluid is considered on shell side. The simulated nanofluid analysis has been carried out using single phase approach in CFD by updating the user-defined functions and expressions with thermophysical properties of the selected nanofluid. These results are thereafter compared against the results obtained for pure water as shell side fluid. Entropy generated due to heat transfer and fluid flow is calculated for the nanofluid. Analysis of entropy generation is carried out using the Taguchi technique. Analysis of variance (ANOVA results show that the inlet temperature on shell side has more pronounced effect on entropy generation.
Entropy Generation Analysis of Natural Convection in Square Enclosures with Two Isoflux Heat Sources
Directory of Open Access Journals (Sweden)
S. Z. Nejad
2017-04-01
Full Text Available This study investigates entropy generation resulting from natural convective heat transfer in square enclosures with local heating of the bottom and symmetrical cooling of the sidewalls. This analysis tends to optimize heat transfer of two pieces of semiconductor in a square electronic package. In this simulation, heaters are modeled as isoflux heat sources and sidewalls of the enclosure are isothermal heat sinks. The top wall and the non-heated portions of the bottom wall are adiabatic. Flow and temperature fields are obtained by numerical simulation of conservation equations of mass, momentum and energy in laminar, steady and two dimensional flows. With constant heat energy into the cavity, effect of Rayleigh number, heater length, heater strength ratios and heater position is evaluated on flow and temperature fields and local entropy generation. The results show that a minimum entropy generation rate is obtained under the same condition in which a minimum peak heater temperature is obtained.
Value at risk estimation with entropy-based wavelet analysis in exchange markets
He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung
2014-08-01
In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.
Detrended fluctuation analysis and Kolmogorov–Sinai entropy of electroencephalogram signals
International Nuclear Information System (INIS)
Lim, Jung Ho; Khang, Eun Joo; Lee, Tae Hyun; Kim, In Hye; Maeng, Seong Eun; Lee, Jae Woo
2013-01-01
We measured the electroencephalogram (EEG) of young students in the relaxed state and in the state of the mathematical activities. We applied the detrended fluctuation analysis and Kolmogorov–Sinai entropy (KSE) in the EEG signals. We found that the detrended fluctuation functions follow a power law with Hurst exponents larger than 1/2. The Hurst exponents enhanced at all EEG channels in the state of mathematical activities. The KSE in the relaxed state is larger than those in the state of the mathematical activities. These indicate that the entropy is enhanced in the disorder state of the brain.
Local entropy generation analysis of a rotary magnetic heat pump regenerator
International Nuclear Information System (INIS)
Drost, M.K.; White, M.D.
1990-01-01
The rotary magnetic heat pump has attractive thermodynamic performance but it is strongly influenced by the effectiveness of the regenerator. This paper uses local entropy generation analysis to evaluate the regenerator design and to suggest design improvements. The results show that performance of the proposed design is dominated by heat transfer related entropy generation. This suggests that enhancement concepts that improve heat transfer should be considered, even if the enhancement causes a significant increase in viscous losses (pressure drop). One enhancement technique, the use of flow disruptors, was evaluated and the results showed that flow disruptors can significantly reduce thermodynamic losses
Digital Image Stabilization Method Based on Variational Mode Decomposition and Relative Entropy
Directory of Open Access Journals (Sweden)
Duo Hao
2017-11-01
Full Text Available Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD and relative entropy (RE. In this paper, the global motion vector (GMV is initially decomposed into several narrow-banded modes by VMD. REs, which exhibit the difference of probability distribution between two modes, are then calculated to identify the intentional and jitter motion modes. Finally, the summation of the jitter motion modes constitutes jitter motions, whereas the subtraction of the resulting sum from the GMV represents the intentional motions. The proposed stabilization method is compared with several known methods, namely, medium filter (MF, Kalman filter (KF, wavelet decomposition (MD method, empirical mode decomposition (EMD-based method, and enhanced EMD-based method, to evaluate stabilization performance. Experimental results show that the proposed method outperforms the other stabilization methods.
Music viewed by its entropy content: A novel window for comparative analysis
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the ‘2nd Order Entropy’. Applying these methods to a variety of musical pieces showed how the space of ‘symbolic specific diversity-entropy’ and that of ‘2nd order entropy’ captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning. PMID:29040288
Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method
Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung
2015-04-01
In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting
The entropy concept. A powerful tool for multiphase flow analysis
International Nuclear Information System (INIS)
Kolev, Nikolay Ivanov
2007-01-01
This work summarizes the system of partial differential equations describing multiphase, multi-component flows in arbitrary geometry including porous structures with arbitrary thermal and mechanical interactions among the fields and between each field and the structure. Each of the fluids is designed as a universal mixture of miscible and immiscible component. The system contains the rigorously derived entropy equations which are used instead of the primitive form of the energy conservation. Based on well established mathematical theorems the equations are local volume and time averaged. The so called volume conservation equation allowing establishing close coupling between pressure and density changes of all of the participating velocity fields is presented. It replaces one of the mass conservation equations. The system is solved within the computer code system IVA together with large number of constitutive relationships for closing it in arbitrary geometry. The extensive validation on many hundreds of simple- and complex experiments, including the many industrial applications, demonstrates the versatility and the power of this analytical tool for designing complex processes in the industry and analyzing complex processes in the nature. (author)
Low Streamflow Forcasting using Minimum Relative Entropy
Cui, H.; Singh, V. P.
2013-12-01
Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.
Energy Technology Data Exchange (ETDEWEB)
Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)
2017-06-15
A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.
Directory of Open Access Journals (Sweden)
Kartik V. Bulusu
2015-09-01
Full Text Available The coherent secondary flow structures (i.e., swirling motions in a curved artery model possess a variety of spatio-temporal morphologies and can be encoded over an infinitely-wide range of wavelet scales. Wavelet analysis was applied to the following vorticity fields: (i a numerically-generated system of Oseen-type vortices for which the theoretical solution is known, used for bench marking and evaluation of the technique; and (ii experimental two-dimensional, particle image velocimetry data. The mother wavelet, a two-dimensional Ricker wavelet, can be dilated to infinitely large or infinitesimally small scales. We approached the problem of coherent structure detection by means of continuous wavelet transform (CWT and decomposition (or Shannon entropy. The main conclusion of this study is that the encoding of coherent secondary flow structures can be achieved by an optimal number of binary digits (or bits corresponding to an optimal wavelet scale. The optimal wavelet-scale search was driven by a decomposition entropy-based algorithmic approach and led to a threshold-free coherent structure detection method. The method presented in this paper was successfully utilized in the detection of secondary flow structures in three clinically-relevant blood flow scenarios involving the curved artery model under a carotid artery-inspired, pulsatile inflow condition. These scenarios were: (i a clean curved artery; (ii stent-implanted curved artery; and (iii an idealized Type IV stent fracture within the curved artery.
Energy Technology Data Exchange (ETDEWEB)
Holden, Helge; Karlsen, Kenneth H.; Lie, Knut-Andreas
1999-10-01
We present and analyze a numerical method for the solution of a class of scalar, multi-dimensional, nonlinear degenerate convection-diffusion equations. The method is based on operator splitting to separate the convective and the diffusive terms in the governing equation. The nonlinear, convective part is solved using front tracking and dimensional splitting, while the nonlinear diffusion equation is solved by a suitable difference scheme. We verify L{sup 1} compactness of the corresponding set of approximate solutions and derive precise entropy estimates. In particular, these results allow us to pass to the limit in our approximations and recover an entropy solution of the problem in question. The theory presented covers a large class of equations. Important subclasses are hyperbolic conservation laws, porous medium type equations, two-phase reservoir flow equations, and strongly degenerate equations coming from the recent theory of sedimentation-consolidation processes. A thorough numerical investigation of the method analyzed in this paper (and similar methods) is presented in a companion paper. (author)
Entropy Based Test Point Evaluation and Selection Method for Analog Circuit Fault Diagnosis
Directory of Open Access Journals (Sweden)
Yuan Gao
2014-01-01
Full Text Available By simplifying tolerance problem and treating faulty voltages on different test points as independent variables, integer-coded table technique is proposed to simplify the test point selection process. Usually, simplifying tolerance problem may induce a wrong solution while the independence assumption will result in over conservative result. To address these problems, the tolerance problem is thoroughly considered in this paper, and dependency relationship between different test points is considered at the same time. A heuristic graph search method is proposed to facilitate the test point selection process. First, the information theoretic concept of entropy is used to evaluate the optimality of test point. The entropy is calculated by using the ambiguous sets and faulty voltage distribution, determined by component tolerance. Second, the selected optimal test point is used to expand current graph node by using dependence relationship between the test point and graph node. Simulated results indicate that the proposed method more accurately finds the optimal set of test points than other methods; therefore, it is a good solution to minimize the size of the test point set. To simplify and clarify the proposed method, only catastrophic and some specific parametric faults are discussed in this paper.
Effective updating process of seismic fragilities using Bayesian method and information entropy
International Nuclear Information System (INIS)
Kato, Masaaki; Takata, Takashi; Yamaguchi, Akira
2008-01-01
Seismic probabilistic safety assessment (SPSA) is an effective method for evaluating overall performance of seismic safety of a plant. Seismic fragilities are estimated to quantify the seismically induced accident sequences. It is a great concern that the SPSA results involve uncertainties, a part of which comes from the uncertainty in the seismic fragility of equipment and systems. A straightforward approach to reduce the uncertainty is to perform a seismic qualification test and to reflect the results on the seismic fragility estimate. In this paper, we propose a figure-of-merit to find the most cost-effective condition of the seismic qualification tests about the acceleration level and number of components tested. Then a mathematical method to reflect the test results on the fragility update is developed. A Bayesian method is used for the fragility update procedure. Since a lognormal distribution that is used for the fragility model does not have a Bayes conjugate function, a parameterization method is proposed so that the posterior distribution expresses the characteristics of the fragility. The information entropy is used as the figure-of-merit to express importance of obtained evidence. It is found that the information entropy is strongly associated with the uncertainty of the fragility. (author)
Relations Among Some Fuzzy Entropy Formulae
Institute of Scientific and Technical Information of China (English)
卿铭
2004-01-01
Fuzzy entropy has been widely used to analyze and design fuzzy systems, and many fuzzy entropy formulae have been proposed. For further in-deepth analysis of fuzzy entropy, the axioms and some important formulae of fuzzy entropy are introduced. Some equivalence results among these fuzzy entropy formulae are proved, and it is shown that fuzzy entropy is a special distance measurement.
Improvement of the detector resolution in X-ray spectrometry by using the maximum entropy method
International Nuclear Information System (INIS)
Fernández, Jorge E.; Scot, Viviana; Giulio, Eugenio Di; Sabbatucci, Lorenzo
2015-01-01
In every X-ray spectroscopy measurement the influence of the detection system causes loss of information. Different mechanisms contribute to form the so-called detector response function (DRF): the detector efficiency, the escape of photons as a consequence of photoelectric or scattering interactions, the spectrum smearing due to the energy resolution, and, in solid states detectors (SSD), the charge collection artifacts. To recover the original spectrum, it is necessary to remove the detector influence by solving the so-called inverse problem. The maximum entropy unfolding technique solves this problem by imposing a set of constraints, taking advantage of the known a priori information and preserving the positive-defined character of the X-ray spectrum. This method has been included in the tool UMESTRAT (Unfolding Maximum Entropy STRATegy), which adopts a semi-automatic strategy to solve the unfolding problem based on a suitable combination of the codes MAXED and GRAVEL, developed at PTB. In the past UMESTRAT proved the capability to resolve characteristic peaks which were revealed as overlapped by a Si SSD, giving good qualitative results. In order to obtain quantitative results, UMESTRAT has been modified to include the additional constraint of the total number of photons of the spectrum, which can be easily determined by inverting the diagonal efficiency matrix. The features of the improved code are illustrated with some examples of unfolding from three commonly used SSD like Si, Ge, and CdTe. The quantitative unfolding can be considered as a software improvement of the detector resolution. - Highlights: • Radiation detection introduces distortions in X- and Gamma-ray spectrum measurements. • UMESTRAT is a graphical tool to unfold X- and Gamma-ray spectra. • UMESTRAT uses the maximum entropy method. • UMESTRAT’s new version produces unfolded spectra with quantitative meaning. • UMESTRAT is a software tool to improve the detector resolution.
Institute of Scientific and Technical Information of China (English)
S.Talebi; M.M.Valoujerdi
2017-01-01
The present paper discusses entropy generation in fully developed turbulent flows through a subchannel,arranged in square and triangle arrays.Entropy generation is due to contribution of both heat transfer and pressure drop.Our main objective is to study the effect of key parameters such as spacer grid,fuel rod power distribution,Reynolds number Re,dimensionless heat power ω,lengthto-fuel-diameter ratio λ,and pitch-to-diameter ratio ξ on subchannel entropy generation.The analysis explicitly shows the contribution of heat transfer and pressure drop to the total entropy generation.An analytical formulation is introduced to total entropy generation for situations with uniform and sinusoidal rod power distribution.It is concluded that power distribution affects entropy generation.A smoother power profile leads to less entropy generation.The entropy generation of square rod array bundles is more efficient than that of triangular rod arrays,and spacer grids generate more entropy.
Directory of Open Access Journals (Sweden)
John McCamley
2017-01-01
Full Text Available The aim of this investigation was to compare and contrast the use of cross sample entropy (xSE and cross recurrence quantification analysis (cRQA measures for the assessment of coupling of rhythmical patterns. Measures were assessed using simulated signals with regular, chaotic, and random fluctuations in frequency, amplitude, and a combination of both. Biological data were studied as models of normal and abnormal locomotor-respiratory coupling. Nine signal types were generated for seven frequency ratios. Fifteen patients with COPD (abnormal coupling and twenty-one healthy controls (normal coupling walked on a treadmill at three speeds while breathing and walking were recorded. xSE and the cRQA measures of percent determinism, maximum line, mean line, and entropy were quantified for both the simulated and experimental data. In the simulated data, xSE, percent determinism, and entropy were influenced by the frequency manipulation. The 1 : 1 frequency ratio was different than other frequency ratios for almost all measures and/or manipulations. The patients with COPD used a 2 : 3 ratio more often and xSE, percent determinism, maximum line, mean line, and cRQA entropy were able to discriminate between the groups. Analysis of the effects of walking speed indicated that all measures were able to discriminate between speeds.
Analysis of crude oil markets with improved multiscale weighted permutation entropy
Niu, Hongli; Wang, Jun; Liu, Cheng
2018-03-01
Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.
Maximum Entropy Methods as the Bridge Between Microscopic and Macroscopic Theory
Taylor, Jamie M.
2016-09-01
This paper is concerned with an investigation into a function of macroscopic variables known as the singular potential, building on previous work by Ball and Majumdar. The singular potential is a function of the admissible statistical averages of probability distributions on a state space, defined so that it corresponds to the maximum possible entropy given known observed statistical averages, although non-classical entropy-like objective functions will also be considered. First the set of admissible moments must be established, and under the conditions presented in this work the set is open, bounded and convex allowing a description in terms of supporting hyperplanes, which provides estimates on the development of singularities for related probability distributions. Under appropriate conditions it is shown that the singular potential is strictly convex, as differentiable as the microscopic entropy, and blows up uniformly as the macroscopic variable tends to the boundary of the set of admissible moments. Applications of the singular potential are then discussed, and particular consideration will be given to certain free-energy functionals typical in mean-field theory, demonstrating an equivalence between certain microscopic and macroscopic free-energy functionals. This allows statements about L^1-local minimisers of Onsager's free energy to be obtained which cannot be given by two-sided variations, and overcomes the need to ensure local minimisers are bounded away from zero and +∞ before taking L^∞ variations. The analysis also permits the definition of a dual order parameter for which Onsager's free energy allows an explicit representation. Also, the difficulties in approximating the singular potential by everywhere defined functions, in particular by polynomial functions, are addressed, with examples demonstrating the failure of the Taylor approximation to preserve relevant shape properties of the singular potential.
Another Method of Building 2D Entropy to Realize Automatic Segmentation
International Nuclear Information System (INIS)
Zhang, Y F; Zhang, Y
2006-01-01
2D entropy formed during the process of building 2D histogram can realize automatic segmentation. Traditional method utilizes central pixel grey value and the others or all of pixels grey mean value in 4-neighbor to build 2D histogram. In fact, the change of the greyscale value between two ''invariable position vectors'' cannot represent the total characteristics among neighbour pixels very well. A new method is proposed which makes use of minimum grey value in the 4-neighbor and of maximum grey value in the 3x3 neighbour except pixels of the 4-neighbor. New method and traditional one are used in contrast to realize image automatic segmentation. The experimental results of the classical image prove the new method is effective
Directory of Open Access Journals (Sweden)
N. Akmal
2018-05-01
Full Text Available The present study gives an account of the heat transfer characteristics of the squeezing flow of a nanofluid between two flat plates with upper plate moving vertically and the lower in the horizontal direction. Tiwari and Das nanofluid model has been utilized to give a comparative analysis of the heat transfer in the Cu-water and Al2O3–water nanofluids with entropy generation. The modeling is carried out with the consideration of Lorentz forces to observe the effect of magnetic field on the flow. The Joule heating effect is included to discuss the heat dissipation in the fluid and its effect on the entropy of the system. The nondimensional ordinary differential equations are solved using the Keller box method to assess the numerical results which are presented by the graphs and tables. An interesting observation is that the entropy is generated more near the lower plate as compared with that at the upper plate. Also, the heat transfer rate is found to be higher for the Cu nanoparticles in comparison with the Al2O3 nanoparticles.
Roostaee, M.; Deng, Z.
2017-12-01
The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.
Directory of Open Access Journals (Sweden)
Shuyu Dai
2017-10-01
Full Text Available As an important implementing body of the national energy strategy, grid enterprises bear the important responsibility of optimizing the allocation of energy resources and serving the economic and social development, and their levels of sustainable development have a direct impact on the national economy and social life. In this paper, the model of fuzzy group ideal point method and combination weighting method with improved group order relation method and entropy weight method is proposed to evaluate the sustainable development of power grid enterprises. Firstly, on the basis of consulting a large amount of literature, the important criteria of the comprehensive evaluation of the sustainable development of power grid enterprises are preliminarily selected. The opinions of the industry experts are consulted and fed back for many rounds through the Delphi method and the evaluation criteria system for sustainable development of power grid enterprises is determined, then doing the consistent and non dimensional processing of the evaluation criteria. After that, based on the basic order relation method, the weights of each expert judgment matrix are synthesized to construct the compound matter elements. By using matter element analysis, the subjective weights of the criteria are obtained. And entropy weight method is used to determine the objective weights of the preprocessed criteria. Then, combining the subjective and objective information with the combination weighting method based on the subjective and objective weighted attribute value consistency, a more comprehensive, reasonable and accurate combination weight is calculated. Finally, based on the traditional TOPSIS method, the triangular fuzzy numbers are introduced to better realize the scientific processing of the data information which is difficult to quantify, and the queuing indication value of each object and the ranking result are obtained. A numerical example is taken to prove that the
Analyzing the Performances of Automotive Companies Using Entropy Based MAUT and SAW Methods
Directory of Open Access Journals (Sweden)
Nuri Ömürbek
2016-06-01
Full Text Available In this study, performances of automotive companies traded on BİST (Istanbul Stock Exchange and also operated in our country have been compared with the multi-criteria decision making techniques. Data of the most important automotive companies operating in Turkey have been analyzed based on capital, stock certificate, marketing value, sales revenue, number of employees, net profit margin, current ratio, net profit/capital, net profit/sales and net sales/number of employees. Criteria applied on Performance measurement was gained operating reports of companies in 2014. Entropy method has been used to determine the weights of the criteria. Those weights have been used MAUT (Multi-Attribute Utility Theory and SAW (Simple Additive Weighting methods to rank automative companies’ performances The findings highlight that the same companies were in the first three places in both methods.
Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG
Directory of Open Access Journals (Sweden)
Isabella Palamara
2012-07-01
Full Text Available An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database.
Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method
Pei-Jui, Wu; Hwa-Lung, Yu
2016-04-01
The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .
A projection-adapted cross entropy (PACE) method for transmission network planning
Energy Technology Data Exchange (ETDEWEB)
Eshragh, Ali; Filar, Jerzy [University of South Australia, School of Mathematics and Statistics, Mawson Lakes, SA (Australia); Nazar, Asef [University of South Australia, Institute for Sustainable Systems Technologies, School of Mathematics and Statistics, Mawson Lakes, SA (Australia)
2011-06-15
In this paper, we propose an adaptation of the cross entropy (CE) method called projection-adapted CE (PACE) to solve a transmission expansion problem that arises in management of national and provincial electricity grids. The aim of the problem is to find an expansion policy that is both economical and operational from the technical perspective. Often, the transmission network expansion problem is mathematically formulated as a mixed integer nonlinear program that is very challenging algorithmically. The challenge originates from the fact that a global optimum should be found despite the presence, of possibly a huge number, of local optima. The PACE method shows promise in solving global optimization problems regardless of continuity or other assumptions. In our approach, we sample the integer variables using the CE mechanism, and solve LPs to obtain matching continuous variables. Numerical results, on selected test systems, demonstrate the potential of this approach. (orig.)
International Nuclear Information System (INIS)
Werner, K.D.
1990-01-01
In this paper we introduce briefly the Geometrical Shock Correction (GSC) method and consider various fields of applications, with special emphasis on two-phase flow problems in porous media. Some test problems are taken from this field. GSC is a very efficient numerical method for constructing the entropy solution of the Cauchy problem of scalar hyperboli conservation laws (with source term) in one space dimension and in specific two-dimensional cases. The novelty consists in constructing the solution at an arbitrary fixed time t=T>0 in one time step, based on transporting the initial values along characteristics and, if shocks appear, on a correction of the multivalued relation by a geometrical averaging technique. (orig.) With 7 figs [de
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
CFD Prediction of Airfoil Drag in Viscous Flow Using the Entropy Generation Method
Directory of Open Access Journals (Sweden)
Wei Wang
2018-01-01
Full Text Available A new aerodynamic force of drag prediction approach was developed to compute the airfoil drag via entropy generation rate in the flow field. According to the momentum balance, entropy generation and its relationship to drag were derived for viscous flow. Model equations for the calculation of the local entropy generation in turbulent flows were presented by extending the RANS procedure to the entropy balance equation. The accuracy of algorithm and programs was assessed by simulating the pressure coefficient distribution and dragging coefficient of different airfoils under different Reynolds number at different attack angle. Numerical data shows that the total entropy generation rate in the flow field and the drag coefficient of the airfoil can be related by linear equation, which indicates that the total drag could be resolved into entropy generation based on its physical mechanism of energy loss.
miRge - A Multiplexed Method of Processing Small RNA-Seq Data to Determine MicroRNA Entropy.
Directory of Open Access Journals (Sweden)
Alexander S Baras
Full Text Available Small RNA RNA-seq for microRNAs (miRNAs is a rapidly developing field where opportunities still exist to create better bioinformatics tools to process these large datasets and generate new, useful analyses. We built miRge to be a fast, smart small RNA-seq solution to process samples in a highly multiplexed fashion. miRge employs a Bayesian alignment approach, whereby reads are sequentially aligned against customized mature miRNA, hairpin miRNA, noncoding RNA and mRNA sequence libraries. miRNAs are summarized at the level of raw reads in addition to reads per million (RPM. Reads for all other RNA species (tRNA, rRNA, snoRNA, mRNA are provided, which is useful for identifying potential contaminants and optimizing small RNA purification strategies. miRge was designed to optimally identify miRNA isomiRs and employs an entropy based statistical measurement to identify differential production of isomiRs. This allowed us to identify decreasing entropy in isomiRs as stem cells mature into retinal pigment epithelial cells. Conversely, we show that pancreatic tumor miRNAs have similar entropy to matched normal pancreatic tissues. In a head-to-head comparison with other miRNA analysis tools (miRExpress 2.0, sRNAbench, omiRAs, miRDeep2, Chimira, UEA small RNA Workbench, miRge was faster (4 to 32-fold and was among the top-two methods in maximally aligning miRNAs reads per sample. Moreover, miRge has no inherent limits to its multiplexing. miRge was capable of simultaneously analyzing 100 small RNA-Seq samples in 52 minutes, providing an integrated analysis of miRNA expression across all samples. As miRge was designed for analysis of single as well as multiple samples, miRge is an ideal tool for high and low-throughput users. miRge is freely available at http://atlas.pathology.jhu.edu/baras/miRge.html.
Imaging VLBI polarimetry data from Active Galactic Nuclei using the Maximum Entropy Method
Directory of Open Access Journals (Sweden)
Coughlan Colm P.
2013-12-01
Full Text Available Mapping the relativistic jets emanating from AGN requires the use of a deconvolution algorithm to account for the effects of missing baseline spacings. The CLEAN algorithm is the most commonly used algorithm in VLBI imaging today and is suitable for imaging polarisation data. The Maximum Entropy Method (MEM is presented as an alternative with some advantages over the CLEAN algorithm, including better spatial resolution and a more rigorous and unbiased approach to deconvolution. We have developed a MEM code suitable for deconvolving VLBI polarisation data. Monte Carlo simulations investigating the performance of CLEAN and the MEM code on a variety of source types are being carried out. Real polarisation (VLBA data taken at multiple wavelengths have also been deconvolved using MEM, and several of the resulting polarisation and Faraday rotation maps are presented and discussed.
The determination of nuclear charge distributions using a Bayesian maximum entropy method
International Nuclear Information System (INIS)
Macaulay, V.A.; Buck, B.
1995-01-01
We treat the inference of nuclear charge densities from measurements of elastic electron scattering cross sections. In order to get the most reliable information from expensively acquired, incomplete and noisy measurements, we use Bayesian probability theory. Very little prior information about the charge densities is assumed. We derive a prior probability distribution which is a generalization of a form used widely in image restoration based on the entropy of a physical density. From the posterior distribution of possible densities, we select the most probable one, and show how error bars can be evaluated. These have very reasonable properties, such as increasing without bound as hypotheses about finer scale structures are included in the hypothesis space. The methods are demonstrated by using data on the nuclei 4 He and 12 C. (orig.)
Analysis of positron lifetime spectra using quantified maximum entropy and a general linear filter
International Nuclear Information System (INIS)
Shukla, A.; Peter, M.; Hoffmann, L.
1993-01-01
Two new approaches are used to analyze positron annihilation lifetime spectra. A general linear filter is designed to filter the noise from lifetime data. The quantified maximum entropy method is used to solve the inverse problem of finding the lifetimes and intensities present in data. We determine optimal values of parameters needed for fitting using Bayesian methods. Estimates of errors are provided. We present results on simulated and experimental data with extensive tests to show the utility of this method and compare it with other existing methods. (orig.)
Principle of maximum entropy for reliability analysis in the design of machine components
Zhang, Yimin
2018-03-01
We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.
Entropy Analysis of Solar Two-Step Thermochemical Cycles for Water and Carbon Dioxide Splitting
Directory of Open Access Journals (Sweden)
Matthias Lange
2016-01-01
Full Text Available The present study provides a thermodynamic analysis of solar thermochemical cycles for splitting of H2O or CO2. Such cycles, powered by concentrated solar energy, have the potential to produce fuels in a sustainable way. We extend a previous study on the thermodynamics of water splitting by also taking into account CO2 splitting and the influence of the solar absorption efficiency. Based on this purely thermodynamic approach, efficiency trends are discussed. The comprehensive and vivid representation in T-S diagrams provides researchers in this field with the required theoretical background to improve process development. Furthermore, results about the required entropy change in the used redox materials can be used as a guideline for material developers. The results show that CO2 splitting is advantageous at higher temperature levels, while water splitting is more feasible at lower temperature levels, as it benefits from a great entropy change during the splitting step.
Zhang, Li; Wu, Kexin; Liu, Yang
2017-12-01
A multi-objective performance optimization method is proposed, and the problem that single structural parameters of small fan balance the optimization between the static characteristics and the aerodynamic noise is solved. In this method, three structural parameters are selected as the optimization variables. Besides, the static pressure efficiency and the aerodynamic noise of the fan are regarded as the multi-objective performance. Furthermore, the response surface method and the entropy method are used to establish the optimization function between the optimization variables and the multi-objective performances. Finally, the optimized model is found when the optimization function reaches its maximum value. Experimental data shows that the optimized model not only enhances the static characteristics of the fan but also obviously reduces the noise. The results of the study will provide some reference for the optimization of multi-objective performance of other types of rotating machinery.
Directory of Open Access Journals (Sweden)
Ammar Ben Brahim
2011-05-01
Full Text Available Thermosolutal convection in a square cavity filled with air and submitted to an inclined magnetic field is investigated numerically. The cavity is heated and cooled along the active walls with a mass gradient whereas the two other walls of the cavity are adiabatic and insulated. Entropy generation due to heat and mass transfer, fluid friction and magnetic effect has been determined in transient state for laminar flow by solving numerically the continuity, momentum energy and mass balance equations, using a Control Volume Finite—Element Method. The structure of the studied flows depends on four dimensionless parameters which are the Grashof number, the buoyancy ratio, the Hartman number and the inclination angle. The results show that the magnetic field parameter has a retarding effect on the flow in the cavity and this lead to a decrease of entropy generation, Temperature and concentration decrease with increasing value of the magnetic field parameter.
Directory of Open Access Journals (Sweden)
Rong Jiang
2014-09-01
Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.
RNA Thermodynamic Structural Entropy.
Garcia-Martin, Juan Antonio; Clote, Peter
2015-01-01
Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
RNA Thermodynamic Structural Entropy.
Directory of Open Access Journals (Sweden)
Juan Antonio Garcia-Martin
Full Text Available Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs. However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
Diks, C.; Fang, H.
2017-01-01
The information-theoretical concept transfer entropy is an ideal measure for detecting conditional independence, or Granger causality in a time series setting. The recent literature indeed witnesses an increased interest in applications of entropy-based tests in this direction. However, those tests
Directory of Open Access Journals (Sweden)
Cheng-Wei Fei
2014-01-01
Full Text Available To improve the diagnosis capacity of rotor vibration fault in stochastic process, an effective fault diagnosis method (named Process Power Spectrum Entropy (PPSE and Support Vector Machine (SVM (PPSE-SVM, for short method was proposed. The fault diagnosis model of PPSE-SVM was established by fusing PPSE method and SVM theory. Based on the simulation experiment of rotor vibration fault, process data for four typical vibration faults (rotor imbalance, shaft misalignment, rotor-stator rubbing, and pedestal looseness were collected under multipoint (multiple channels and multispeed. By using PPSE method, the PPSE values of these data were extracted as fault feature vectors to establish the SVM model of rotor vibration fault diagnosis. From rotor vibration fault diagnosis, the results demonstrate that the proposed method possesses high precision, good learning ability, good generalization ability, and strong fault-tolerant ability (robustness in four aspects of distinguishing fault types, fault severity, fault location, and noise immunity of rotor stochastic vibration. This paper presents a novel method (PPSE-SVM for rotor vibration fault diagnosis and real-time vibration monitoring. The presented effort is promising to improve the fault diagnosis precision of rotating machinery like gas turbine.
An Entropy-based gene selection method for cancer classification using microarray data
Directory of Open Access Journals (Sweden)
Krishnan Arun
2005-03-01
Full Text Available Abstract Background Accurate diagnosis of cancer subtypes remains a challenging problem. Building classifiers based on gene expression data is a promising approach; yet the selection of non-redundant but relevant genes is difficult. The selected gene set should be small enough to allow diagnosis even in regular clinical laboratories and ideally identify genes involved in cancer-specific regulatory pathways. Here an entropy-based method is proposed that selects genes related to the different cancer classes while at the same time reducing the redundancy among the genes. Results The present study identifies a subset of features by maximizing the relevance and minimizing the redundancy of the selected genes. A merit called normalized mutual information is employed to measure the relevance and the redundancy of the genes. In order to find a more representative subset of features, an iterative procedure is adopted that incorporates an initial clustering followed by data partitioning and the application of the algorithm to each of the partitions. A leave-one-out approach then selects the most commonly selected genes across all the different runs and the gene selection algorithm is applied again to pare down the list of selected genes until a minimal subset is obtained that gives a satisfactory accuracy of classification. The algorithm was applied to three different data sets and the results obtained were compared to work done by others using the same data sets Conclusion This study presents an entropy-based iterative algorithm for selecting genes from microarray data that are able to classify various cancer sub-types with high accuracy. In addition, the feature set obtained is very compact, that is, the redundancy between genes is reduced to a large extent. This implies that classifiers can be built with a smaller subset of genes.
Coarse-graining using the relative entropy and simplex-based optimization methods in VOTCA
Rühle, Victor; Jochum, Mara; Koschke, Konstantin; Aluru, N. R.; Kremer, Kurt; Mashayak, S. Y.; Junghans, Christoph
2014-03-01
Coarse-grained (CG) simulations are an important tool to investigate systems on larger time and length scales. Several methods for systematic coarse-graining were developed, varying in complexity and the property of interest. Thus, the question arises which method best suits a specific class of system and desired application. The Versatile Object-oriented Toolkit for Coarse-graining Applications (VOTCA) provides a uniform platform for coarse-graining methods and allows for their direct comparison. We present recent advances of VOTCA, namely the implementation of the relative entropy method and downhill simplex optimization for coarse-graining. The methods are illustrated by coarse-graining SPC/E bulk water and a water-methanol mixture. Both CG models reproduce the pair distributions accurately. SYM is supported by AFOSR under grant 11157642 and by NSF under grant 1264282. CJ was supported in part by the NSF PHY11-25915 at KITP. K. Koschke acknowledges funding by the Nestle Research Center.
International Nuclear Information System (INIS)
Zhou Yunlong; Chen Fei; Sun Bin
2008-01-01
Based on the characteristic that wavelet packet transform image can be decomposed by different scales, a flow regime identification method based on image wavelet packet information entropy feature and genetic neural network was proposed. Gas-liquid two-phase flow images were captured by digital high speed video systems in horizontal pipe. The information entropy feature from transformation coefficients were extracted using image processing techniques and multi-resolution analysis. The genetic neural network was trained using those eigenvectors, which was reduced by the principal component analysis, as flow regime samples, and the flow regime intelligent identification was realized. The test result showed that image wavelet packet information entropy feature could excellently reflect the difference between seven typical flow regimes, and the genetic neural network with genetic algorithm and BP algorithm merits were with the characteristics of fast convergence for simulation and avoidance of local minimum. The recognition possibility of the network could reach up to about 100%, and a new and effective method was presented for on-line flow regime. (authors)
Directory of Open Access Journals (Sweden)
José Ernesto Nájera-Carpio
2015-07-01
Full Text Available In this work, the irreversible processes in light heating of Silicon (Si and Germanium (Ge thin films are examined. Each film is exposed to light irradiation with radiative and convective boundary conditions. Heat, electron and hole transport and generation-recombination processes of electron-hole pairs are studied in terms of a phenomenological model obtained from basic principles of irreversible thermodynamics. We present an analysis of the contributions to the entropy production in the stationary state due to the dissipative effects associated with electron and hole transport, generation-recombination of electron-hole pairs as well as heat transport. The most significant contribution to the entropy production comes from the interaction of light with the medium in both Si and Ge. This interaction includes two processes, namely, the generation of electron-hole pairs and the transferring of energy from the absorbed light to the lattice. In Si the following contribution in magnitude comes from the heat transport. In Ge all the remaining contributions to entropy production have nearly the same order of magnitude. The results are compared and explained addressing the differences in the magnitude of the thermodynamic forces, Onsager’s coefficients and transport properties of Si and Ge.
Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki
2017-08-01
In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.
Liang, Xuedong; Si, Dongyang; Zhang, Xinli
2017-10-13
According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.
Optimization of rainfall networks using information entropy and temporal variability analysis
Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin
2018-04-01
Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.
Entropy Generation Due to Natural Convection in a Partially Heated Cavity by Local RBF-DQ Method
DEFF Research Database (Denmark)
Soleimani, S.; Qajarjazi, A.; Bararnia, H.
2011-01-01
The Local Radial Basis Function-Differential Quadrature (RBF-DQ) method is applied to twodimensional incompressible Navier-Stokes equations in primitive form. Numerical results of heatlines and entropy generation due to heat transfer and fluid friction have been obtained for laminar natural...
International Nuclear Information System (INIS)
Kubo, S.; Narihara, K.; Tomita, Y.; Hasegawa, M.; Tsuzuki, T.; Mohri, A.
1988-01-01
A multichannel HCN laser interferometer system has been developed to investigate the plasma electron confinement properties in SPAC VII device. Maximum entropy method is applied to reconstruct the electron density profile from measured line integrated data. Particle diffusion coefficient in the peripheral region of the REB ring core spherator was obtained from the evolution of the density profile. (author)
International Nuclear Information System (INIS)
Ko, T.H.
2006-01-01
In the present paper, the entropy generation and optimal Reynolds number for developing forced convection in a double sine duct with various wall heat fluxes, which frequently occurs in plate heat exchangers, are studied based on the entropy generation minimization principle by analytical thermodynamic analysis as well as numerical investigation. According to the thermodynamic analysis, a very simple expression for the optimal Reynolds number for the double sine duct as a function of mass flow rate, wall heat flux, working fluid and geometric dimensions is proposed. In the numerical simulations, the investigated Reynolds number (Re) covers the range from 86 to 2000 and the wall heat flux (q'') varies as 160, 320 and 640 W/m 2 . From the numerical simulation of the developing laminar forced convection in the double sine duct, the effect of Reynolds number on entropy generation in the duct has been examined, through which the optimal Reynolds number with minimal entropy generation is detected. The optimal Reynolds number obtained from the analytical thermodynamic analysis is compared with the one from the numerical solutions and is verified to have a similar magnitude of entropy generation as the minimal entropy generation predicted by the numerical simulations. The optimal analysis provided in the present paper gives worthy information for heat exchanger design, since the thermal system could have the least irreversibility and best exergy utilization if the optimal Re can be used according to practical design conditions
Directory of Open Access Journals (Sweden)
Yi-Ming Kuo
2011-06-01
Full Text Available Fine airborne particulate matter (PM2.5 has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS, the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME method. The resulting epistemic framework can assimilate knowledge bases including: (a empirical-based spatial trends of PM concentration based on landuse regression, (b the spatio-temporal dependence among PM observation information, and (c site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan from 2005–2007.
Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming
2011-06-01
Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.
Directory of Open Access Journals (Sweden)
Umberto Lucia
2015-03-01
Full Text Available The interest in designing nanosystems is continuously growing. Engineers apply a great number of optimization methods to design macroscopic systems. If these methods could be introduced into the design of small systems, a great improvement in nanotechnologies could be achieved. To do so, however, it is necessary to extend classical thermodynamic analysis to small systems, but irreversibility is also present in small systems, as the Loschmidt paradox highlighted. Here, the use of the recent improvement of the Gouy-Stodola theorem to complex systems (GSGL approach, based on the use of entropy generation, is suggested to obtain the extension of classical thermodynamics to nanothermodynamics. The result is a new approach to nanosystems which avoids the difficulties highlighted in the usual analysis of the small systems, such as the definition of temperature for nanosystems.
International Nuclear Information System (INIS)
Clariá, F; Vallverdú, M; Caminal, P; Baranowski, R; Chojnowska, L
2008-01-01
In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time–frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0–0.04 Hz), low frequency band (LF, 0.04–0.15 Hz) and high frequency band (HF, 0.15–0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of
Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method
Directory of Open Access Journals (Sweden)
Majid Shadman Roodposhti
2016-09-01
Full Text Available Assessing Landslide Susceptibility Mapping (LSM contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70% and testing (≈30% samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC curves in combination with area under the curve (AUC. The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area.
Entropy analysis of convective MHD flow of third grade non-Newtonian fluid over a stretching sheet
Directory of Open Access Journals (Sweden)
M.M. Rashidi
2017-03-01
Full Text Available The purpose of this article is to study and analyze the convective flow of a third grade non-Newtonian fluid due to a linearly stretching sheet subject to a magnetic field. The dimensionless entropy generation equation is obtained by solving the reduced momentum and energy equations. The momentum and energy equations are reduced to a system of ordinary differential equations by a similarity method. The optimal homotopy analysis method (OHAM is used to solve the resulting system of ordinary differential equations. The effects of the magnetic field, Biot number and Prandtl number on the velocity component and temperature are studied. The results show that the thermal boundary-layer thickness gets decreased with increasing the Prandtl number. In addition, Brownian motion plays an important role to improve thermal conductivity of the fluid. The main purpose of the paper is to study the effects of Reynolds number, dimensionless temperature difference, Brinkman number, Hartmann number and other physical parameters on the entropy generation. These results are analyzed and discussed.
Adjoint entropy vs topological entropy
Giordano Bruno, Anna
2012-01-01
Recently the adjoint algebraic entropy of endomorphisms of abelian groups was introduced and studied. We generalize the notion of adjoint entropy to continuous endomorphisms of topological abelian groups. Indeed, the adjoint algebraic entropy is defined using the family of all finite-index subgroups, while we take only the subfamily of all open finite-index subgroups to define the topological adjoint entropy. This allows us to compare the (topological) adjoint entropy with the known topologic...
Directory of Open Access Journals (Sweden)
Jasleen Kaur
2013-01-01
Full Text Available Background: The induction dose of propofol is reduced with concomitant use of opioids as a result of a possible synergistic action. Aim and Objectives: The present study compared the effect of fentanyl and two doses of butorphanol pre-treatment on the induction dose of propofol, with specific emphasis on entropy. Methods: Three groups of 40 patients each, of the American Society of Anaesthesiologistsphysical status I and II, were randomized to receive fentanyl 2 μg/kg (Group F, butorphanol 20 μg/kg (Group B 20 or 40 μg/kg (Group B 40 as pre-treatment. Five minutes later, the degree of sedation was assessed by the observer′s assessment of alertness scale (OAA/S. Induction of anesthesia was done with propofol (30 mg/10 s till the loss of response to verbal commands. Thereafter, rocuronium 1 mg/kg was administered and endotracheal intubation was performed 2 min later. OAA/S, propofol induction dose, heart rate, blood pressure, oxygen saturation and entropy (response and state were compared in the three groups. Statistical Analysis: Data was analyzed using ANOVA test with posthoc significance, Kruskal-Wallis test, Chi-square test and Fischer exact test. A P<0.05 was considered as significant. Results: The induction dose of propofol (mg/kg was observed to be 1.1±0.50 in Group F, 1.05±0.35 in Group B 20 and 1.18±0.41 in Group B40. Induction with propofol occurred at higher entropy values on pre-treatment with both fentanyl as well as butorphanol. Hemodynamic variables were comparable in all the three groups. Conclusion: Butorphanol 20 μg/kg and 40 μg/kg reduce the induction requirement of propofol, comparable to that of fentanyl 2 μg/kg, and confer hemodynamic stability at induction and intubation.
An entropy-based improved k-top scoring pairs (TSP) method for ...
African Journals Online (AJOL)
DR. NJ TONUKARI
2012-06-05
Jun 5, 2012 ... Key words: Cancer classification, gene expression, k-TSP, information entropy, gene selection. INTRODUCTION ..... The 88 kDa precursor protein, progranulin, is also ... TCF3 is in acute myeloid leukemia pathway, so it is.
An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging
Directory of Open Access Journals (Sweden)
Pistorius Stephen
2010-01-01
Full Text Available During the last forty years, Subsurface Radar (SR has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI and Breast Microwave Radar (BMR, the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.
An Entropy-Based Propagation Speed Estimation Method for Near-Field Subsurface Radar Imaging
Flores-Tapia, Daniel; Pistorius, Stephen
2010-12-01
During the last forty years, Subsurface Radar (SR) has been used in an increasing number of noninvasive/nondestructive imaging applications, ranging from landmine detection to breast imaging. To properly assess the dimensions and locations of the targets within the scan area, SR data sets have to be reconstructed. This process usually requires the knowledge of the propagation speed in the medium, which is usually obtained by performing an offline measurement from a representative sample of the materials that form the scan region. Nevertheless, in some novel near-field SR scenarios, such as Microwave Wood Inspection (MWI) and Breast Microwave Radar (BMR), the extraction of a representative sample is not an option due to the noninvasive requirements of the application. A novel technique to determine the propagation speed of the medium based on the use of an information theory metric is proposed in this paper. The proposed method uses the Shannon entropy of the reconstructed images as the focal quality metric to generate an estimate of the propagation speed in a given scan region. The performance of the proposed algorithm was assessed using data sets collected from experimental setups that mimic the dielectric contrast found in BMI and MWI scenarios. The proposed method yielded accurate results and exhibited an execution time in the order of seconds.
Liu, Yong; Shu, Chi-Wang; Zhang, Mengping
2018-02-01
We present a discontinuous Galerkin (DG) scheme with suitable quadrature rules [15] for ideal compressible magnetohydrodynamic (MHD) equations on structural meshes. The semi-discrete scheme is analyzed to be entropy stable by using the symmetrizable version of the equations as introduced by Godunov [32], the entropy stable DG framework with suitable quadrature rules [15], the entropy conservative flux in [14] inside each cell and the entropy dissipative approximate Godunov type numerical flux at cell interfaces to make the scheme entropy stable. The main difficulty in the generalization of the results in [15] is the appearance of the non-conservative "source terms" added in the modified MHD model introduced by Godunov [32], which do not exist in the general hyperbolic system studied in [15]. Special care must be taken to discretize these "source terms" adequately so that the resulting DG scheme satisfies entropy stability. Total variation diminishing / bounded (TVD/TVB) limiters and bound-preserving limiters are applied to control spurious oscillations. We demonstrate the accuracy and robustness of this new scheme on standard MHD examples.
Maximum-entropy data restoration using both real- and Fourier-space analysis
International Nuclear Information System (INIS)
Anderson, D.M.; Martin, D.C.; Thomas, E.L.
1989-01-01
An extension of the maximum-entropy (ME) data-restoration method is presented that is sensitive to periodic correlations in data. The method takes advantage of the higher signal-to-noise ratio for periodic information in Fourier space, thus enhancing statistically significant frequencies in a manner which avoids the user bias inherent in conventional Fourier filtering. This procedure incorporates concepts underlying new approaches in quantum mechanics that consider entropies in both position and momentum spaces, although the emphasis here is on data restoration rather than quantum physics. After a fast Fourier transform of the image, the phases are saved and the array of Fourier moduli are restored using the maximum-entropy criterion. A first-order continuation method is introduced that speeds convergence of the ME computation. The restored moduli together with the original phases are then Fourier inverted to yield a new image; traditional real-space ME restoration is applied to this new image completing one stage in the restoration process. In test cases improvement can be obtained from two to four stages of iteration. It is shown that in traditional Fourier filtering spurious features can be induced by selection or elimination of Fourier components without regard to their statistical significance. With the present approach there is no such freedom for the user to exert personal bias, so that features present in the final image and power spectrum are those which have survived the tests of statistical significance in both real and Fourier space. However, it is still possible for periodicities to 'bleed' across sharp boundaries. An 'uncertainty' relation is derived describing the inverse relationship between the resolution of these boundaries and the level of noise that can be eliminated. (orig./BHO)
Electroencephalogram–Electromyography Coupling Analysis in Stroke Based on Symbolic Transfer Entropy
Directory of Open Access Journals (Sweden)
Yunyuan Gao
2018-01-01
Full Text Available The coupling strength between electroencephalogram (EEG and electromyography (EMG signals during motion control reflects the interaction between the cerebral motor cortex and muscles. Therefore, neuromuscular coupling characterization is instructive in assessing motor function. In this study, to overcome the limitation of losing the characteristics of signals in conventional time series symbolization methods, a variable scale symbolic transfer entropy (VS-STE analysis approach was proposed for corticomuscular coupling evaluation. Post-stroke patients (n = 5 and healthy volunteers (n = 7 were recruited and participated in various tasks (left and right hand gripping, elbow bending. The proposed VS-STE was employed to evaluate the corticomuscular coupling strength between the EEG signal measured from the motor cortex and EMG signal measured from the upper limb in both the time-domain and frequency-domain. Results showed a greater strength of the bi-directional (EEG-to-EMG and EMG-to-EEG VS-STE in post-stroke patients compared to healthy controls. In addition, the strongest EEG–EMG coupling strength was observed in the beta frequency band (15–35 Hz during the upper limb movement. The predefined coupling strength of EMG-to-EEG in the affected side of the patient was larger than that of EEG-to-EMG. In conclusion, the results suggested that the corticomuscular coupling is bi-directional, and the proposed VS-STE can be used to quantitatively characterize the non-linear synchronization characteristics and information interaction between the primary motor cortex and muscles.
Comprehensive benefits analysis of steel structure modular residence based on the entropy evaluation
Zhang, Xiaoxiao; Wang, Li; Jiang, Pengming
2017-04-01
Steel structure modular residence is the outstanding residential industrialization. It has many advantages, such as the low whole cost, high resource recovery, a high degree of industrialization. This paper compares the comprehensive benefits of steel structural in modular buildings with prefabricated reinforced concrete residential from economic benefits, environmental benefits, social benefits and technical benefits by the method of entropy evaluation. Finally, it is concluded that the comprehensive benefits of steel structural in modular buildings is better than that of prefabricated reinforced concrete residential. The conclusion of this study will provide certain reference significance to the development of steel structural in modular buildings in China.
Analysis of financial time series using multiscale entropy based on skewness and kurtosis
Xu, Meng; Shang, Pengjian
2018-01-01
There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.
Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard
2015-05-01
It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high
Tang, Shaolei; Yang, Xiaofeng; Dong, Di; Li, Ziwei
2015-12-01
Sea surface temperature (SST) is an important variable for understanding interactions between the ocean and the atmosphere. SST fusion is crucial for acquiring SST products of high spatial resolution and coverage. This study introduces a Bayesian maximum entropy (BME) method for blending daily SSTs from multiple satellite sensors. A new spatiotemporal covariance model of an SST field is built to integrate not only single-day SSTs but also time-adjacent SSTs. In addition, AVHRR 30-year SST climatology data are introduced as soft data at the estimation points to improve the accuracy of blended results within the BME framework. The merged SSTs, with a spatial resolution of 4 km and a temporal resolution of 24 hours, are produced in the Western Pacific Ocean region to demonstrate and evaluate the proposed methodology. Comparisons with in situ drifting buoy observations show that the merged SSTs are accurate and the bias and root-mean-square errors for the comparison are 0.15°C and 0.72°C, respectively.
Assessment of the Sustainable Development Capacity with the Entropy Weight Coefficient Method
Directory of Open Access Journals (Sweden)
Qingsong Wang
2015-10-01
Full Text Available Sustainable development is widely accepted in the world. How to reflect the sustainable development capacity of a region is an important issue for enacting policies and plans. An index system for capacity assessment is established by employing the Entropy Weight Coefficient method. The results indicate that the sustainable development capacity of Shandong Province is improving in terms of its economy subsystem, resource subsystem, and society subsystem whilst degrading in its environment subsystem. Shandong Province has shown the general trend towards sustainable development. However, the sustainable development capacity can be constrained by the resources such as energy, land, water, as well as environmental protection. These issues are induced by the economy development model, the security of energy supply, the level of new energy development, the end-of-pipe control of pollution, and the level of science and technology commercialization. Efforts are required to accelerate the development of the tertiary industry, the commercialization of high technology, the development of new energy and renewable energy, and the structure optimization of energy mix. Long-term measures need to be established for the ecosystem and environment protection.
A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method
Energy Technology Data Exchange (ETDEWEB)
Huang, Shengzhi; Ming, Bo; Huang, Qiang; Leng, Guoyong; Hou, Beibei
2017-05-05
It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecasting models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.
Energy and entropy analysis of closed adiabatic expansion based trilateral cycles
International Nuclear Information System (INIS)
Garcia, Ramon Ferreiro; Carril, Jose Carbia; Gomez, Javier Romero; Gomez, Manuel Romero
2016-01-01
Highlights: • The adiabatic expansion based TC surpass Carnot factor at low temperatures. • The fact of surpassing Carnot factor doesn’t violate the 2nd law. • An entropy analysis is applied to verify the fulfilment of the second law. • Correction of the exergy transfer associated with heat transferred to a cycle. - Abstract: A vast amount of heat energy is available at low cost within the range of medium and low temperatures. Existing thermal cycles cannot make efficient use of such available low grade heat because they are mainly based on conventional organic Rankine cycles which are limited by Carnot constraints. However, recent developments related to the performance of thermal cycles composed of closed processes have led to the exceeding of the Carnot factor. Consequently, once the viability of closed process based thermal cycles that surpass the Carnot factor operating at low and medium temperatures is globally accepted, research work will aim at looking into the consequences that lead from surpassing the Carnot factor while fulfilling the 2nd law, its impact on the 2nd law efficiency definition as well as the impact on the exergy transfer from thermal power sources to any heat consumer, including thermal cycles. The methodology used to meet the proposed objectives involves the analysis of energy and entropy on trilateral closed process based thermal cycles. Thus, such energy and entropy analysis is carried out upon non-condensing mode trilateral thermal cycles (TCs) characterised by the conversion of low grade heat into mechanical work undergoing closed adiabatic path functions: isochoric heat absorption, adiabatic heat to mechanical work conversion and isobaric heat rejection. Firstly, cycle energy analysis is performed to determine the range of some relevant cycle parameters, such as the operating temperatures and their associated pressures, entropies, internal energies and specific volumes. In this way, the ranges of temperatures within which
Directory of Open Access Journals (Sweden)
C Hauman
2014-06-01
Full Text Available The vehicle routing problem with time windows is a widely studied problem with many real-world applications. The problem considered here entails the construction of routes that a number of identical vehicles travel to service different nodes within a certain time window. New benchmark problems with multi-objective features were recently suggested in the literature and the multi-objective optimisation cross-entropy method is applied to these problems to investigate the feasibility of the method and to determine and propose reference solutions for the benchmark problems. The application of the cross-entropy method to the multi-objective vehicle routing problem with soft time windows is investigated. The objectives that are evaluated include the minimisation of the total distance travelled, the number of vehicles and/or routes, the total waiting time and delay time of the vehicles and the makespan of a route.
Jesudason, Christopher G.
2003-09-01
Recently, there have appeared interesting correctives or challenges [Entropy 1999, 1, 111-147] to the Second law formulations, especially in the interpretation of the Clausius equivalent transformations, closely related in area to extensions of the Clausius principle to irreversible processes [Chem. Phys. Lett. 1988, 143(1), 65-70]. Since the traditional formulations are central to science, a brief analysis of some of these newer theories along traditional lines is attempted, based on well-attested axioms which have formed the basis of equilibrium thermodynamics. It is deduced that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the "Universe") is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer. On the basis of two new simple theorems and three corollaries derived for the correlation between irreversible and reversible pathways and the traditional axiomatics, it is shown that a sequence of reversible states can never be used to describe a corresponding sequence of irreversible states for at least closed systems, thereby restricting the principle of local equilibrium. It is further shown that some of the newer irreversible entropy forms given exhibit some paradoxical properties relative to the standard axiomatics. It is deduced that any reconciliation between the traditional approach and novel theories lie in creating a well defined set of axioms to which all theoretical developments should attempt to be based on unless proven not be useful, in which case there should be consensus in removing such axioms from theory. Clausius' theory of equivalent transformations do not contradict the traditional understanding of heat- work efficiency. It is concluded that the intuitively derived assumptions over the last two centuries seem to
Directory of Open Access Journals (Sweden)
Wen-Yao Pan
2015-01-01
Full Text Available Obstructive sleep apnea (OSA is an independent cardiovascular risk factor to which autonomic nervous dysfunction has been reported to be an important contributor. Ninety subjects recruited from the sleep center of a single medical center were divided into four groups: normal snoring subjects without OSA (apnea hypopnea index, AHI < 5, n = 11, mild OSA (5 ≤ AHI < 15, n = 10, moderate OSA (15 ≤ AHI < 30, n = 24, and severe OSA (AHI ≥ 30, n = 45. Demographic (i.e., age, gender, anthropometric (i.e., body mass index, neck circumference, and polysomnographic (PSG data were recorded and compared among the different groups. For each subject, R-R intervals (RRI from 10 segments of 10-minute electrocardiogram recordings during non-rapid eye movement sleep at stage N2 were acquired and analyzed for heart rate variability (HRV and sample entropy using multiscale entropy index (MEI that was divided into small scale (MEISS, scale 1–5 and large scale (MEILS, scale 6–10. Our results not only demonstrated that MEISS could successfully distinguish normal snoring subjects and those with mild OSA from those with moderate and severe disease, but also revealed good correlation between MEISS and AHI with Spearman correlation analysis (r = −0.684, p < 0.001. Therefore, using the two parameters of EEG and ECG, MEISS may serve as a simple preliminary screening tool for assessing the severity of OSA before proceeding to PSG analysis.
Directory of Open Access Journals (Sweden)
Mathias Baumert
2014-12-01
Full Text Available Autonomic activity affects beat-to-beat variability of heart rate and QT interval. The aim of this study was to explore whether entropy measures are suitable to detect changes in neural outflow to the heart elicited by two different stress paradigms. We recorded short-term ECG in 11 normal subjects during an experimental protocol that involved head-up tilt and mental arithmetic stress and computed sample entropy, cross-sample entropy and causal interactions based on conditional entropy from RR and QT interval time series. Head-up tilt resulted in a significant reduction in sample entropy of RR intervals and cross-sample entropy, while mental arithmetic stress resulted in a significant reduction in coupling directed from RR to QT. In conclusion, measures of entropy are suitable to detect changes in neural outflow to the heart and decoupling of repolarisation variability from heart rate variability elicited by orthostatic or mental arithmetic stress.
A Bayes-Maximum Entropy method for multi-sensor data fusion
Energy Technology Data Exchange (ETDEWEB)
Beckerman, M.
1991-01-01
In this paper we introduce a Bayes-Maximum Entropy formalism for multi-sensor data fusion, and present an application of this methodology to the fusion of ultrasound and visual sensor data as acquired by a mobile robot. In our approach the principle of maximum entropy is applied to the construction of priors and likelihoods from the data. Distances between ultrasound and visual points of interest in a dual representation are used to define Gibbs likelihood distributions. Both one- and two-dimensional likelihoods are presented, and cast into a form which makes explicit their dependence upon the mean. The Bayesian posterior distributions are used to test a null hypothesis, and Maximum Entropy Maps used for navigation are updated using the resulting information from the dual representation. 14 refs., 9 figs.
Variations mechanism in entropy of wave height field and its relation with thermodynamic entropy
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
This paper gives a brief description of annual period and seasonal variation in the wave height field entropy in the northeastern Pacific. A calculation of the quantity of the, received by lithosphere systems in the northern hemisphere is introduced. The wave heat field entropy is compared with the difference in the quantity of the sun's radiation heat. Analysis on the transfer method, period and lag of this seasonal variation led to the conclusion that the annual period and seasonal variation in the entropy of the wave height field in the Northwestern Pacific is due to the seasonal variation of the sun's radiation heat. Furthermore, the inconsistency between thermodynamic entropy and information entropy was studied.
Directory of Open Access Journals (Sweden)
Yanzhu Hu
2016-09-01
Full Text Available Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named small-shuffle symbolic transfer entropy spectrum (SSSTES, for inferring association networks from multivariate time series. The method can solve four problems for inferring association networks, i.e., strong correlation identification, correlation quantification, direction identification and temporal relation identification. The method can be divided into four layers. The first layer is the so-called data layer. Data input and processing are the things to do in this layer. In the second layer, we symbolize the model data, original data and shuffled data, from the previous layer and calculate circularly transfer entropy with different time lags for each pair of time series variables. Thirdly, we compose transfer entropy spectrums for pairwise time series with the previous layer’s output, a list of transfer entropy matrix. We also identify the correlation level between variables in this layer. In the last layer, we build a weighted adjacency matrix, the value of each entry representing the correlation level between pairwise variables, and then get the weighted directed association network. Three sets of numerical simulated data from a linear system, a nonlinear system and a coupled Rossler system are used to show how the proposed approach works. Finally, we apply SSSTES to a real industrial system and get a better result than with two other methods.
Scale-specific effects: A report on multiscale analysis of acupunctured EEG in entropy and power
Song, Zhenxi; Deng, Bin; Wei, Xile; Cai, Lihui; Yu, Haitao; Wang, Jiang; Wang, Ruofan; Chen, Yingyuan
2018-02-01
Investigating acupuncture effects contributes to improving clinical application and understanding neuronal dynamics under external stimulation. In this report, we recorded electroencephalography (EEG) signals evoked by acupuncture at ST36 acupoint with three stimulus frequencies of 50, 100 and 200 times per minutes, and selected non-acupuncture EEGs as the control group. Multiscale analyses were introduced to investigate the possible acupuncture effects on complexity and power in multiscale level. Using multiscale weighted-permutation entropy, we found the significant effects on increased complexity degree in EEG signals induced by acupuncture. The comparison of three stimulation manipulations showed that 100 times/min generated most obvious effects, and affected most cortical regions. By estimating average power spectral density, we found decreased power induced by acupuncture. The joint distribution of entropy and power indicated an inverse correlation, and this relationship was weakened by acupuncture effects, especially under the manipulation of 100 times/min frequency. Above findings are more evident and stable in large scales than small scales, which suggests that multiscale analysis allows evaluating significant effects in specific scale and enables to probe the inherent characteristics underlying physiological signals.
Application of maximum entropy to neutron tunneling spectroscopy
International Nuclear Information System (INIS)
Mukhopadhyay, R.; Silver, R.N.
1990-01-01
We demonstrate the maximum entropy method for the deconvolution of high resolution tunneling data acquired with a quasielastic spectrometer. Given a precise characterization of the instrument resolution function, a maximum entropy analysis of lutidine data obtained with the IRIS spectrometer at ISIS results in an effective factor of three improvement in resolution. 7 refs., 4 figs
Time series analysis of the Antarctic Circumpolar Wave via symbolic transfer entropy
Oh, Mingi; Kim, Sehyun; Lim, Kyuseong; Kim, Soo Yong
2018-06-01
An attempt to interpret a large-scale climate phenomenon in the Southern Ocean (SO), the Antarctic Circumpolar Wave (ACW), has been made using an information entropy method, symbolic transfer entropy (STE). Over the areas of 50-60∘S latitude belt, information flow for four climate variables, sea surface temperature (SST), sea-ice edge (SIE), sea level pressure (SLP) and meridional wind speed (MWS) is examined. We found a tendency that eastward flow of information is preferred only for oceanic variables, which is a main characteristic of the ACW, an eastward wave making a circuit around the Antarctica. Since the ACW is the coherent pattern in both ocean and atmosphere it is reasonable to infer that the tendency reflects the Antarctic Circumpolar Current (ACC) encircling the Antarctica, rather than an evidence of the ACW. We observed one common feature for all four variables, a strong information flow over the area of the eastern Pacific Ocean, which suggest a signature of El Nino Southern Oscillation (ENSO).
Tang, Qingxin; Bo, Yanchen; Zhu, Yuxin
2016-04-01
Merging multisensor aerosol optical depth (AOD) products is an effective way to produce more spatiotemporally complete and accurate AOD products. A spatiotemporal statistical data fusion framework based on a Bayesian maximum entropy (BME) method was developed for merging satellite AOD products in East Asia. The advantages of the presented merging framework are that it not only utilizes the spatiotemporal autocorrelations but also explicitly incorporates the uncertainties of the AOD products being merged. The satellite AOD products used for merging are the Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5.1 Level-2 AOD products (MOD04_L2) and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Deep Blue Level 2 AOD products (SWDB_L2). The results show that the average completeness of the merged AOD data is 95.2%,which is significantly superior to the completeness of MOD04_L2 (22.9%) and SWDB_L2 (20.2%). By comparing the merged AOD to the Aerosol Robotic Network AOD records, the results show that the correlation coefficient (0.75), root-mean-square error (0.29), and mean bias (0.068) of the merged AOD are close to those (the correlation coefficient (0.82), root-mean-square error (0.19), and mean bias (0.059)) of the MODIS AOD. In the regions where both MODIS and SeaWiFS have valid observations, the accuracy of the merged AOD is higher than those of MODIS and SeaWiFS AODs. Even in regions where both MODIS and SeaWiFS AODs are missing, the accuracy of the merged AOD is also close to the accuracy of the regions where both MODIS and SeaWiFS have valid observations.
International Nuclear Information System (INIS)
Rangel-Hernandez, V.H.; Damian-Ascencio, C.; Juarez-Robles, D.; Gallegos-Munoz, A.; Zaleta-Aguilar, A.; Plascencia-Mora, H.
2011-01-01
The present paper aims at investigating the main sources of irreversibility in a Proton Exchange Membrane Fuel Cell (PEMFC) using a Fermat spiral as flow distributor and also to direct possible improvements in its design. The numerical analysis is based on a finite volume technique with a SIMPLE algorithm as numerical procedure. In order to have a more complete and rigorous analysis a new dimensionless parameter is proposed here. The parameter represents the ratio of the entropy generation due to mass transfer to the total entropy generation is proposed here. Results demonstrate that the main sources of irreversibility in a fuel cell are the concentration losses for the most part of the operational domain, whereas the heat transfer effect is not dominant. -- Highlights: → PEM Fuel Cell with Fermat Spiral as distributor. → Causes of irreversibilities. → A new dimensionless parameter to determine contribution of mass transfer in entropy generation.
The entropy dissipation method for spatially inhomogeneous reaction-diffusion-type systems
Di Francesco, M.; Fellner, K.; Markowich, P. A
2008-01-01
and reaction terms and admit fewer conservation laws than the size of the system. In particular, we successfully apply the entropy approach to general linear systems and to a nonlinear example of a reaction-diffusion-convection system arising in solid
Directory of Open Access Journals (Sweden)
Rajesh Kumar Bhuyan
2016-06-01
Full Text Available The objective of this paper is to optimize the process parameters by combined approach of VIKOR and Entropy weight measurement method during Electrical discharge machining (EDM process of Al-18wt.%SiCp metal matrix composite (MMC. The central composite design (CCD method is considered to evaluate the effect of three process parameters; namely pulse on time (Ton, peak current (Ip and flushing pressure (Fp on the responses like material removal rate (MRR, tool wear rate (TWR, Radial over cut (ROC and surface roughness (Ra. The Entropy weight measurement method evaluates the individual weights of each response and, using VIKOR method, the multi-objective responses are optimized to get a single numerical index known as VIKOR Index. Then the Analysis of Variance (ANOVA technique is used to determine the significance of the process parameters on the VIKOR Index. Finally, the result of the VIKOR Indexed is validated by conformation test using the liner mathematical model equation develop by responses surface methodology to identify the effectiveness of the proposed method.
Analysis of crack propagation in concrete structures with structural information entropy
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
The propagation of cracks in concrete structures causes energy dissipation and release, and also causes energy redistribution in the structures. Entropy can characterize the energy redistribution. To investigate the relation between the propagation of cracks and the entropy in concrete structures, cracked concrete structures are treated as dissipative structures. Structural information entropy is defined for concrete structures. A compact tension test is conducted. Meanwhile, numerical simulations are also carried out. Both the test and numerical simulation results show that the structural information entropy in the structures can characterize the propagation of cracks in concrete structures.
International Nuclear Information System (INIS)
Papaioannou, V E; Pneumatikos, I A; Chouvarda, I G; Maglaveras, N K; Baltopoulos, G I
2013-01-01
A few studies estimating temperature complexity have found decreased Shannon entropy, during severe stress. In this study, we measured both Shannon and Tsallis entropy of temperature signals in a cohort of critically ill patients and compared these measures with the sequential organ failure assessment (SOFA) score, in terms of intensive care unit (ICU) mortality. Skin temperature was recorded in 21 mechanically ventilated patients, who developed sepsis and septic shock during the first 24 h of an ICU-acquired infection. Shannon and Tsallis entropies were calculated in wavelet-based decompositions of the temperature signal. Statistically significant differences of entropy features were tested between survivors and non-survivors and classification models were built, for predicting final outcome. Significantly reduced Tsallis and Shannon entropies were found in non-survivors (seven patients, 33%) as compared to survivors. Wavelet measurements of both entropy metrics were found to predict ICU mortality better than SOFA, according to a combination of area under the curve, sensitivity and specificity values. Both entropies exhibited similar prognostic accuracy. Combination of SOFA and entropy presented improved the outcome of univariate models. We suggest that reduced wavelet Shannon and Tsallis entropies of temperature signals may complement SOFA in mortality prediction, during the first 24 h of an ICU-acquired infection. (paper)
Entropy and Digital Installation
Directory of Open Access Journals (Sweden)
Susan Ballard
2005-01-01
Full Text Available This paper examines entropy as a process which introduces ideas of distributed materiality to digital installation. Beginning from an analysis of entropy as both force and probability measure within information theory and it’s extension in Ruldof Arnheim’s text ‘Entropy and Art” it develops an argument for the positive rather thannegative forces of entropy. The paper centres on a discussion of two recent works by New Zealand artists Ronnie van Hout (“On the Run”, Wellington City Gallery, NZ, 2004 and Alex Monteith (“Invisible Cities”, Physics Room Contemporary Art Space, Christchurch, NZ, 2004. Ballard suggests that entropy, rather than being a hindrance to understanding or a random chaotic force, discloses a necessary and material politics of noise present in digital installation.
2D Tsallis Entropy for Image Segmentation Based on Modified Chaotic Bat Algorithm
Directory of Open Access Journals (Sweden)
Zhiwei Ye
2018-03-01
Full Text Available Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum entropy, 1D cross entropy, 1D Tsallis entropy, fuzzy entropy, 2D Fisher, 2D maximum entropy and 2D cross entropy. On the other hand, due to the existence of huge computational costs, meta-heuristics algorithms like genetic algorithm (GA, particle swarm optimization (PSO, ant colony optimization algorithm (ACO and differential evolution algorithm (DE are used to accelerate the 2D Tsallis entropy thresholding method. In this paper, considering 2D Tsallis entropy as a constrained optimization problem, the optimal thresholds are acquired by maximizing the objective function using a modified chaotic Bat algorithm (MCBA. The proposed algorithm has been tested on some actual and infrared images. The results are compared with that of PSO, GA, ACO and DE and demonstrate that the proposed method outperforms other approaches involved in the paper, which is a feasible and effective option for image segmentation.
Nonsymmetric entropy and maximum nonsymmetric entropy principle
International Nuclear Information System (INIS)
Liu Chengshi
2009-01-01
Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.
Directory of Open Access Journals (Sweden)
Guiji Tang
2016-01-01
Full Text Available A novel method of fault diagnosis for rolling bearing, which combines the dual tree complex wavelet packet transform (DTCWPT, the improved multiscale permutation entropy (IMPE, and the linear local tangent space alignment (LLTSA with the extreme learning machine (ELM, is put forward in this paper. In this method, in order to effectively discover the underlying feature information, DTCWPT, which has the attractive properties as nearly shift invariance and reduced aliasing, is firstly utilized to decompose the original signal into a set of subband signals. Then, IMPE, which is designed to reduce the variability of entropy measures, is applied to characterize the properties of each obtained subband signal at different scales. Furthermore, the feature vectors are constructed by combining IMPE of each subband signal. After the feature vectors construction, LLTSA is employed to compress the high dimensional vectors of the training and the testing samples into the low dimensional vectors with better distinguishability. Finally, the ELM classifier is used to automatically accomplish the condition identification with the low dimensional feature vectors. The experimental data analysis results validate the effectiveness of the presented diagnosis method and demonstrate that this method can be applied to distinguish the different fault types and fault degrees of rolling bearings.
International Nuclear Information System (INIS)
De Nicola, Sergio; Fedele, Renato; Man'ko, Margarita A; Man'ko, Vladimir I
2007-01-01
The tomographic-probability description of quantum states is reviewed. The symplectic tomography of quantum states with continuous variables is studied. The symplectic entropy of the states with continuous variables is discussed and its relation to Shannon entropy and information is elucidated. The known entropic uncertainty relations of the probability distribution in position and momentum of a particle are extended and new uncertainty relations for symplectic entropy are obtained. The partial case of symplectic entropy, which is optical entropy of quantum states, is considered. The entropy associated to optical tomogram is shown to satisfy the new entropic uncertainty relation. The example of Gaussian states of harmonic oscillator is studied and the entropic uncertainty relations for optical tomograms of the Gaussian state are shown to minimize the uncertainty relation
Permutation entropy analysis of financial time series based on Hill's diversity number
Zhang, Yali; Shang, Pengjian
2017-12-01
In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.
An entropy-based analysis of lane changing behavior: An interactive approach.
Kosun, Caglar; Ozdemir, Serhan
2017-05-19
As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would
International Nuclear Information System (INIS)
Maes, Christian
2012-01-01
In contrast to the quite unique entropy concept useful for systems in (local) thermodynamic equilibrium, there is a variety of quite distinct nonequilibrium entropies, reflecting different physical points. We disentangle these entropies as they relate to heat, fluctuations, response, time asymmetry, variational principles, monotonicity, volume contraction or statistical forces. However, not all of those extensions yield state quantities as understood thermodynamically. At the end we sketch how aspects of dynamical activity can take over for obtaining an extended Clausius relation.
On the maximum-entropy method for kinetic equation of radiation, particle and gas
International Nuclear Information System (INIS)
El-Wakil, S.A.; Madkour, M.A.; Degheidy, A.R.; Machali, H.M.
1995-01-01
The maximum-entropy approach is used to calculate some problems in radiative transfer and reactor physics such as the escape probability, the emergent and transmitted intensities for a finite slab as well as the emergent intensity for a semi-infinite medium. Also, it is employed to solve problems involving spherical geometry, such as luminosity (the total energy emitted by a sphere), neutron capture probability and the albedo problem. The technique is also employed in the kinetic theory of gases to calculate the Poiseuille flow and thermal creep of a rarefied gas between two plates. Numerical calculations are achieved and compared with the published data. The comparisons demonstrate that the maximum-entropy results are good in agreement with the exact ones. (orig.)
Information entropy method and the description of echo hologram formation in gaseous media
Garnaeva, G. I.; Nefediev, L. A.; Akhmedshina, E. N.
2018-02-01
The effect of collisions with a change in velocity of gas particles, on the value of information entropy, is associated with the spectral structure of the echo hologram’s response, where its temporal form is considered. It is shown that collisions with a change in gas particle velocity increase the ‘parasitical’ information, on the background of which the information contained in the temporary shape of the object laser pulse is lost.
Directory of Open Access Journals (Sweden)
Lijuan Xiang
2017-06-01
Full Text Available This paper identifies the Wireless Power Transfer Network (WPTN as an ideal model for long-distance Wireless Power Transfer (WPT in a certain region with multiple electrical equipment. The schematic circuit and design of each power node and the process of power transmission between the two power nodes are elaborated. The Improved Cross-Entropy (ICE method is proposed as an algorithm to solve for optimal energy route. Non-dominated sorting is introduced for optimization. A demonstration of the optimization result of a 30-nodes WPTN system based on the proposed algorithm proves ICE method to be efficacious and efficiency.
DEFF Research Database (Denmark)
Escudero, Javier; Evrim, Acar Ataman; Fernández, Alberto
2015-01-01
dynamics. We consider the "refined composite multiscale entropy" (rcMSE), which computes entropy "profiles" showing levels of physiological complexity over temporal scales for individual signals. We compute the rcMSE of resting-state magnetoencephalogram (MEG) recordings from 36 patients with Alzheimer...
Integrating Entropy and Closed Frequent Pattern Mining for Social Network Modelling and Analysis
Adnan, Muhaimenul; Alhajj, Reda; Rokne, Jon
The recent increase in the explicitly available social networks has attracted the attention of the research community to investigate how it would be possible to benefit from such a powerful model in producing effective solutions for problems in other domains where the social network is implicit; we argue that social networks do exist around us but the key issue is how to realize and analyze them. This chapter presents a novel approach for constructing a social network model by an integrated framework that first preparing the data to be analyzed and then applies entropy and frequent closed patterns mining for network construction. For a given problem, we first prepare the data by identifying items and transactions, which arc the basic ingredients for frequent closed patterns mining. Items arc main objects in the problem and a transaction is a set of items that could exist together at one time (e.g., items purchased in one visit to the supermarket). Transactions could be analyzed to discover frequent closed patterns using any of the well-known techniques. Frequent closed patterns have the advantage that they successfully grab the inherent information content of the dataset and is applicable to a broader set of domains. Entropies of the frequent closed patterns arc used to keep the dimensionality of the feature vectors to a reasonable size; it is a kind of feature reduction process. Finally, we analyze the dynamic behavior of the constructed social network. Experiments were conducted on a synthetic dataset and on the Enron corpus email dataset. The results presented in the chapter show that social networks extracted from a feature set as frequent closed patterns successfully carry the community structure information. Moreover, for the Enron email dataset, we present an analysis to dynamically indicate the deviations from each user's individual and community profile. These indications of deviations can be very useful to identify unusual events.
Comparison of Boltzmann and Gibbs entropies for the analysis of single-chain phase transitions
Shakirov, T.; Zablotskiy, S.; Böker, A.; Ivanov, V.; Paul, W.
2017-03-01
In the last 10 years, flat histogram Monte Carlo simulations have contributed strongly to our understanding of the phase behavior of simple generic models of polymers. These simulations result in an estimate for the density of states of a model system. To connect this result with thermodynamics, one has to relate the density of states to the microcanonical entropy. In a series of publications, Dunkel, Hilbert and Hänggi argued that it would lead to a more consistent thermodynamic description of small systems, when one uses the Gibbs definition of entropy instead of the Boltzmann one. The latter is the logarithm of the density of states at a certain energy, the former is the logarithm of the integral of the density of states over all energies smaller than or equal to this energy. We will compare the predictions using these two definitions for two polymer models, a coarse-grained model of a flexible-semiflexible multiblock copolymer and a coarse-grained model of the protein poly-alanine. Additionally, it is important to note that while Monte Carlo techniques are normally concerned with the configurational energy only, the microcanonical ensemble is defined for the complete energy. We will show how taking the kinetic energy into account alters the predictions from the analysis. Finally, the microcanonical ensemble is supposed to represent a closed mechanical N-particle system. But due to Galilei invariance such a system has two additional conservation laws, in general: momentum and angular momentum. We will also show, how taking these conservation laws into account alters the results.
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Shannon entropy in time domain is a measure of signal or system uncertainty. When based on spectrum entropy, Shannon entropy can be taken as a measure of signal or system complexity.Therefore, wavelet analysis based on wavelet entropy measure can signify the complexity of non-steady signal or system in both time and frequency domain. In this paper, in order to meet the requirements of post-analysis on abundant wavelet transform result data and the need of information mergence, the basic definition of wavelet entropy measure is proposed, corresponding algorithms of several wavelet entropies, such as wavelet average entropy, wavelet time-frequency entropy, wavelet distance entropy,etc. are put forward, and the physical meanings of these entropies are analyzed as well. The application principle of wavelet entropy measure in ElectroEncephaloGraphy (EEG) signal analysis, mechanical fault diagnosis, fault detection and classification in power system are analyzed. Finally, take the transmission line fault detection in power system for example, simulations in two different systems, a 10kV automatic blocking and continuous power transmission line and a 500kV Extra High Voltage (EHV) transmission line, are carried out, and the two methods, wavelet entropy and wavelet modulus maxima, are compared, the results show feasibility and application prospect of the six wavelet entropies.
Han, Yongming; Long, Chang; Geng, Zhiqiang; Zhang, Keyu
2018-01-01
Environmental protection and carbon emission reduction play a crucial role in the sustainable development procedure. However, the environmental efficiency analysis and evaluation based on the traditional data envelopment analysis (DEA) cross model is subjective and inaccurate, because all elements in a column or a row of the cross evaluation matrix (CEM) in the traditional DEA cross model are given the same weight. Therefore, this paper proposes an improved environmental DEA cross model based on the information entropy to analyze and evaluate the carbon emission of industrial departments in China. The information entropy is applied to build the entropy distance based on the turbulence of the whole system, and calculate the weights in the CEM of the environmental DEA cross model in a dynamic way. The theoretical results show that the new weight constructed based on the information entropy is unique and optimal globally by using the Monte Carlo simulation. Finally, compared with the traditional environmental DEA and DEA cross model, the improved environmental DEA cross model has a better efficiency discrimination ability based on the data of industrial departments in China. Moreover, the proposed model can obtain the potential of carbon emission reduction of industrial departments to improve the energy efficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ishimatsu, N; Takata, M; Nishibori, E; Sakata, M; Hayashi, J; Shirotani, I; Shimomura, O
2002-01-01
The physical properties relating to 4f electrons in cerium phosphide, especially the temperature dependence and the isomorphous transition that occurs at around 10 GPa, were studied by means of x-ray powder diffraction and charge density distribution maps derived by the maximum-entropy method. The compressibility of CeP was exactly determined using a helium pressure medium and the anomaly that indicated the isomorphous transition was observed in the compressibility. We also discuss the anisotropic charge density distribution of Ce ions and its temperature dependence.
Liu, Zhigang; Han, Zhiwei; Zhang, Yang; Zhang, Qiaoge
2014-11-01
Multiwavelets possess better properties than traditional wavelets. Multiwavelet packet transformation has more high-frequency information. Spectral entropy can be applied as an analysis index to the complexity or uncertainty of a signal. This paper tries to define four multiwavelet packet entropies to extract the features of different transmission line faults, and uses a radial basis function (RBF) neural network to recognize and classify 10 fault types of power transmission lines. First, the preprocessing and postprocessing problems of multiwavelets are presented. Shannon entropy and Tsallis entropy are introduced, and their difference is discussed. Second, multiwavelet packet energy entropy, time entropy, Shannon singular entropy, and Tsallis singular entropy are defined as the feature extraction methods of transmission line fault signals. Third, the plan of transmission line fault recognition using multiwavelet packet entropies and an RBF neural network is proposed. Finally, the experimental results show that the plan with the four multiwavelet packet energy entropies defined in this paper achieves better performance in fault recognition. The performance with SA4 (symmetric antisymmetric) multiwavelet packet Tsallis singular entropy is the best among the combinations of different multiwavelet packets and the four multiwavelet packet entropies.
Quantum key distribution with finite resources: Smooth Min entropy vs. Smooth Renyi entropy
Energy Technology Data Exchange (ETDEWEB)
Mertz, Markus; Abruzzo, Silvestre; Bratzik, Sylvia; Kampermann, Hermann; Bruss, Dagmar [Institut fuer Theoretische Physik III, Duesseldorf (Germany)
2010-07-01
We consider different entropy measures that play an important role in the analysis of the security of QKD with finite resources. The smooth min entropy leads to an optimal bound for the length of a secure key. Another bound on the secure key length was derived by using Renyi entropies. Unfortunately, it is very hard or even impossible to calculate these entropies for realistic QKD scenarios. To estimate the security rate it becomes important to find computable bounds on these entropies. Here, we compare a lower bound for the smooth min entropy with a bound using Renyi entropies. We compare these entropies for the six-state protocol with symmetric attacks.
DYNAMIC PARAMETER ESTIMATION BASED ON MINIMUM CROSS-ENTROPY METHOD FOR COMBINING INFORMATION SOURCES
Czech Academy of Sciences Publication Activity Database
Sečkárová, Vladimíra
2015-01-01
Roč. 24, č. 5 (2015), s. 181-188 ISSN 0204-9805. [XVI-th International Summer Conference on Probability and Statistics (ISCPS-2014). Pomorie, 21.6.-29.6.2014] R&D Projects: GA ČR GA13-13502S Grant - others:GA UK(CZ) SVV 260225/2015 Institutional support: RVO:67985556 Keywords : minimum cross- entropy principle * Kullback-Leibler divergence * dynamic diffusion estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/seckarova-0445817.pdf
The different paths to entropy
International Nuclear Information System (INIS)
Benguigui, L
2013-01-01
In order to understand how the complex concept of entropy emerged, we propose a trip into the past, reviewing the works of Clausius, Boltzmann, Gibbs and Planck. In particular, since Gibbs's work is not very well known we present a detailed analysis, recalling the three definitions of entropy that Gibbs gives. The introduction of entropy in quantum mechanics gives in a compact form all the classical definitions of entropy. Perhaps one of the most important aspects of entropy is to see it as a thermodynamic potential like the others proposed by Callen. The calculation of fluctuations in thermodynamic quantities is thus naturally related to entropy. We close with some remarks on entropy and irreversibility. (paper)
Energy Technology Data Exchange (ETDEWEB)
Xu, Kaixuan, E-mail: kaixuanxubjtu@yeah.net; Wang, Jun
2017-02-26
In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model. - Highlights: • Two new entropy approaches for estimation of nonlinear complexity are proposed for the financial market. • Effectiveness analysis of proposed methods is presented and their respective features are studied. • Empirical research of proposed analysis on seven world financial market indices. • Numerical simulation of Potts financial dynamics is preformed for nonlinear complexity behaviors.
International Nuclear Information System (INIS)
Xu, Kaixuan; Wang, Jun
2017-01-01
In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model. - Highlights: • Two new entropy approaches for estimation of nonlinear complexity are proposed for the financial market. • Effectiveness analysis of proposed methods is presented and their respective features are studied. • Empirical research of proposed analysis on seven world financial market indices. • Numerical simulation of Potts financial dynamics is preformed for nonlinear complexity behaviors.
Carricarte Naranjo, Claudia; Sanchez-Rodriguez, Lazaro M; Brown Martínez, Marta; Estévez Báez, Mario; Machado García, Andrés
2017-07-01
Heart rate variability (HRV) analysis is a relevant tool for the diagnosis of cardiovascular autonomic neuropathy (CAN). To our knowledge, no previous investigation on CAN has assessed the complexity of HRV from an ordinal perspective. Therefore, the aim of this work is to explore the potential of permutation entropy (PE) analysis of HRV complexity for the assessment of CAN. For this purpose, we performed a short-term PE analysis of HRV in healthy subjects and type 1 diabetes mellitus patients, including patients with CAN. Standard HRV indicators were also calculated in the control group. A discriminant analysis was used to select the variables combination with best discriminative power between control and CAN patients groups, as well as for classifying cases. We found that for some specific temporal scales, PE indicators were significantly lower in CAN patients than those calculated for controls. In such cases, there were ordinal patterns with high probabilities of occurrence, while others were hardly found. We posit this behavior occurs due to a decrease of HRV complexity in the diseased system. Discriminant functions based on PE measures or probabilities of occurrence of ordinal patterns provided an average of 75% and 96% classification accuracy. Correlations of PE and HRV measures showed to depend only on temporal scale, regardless of pattern length. PE analysis at some specific temporal scales, seem to provide additional information to that obtained with traditional HRV methods. We concluded that PE analysis of HRV is a promising method for the assessment of CAN. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis of the anomalous mean-field like properties of Gaussian core model in terms of entropy
Nandi, Manoj Kumar; Maitra Bhattacharyya, Sarika
2018-01-01
Studies of the Gaussian core model (GCM) have shown that it behaves like a mean-field model and the properties are quite different from standard glass former. In this work, we investigate the entropies, namely, the excess entropy (Sex) and the configurational entropy (Sc) and their different components to address these anomalies. Our study corroborates most of the earlier observations and also sheds new light on the high and low temperature dynamics. We find that unlike in standard glass former where high temperature dynamics is dominated by two-body correlation and low temperature by many-body correlations, in the GCM both high and low temperature dynamics are dominated by many-body correlations. We also find that the many-body entropy which is usually positive at low temperatures and is associated with activated dynamics is negative in the GCM suggesting suppression of activation. Interestingly despite the suppression of activation, the Adam-Gibbs (AG) relation that describes activated dynamics holds in the GCM, thus suggesting a non-activated contribution in AG relation. We also find an overlap between the AG relation and mode coupling power law regime leading to a power law behavior of Sc. From our analysis of this power law behavior, we predict that in the GCM the high temperature dynamics will disappear at dynamical transition temperature and below that there will be a transition to the activated regime. Our study further reveals that the activated regime in the GCM is quite narrow.
Directory of Open Access Journals (Sweden)
M. H. Yazdi
2014-01-01
Full Text Available In the present study, the first and second law analyses of power-law non-Newtonian flow over embedded open parallel microchannels within micropatterned permeable continuous moving surface are examined at prescribed surface temperature. A similarity transformation is used to reduce the governing equations to a set of nonlinear ordinary differential equations. The dimensionless entropy generation number is formulated by an integral of the local rate of entropy generation along the width of the surface based on an equal number of microchannels and no-slip gaps interspersed between those microchannels. The velocity, the temperature, the velocity gradient, and the temperature gradient adjacent to the wall are substituted into this equation resulting from the momentum and energy equations obtained numerically by Dormand-Prince pair and shooting method. Finally, the entropy generation numbers, as well as the Bejan number, are evaluated. It is noted that the presence of the shear thinning (pseudoplastic fluids creates entropy along the surface, with an opposite effect resulting from shear thickening (dilatant fluids.
Sze, Vivienne; Marpe, Detlev
2014-01-01
Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...
Harikrishnan, K. P.; Misra, R.; Ambika, G.
2009-09-01
We show that the combined use of correlation dimension (D2) and correlation entropy (K2) as discriminating measures can extract a more accurate information regarding the different types of noise present in a time series data. For this, we make use of an algorithmic approach for computing D2 and K2 proposed by us recently [Harikrishnan KP, Misra R, Ambika G, Kembhavi AK. Physica D 2006;215:137; Harikrishnan KP, Ambika G, Misra R. Mod Phys Lett B 2007;21:129; Harikrishnan KP, Misra R, Ambika G. Pramana - J Phys, in press], which is a modification of the standard Grassberger-Proccacia scheme. While the presence of white noise can be easily identified by computing D2 of data and surrogates, K2 is a better discriminating measure to detect colored noise in the data. Analysis of time series from a real world system involving both white and colored noise is presented as evidence. To our knowledge, this is the first time that such a combined analysis is undertaken on a real world data.
Applications of Entropy in Finance: A Review
Directory of Open Access Journals (Sweden)
Guanqun Tong
2013-11-01
Full Text Available Although the concept of entropy is originated from thermodynamics, its concepts and relevant principles, especially the principles of maximum entropy and minimum cross-entropy, have been extensively applied in finance. In this paper, we review the concepts and principles of entropy, as well as their applications in the field of finance, especially in portfolio selection and asset pricing. Furthermore, we review the effects of the applications of entropy and compare them with other traditional and new methods.
Tsallis non-additive entropy and natural time analysis of seismicity
Sarlis, N. V.; Skordas, E. S.; Varotsos, P.
2017-12-01
Within the context of Tsallis non-additive entropy [1] statistical mechanics -in the frame of which kappa distributions arise [2,3]- a derivation of the Gutenberg-Richter (GR) law of seismicity has been proposed [4,5]. Such an analysis leads to a generalized GR law [6,7] which is applied here to the earthquakes in Japan and California. These seismic data are also studied in natural time [6] revealing that although some properties of seismicity may be recovered by the non-additive entropy approach, temporal correlations between successive earthquake magnitudes should be also taken into account [6,8]. The importance of such correlations is strengthened by the observation of periods of long range correlated earthquake magnitude time series [9] a few months before all earthquakes of magnitude 7.6 or larger in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the magnitude 9.0 Tohoku-Oki earthquake) almost simultaneously with characteristic order parameter variations of seismicity [10]. These variations appear approximately when low frequency abnormal changes of the electric and magnetic field of the Earth (less than around 1Hz) are recorded [11] before strong earthquakes as the magnitude 9.0 Tohoku-Oki earthquake in Japan in 2011 [12]. 1. C Tsallis, J Stat Phys 52 (1988) 479 2. G Livadiotis, and D J McComas, J Geophys Res 114 (2009) A11105 3. G Livadiotis, Kappa Distributions. (Elsevier, Amsterdam) 2017. doi: 10.1016/B978-0-12-804638-8.01001-9 4. O Sotolongo-Costa, A Posadas, Phys Rev Lett 92 (2004) 048501 5. R Silva, G França, C Vilar, J Alcaniz, Phys Rev E 73 (2006) 026102 6. N Sarlis, E Skordas, P Varotsos, Phys Rev E 82 (2010) 021110 7. L Telesca, Bull Seismol Soc Am 102 (2012) 886-891 8. P Varotsos, N Sarlis, E Skordas, Natural Time Analysis: The new view of time. (Springer, Berlin) 2011. doi: 10.1007/978-3-642-16449-1 9. P Varotsos, N Sarlis, E Skordas, J Geophys Res Space Physics 119 (2014) 9192. 10. N Sarlis, E Skordas, P Varotsos, T
Analysis of Neural Oscillations on Drosophila’s Subesophageal Ganglion Based on Approximate Entropy
Directory of Open Access Journals (Sweden)
Tian Mei
2015-10-01
Full Text Available The suboesophageal ganglion (SOG, which connects to both central and peripheral nerves, is the primary taste-processing center in the Drosophila’s brain. The neural oscillation in this center may be of great research value yet it is rarely reported. This work aims to determine the amount of unique information contained within oscillations of the SOG and describe the variability of these patterns. The approximate entropy (ApEn values of the spontaneous membrane potential (sMP of SOG neurons were calculated in this paper. The arithmetic mean (MA, standard deviation (SDA and the coefficient of variation (CVA of ApEn were proposed as the three statistical indicators to describe the irregularity and complexity of oscillations. The hierarchical clustering method was used to classify them. As a result, the oscillations in SOG were divided into five categories, including: (1 Continuous spike pattern; (2 Mixed oscillation pattern; (3 Spikelet pattern; (4 Busting pattern and (5 Sparse spike pattern. Steady oscillation state has a low level of irregularity, and vice versa. The dopamine stimulation can distinctly cut down the complexity of the mixed oscillation pattern. The current study provides a quantitative method and some critera on mining the information carried in neural oscillations.
Directory of Open Access Journals (Sweden)
Jinkyu Kim
Full Text Available The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.
Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh
2013-01-01
The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.
Bullwhip Entropy Analysis and Chaos Control in the Supply Chain with Sales Game and Consumer Returns
Directory of Open Access Journals (Sweden)
Wandong Lou
2017-02-01
Full Text Available In this paper, we study a supply chain system which consists of one manufacturer and two retailers including a traditional retailer and an online retailer. In order to gain a larger market share, the retailers often take the sales as a decision-making variable in the competition game. We devote ourselves to analyze the bullwhip effect in the supply chain with sales game and consumer returns via the theory of entropy and complexity and take the delayed feedback control method to control the system’s chaotic state. The impact of a statutory 7-day no reason for return policy for online retailers is also investigated. The bounded rational expectation is adopt to forecast the future demand in the sales game system with weak noise. Our results show that high return rates will hurt the profits of both the retailers and the adjustment speed of the bounded rational sales expectation has an important impact on the bullwhip effect. There is a stable area for retailers where the bullwhip effect doesn’t appear. The supply chain system suffers a great bullwhip effect in the quasi-periodic state and the quasi-chaotic state. The purpose of chaos control on the sales game can be achieved and the bullwhip effect would be effectively mitigated by using the delayed feedback control method.
Thermal analysis and entropy generation of pulsating heat pipes using nanofluids
International Nuclear Information System (INIS)
Jafarmadar, Samad; Azizinia, Nazli; Razmara, Nayyer; Mobadersani, Farrokh
2016-01-01
Highlights: • Performance of PHP containing 0.5% Al_2O_3, CuO and silver nanofluids is reported. • The rate of entropy generation of PHP is investigated for different nanofluids. • The effects of particle volume concentration on the entropy generation of PHP are studied. • The appropriate volume concentration for the best thermal efficiency is 0.5–1%. • Al_2O_3 and CuO nanofluids show approximately same rate of entropy generation. - Abstract: Demanding of high-performance cooling systems is one of the most challenging and virtual issues in the industry and pulsating heat pipes (PHPs) are effective solutions for this concern. Nanofluids also have attracted attentions, due to its superior heat transfer properties in recent years. In the present study, the flow, heat transfer and entropy generation based on the second law of thermodynamics have been investigated and compared with the flow of Al_2O_3, CuO, Ag nanofluid and pure water through PHPs. The results show that, silver nanofluid provides the highest entropy generation. Also, the effects of different particle volume concentrations on the heat and flow characteristics of Al_2O_3 nanofluid have been studied. It is indicated that the optimal volume concentration of nanoparticles is about 0.5–1% to minimize the entropy generation and appropriate thermal operation.
Heat Transfer and Entropy Generation Analysis of an Intermediate Heat Exchanger in ADS
Wang, Yongwei; Huai, Xiulan
2018-04-01
The intermediate heat exchanger for enhancement heat transfer is the important equipment in the usage of nuclear energy. In the present work, heat transfer and entropy generation of an intermediate heat exchanger (IHX) in the accelerator driven subcritical system (ADS) are investigated experimentally. The variation of entropy generation number with performance parameters of the IHX is analyzed, and effects of inlet conditions of the IHX on entropy generation number and heat transfer are discussed. Compared with the results at two working conditions of the constant mass flow rates of liquid lead-bismuth eutectic (LBE) and helium gas, the total pumping power all tends to reduce with the decreasing entropy generation number, but the variations of the effectiveness, number of transfer units and thermal capacity rate ratio are inconsistent, and need to analyze respectively. With the increasing inlet mass flow rate or LBE inlet temperature, the entropy generation number increases and the heat transfer is enhanced, while the opposite trend occurs with the increasing helium gas inlet temperature. The further study is necessary for obtaining the optimized operation parameters of the IHX to minimize entropy generation and enhance heat transfer.
Environmental efficiency analysis of power industry in China based on an entropy SBM model
International Nuclear Information System (INIS)
Zhou, Yan; Xing, Xinpeng; Fang, Kuangnan; Liang, Dapeng; Xu, Chunlin
2013-01-01
In order to assess the environmental efficiency of power industry in China, this paper first proposes a new non-radial DEA approach by integrating the entropy weight and the SBM model. This will improve the assessment reliability and reasonableness. Using the model, this study then evaluates the environmental efficiency of the Chinese power industry at the provincial level during 2005–2010. The results show a marked difference in environmental efficiency of the power industry among Chinese provinces. Although the annual, average, environmental efficiency level fluctuates, there is an increasing trend. The Tobit regression analysis reveals the innovation ability of enterprises, the proportion of electricity generated by coal-fired plants and the generation capacity have a significantly positive effect on environmental efficiency. However the waste fees levied on waste discharge and investment in industrial pollutant treatment are negatively associated with environmental efficiency. - Highlights: ► We assess the environmental efficiency of power industry in China by E-SBM model. ► Environmental efficiency of power industry is different among provinces. ► Efficiency stays at a higher level in the eastern and the western area. ► Proportion of coal-fired plants has a positive effect on the efficiency. ► Waste fees and the investment have a negative effect on the efficiency
Indian Academy of Sciences (India)
Abstract. It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf f that satisfy. ∫ fhi dμ = λi for i = 1, 2,...,...k the maximizer of entropy is an f0 that is pro- portional to exp(. ∑ ci hi ) for some choice of ci . An extension of this to a continuum of.
Indian Academy of Sciences (India)
It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf that satisfy ∫ f h i d = i for i = 1 , 2 , … , … k the maximizer of entropy is an f 0 that is proportional to exp ( ∑ c i h i ) for some choice of c i . An extension of this to a continuum of ...
Directory of Open Access Journals (Sweden)
Tommaso Toffoli
2016-06-01
Full Text Available Here we deconstruct, and then in a reasoned way reconstruct, the concept of “entropy of a system”, paying particular attention to where the randomness may be coming from. We start with the core concept of entropy as a count associated with a description; this count (traditionally expressed in logarithmic form for a number of good reasons is in essence the number of possibilities—specific instances or “scenarios”—that match that description. Very natural (and virtually inescapable generalizations of the idea of description are the probability distribution and its quantum mechanical counterpart, the density operator. We track the process of dynamically updating entropy as a system evolves. Three factors may cause entropy to change: (1 the system’s internal dynamics; (2 unsolicited external influences on it; and (3 the approximations one has to make when one tries to predict the system’s future state. The latter task is usually hampered by hard-to-quantify aspects of the original description, limited data storage and processing resource, and possibly algorithmic inadequacy. Factors 2 and 3 introduce randomness—often huge amounts of it—into one’s predictions and accordingly degrade them. When forecasting, as long as the entropy bookkeping is conducted in an honest fashion, this degradation will always lead to an entropy increase. To clarify the above point we introduce the notion of honest entropy, which coalesces much of what is of course already done, often tacitly, in responsible entropy-bookkeping practice. This notion—we believe—will help to fill an expressivity gap in scientific discourse. With its help, we shall prove that any dynamical system—not just our physical universe—strictly obeys Clausius’s original formulation of the second law of thermodynamics if and only if it is invertible. Thus this law is a tautological property of invertible systems!
Damay, Nicolas; Forgez, Christophe; Bichat, Marie-Pierre; Friedrich, Guy
2016-11-01
The entropy-variation of a battery is responsible for heat generation or consumption during operation and its prior measurement is mandatory for developing a thermal model. It is generally done through the potentiometric method which is considered as a reference. However, it requires several days or weeks to get a look-up table with a 5 or 10% SoC (State of Charge) resolution. In this study, a calorimetric method based on the inversion of a thermal model is proposed for the fast estimation of a nearly continuous curve of entropy-variation. This is achieved by separating the heats produced while charging and discharging the battery. The entropy-variation is then deduced from the extracted entropic heat. The proposed method is validated by comparing the results obtained with several current rates to measurements made with the potentiometric method.
Al-Abadi, Alaa M; Shahid, Shamsuddin
2015-09-01
In this study, index of entropy and catastrophe theory methods were used for demarcating groundwater potential in an arid region using weighted linear combination techniques in geographical information system (GIS) environment. A case study from Badra area in the eastern part of central of Iraq was analyzed and discussed. Six factors believed to have influence on groundwater occurrence namely elevation, slope, aquifer transmissivity and storativity, soil, and distance to fault were prepared as raster thematic layers to facility integration into GIS environment. The factors were chosen based on the availability of data and local conditions of the study area. Both techniques were used for computing weights and assigning ranks vital for applying weighted linear combination approach. The results of application of both modes indicated that the most influential groundwater occurrence factors were slope and elevation. The other factors have relatively smaller values of weights implying that these factors have a minor role in groundwater occurrence conditions. The groundwater potential index (GPI) values for both models were classified using natural break classification scheme into five categories: very low, low, moderate, high, and very high. For validation of generated GPI, the relative operating characteristic (ROC) curves were used. According to the obtained area under the curve, the catastrophe model with 78 % prediction accuracy was found to perform better than entropy model with 77 % prediction accuracy. The overall results indicated that both models have good capability for predicting groundwater potential zones.
High-Order Entropy Stable Finite Difference Schemes for Nonlinear Conservation Laws: Finite Domains
Fisher, Travis C.; Carpenter, Mark H.
2013-01-01
Developing stable and robust high-order finite difference schemes requires mathematical formalism and appropriate methods of analysis. In this work, nonlinear entropy stability is used to derive provably stable high-order finite difference methods with formal boundary closures for conservation laws. Particular emphasis is placed on the entropy stability of the compressible Navier-Stokes equations. A newly derived entropy stable weighted essentially non-oscillatory finite difference method is used to simulate problems with shocks and a conservative, entropy stable, narrow-stencil finite difference approach is used to approximate viscous terms.
Introduction to maximum entropy
International Nuclear Information System (INIS)
Sivia, D.S.
1988-01-01
The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. We review the need for such methods in data analysis and show, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. We conclude with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab
Introduction to maximum entropy
International Nuclear Information System (INIS)
Sivia, D.S.
1989-01-01
The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. The author reviews the need for such methods in data analysis and shows, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. He concludes with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab
Entropy Analysis of the Coupled Human–Earth System: Implications for Sustainable Development
Directory of Open Access Journals (Sweden)
Weifang Shi
2017-07-01
Full Text Available Finding the basic physical foundation contributing to sustainable development is significantly useful in seeking ways to build an enduring human future. This paper introduces the dissipative structure theory to analyze the entropy budgets of the whole coupled human–Earth system and the key processes of the subsystems, and then presents the formulas to calculate these entropy budgets. The results show that the total net negative entropy of the coupled human–Earth system from exchange with space is sufficient, but only about 0.0042% of it is available for sustaining the life activities of the whole coupled system and the quantity of this portion is also not more than sufficient compared with the requirement of human life activities. In addition, the rate of negative entropy consumption by human subsystem from fossil fuels for sustaining modern civilization is too large, nearly a half of the negative entropy rate obtained by photosynthesis on the Earth, which indicates that entirely substituting biomass fuels for fossil fuels may be infeasible. The strategies for sustaining human life activities and modern civilization are proposed in the study, which would provide valuable information for humans to realize sustainable development.
Directory of Open Access Journals (Sweden)
George J. A. Jiang
2015-01-01
Full Text Available Electroencephalogram (EEG signals, as it can express the human brain’s activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA. Bispectral (BIS index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD method and analyzed using sample entropy (SampEn analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN model through using expert assessment of consciousness level (EACL which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.
Entanglement entropy and differential entropy for massive flavors
International Nuclear Information System (INIS)
Jones, Peter A.R.; Taylor, Marika
2015-01-01
In this paper we compute the holographic entanglement entropy for massive flavors in the D3-D7 system, for arbitrary mass and various entangling region geometries. We show that the universal terms in the entanglement entropy exactly match those computed in the dual theory using conformal perturbation theory. We derive holographically the universal terms in the entanglement entropy for a CFT perturbed by a relevant operator, up to second order in the coupling; our results are valid for any entangling region geometry. We present a new method for computing the entanglement entropy of any top-down brane probe system using Kaluza-Klein holography and illustrate our results with massive flavors at finite density. Finally we discuss the differential entropy for brane probe systems, emphasising that the differential entropy captures only the effective lower-dimensional Einstein metric rather than the ten-dimensional geometry.
Entropy Based Analysis of DNS Query Traffic in the Campus Network
Directory of Open Access Journals (Sweden)
Dennis Arturo Ludeña Romaña
2008-10-01
Full Text Available We carried out the entropy based study on the DNS query traffic from the campus network in a university through January 1st, 2006 to March 31st, 2007. The results are summarized, as follows: (1 The source IP addresses- and query keyword-based entropies change symmetrically in the DNS query traffic from the outside of the campus network when detecting the spam bot activity on the campus network. On the other hand (2, the source IP addresses- and query keywordbased entropies change similarly each other when detecting big DNS query traffic caused by prescanning or distributed denial of service (DDoS attack from the campus network. Therefore, we can detect the spam bot and/or DDoS attack bot by only watching DNS query access traffic.
Comprehensive entropy weight observability-controllability risk ...
African Journals Online (AJOL)
Decision making for water resource planning is often related to social, economic and environmental factors. There are various methods for making decisions about water resource planning alternatives and measures with various shortcomings. A comprehensive entropy weight observability-controllability risk analysis ...
Spatial-dependence recurrence sample entropy
Pham, Tuan D.; Yan, Hong
2018-03-01
Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.
Directory of Open Access Journals (Sweden)
S. H. Chiang
2016-06-01
Full Text Available Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface
Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn
2016-06-01
Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled
Kakemoto, H; Makita, Y; Kino, Y; Tsukamoto, T; Shin, S; Wada, S; Tsurumi, T
2003-01-01
The electronic structure of beta-FeSi sub 2 was investigated by maximum entropy method (MEM) and photoemission spectroscopy. The electronic structure obtained by MEM using X-ray diffraction data at room temperature (RT) showed covalent bonds of Fe-Si and Si-Si electrons. The photoemission spectra of beta-FeSi sub 2 at RT were changed by incidence photon energies. For photon energies between 50 and 100 eV, resonant photoemission spectra caused by a super Coster-Kronig transition were observed. In order to reduce resonant effect about Fe(3d) for obtained photoemission spectra, difference spectrum between 53 and 57 eV was calculated, and it was compared with ab-initio band calculation and spectra function.
Relation Entropy and Transferable Entropy Think of Aggregation on Group Decision Making
Institute of Scientific and Technical Information of China (English)
CHENG Qi-yue; QIU Wan-hua; LIU Xiao-feng
2002-01-01
In this paper, aggregation question based on group decision making and a single decision making is studied. The theory of entropy is applied to the sets pair analysis. The system of relation entropy and the transferable entropy notion are put. The character is studied. An potential by the relation entropy and transferable entropy are defined. It is the consistency measure on the group between a single decision making. We gained a new aggregation effective definition on the group misjudge.
Schuster, Fabian; Ostermann, Thomas; Emcke, Timo; Schuster, Reinhard
2017-01-01
Diagnostic diversity has been in the focus of several studies of health services research. As the fraction of people with statutory health insurance changes with age and gender it is assumed that diagnostic diversity may be influenced by these parameters. We analyze fractions of patients in Schleswig-Holstein with respect to the chapters of the ICD-10 code in outpatient treatment for quarter 2/2016 with respect to age and gender/sex of the patient. In a first approach we analyzed which diagnose chapters are most relevant in dependence of age and gender. To detect diagnostic diversity, we finally applied Shannon's entropy measure. Due to multimorbidity we used different standardizations. Shannon entropy strongly increases for women after the age of 15, reaching a limit level at the age of 50 years. Between 15 and 70 years we get higher values for women, after 75 years for men. This article describes a straight forward pragmatic approach to diagnostic diversity using Shannon's Entropy. From a methodological point of view, the use of Shannon's entropy as a measure for diversity should gain more attraction to researchers of health services research.
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.
1998-12-01
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Straka, Mika J.; Caldarelli, Guido; Squartini, Tiziano; Saracco, Fabio
2018-04-01
Bipartite networks provide an insightful representation of many systems, ranging from mutualistic networks of species interactions to investment networks in finance. The analyses of their topological structures have revealed the ubiquitous presence of properties which seem to characterize many—apparently different—systems. Nestedness, for example, has been observed in biological plant-pollinator as well as in country-product exportation networks. Due to the interdisciplinary character of complex networks, tools developed in one field, for example ecology, can greatly enrich other areas of research, such as economy and finance, and vice versa. With this in mind, we briefly review several entropy-based bipartite null models that have been recently proposed and discuss their application to real-world systems. The focus on these models is motivated by the fact that they show three very desirable features: analytical character, general applicability, and versatility. In this respect, entropy-based methods have been proven to perform satisfactorily both in providing benchmarks for testing evidence-based null hypotheses and in reconstructing unknown network configurations from partial information. Furthermore, entropy-based models have been successfully employed to analyze ecological as well as economic systems. As an example, the application of entropy-based null models has detected early-warning signals, both in economic and financial systems, of the 2007-2008 world crisis. Moreover, they have revealed a statistically-significant export specialization phenomenon of country export baskets in international trade, a result that seems to reconcile Ricardo's hypothesis in classical economics with recent findings on the (empirical) diversification industrial production at the national level. Finally, these null models have shown that the information contained in the nestedness is already accounted for by the degree sequence of the corresponding graphs.
Logarithmic black hole entropy corrections and holographic Renyi entropy
Energy Technology Data Exchange (ETDEWEB)
Mahapatra, Subhash [The Institute of Mathematical Sciences, Chennai (India); KU Leuven - KULAK, Department of Physics, Kortrijk (Belgium)
2018-01-15
The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G{sub D}{sup 0}. The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)
Logarithmic black hole entropy corrections and holographic Renyi entropy
International Nuclear Information System (INIS)
Mahapatra, Subhash
2018-01-01
The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G D 0 . The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)
Entropy and convexity for nonlinear partial differential equations.
Ball, John M; Chen, Gui-Qiang G
2013-12-28
Partial differential equations are ubiquitous in almost all applications of mathematics, where they provide a natural mathematical description of many phenomena involving change in physical, chemical, biological and social processes. The concept of entropy originated in thermodynamics and statistical physics during the nineteenth century to describe the heat exchanges that occur in the thermal processes in a thermodynamic system, while the original notion of convexity is for sets and functions in mathematics. Since then, entropy and convexity have become two of the most important concepts in mathematics. In particular, nonlinear methods via entropy and convexity have been playing an increasingly important role in the analysis of nonlinear partial differential equations in recent decades. This opening article of the Theme Issue is intended to provide an introduction to entropy, convexity and related nonlinear methods for the analysis of nonlinear partial differential equations. We also provide a brief discussion about the content and contributions of the papers that make up this Theme Issue.
Entropy of balance - some recent results
Directory of Open Access Journals (Sweden)
Laxåback Gerd
2010-07-01
Full Text Available Abstract Background Entropy when applied to biological signals is expected to reflect the state of the biological system. However the physiological interpretation of the entropy is not always straightforward. When should high entropy be interpreted as a healthy sign, and when as marker of deteriorating health? We address this question for the particular case of human standing balance and the Center of Pressure data. Methods We have measured and analyzed balance data of 136 participants (young, n = 45; elderly, n = 91 comprising in all 1085 trials, and calculated the Sample Entropy (SampEn for medio-lateral (M/L and anterior-posterior (A/P Center of Pressure (COP together with the Hurst self-similariy (ss exponent α using Detrended Fluctuation Analysis (DFA. The COP was measured with a force plate in eight 30 seconds trials with eyes closed, eyes open, foam, self-perturbation and nudge conditions. Results 1 There is a significant difference in SampEn for the A/P-direction between the elderly and the younger groups Old > young. 2 For the elderly we have in general A/P > M/L. 3 For the younger group there was no significant A/P-M/L difference with the exception for the nudge trials where we had the reverse situation, A/P Eyes Open. 5 In case of the Hurst ss-exponent we have for the elderly, M/L > A/P. Conclusions These results seem to be require some modifications of the more or less established attention-constraint interpretation of entropy. This holds that higher entropy correlates with a more automatic and a less constrained mode of balance control, and that a higher entropy reflects, in this sense, a more efficient balancing.
International Nuclear Information System (INIS)
Hudetz, T.
1989-01-01
As a 'by-product' of the Connes-Narnhofer-Thirring theory of dynamical entropy for (originally non-Abelian) nuclear C * -algebras, the well-known variational principle for topological entropy is eqivalently reformulated in purly algebraically defined terms for (separable) Abelian C * -algebras. This 'algebraic variational principle' should not only nicely illustrate the 'feed-back' of methods developed for quantum dynamical systems to the classical theory, but it could also be proved directly by 'algebraic' methods and could thus further simplify the original proof of the variational principle (at least 'in principle'). 23 refs. (Author)
Entropy statistics and information theory
Frenken, K.; Hanusch, H.; Pyka, A.
2007-01-01
Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the
Upper entropy axioms and lower entropy axioms
International Nuclear Information System (INIS)
Guo, Jin-Li; Suo, Qi
2015-01-01
The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover, different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics
Directory of Open Access Journals (Sweden)
Jiaxin Lu
2017-10-01
Full Text Available Implementation of hybrid energy system (HES is generally considered as a promising way to satisfy the electrification requirements for remote areas. In the present study, a novel decision making methodology is proposed to identify the best compromise configuration of HES from a set of feasible combinations obtained from HOMER. For this purpose, a multi-objective function, which comprises four crucial and representative indices, is formulated by applying the weighted sum method. The entropy weight method is employed as a quantitative methodology for weighting factors calculation to enhance the objectivity of decision-making. Moreover, the optimal design of a stand-alone PV/wind/battery/diesel HES in Yongxing Island, China, is conducted as a case study to validate the effectiveness of the proposed method. Both the simulation and optimization results indicate that, the optimization method is able to identify the best trade-off configuration among system reliability, economy, practicability and environmental sustainability. Several useful conclusions are given by analyzing the operation of the best configuration.
Notes on entanglement entropy in string theory
International Nuclear Information System (INIS)
He, Song; Numasawa, Tokiro; Takayanagi, Tadashi; Watanabe, Kento
2015-01-01
In this paper, we study the conical entropy in string theory in the simplest setup of dividing the nine dimensional space into two halves. This corresponds to the leading quantum correction to the horizon entropy in string theory on the Rindler space. This entropy is also called the conical entropy and includes surface term contributions. We first derive a new simple formula of the conical entropy for any free higher spin fields. Then we apply this formula to computations of conical entropy in open and closed superstring. In our analysis of closed string, we study the twisted conical entropy defined by making use of string theory on Melvin backgrounds. This quantity is easier to calculate owing to the folding trick. Our analysis shows that the conical entropy in closed superstring is UV finite owing to the string scale cutoff.
Directory of Open Access Journals (Sweden)
Pei-Feng Lin
Full Text Available The heart begins to beat before the brain is formed. Whether conventional hierarchical central commands sent by the brain to the heart alone explain all the interplay between these two organs should be reconsidered. Here, we demonstrate correlations between the signal complexity of brain and cardiac activity. Eighty-seven geriatric outpatients with healthy hearts and varied cognitive abilities each provided a 24-hour electrocardiography (ECG and a 19-channel eye-closed routine electroencephalography (EEG. Multiscale entropy (MSE analysis was applied to three epochs (resting-awake state, photic stimulation of fast frequencies (fast-PS, and photic stimulation of slow frequencies (slow-PS of EEG in the 1-58 Hz frequency range, and three RR interval (RRI time series (awake-state, sleep and that concomitant with the EEG for each subject. The low-to-high frequency power (LF/HF ratio of RRI was calculated to represent sympatho-vagal balance. With statistics after Bonferroni corrections, we found that: (a the summed MSE value on coarse scales of the awake RRI (scales 11-20, RRI-MSE-coarse were inversely correlated with the summed MSE value on coarse scales of the resting-awake EEG (scales 6-20, EEG-MSE-coarse at Fp2, C4, T6 and T4; (b the awake RRI-MSE-coarse was inversely correlated with the fast-PS EEG-MSE-coarse at O1, O2 and C4; (c the sleep RRI-MSE-coarse was inversely correlated with the slow-PS EEG-MSE-coarse at Fp2; (d the RRI-MSE-coarse and LF/HF ratio of the awake RRI were correlated positively to each other; (e the EEG-MSE-coarse at F8 was proportional to the cognitive test score; (f the results conform to the cholinergic hypothesis which states that cognitive impairment causes reduction in vagal cardiac modulation; (g fast-PS significantly lowered the EEG-MSE-coarse globally. Whether these heart-brain correlations could be fully explained by the central autonomic network is unknown and needs further exploration.
Directory of Open Access Journals (Sweden)
Yonghua You
2018-06-01
Full Text Available In the current work, a novel 2D numerical model of stationary grids was developed for reciprocating magnetic refrigerators, with Gd plates, in which the magneto-caloric properties, derived from the Weiss molecular field theory, were adopted for the built-in energy source of the magneto-caloric effect. The numerical simulation was conducted under the conditions of different structural and operational parameters, and the effects of the relative fluid displacement (φ on the specific refrigeration capacity (qref and the Coefficient of Performance (COP were obtained. Besides the variations of entropy, the generation rate and number were studied and the contours of the local entropy generation rate are presented for discussion. From the current work, it is found that with an increase in φ, both the qref and COP followed the convex variation trend, while the entropy generation number (Ns varied concavely. As for the current cases, the maximal qref and COP were equal to 151.2 kW/m3 and 9.11, respectively, while the lowest Ns was the value of 2.4 × 10−4 K−1. However, the optimal φ for the largest qref and COP, and for the lowest Ns, were inconsistent, thus, some compromises need be made in the optimization of magnetic refrigerators.
International Nuclear Information System (INIS)
Nuno Almirantearena, F; Introzzi, A; Clara, F; Burillo Lopez, P
2007-01-01
In this work we use 53 Arterial Diameter Variation (ADV) waves extracted from radial artery of normotense males, along with the values of variables that represent the ADV wave, obtained by means of multivariate analysis. Then, we specify the linguistic variables and the linguistic terms. The variables are fuzzified using triangular and trapezoidal fuzzy numbers. We analyze the fuzziness of the linguistic terms by applying discrete and continuous fuzzy entropies. Finally, we infer which variable presents the greatest disorder associated to the loss of arterial elasticity in radial artery
Controlling the Shannon Entropy of Quantum Systems
Xing, Yifan; Wu, Jun
2013-01-01
This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking. PMID:23818819
Controlling the Shannon Entropy of Quantum Systems
Directory of Open Access Journals (Sweden)
Yifan Xing
2013-01-01
Full Text Available This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking.
Information Entropy Measures for Stand Structural Diversity:Joint Entropy
Institute of Scientific and Technical Information of China (English)
Lei Xiangdong; Lu Yuanchang
2004-01-01
Structural diversity is the key attribute of a stand. A set of biodiversity measures in ecology was introduced in forest management for describing stand structure, of which Shannon information entropy (Shannon index) has been the most widely used measure of species diversity. It is generally thought that tree size diversity could serve as a good proxy for height diversity. However, tree size diversity and height diversity for stand structure is not completely consistent. Stand diameter cannot reflect height information completely. Either tree size diversity or height diversity is one-dimensional information entropy measure. This paper discussed the method of multiple-dimensional information entropy measure with the concept of joint entropy. It is suggested that joint entropy is a good measure for describing overall stand structural diversity.
Parsani, Matteo; Carpenter, Mark H.; Fisher, Travis C.; Nielsen, Eric J.
2016-01-01
set of points from tensor product, Legendre--Gauss--Lobatto (LGL), to a combination of tensor product Legendre--Gauss (LG) and LGL points. The new semidiscrete operators discretely conserve mass, momentum, energy, and satisfy a mathematical entropy
Directory of Open Access Journals (Sweden)
Lei Chen
2017-06-01
Full Text Available Event-based runoff–pollutant relationships have been the key for water quality management, but the scarcity of measured data results in poor model performance, especially for multiple rainfall events. In this study, a new framework was proposed for event-based non-point source (NPS prediction and evaluation. The artificial neural network (ANN was used to extend the runoff–pollutant relationship from complete data events to other data-scarce events. The interpolation method was then used to solve the problem of tail deviation in the simulated pollutographs. In addition, the entropy method was utilized to train the ANN for comprehensive evaluations. A case study was performed in the Three Gorges Reservoir Region, China. Results showed that the ANN performed well in the NPS simulation, especially for light rainfall events, and the phosphorus predictions were always more accurate than the nitrogen predictions under scarce data conditions. In addition, peak pollutant data scarcity had a significant impact on the model performance. Furthermore, these traditional indicators would lead to certain information loss during the model evaluation, but the entropy weighting method could provide a more accurate model evaluation. These results would be valuable for monitoring schemes and the quantitation of event-based NPS pollution, especially in data-poor catchments.
Directory of Open Access Journals (Sweden)
Shuan-Feng Zhao
2017-01-01
Full Text Available In the driver fatigue monitoring technology, the essence is to capture and analyze the driver behavior information, such as eyes, face, heart, and EEG activity during driving. However, ECG and EEG monitoring are limited by the installation electrodes and are not commercially available. The most common fatigue detection method is the analysis of driver behavior, that is, to determine whether the driver is tired by recording and analyzing the behavior characteristics of steering wheel and brake. The driver usually adjusts his or her actions based on the observed road conditions. Obviously the road path information is directly contained in the vehicle driving state; if you want to judge the driver’s driving behavior by vehicle driving status information, the first task is to remove the road information from the vehicle driving state data. Therefore, this paper proposes an effective intrinsic mode function selection method for the approximate entropy of empirical mode decomposition considering the characteristics of the frequency distribution of road and vehicle information and the unsteady and nonlinear characteristics of the driver closed-loop driving system in vehicle driving state data. The objective is to extract the effective component of the driving behavior information and to weaken the road information component. Finally the effectiveness of the proposed method is verified by simulating driving experiments.
International Nuclear Information System (INIS)
Yin, Chukai; Su, Bingjing
2001-01-01
The Minerbo's maximum entropy Eddington factor (MEEF) method was proposed as a low-order approximation to transport theory, in which the first two moment equations are closed for the scalar flux f and the current F through a statistically derived nonlinear Eddington factor f. This closure has the ability to handle various degrees of anisotropy of angular flux and is well justified both numerically and theoretically. Thus, a lot of efforts have been made to use this approximation in transport computations, especially in the radiative transfer and astrophysics communities. However, the method suffers numerical instability and may lead to anomalous solutions if the equations are solved by certain commonly used (implicit) mesh schemes. Studies on numerical stability in one-dimensional cases show that the MEEF equations can be solved satisfactorily by an implicit scheme (of treating δΦ/δx) if the angular flux is not too anisotropic so that f 32 , the classic diffusion solution P 1 , the MEEF solution f M obtained by Riemann solvers, and the NFLD solution D M for the two problems, respectively. In Fig. 1, NFLD and MEEF quantitatively predict very close results. However, the NFLD solution is qualitatively better because it is continuous while MEEF predicts unphysical jumps near the middle of the slab. In Fig. 2, the NFLD and MEEF solutions are almost identical, except near the material interface. In summary, the flux-limited diffusion theory derived from the MEEF description is quantitatively as accurate as the MEEF method. However, it is more qualitatively correct and user-friendly than the MEEF method and can be applied efficiently to various steady-state problems. Numerical tests show that this method is widely valid and overall predicts better results than other low-order approximations for various kinds of problems, including eigenvalue problems. Thus, it is an appealing approximate solution technique that is fast computationally and yet is accurate enough for a
Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis.
Azami, Hamed; Fernández, Alberto; Escudero, Javier
2017-11-01
Multiscale entropy (MSE) has been a prevalent algorithm to quantify the complexity of biomedical time series. Recent developments in the field have tried to alleviate the problem of undefined MSE values for short signals. Moreover, there has been a recent interest in using other statistical moments than the mean, i.e., variance, in the coarse-graining step of the MSE. Building on these trends, here we introduce the so-called refined composite multiscale fuzzy entropy based on the standard deviation (RCMFE σ ) and mean (RCMFE μ ) to quantify the dynamical properties of spread and mean, respectively, over multiple time scales. We demonstrate the dependency of the RCMFE σ and RCMFE μ , in comparison with other multiscale approaches, on several straightforward signal processing concepts using a set of synthetic signals. The results evidenced that the RCMFE σ and RCMFE μ values are more stable and reliable than the classical multiscale entropy ones. We also inspect the ability of using the standard deviation as well as the mean in the coarse-graining process using magnetoencephalograms in Alzheimer's disease and publicly available electroencephalograms recorded from focal and non-focal areas in epilepsy. Our results indicated that when the RCMFE μ cannot distinguish different types of dynamics of a particular time series at some scale factors, the RCMFE σ may do so, and vice versa. The results showed that RCMFE σ -based features lead to higher classification accuracies in comparison with the RCMFE μ -based ones. We also made freely available all the Matlab codes used in this study at http://dx.doi.org/10.7488/ds/1477 .
Improved entropy encoding for high efficient video coding standard
Directory of Open Access Journals (Sweden)
B.S. Sunil Kumar
2018-03-01
Full Text Available The High Efficiency Video Coding (HEVC has better coding efficiency, but the encoding performance has to be improved to meet the growing multimedia applications. This paper improves the standard entropy encoding by introducing the optimized weighing parameters, so that higher rate of compression can be accomplished over the standard entropy encoding. The optimization is performed using the recently introduced firefly algorithm. The experimentation is carried out using eight benchmark video sequences and the PSNR for varying rate of data transmission is investigated. Comparative analysis based on the performance statistics is made with the standard entropy encoding. From the obtained results, it is clear that the originality of the decoded video sequence is preserved far better than the proposed method, though the compression rate is increased. Keywords: Entropy, Encoding, HEVC, PSNR, Compression
Foreign exchange rate entropy evolution during financial crises
Stosic, Darko; Stosic, Dusan; Ludermir, Teresa; de Oliveira, Wilson; Stosic, Tatijana
2016-05-01
This paper examines the effects of financial crises on foreign exchange (FX) markets, where entropy evolution is measured for different exchange rates, using the time-dependent block entropy method. Empirical results suggest that financial crises are associated with significant increase of exchange rate entropy, reflecting instability in FX market dynamics. In accordance with phenomenological expectations, it is found that FX markets with large liquidity and large trading volume are more inert - they recover quicker from a crisis than markets with small liquidity and small trading volume. Moreover, our numerical analysis shows that periods of economic uncertainty are preceded by periods of low entropy values, which may serve as a tool for anticipating the onset of financial crises.
Wavelet bidomain sample entropy analysis to predict spontaneous termination of atrial fibrillation
International Nuclear Information System (INIS)
Alcaraz, Raúl; Rieta, José Joaquín
2008-01-01
The ability to predict if an atrial fibrillation (AF) episode terminates spontaneously or not through non-invasive techniques is a challenging problem of great clinical interest. This fact could avoid useless therapeutic interventions and minimize the risks for the patient. The present work introduces a robust AF prediction methodology carried out by estimating, through sample entropy (SampEn), the atrial activity (AA) organization increase prior to AF termination from the surface electrocardiogram (ECG). This regularity variation appears as a consequence of the decrease in the number of reentries wandering throughout the atrial tissue. AA was obtained from surface ECG recordings by applying a QRST cancellation technique. Next, a robust and reliable classification process for terminating and non-terminating AF episodes was developed, making use of two different wavelet decomposition strategies. Finally, the AA organization both in time and wavelet domains (bidomain) was estimated via SampEn. The methodology was validated using a training set consisting of 20 AF recordings with known termination properties and a test set of 30 recordings. All the training signals and 93.33% of the test set were correctly classified into terminating and sustained AF, obtaining 93.75% sensitivity and 92.86% specificity. It can be concluded that spontaneous AF termination can be reliably and noninvasively predicted by applying wavelet bidomain sample entropy
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Directory of Open Access Journals (Sweden)
Xiaokang Kou
2016-01-01
Full Text Available Land surface temperature (LST plays a major role in the study of surface energy balances. Remote sensing techniques provide ways to monitor LST at large scales. However, due to atmospheric influences, significant missing data exist in LST products retrieved from satellite thermal infrared (TIR remotely sensed data. Although passive microwaves (PMWs are able to overcome these atmospheric influences while estimating LST, the data are constrained by low spatial resolution. In this study, to obtain complete and high-quality LST data, the Bayesian Maximum Entropy (BME method was introduced to merge 0.01° and 0.25° LSTs inversed from MODIS and AMSR-E data, respectively. The result showed that the missing LSTs in cloudy pixels were filled completely, and the availability of merged LSTs reaches 100%. Because the depths of LST and soil temperature measurements are different, before validating the merged LST, the station measurements were calibrated with an empirical equation between MODIS LST and 0~5 cm soil temperatures. The results showed that the accuracy of merged LSTs increased with the increasing quantity of utilized data, and as the availability of utilized data increased from 25.2% to 91.4%, the RMSEs of the merged data decreased from 4.53 °C to 2.31 °C. In addition, compared with the filling gap method in which MODIS LST gaps were filled with AMSR-E LST directly, the merged LSTs from the BME method showed better spatial continuity. The different penetration depths of TIR and PMWs may influence fusion performance and still require further studies.
Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto
2012-06-01
Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.
Directory of Open Access Journals (Sweden)
Aich Walid
2018-01-01
Full Text Available A computational analysis of the natural ventilation process and entropy generation in 3-D prismatic greenhouse was performed using CFD. The aim of the study is to investigate how buoyancy forces influence air-flow and temperature patterns inside the greenhouse having lower level opening in its right heated façade and also upper level opening near the roof top in the opposite cooled façade. The bot-tom and all other walls are assumed to be perfect thermal insulators. Rayleigh number is the main parameter which changes from 103 to 106 and Prandtl number is fixed at Pr = 0.71. Results are reported in terms of particles trajectories, iso-surfaces of temperature, mean Nusselt number, and entropy generation. It has been found that the flow structure is sensitive to the value of Rayleigh number and that heat transfer increases with increasing this parameter. Also, it have been noticed that, using asymmetric opening positions improve the natural ventilation and facilitate the occurrence of buoyancy induced upward cross air-flow (low-level supply and upper-level extraction inside the greenhouse.
Directory of Open Access Journals (Sweden)
Alicja P. Sobańtka
2014-01-01
Full Text Available Extended statistical entropy analysis (eSEA is used to assess the nitrogen (N removal performance of the wastewater treatment (WWT simulation software, the Benchmarking Simulation Model No. 2 (BSM No. 2 . Six simulations with three different types of wastewater are carried out, which vary in the dissolved oxygen concentration (O2,diss. during the aerobic treatment. N2O emissions generated during denitrification are included in the model. The N-removal performance is expressed as reduction in statistical entropy, ΔH, compared to the hypothetical reference situation of direct discharge of the wastewater into the river. The parameters chemical and biological oxygen demand (COD, BOD and suspended solids (SS are analogously expressed in terms of reduction of COD, BOD, and SS, compared to a direct discharge of the wastewater to the river (ΔEQrest. The cleaning performance is expressed as ΔEQnew, the weighted average of ΔH and ΔEQrest. The results show that ΔEQnew is a more comprehensive indicator of the cleaning performance because, in contrast to the traditional effluent quality index (EQ, it considers the characteristics of the wastewater, includes all N-compounds and their distribution in the effluent, the off-gas, and the sludge. Furthermore, it is demonstrated that realistically expectable N2O emissions have only a moderate impact on ΔEQnew.
International Nuclear Information System (INIS)
Kantar, Yeliz Mert; Usta, Ilhan
2008-01-01
In this study, the minimum cross entropy (MinxEnt) principle is applied for the first time to the wind energy field. This principle allows the inclusion of previous information of a wind speed distribution and covers the maximum entropy (MaxEnt) principle, which is also discussed by Li and Li and Ramirez as special cases in their wind power study. The MinxEnt probability density function (pdf) derived from the MinxEnt principle are used to determine the diurnal, monthly, seasonal and annual wind speed distributions. A comparison between MinxEnt pdfs defined on the basis of the MinxEnt principle and the Weibull pdf on wind speed data, which are taken from different sources and measured in various regions, is conducted. The wind power densities of the considered regions obtained from Weibull and MinxEnt pdfs are also compared. The results indicate that the pdfs derived from the MinxEnt principle fit better to a variety of measured wind speed data than the conventionally applied empirical Weibull pdf. Therefore, it is shown that the MinxEnt principle can be used as an alternative method to estimate both wind distribution and wind power accurately
Ream, Allen E.; Slattery, John C.; Cizmas, Paul G. A.
2018-04-01
This paper presents a new method for determining the Arrhenius parameters of a reduced chemical mechanism such that it satisfies the second law of thermodynamics. The strategy is to approximate the progress of each reaction in the reduced mechanism from the species production rates of a detailed mechanism by using a linear least squares method. A series of non-linear least squares curve fittings are then carried out to find the optimal Arrhenius parameters for each reaction. At this step, the molar rates of production are written such that they comply with a theorem that provides the sufficient conditions for satisfying the second law of thermodynamics. This methodology was used to modify the Arrhenius parameters for the Westbrook and Dryer two-step mechanism and the Peters and Williams three-step mechanism for methane combustion. Both optimized mechanisms showed good agreement with the detailed mechanism for species mole fractions and production rates of most major species. Both optimized mechanisms showed significant improvement over previous mechanisms in minor species production rate prediction. Both optimized mechanisms produced no violations of the second law of thermodynamics.
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Sample Entropy-Based Approach to Evaluate the Stability of Double-Wire Pulsed MIG Welding
Directory of Open Access Journals (Sweden)
Ping Yao
2014-01-01
Full Text Available According to the sample entropy, this paper deals with a quantitative method to evaluate the current stability in double-wire pulsed MIG welding. Firstly, the sample entropy of current signals with different stability but the same parameters is calculated. The results show that the more stable the current, the smaller the value and the standard deviation of sample entropy. Secondly, four parameters, which are pulse width, peak current, base current, and frequency, are selected for four-level three-factor orthogonal experiment. The calculation and analysis of desired signals indicate that sample entropy values are affected by welding current parameters. Then, a quantitative method based on sample entropy is proposed. The experiment results show that the method can preferably quantify the welding current stability.
Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen
2018-06-01
As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.
Receiver function estimated by maximum entropy deconvolution
Institute of Scientific and Technical Information of China (English)
吴庆举; 田小波; 张乃铃; 李卫平; 曾融生
2003-01-01
Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.
Complexity Analysis of Carbon Market Using the Modified Multi-Scale Entropy
Directory of Open Access Journals (Sweden)
Jiuli Yin
2018-06-01
Full Text Available Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are taken as the example. Moving average is adopted to extract the scales due to the short length of the data set. Results show a low-level complexity inferring that China’s pilot carbon markets are quite immature in lack of market efficiency. However, the complexity varies in different time scales. China’s carbon markets (except for the Chongqing pilot are more complex in the short period than in the long term. Furthermore, complexity level in most pilot markets increases as the markets developed, showing an improvement in market efficiency. All these results demonstrate that an effective carbon market is required for the full function of emission trading.
Zucker, M. H.
This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own
Excess Entropy and Diffusivity
Indian Academy of Sciences (India)
First page Back Continue Last page Graphics. Excess Entropy and Diffusivity. Excess entropy scaling of diffusivity (Rosenfeld,1977). Analogous relationships also exist for viscosity and thermal conductivity.
THE LICK AGN MONITORING PROJECT: VELOCITY-DELAY MAPS FROM THE MAXIMUM-ENTROPY METHOD FOR Arp 151
International Nuclear Information System (INIS)
Bentz, Misty C.; Barth, Aaron J.; Walsh, Jonelle L.; Horne, Keith; Bennert, Vardha Nicola; Treu, Tommaso; Canalizo, Gabriela; Filippenko, Alexei V.; Gates, Elinor L.; Malkan, Matthew A.; Minezaki, Takeo; Woo, Jong-Hak
2010-01-01
We present velocity-delay maps for optical H I, He I, and He II recombination lines in Arp 151, recovered by fitting a reverberation model to spectrophotometric monitoring data using the maximum-entropy method. H I response is detected over the range 0-15 days, with the response confined within the virial envelope. The Balmer-line maps have similar morphologies but exhibit radial stratification, with progressively longer delays for Hγ to Hβ to Hα. The He I and He II response is confined within 1-2 days. There is a deficit of prompt response in the Balmer-line cores but strong prompt response in the red wings. Comparison with simple models identifies two classes that reproduce these features: free-falling gas and a half-illuminated disk with a hot spot at small radius on the receding lune. Symmetrically illuminated models with gas orbiting in an inclined disk or an isotropic distribution of randomly inclined circular orbits can reproduce the virial structure but not the observed asymmetry. Radial outflows are also largely ruled out by the observed asymmetry. A warped-disk geometry provides a physically plausible mechanism for the asymmetric illumination and hot spot features. Simple estimates show that a disk in the broad-line region of Arp 151 could be unstable to warping induced by radiation pressure. Our results demonstrate the potential power of detailed modeling combined with monitoring campaigns at higher cadence to characterize the gas kinematics and physical processes that give rise to the broad emission lines in active galactic nuclei.
Investigating dynamical complexity in the magnetosphere using various entropy measures
Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos
2009-09-01
The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather
Bubble Entropy: An Entropy Almost Free of Parameters.
Manis, George; Aktaruzzaman, Md; Sassi, Roberto
2017-11-01
Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation
Aziz, Asim; Jamshed, Wasim; Aziz, Taha
2018-04-01
In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.
Tan, Chao; Zhao, Jia; Dong, Feng
2015-03-01
Flow behavior characterization is important to understand gas-liquid two-phase flow mechanics and further establish its description model. An Electrical Resistance Tomography (ERT) provides information regarding flow conditions at different directions where the sensing electrodes implemented. We extracted the multivariate sample entropy (MSampEn) by treating ERT data as a multivariate time series. The dynamic experimental results indicate that the MSampEn is sensitive to complexity change of flow patterns including bubbly flow, stratified flow, plug flow and slug flow. MSampEn can characterize the flow behavior at different direction of two-phase flow, and reveal the transition between flow patterns when flow velocity changes. The proposed method is effective to analyze two-phase flow pattern transition by incorporating information of different scales and different spatial directions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Junhai Ma
2016-11-01
Full Text Available In this research, a model is established to represent a supply chain, which consists of one manufacturer and two retailers. The price-sensitive demand model is considered and the price game system is built according to the rule of bounded rationality as well as the entropy theory. With the increase of the price adjustment speed, the game system may go into chaos from the stable and periodic state. The bullwhip effect and inventory variance ratio of different stages that the system falls in are compared in real time. We also employ the delayed feedback control method to control the system and succeed in mitigating the bullwhip effect of the system. On the whole, the bullwhip effect and inventory variance ratio in the stable state are smaller than those in period-doubling and chaos. In the stable state, there is an optimal price adjustment speed to obtain both the lowest bullwhip effect and inventory variance ratio.
DEFF Research Database (Denmark)
Olivarius, Signe
of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...
Isaacson, Eugene
1994-01-01
This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.
Mixing, entropy and competition
International Nuclear Information System (INIS)
Klimenko, A Y
2012-01-01
Non-traditional thermodynamics, applied to random behaviour associated with turbulence, mixing and competition, is reviewed and analysed. Competitive mixing represents a general framework for the study of generic properties of competitive systems and can be used to model a wide class of non-equilibrium phenomena ranging from turbulent premixed flames and invasion waves to complex competitive systems. We demonstrate consistency of the general principles of competition with thermodynamic description, review and analyse the related entropy concepts and introduce the corresponding competitive H-theorem. A competitive system can be characterized by a thermodynamic quantity—competitive potential—which determines the likely direction of evolution of the system. Contested resources tend to move between systems from lower to higher values of the competitive potential. There is, however, an important difference between conventional thermodynamics and competitive thermodynamics. While conventional thermodynamics is constrained by its zeroth law and is fundamentally transitive, the transitivity of competitive thermodynamics depends on the transitivity of the competition rules. Intransitivities are common in the real world and are responsible for complex behaviour in competitive systems. This work follows ideas and methods that have originated from the analysis of turbulent combustion, but reviews a much broader scope of issues linked to mixing and competition, including thermodynamic characterization of complex competitive systems with self-organization. The approach presented here is interdisciplinary and is addressed to the general educated readers, whereas the mathematical details can be found in the appendices. (comment)
Energy Technology Data Exchange (ETDEWEB)
Wintermeyer, Niklas [Mathematisches Institut, Universität zu Köln, Weyertal 86-90, 50931 Köln (Germany); Winters, Andrew R., E-mail: awinters@math.uni-koeln.de [Mathematisches Institut, Universität zu Köln, Weyertal 86-90, 50931 Köln (Germany); Gassner, Gregor J. [Mathematisches Institut, Universität zu Köln, Weyertal 86-90, 50931 Köln (Germany); Kopriva, David A. [Department of Mathematics, The Florida State University, Tallahassee, FL 32306 (United States)
2017-07-01
We design an arbitrary high-order accurate nodal discontinuous Galerkin spectral element approximation for the non-linear two dimensional shallow water equations with non-constant, possibly discontinuous, bathymetry on unstructured, possibly curved, quadrilateral meshes. The scheme is derived from an equivalent flux differencing formulation of the split form of the equations. We prove that this discretization exactly preserves the local mass and momentum. Furthermore, combined with a special numerical interface flux function, the method exactly preserves the mathematical entropy, which is the total energy for the shallow water equations. By adding a specific form of interface dissipation to the baseline entropy conserving scheme we create a provably entropy stable scheme. That is, the numerical scheme discretely satisfies the second law of thermodynamics. Finally, with a particular discretization of the bathymetry source term we prove that the numerical approximation is well-balanced. We provide numerical examples that verify the theoretical findings and furthermore provide an application of the scheme for a partial break of a curved dam test problem.
Directory of Open Access Journals (Sweden)
Roberto Siciliano
2017-07-01
Full Text Available Everyone is subject to a process of progressive deterioration of control mechanisms, which supervise the complex network of human physiological functions, reducing the individual ability to adapt to emerging situations of stress or change. In the light of results obtained during the last years, it appears that some of the tools of nonlinear dynamics, first developed for the physical sciences are well suited for studies of biological systems. We believe that, considering the level of order or complexity of the anatomical apparatus by measuring a physical quantity, which is the entropy, we can evaluate the health status or vice versa fragility of a biological system. In particular, a reduction in the entropy value, indicates modification of the structural order with a progressive reduction of functional reserve of the individual, which is associated with a failure to adapt to stress conditions, difficult to be analyzed and documented with a unique traditional biochemical or biomolecular vision. Therefore, in this paper, we present a method that, conceptually combines complexity, disease and aging, alloys Poisson statistics, predictive of the personal level of health, to the entropy value indicating the status of bio-dynamic and functional body, seen as a complex and open thermodynamic system.
de La Sierra, Ruben Ulises
The present study introduces entropy mapping as a comprehensive method to analyze and describe complex interactive systems; and to assess the effect that entropy has in paradigm changes as described by transition theory. Dynamics of interactions among environmental, economic and demographic conditions affect a number of fast growing locations throughout the world. One of the regions especially affected by accelerated growth in terms of demographic and economic development is the border region between Mexico and the US. As the contrast between these countries provides a significant economic and cultural differential, the dynamics of capital, goods, services and people and the rates at which they interact are rather unique. To illustrate the most fundamental economic and political changes affecting the region, a background addressing the causes for these changes leading to the North America Free Trade Agreement (NAFTA) is presented. Although the concept of thermodynamic entropy was first observed in physical sciences, a relevant homology exists in biological, social and economic sciences as the universal tendency towards disorder, dissipation and equilibrium is present in these disciplines when energy or resources become deficient. Furthermore, information theory is expressed as uncertainty and randomness in terms of efficiency in transmission of information. Although entropy in closed systems is unavoidable, its increase in open systems, can be arrested by a flux of energy, resources and/or information. A critical component of all systems is the boundary. If a boundary is impermeable, it will prevent energy flow from the environment into the system; likewise, if the boundary is too porous, it will not be able to prevent the dissipation of energy and resources into the environment, and will not prevent entropy from entering. Therefore, two expressions of entropy--thermodynamic and information--are identified and related to systems in transition and to spatial
Maximum entropy reconstruction of spin densities involving non uniform prior
International Nuclear Information System (INIS)
Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.
1997-01-01
Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing
Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.
Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro
2015-01-01
Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.
Caticha, Ariel
2007-11-01
What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.
Parametric optimization of CNC end milling using entropy ...
African Journals Online (AJOL)
Parametric optimization of CNC end milling using entropy measurement technique combined with grey-Taguchi method. ... International Journal of Engineering, Science and Technology ... Keywords: CNC end milling, surface finish, material removal rate (MRR), entropy measurement technique, Taguchi method ...
SpatEntropy: Spatial Entropy Measures in R
Altieri, Linda; Cocchi, Daniela; Roli, Giulia
2018-01-01
This article illustrates how to measure the heterogeneity of spatial data presenting a finite number of categories via computation of spatial entropy. The R package SpatEntropy contains functions for the computation of entropy and spatial entropy measures. The extension to spatial entropy measures is a unique feature of SpatEntropy. In addition to the traditional version of Shannon's entropy, the package includes Batty's spatial entropy, O'Neill's entropy, Li and Reynolds' contagion index, Ka...
International Nuclear Information System (INIS)
Reginatto, M.; Goldhagen, P.
1998-06-01
The problem of analyzing data from a multisphere neutron spectrometer to infer the energy spectrum of the incident neutrons is discussed. The main features of the code MAXED, a computer program developed to apply the maximum entropy principle to the deconvolution (unfolding) of multisphere neutron spectrometer data, are described, and the use of the code is illustrated with an example. A user's guide for the code MAXED is included in an appendix. The code is available from the authors upon request
Directory of Open Access Journals (Sweden)
Meyliana Meyliana
2015-11-01
Full Text Available Purpose: To analyze students’ social media preference in order to improve student engagement with university by examining social media implementation quality in terms of information and service quality. Design/methodology/approach: Research methodology is started with the hierarchy creation of student engagement with university which then translated into questionnaire. This questionnaire was distributed to 58 universities in Jakarta (Indonesia’s capital. The questionnaire result was analyzed with entropy and TOPSIS method. Findings: In social media implementation quality, information quality is more important than service quality because in social media, a good information quality is really relevant with the usefulness and comprehensiveness of the information. On the other hand regarding service quality, the system availability will help students in their interaction process with university, on top of the service’s efficiency and fulfillment. This directly impacts the cooperation between students, active learning process, and students’ expectation. The social medias students preferred to improve student engagement with universities respectively are LINE, Facebook, Twitter, Wiki, Blog, Instagram, YouTube, Path, LinkedIn, and Podcast. Research limitations/implications: Social media’s role is not only to create student engagement in the learning process, but also other aspects included by Chickering & Gamson (1987. Practical implications: The Social CRM channel shift from electronic into social media shows that social media holds an important role for university since it eases up the communication between university and the students. The good social media management has been an issue that needs to be solved by university by creating a unit or delegate a person that can manage the social media correctly and quickly so the students feel that they get the good service they want. Originality/value: The other researches focus on observing
International Nuclear Information System (INIS)
Berrocal T, Mariella J.; Roberty, Nilson C.; Silva Neto, Antonio J.; Universidade Federal, Rio de Janeiro, RJ
2002-01-01
The solution of inverse problems in participating media where there is emission, absorption and dispersion of the radiation possesses several applications in engineering and medicine. The objective of this work is to estimative the coefficients of absorption and dispersion in two-dimensional heterogeneous participating media, using in independent form the Generalized Maximum Entropy and Levenberg Marquardt methods. Both methods are based on the solution of the direct problem that is modeled by the Boltzmann equation in cartesian geometry. Some cases testes are presented. (author)
Natural time analysis and Tsallis non-additive entropy statistical mechanics.
Sarlis, N. V.; Skordas, E. S.; Varotsos, P.
2016-12-01
Upon analyzing the seismic data in natural time and employing a sliding natural time window comprising a number of events that would occur in a few months, it has been recently uncovered[1] that a precursory Seismic Electric Signals activity[2] initiates almost simultaneously with the appearance of a minimum in the fluctuations of the order parameter of seismicity [3]. Such minima have been ascertained [4] during periods of the magnitude time series exhibiting long range correlations [5] a few months before all earthquakes of magnitude 7.6 or larger that occurred in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the M9 Tohoku-Oki earthquake). Before and after these minima, characteristic changes of the temporal correlations between earthquake magnitudes are observed which cannot be captured by Tsallis non-additive entropy statistical mechanics in the frame of which it has been suggested that kappa distributions arise [6]. Here, we extend the study concerning the existence of such minima in a large area that includes Aegean Sea and its surrounding area which exhibits in general seismo-tectonics [7] different than that of the entire Japanese area. References P. A. Varotsos et al., Tectonophysics, 589 (2013) 116. P. Varotsos and M. Lazaridou, Tectonophysics 188 (1991) 321. P.A. Varotsos et al., Phys Rev E 72 (2005) 041103. N. V. Sarlis et al., Proc Natl Acad Sci USA 110 (2013) 13734. P. A. Varotsos, N. V. Sarlis, and E. S. Skordas, J Geophys Res Space Physics 119 (2014), 9192, doi: 10.1002/2014JA0205800. G. Livadiotis, and D. J. McComas, J Geophys Res 114 (2009) A11105, doi:10.1029/2009JA014352. S. Uyeda et al., Tectonophysics, 304 (1999) 41.
Excess costs from functional somatic syndromes in Germany - An analysis using entropy balancing.
Grupp, Helen; Kaufmann, Claudia; König, Hans-Helmut; Bleibler, Florian; Wild, Beate; Szecsenyi, Joachim; Herzog, Wolfgang; Schellberg, Dieter; Schäfert, Rainer; Konnopka, Alexander
2017-06-01
The aim of this study was to calculate disorder-specific excess costs in patients with functional somatic syndromes (FSS). We compared 6-month direct and indirect costs in a patient group with FSS (n=273) to a control group of the general adult population in Germany without FSS (n=2914). Data on the patient group were collected between 2007 and 2009 in a randomized controlled trial (speciAL). Data on the control group were obtained from a telephone survey, representative for the general German population, conducted in 2014. Covariate balance between the patient group and the control group was achieved using entropy balancing. Excess costs were calculated by estimating generalized linear models and two-part models for direct costs and indirect costs. Further, we estimated excess costs according to the level of somatic symptom severity (SSS). FSS patients differed significantly from the control group regarding 6-month costs of outpatient physicians (+€280) and other outpatient providers (+€74). According to SSS, significantly higher outpatient physician costs were found for mild (+€151), moderate (+€306) and severe (+€376) SSS. We also found significantly higher costs of other outpatient providers in patients with mild, moderate and severe SSS. Regarding costs of rehabilitation and hospital treatments, FSS patients did not differ significantly from the control group for any level of SSS. Indirect costs were significantly higher in patients with severe SSS (+€760). FSS were of major importance in the outpatient sector. Further, we found significantly higher indirect costs in patients with severe SSS. Copyright © 2017 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Zhendong Mu
2017-02-01
Full Text Available Driver fatigue has become one of the major causes of traffic accidents, and is a complicated physiological process. However, there is no effective method to detect driving fatigue. Electroencephalography (EEG signals are complex, unstable, and non-linear; non-linear analysis methods, such as entropy, maybe more appropriate. This study evaluates a combined entropy-based processing method of EEG data to detect driver fatigue. In this paper, 12 subjects were selected to take part in an experiment, obeying driving training in a virtual environment under the instruction of the operator. Four types of enthrones (spectrum entropy, approximate entropy, sample entropy and fuzzy entropy were used to extract features for the purpose of driver fatigue detection. Electrode selection process and a support vector machine (SVM classification algorithm were also proposed. The average recognition accuracy was 98.75%. Retrospective analysis of the EEG showed that the extracted features from electrodes T5, TP7, TP8 and FP1 may yield better performance. SVM classification algorithm using radial basis function as kernel function obtained better results. A combined entropy-based method demonstrates good classification performance for studying driver fatigue detection.
Shannon information entropy in heavy-ion collisions
Ma, Chun-Wang; Ma, Yu-Gang
2018-03-01
The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.
Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie
2017-04-27
Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.
International Nuclear Information System (INIS)
Alverbro, Karin
2010-01-01
Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects
Parsani, Matteo
2016-10-04
Staggered grid, entropy stable discontinuous spectral collocation operators of any order are developed for the compressible Euler and Navier--Stokes equations on unstructured hexahedral elements. This generalization of previous entropy stable spectral collocation work [M. H. Carpenter, T. C. Fisher, E. J. Nielsen, and S. H. Frankel, SIAM J. Sci. Comput., 36 (2014), pp. B835--B867, M. Parsani, M. H. Carpenter, and E. J. Nielsen, J. Comput. Phys., 292 (2015), pp. 88--113], extends the applicable set of points from tensor product, Legendre--Gauss--Lobatto (LGL), to a combination of tensor product Legendre--Gauss (LG) and LGL points. The new semidiscrete operators discretely conserve mass, momentum, energy, and satisfy a mathematical entropy inequality for the compressible Navier--Stokes equations in three spatial dimensions. They are valid for smooth as well as discontinuous flows. The staggered LG and conventional LGL point formulations are compared on several challenging test problems. The staggered LG operators are significantly more accurate, although more costly from a theoretical point of view. The LG and LGL operators exhibit similar robustness, as is demonstrated using test problems known to be problematic for operators that lack a nonlinear stability proof for the compressible Navier--Stokes equations (e.g., discontinuous Galerkin, spectral difference, or flux reconstruction operators).
Curvature Entropy for Curved Profile Generation
Directory of Open Access Journals (Sweden)
Koichiro Sato
2012-03-01
Full Text Available In a curved surface design, the overall shape features that emerge from combinations of shape elements are important. However, controlling the features of the overall shape in curved profiles is difficult using conventional microscopic shape information such as dimension. Herein two types of macroscopic shape information, curvature entropy and quadrature curvature entropy, quantitatively represent the features of the overall shape. The curvature entropy is calculated by the curvature distribution, and represents the complexity of a shape (one of the overall shape features. The quadrature curvature entropy is an improvement of the curvature entropy by introducing a Markov process to evaluate the continuity of a curvature and to approximate human cognition of the shape. Additionally, a shape generation method using a genetic algorithm as a calculator and the entropy as a shape generation index is presented. Finally, the applicability of the proposed method is demonstrated using the side view of an automobile as a design example.
Harvey, Raymond A; Hayden, Jennifer D; Kamble, Pravin S; Bouchard, Jonathan R; Huang, Joanna C
2017-04-01
We compared methods to control bias and confounding in observational studies including inverse probability weighting (IPW) and stabilized IPW (sIPW). These methods often require iteration and post-calibration to achieve covariate balance. In comparison, entropy balance (EB) optimizes covariate balance a priori by calibrating weights using the target's moments as constraints. We measured covariate balance empirically and by simulation by using absolute standardized mean difference (ASMD), absolute bias (AB), and root mean square error (RMSE), investigating two scenarios: the size of the observed (exposed) cohort exceeds the target (unexposed) cohort and vice versa. The empirical application weighted a commercial health plan cohort to a nationally representative National Health and Nutrition Examination Survey target on the same covariates and compared average total health care cost estimates across methods. Entropy balance alone achieved balance (ASMD ≤ 0.10) on all covariates in simulation and empirically. In simulation scenario I, EB achieved the lowest AB and RMSE (13.64, 31.19) compared with IPW (263.05, 263.99) and sIPW (319.91, 320.71). In scenario II, EB outperformed IPW and sIPW with smaller AB and RMSE. In scenarios I and II, EB achieved the lowest mean estimate difference from the simulated population outcome ($490.05, $487.62) compared with IPW and sIPW, respectively. Empirically, only EB differed from the unweighted mean cost indicating IPW, and sIPW weighting was ineffective. Entropy balance demonstrated the bias-variance tradeoff achieving higher estimate accuracy, yet lower estimate precision, compared with IPW methods. EB weighting required no post-processing and effectively mitigated observed bias and confounding. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Arithmetic of quantum entropy function
International Nuclear Information System (INIS)
Sen, Ashoke
2009-01-01
Quantum entropy function is a proposal for computing the entropy associated with the horizon of a black hole in the extremal limit, and is related via AdS/CFT correspondence to the dimension of the Hilbert space in a dual quantum mechanics. We show that in N = 4 supersymmetric string theories, quantum entropy function formalism naturally explains the origin of the subtle differences between the microscopic degeneracies of quarter BPS dyons carrying different torsion, i.e. different arithmetical properties. These arise from additional saddle points in the path integral - whose existence depends on the arithmetical properties of the black hole charges - constructed as freely acting orbifolds of the original AdS 2 x S 2 near horizon geometry. During this analysis we demonstrate that the quantum entropy function is insensitive to the details of the infrared cutoff used in the computation, and the details of the boundary terms added to the action. We also discuss the role of the asymptotic symmetries of AdS 2 in carrying out the path integral in the definition of quantum entropy function. Finally we show that even though quantum entropy function is expected to compute the absolute degeneracy in a given charge and angular momentum sector, it can also be used to compute the index. This can then be compared with the microscopic computation of the index.
Permutation Entropy: New Ideas and Challenges
Directory of Open Access Journals (Sweden)
Karsten Keller
2017-03-01
Full Text Available Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data.
International Nuclear Information System (INIS)
Alvarez R, J.T.
1998-01-01
This thesis presents a microscopic model for the non-linear fluctuating hydrodynamic of superfluid helium ( 4 He), model developed by means of the Maximum Entropy Method (Maxent). In the chapter 1, it is demonstrated the necessity to developing a microscopic model for the fluctuating hydrodynamic of the superfluid helium, starting from to show a brief overview of the theories and experiments developed in order to explain the behavior of the superfluid helium. On the other hand, it is presented the Morozov heuristic method for the construction of the non-linear hydrodynamic fluctuating of simple fluid. Method that will be generalized for the construction of the non-linear fluctuating hydrodynamic of the superfluid helium. Besides, it is presented a brief summary of the content of the thesis. In the chapter 2, it is reproduced the construction of a Generalized Fokker-Planck equation, (GFP), for a distribution function associated with the coarse grained variables. Function defined with aid of a nonequilibrium statistical operator ρhut FP that is evaluated as Wigneris function through ρ CG obtained by Maxent. Later this equation of GFP is reduced to a non-linear local FP equation from considering a slow and Markov process in the coarse grained variables. In this equation appears a matrix D mn defined with a nonequilibrium coarse grained statistical operator ρhut CG , matrix elements are used in the construction of the non-linear fluctuating hydrodynamics equations of the superfluid helium. In the chapter 3, the Lagrange multipliers are evaluated for to determine ρhut CG by means of the local equilibrium statistical operator ρhut l -tilde with the hypothesis that the system presents small fluctuations. Also are determined the currents associated with the coarse grained variables and furthermore are evaluated the matrix elements D mn but with aid of a quasi equilibrium statistical operator ρhut qe instead of the local equilibrium operator ρhut l -tilde. Matrix
Progress in Preparation and Research of High Entropy Alloys
Directory of Open Access Journals (Sweden)
CHEN Yong-xing
2017-11-01
Full Text Available The current high entropy alloys' studies are most in block, powder, coating, film and other areas. There are few studies of high entropy alloys in other areas and they are lack of unified classification. According to the current high entropy alloys' research situation, The paper has focused on the classification on all kinds of high entropy alloys having been researched, introduced the selecting principle of elements, summarized the preparation methods, reviewed the research institutions, research methods and research contents of high entropy alloys, prospected the application prospect of high entropy alloys, put forward a series of scientific problems of high entropy alloys, including less research on mechanism, incomplete performance research, unsystematic thermal stability study, preparation process parameters to be optimized, lightweight high entropy alloys' design, the expansion on the research field, etc, and the solutions have been given. Those have certain guiding significance for the expansion of the application of high entropy alloys subjects in the future research direction.
International Nuclear Information System (INIS)
Berthomier, Charles
1975-01-01
A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr
International Nuclear Information System (INIS)
Torabi, Mohsen; Zhang, Kaili
2014-01-01
This article investigates the classical entropy generation in cooled slabs. Two types of materials are assumed for the slab: homogeneous material and FGM (functionally graded material). For the homogeneous material, the thermal conductivity is assumed to be a linear function of temperature, while for the FGM slab the thermal conductivity is modeled to vary in accordance with the rule of mixtures. The boundary conditions are assumed to be convective and radiative concurrently, and the internal heat generation of the slab is a linear function of temperature. Using the DTM (differential transformation method) and resultant temperature fields from the DTM, the local and total entropy generation rates within slabs are derived. The effects of physically applicable parameters such as the thermal conductivity parameter for the homogenous slab, β, the thermal conductivity parameter for the FGM slab, γ, gradient index, j, internal heat generation parameter, Q, Biot number at the right side, Nc 2 , conduction–radiation parameter, Nr 2 , dimensionless convection sink temperature, δ, and dimensionless radiation sink temperature, η, on the local and total entropy generation rates are illustrated and explained. The results demonstrate that considering temperature- or coordinate-dependent thermal conductivity and radiation heat transfer at both sides of the slab have great effects on the entropy generation. - Highlights: • The paper investigates entropy generation in a slab due to heat generation and convective–radiative boundary conditions. • Both homogeneous material and FGM (functionally graded material) were considered. • The calculations are carried out using the differential transformation method which is a well-tested analytical technique
Zhang, Yin; Liu, Yue; Li, Yannan; Zhao, Xia; Zhuo, Lin; Zhou, Ajian; Zhang, Li; Su, Zeqi; Chen, Cen; Du, Shiyu; Liu, Daming; Ding, Xia
2018-03-22
Chronic atrophic gastritis (CAG) is the precancerous stage of gastric carcinoma. Traditional Chinese Medicine (TCM) has been widely used in treating CAG. This study aimed to reveal core pathogenesis of CAG by validating the TCM syndrome patterns and provide evidence for optimization of treatment strategies. This is a cross-sectional study conducted in 4 hospitals in China. Hierarchical clustering analysis (HCA) and complex system entropy clustering analysis (CSECA) were performed, respectively, to achieve syndrome pattern validation. Based on HCA, 15 common factors were assigned to 6 syndrome patterns: liver depression and spleen deficiency and blood stasis in the stomach collateral, internal harassment of phlegm-heat and blood stasis in the stomach collateral, phlegm-turbidity internal obstruction, spleen yang deficiency, internal harassment of phlegm-heat and spleen deficiency, and spleen qi deficiency. By CSECA, 22 common factors were assigned to 7 syndrome patterns: qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, yang deficiency, and yin deficiency. Combination of qi deficiency, qi stagnation, blood stasis, phlegm turbidity, heat, yang deficiency, and yin deficiency may play a crucial role in CAG pathogenesis. In accord with this, treatment strategies by TCM herbal prescriptions should be targeted to regulating qi, activating blood, resolving turbidity, clearing heat, removing toxin, nourishing yin, and warming yang. Further explorations are needed to verify and expand the current conclusions.
International Nuclear Information System (INIS)
Li, Rui; Wang, Jun
2016-01-01
A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.
Energy Technology Data Exchange (ETDEWEB)
Li, Rui, E-mail: lirui1401@bjtu.edu.cn; Wang, Jun
2016-01-08
A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.
Comparing Postural Stability Entropy Analyses to Differentiate Fallers and Non-fallers.
Fino, Peter C; Mojdehi, Ahmad R; Adjerid, Khaled; Habibi, Mohammad; Lockhart, Thurmon E; Ross, Shane D
2016-05-01
The health and financial cost of falls has spurred research to differentiate the characteristics of fallers and non-fallers. Postural stability has received much of the attention with recent studies exploring various measures of entropy. This study compared the discriminatory ability of several entropy methods at differentiating two paradigms in the center-of-pressure of elderly individuals: (1) eyes open (EO) vs. eyes closed (EC) and (2) fallers (F) vs. non-fallers (NF). Methods were compared using the area under the curve (AUC) of the receiver-operating characteristic curves developed from logistic regression models. Overall, multiscale entropy (MSE) and composite multiscale entropy (CompMSE) performed the best with AUCs of 0.71 for EO/EC and 0.77 for F/NF. When methods were combined together to maximize the AUC, the entropy classifier had an AUC of for 0.91 the F/NF comparison. These results suggest researchers and clinicians attempting to create clinical tests to identify fallers should consider a combination of every entropy method when creating a classifying test. Additionally, MSE and CompMSE classifiers using polar coordinate data outperformed rectangular coordinate data, encouraging more research into the most appropriate time series for postural stability entropy analysis.
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
Analysis of spectral methods for the homogeneous Boltzmann equation
Filbet, Francis
2011-04-01
The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.
Analysis of spectral methods for the homogeneous Boltzmann equation
Filbet, Francis; Mouhot, Clé ment
2011-01-01
The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
Energy Technology Data Exchange (ETDEWEB)
Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)
2001-09-01
The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.
Quantum dynamical entropy revisited
International Nuclear Information System (INIS)
Hudetz, T.
1996-10-01
We define a new quantum dynamical entropy, which is a 'hybrid' of the closely related, physically oriented entropy introduced by Alicki and Fannes in 1994, and of the mathematically well-developed, single-argument entropy introduced by Connes, Narnhofer and Thirring in 1987. We show that this new quantum dynamical entropy has many properties similar to the ones of the Alicki-Fannes entropy, and also inherits some additional properties from the CNT entropy. In particular, the 'hybrid' entropy interpolates between the two different ways in which both the AF and the CNT entropy of the shift automorphism on the quantum spin chain agree with the usual quantum entropy density, resulting in even better agreement. Also, the new quantum dynamical entropy generalizes the classical dynamical entropy of Kolmogorov and Sinai in the same way as does the AF entropy. Finally, we estimate the 'hybrid' entropy both for the Powers-Price shift systems and for the noncommutative Arnold map on the irrational rotation C * -algebra, leaving some interesting open problems. (author)
International Nuclear Information System (INIS)
Zhou Yunlong; Zhang Xueqing; Gao Yunpeng; Cheng Yue
2009-01-01
For studying flow regimes of gas/liquid two-phase in a vertical upward pipe, the conductance fluctuation information of four typical flow regimes was collected by a measuring the system with self-made multiple conductivity probes. Owing to the non-stationarity of conductance fluctuation signals of gas-liquid two-phase flow, a kind of' flow regime identification method based on wavelet packet Multi-scale Information Entropy and Hidden Markov Model (HMM) was put forward. First of all, the collected conductance fluctuation signals were decomposed into eight different frequency bands signals. Secondly, the wavelet packet multi-scale information entropy of different frequency bands signals were regarded as the input characteristic vectors of all states HMM which had been trained. In the end the regime identification of' the gas-liquid two-phase flow could be performed. The study showed that the method that HMM was applied to identify the flow regime was superior to the one that BP neural network was used, and the results proved that the method was efficient and feasible. (authors)
Directory of Open Access Journals (Sweden)
Montserrat Vallverdú
2017-12-01
Full Text Available The aim of the study was to analyze heart rate variability (HRV response to high-intensity exercise during a 35-km mountain trail race and to ascertain whether fitness level could influence autonomic nervous system (ANS modulation. Time-domain, frequency-domain, and multi-scale entropy (MSE indexes were calculated for eleven mountain-trail runners who completed the race. Many changes were observed, mostly related to exercise load and fatigue. These changes were characterized by increased mean values and standard deviations of the normal-to-normal intervals associated with sympathetic activity, and by decreased differences between successive intervals related to parasympathetic activity. Normalized low frequency (LF power suggested that ANS modulation varied greatly during the race and between individuals. Normalized high frequency (HF power, associated with parasympathetic activity, varied considerably over the race, and tended to decrease at the final stages, whereas changes in the LF/HF ratio corresponded to intervals with varying exercise load. MSE indexes, related to system complexity, indicated the existence of many interactions between the heart and its neurological control mechanism. The time-domain, frequency-domain, and MSE indexes were also able to discriminate faster from slower runners, mainly in the more difficult and in the final stages of the race. These findings suggest the use of HRV analysis to study cardiac function mechanisms in endurance sports.
Towards an information extraction and knowledge formation framework based on Shannon entropy
Directory of Open Access Journals (Sweden)
Iliescu Dragoș
2017-01-01
Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.
Entropy Spectrum of Black Holes of Heterotic String Theory via Adiabatic Invariance
Institute of Scientific and Technical Information of China (English)
Alexis Larra？ aga; Luis Cabarique; Manuel Londo？ o
2012-01-01
Using adiabatic invariance and the Bohr-Sommerfeld quantization rule we investigate the entropy spectroscopy of two black holes of heterotic string theory,the charged GMGHS and the rotating Sen solutions.It is shown that the entropy spectrum is equally spaced in both cases,identically to the spectrum obtained before for Schwarzschild,Reissner-Nordstr?m and Kerr black holes.Since the adiabatic invariance method does not use quasinormal mode analysis,there is no need to impose the small charge or small angular momentum limits and there is no confusion on whether the real part or the imaginary part of the modes is responsible for the entropy spectrum.
Entropy Measures for Stochastic Processes with Applications in Functional Anomaly Detection
Directory of Open Access Journals (Sweden)
Gabriel Martos
2018-01-01
Full Text Available We propose a definition of entropy for stochastic processes. We provide a reproducing kernel Hilbert space model to estimate entropy from a random sample of realizations of a stochastic process, namely functional data, and introduce two approaches to estimate minimum entropy sets. These sets are relevant to detect anomalous or outlier functional data. A numerical experiment illustrates the performance of the proposed method; in addition, we conduct an analysis of mortality rate curves as an interesting application in a real-data context to explore functional anomaly detection.
Improving performance of two-phase natural circulation loops by reducing of entropy generation
International Nuclear Information System (INIS)
Goudarzi, N.; Talebi, S.
2015-01-01
This paper aims to investigate the effects of various parameters on stability behavior and entropy generation through a two-phase natural circulation loop. Two-phase natural circulation systems have low driving head and, consequently, low heat removal capability. To have a higher thermodynamic efficiency, in addition to the stability analysis, minimization of entropy generation by loop should be taken into account in the design of these systems. In the present study, to investigate the stability behavior, the non-linear method (known as the direct solution method or time domain method) which is able to simulate the uniform and non-uniform diameter loops, was applied. To best calculate entropy generation rates, the governing equations of the entropy generation were solved analytically. The effects of various parameters such as operating conditions and geometrical dimensions on the stability behavior and the entropy generation in the two-phase natural circulation loop were then analyzed. - Highlights: • Effects of all important parameters on entropy generation of a loop are studied. • The governing equations of the entropy generation are solved analytically. • Effects of all important parameters on stability of a loop are investigated. • Improvement of two-phase natural circulation loop is investigated.
Automatic maximum entropy spectral reconstruction in NMR
International Nuclear Information System (INIS)
Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.
2007-01-01
Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system
Directory of Open Access Journals (Sweden)
JosÃƒÂ© C. IÃƒÂ±iguez
1999-10-01
Full Text Available Abstract: This paper, the first in a series of four, will expose the lack of inner consistency of the analysis through which Clausius re-expressed the second law of thermodynamics: "Heat cannot, of itself, pass from a colder to a hotter body", as the law of increasing entropy: "The entropy of the universe tends to a maximum". In the two following papers the flaw in Clausius analysis producing the said lack of consistency will be located, corrected and some of its consequences, discussed. Among them the one stating that the identification of the two above written statements of the second law is valid only under certain circumstances. In the fourth and final
Text mining by Tsallis entropy
Jamaati, Maryam; Mehri, Ali
2018-01-01
Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.
Directory of Open Access Journals (Sweden)
Urban Kordes
2005-10-01
Full Text Available The paper tries to tackle the question of connection between entropy and the living. Definitions of life as the phenomenon that defies entropy are overviewed and the conclusion is reached that life is in a way dependant on entropy - it couldn't exist without it. Entropy is a sort of medium, a fertile soil, that gives life possibility to blossom. Paper ends with presenting some consequences for the field of artificial intelligence.
Entropy of Baker's Transformation
Institute of Scientific and Technical Information of China (English)
栾长福
2003-01-01
Four theorems about four different kinds of entropies for Baker's transformation are presented. The Kolmogorov entropy of Baker's transformation is sensitive to the initial flips by the time. The topological entropy of Baker's transformation is found to be log k. The conditions for the state of Baker's transformation to be forbidden are also derived. The relations among the Shanonn, Kolmogorov, topological and Boltzmann entropies are discussed in details.
Physical entropy, information entropy and their evolution equations
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Inspired by the evolution equation of nonequilibrium statistical physics entropy and the concise statistical formula of the entropy production rate, we develop a theory of the dynamic information entropy and build a nonlinear evolution equation of the information entropy density changing in time and state variable space. Its mathematical form and physical meaning are similar to the evolution equation of the physical entropy: The time rate of change of information entropy density originates together from drift, diffusion and production. The concise statistical formula of information entropy production rate is similar to that of physical entropy also. Furthermore, we study the similarity and difference between physical entropy and information entropy and the possible unification of the two statistical entropies, and discuss the relationship among the principle of entropy increase, the principle of equilibrium maximum entropy and the principle of maximum information entropy as well as the connection between them and the entropy evolution equation.
Ben-Naim, Arieh
2011-01-01
Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)
Methods in quantitative image analysis.
Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M
1996-05-01
The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value
Kinetics of the Dynamical Information Shannon Entropy for Complex Systems
International Nuclear Information System (INIS)
Yulmetyev, R.M.; Yulmetyeva, D.G.
1999-01-01
Kinetic behaviour of dynamical information Shannon entropy is discussed for complex systems: physical systems with non-Markovian property and memory in correlation approximation, and biological and physiological systems with sequences of the Markovian and non-Markovian random noises. For the stochastic processes, a description of the information entropy in terms of normalized time correlation functions is given. The influence and important role of two mutually dependent channels of the entropy change, correlation (creation or generation of correlations) and anti-correlation (decay or annihilation of correlation) is discussed. The method developed here is also used in analysis of the density fluctuations in liquid cesium obtained from slow neutron scattering data, fractal kinetics of the long-range fluctuation in the short-time human memory and chaotic dynamics of R-R intervals of human ECG. (author)
Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study
Kingston, Diego; Razzitte, Adrián César
2018-04-01
Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.
Maximum Entropy in Drug Discovery
Directory of Open Access Journals (Sweden)
Chih-Yuan Tseng
2014-07-01
Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.
Bellman, Richard Ernest
1970-01-01
In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat
The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis
Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali
2018-04-01
The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.
Quantum chaos: entropy signatures
International Nuclear Information System (INIS)
Miller, P.A.; Sarkar, S.; Zarum, R.
1998-01-01
A definition of quantum chaos is given in terms of entropy production rates for a quantum system coupled weakly to a reservoir. This allows the treatment of classical and quantum chaos on the same footing. In the quantum theory the entropy considered is the von Neumann entropy and in classical systems it is the Gibbs entropy. The rate of change of the coarse-grained Gibbs entropy of the classical system with time is given by the Kolmogorov-Sinai (KS) entropy. The relation between KS entropy and the rate of change of von Neumann entropy is investigated for the kicked rotator. For a system which is classically chaotic there is a linear relationship between these two entropies. Moreover it is possible to construct contour plots for the local KS entropy and compare it with the corresponding plots for the rate of change of von Neumann entropy. The quantitative and qualitative similarities of these plots are discussed for the standard map (kicked rotor) and the generalised cat maps. (author)
Volkenstein, Mikhail V
2009-01-01
The book "Entropy and Information" deals with the thermodynamical concept of entropy and its relationship to information theory. It is successful in explaining the universality of the term "Entropy" not only as a physical phenomenon, but reveals its existence also in other domains. E.g., Volkenstein discusses the "meaning" of entropy in a biological context and shows how entropy is related to artistic activities. Written by the renowned Russian bio-physicist Mikhail V. Volkenstein, this book on "Entropy and Information" surely serves as a timely introduction to understand entropy from a thermodynamic perspective and is definitely an inspiring and thought-provoking book that should be read by every physicist, information-theorist, biologist, and even artist.
International Nuclear Information System (INIS)
Bian Yiwen; Yang Feng
2010-01-01
Data envelopment analysis (DEA) has been widely used in energy efficiency and environment efficiency analysis in recent years. Based on the existing environment DEA technology, this paper presents several DEA models for estimating the aggregated efficiency of resource and environment. These models can evaluate DMUs' energy efficiencies and environment efficiencies simultaneously. However, efficiency ranking results obtained from these models are not the same, and each model can provide some valuable information of DMUs' efficiencies, which we could not ignore. Under this situation, it may be hard for us to choose a specific model in practice. To address this kind of performance evaluation problem, the current paper extends Shannon-DEA procedure to establish a comprehensive efficiency measure for appraising DMUs' resource and environment efficiencies. In the proposed approach, the measure for evaluating a model's importance degree is provided, and the targets setting approach of inputs/outputs for DMU managers to improve DMUs' energy and environmental efficiencies is also discussed. We illustrate the proposed approach using real data set of 30 provinces in China.
Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray
2014-05-13
The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.
Entropy of charged dilaton-axion black hole
International Nuclear Information System (INIS)
Ghosh, Tanwi; SenGupta, Soumitra
2008-01-01
Using the brick wall method, the entropy of the charged dilaton-axion black hole is determined for both asymptotically flat and nonflat cases. The entropy turns out to be proportional to the horizon area of the black hole confirming the Bekenstein-Hawking area-entropy formula for black holes. The leading order logarithmic corrections to the entropy are also derived for such black holes.
Nonadditive entropy maximization is inconsistent with Bayesian updating
Pressé, Steve
2014-11-01
The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.
Lim, Jongil; Kwon, Ji Young; Song, Juhee; Choi, Hosoon; Shin, Jong Chul; Park, In Yang
2014-02-01
The interpretation of the fetal heart rate (FHR) signal considering labor progression may improve perinatal morbidity and mortality. However, there have been few studies that evaluate the fetus in each labor stage quantitatively. To evaluate whether the entropy indices of FHR are different according to labor progression. A retrospective comparative study of FHR recordings in three groups: 280 recordings in the second stage of labor before vaginal delivery, 31 recordings in the first stage of labor before emergency cesarean delivery, and 23 recordings in the pre-labor before elective cesarean delivery. The stored FHR recordings of external cardiotocography during labor. Approximate entropy (ApEn) and sample entropy (SampEn) for the final 2000 RR intervals. The median ApEn and SampEn for the 2000 RR intervals showed the lowest values in the second stage of labor, followed by the emergency cesarean group and the elective cesarean group for all time segments (all PEntropy indices of FHR were significantly different according to labor progression. This result supports the necessity of considering labor progression when developing intrapartum fetal monitoring using the entropy indices of FHR. Copyright © 2013 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Tong-Bou Chang
2014-01-01
Full Text Available The effects of wall suction on the entropy generation rate in a two-dimensional steady film condensation flow on a horizontal tube are investigated theoretically. In analyzing the liquid flow, the effects of both the gravitational force and the viscous force are taken into account. In addition, a film thickness reduction ratio, Sf, is introduced to evaluate the effect of wall suction on the thickness of the condensate layer. The analytical results show that, the entropy generation rate depends on the Jakob number Ja, the Rayleigh number Ra, the Brinkman number Br, the dimensionless temperature difference ψ, and the wall suction parameter Sw. In addition, it is shown that in the absence of wall suction, a closed-form correlation for the Nusselt number can be derived. Finally, it is shown that the dimensionless entropy generation due to heat transfer, NT, increases with an increasing suction parameter Sw, whereas the dimensionless entropy generation due to liquid film flow friction, NF, decreases.
Generalized Analysis of a Distribution Separation Method
Directory of Open Access Journals (Sweden)
Peng Zhang
2016-04-01
Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.
The maximum entropy production and maximum Shannon information entropy in enzyme kinetics
Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš
2018-04-01
We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.
ENTROPIES AND FLUX-SPLITTINGS FOR THE ISENTROPIC EULER EQUATIONS
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
The authors establish the existence of a large class of mathematical entropies (the so-called weak entropies) associated with the Euler equations for an isentropic, compressible fluid governed by a general pressure law. A mild assumption on the behavior of the pressure law near the vacuum is solely required. The analysis is based on an asymptotic expansion of the fundamental solution (called here the entropy kernel) of a highly singular Euler-Poisson-Darboux equation. The entropy kernel is only H lder continuous and its regularity is carefully investigated. Relying on a notion introduced earlier by the authors, it is also proven that, for the Euler equations, the set of entropy flux-splittings coincides with the set of entropies-entropy fluxes. These results imply the existence of a flux-splitting consistent with all of the entropy inequalities.
The pigeon's discrimination of visual entropy: a logarithmic function.
Young, Michael E; Wasserman, Edward A
2002-11-01
We taught 8 pigeons to discriminate 16-icon arrays that differed in their visual variability or "entropy" to see whether the relationship between entropy and discriminative behavior is linear (in which equivalent differences in entropy should produce equivalent changes in behavior) or logarithmic (in which higher entropy values should be less discriminable from one another than lower entropy values). Pigeons received a go/no-go task in which the lower entropy arrays were reinforced for one group and the higher entropy arrays were reinforced for a second group. The superior discrimination of the second group was predicted by a theoretical analysis in which excitatory and inhibitory stimulus generalization gradients fall along a logarithmic, but not a linear scale. Reanalysis of previously published data also yielded results consistent with a logarithmic relationship between entropy and discriminative behavior.
Density estimation by maximum quantum entropy
International Nuclear Information System (INIS)
Silver, R.N.; Wallstrom, T.; Martz, H.F.
1993-01-01
A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets
Directory of Open Access Journals (Sweden)
David Camarena-Martinez
2016-01-01
Full Text Available For industry, the induction motors are essential elements in production chains. Despite the robustness of induction motors, they are susceptible to failures. The broken rotor bar (BRB fault in induction motors has received special attention since one of its characteristics is that the motor can continue operating with apparent normality; however, at certain point the fault may cause severe damage to the motor. In this work, a methodology to detect BRBs using vibration signals is proposed. The methodology uses the Shannon entropy to quantify the amount of information provided by the vibration signals, which changes due to the presence of new frequency components associated with the fault. For automatic diagnosis, the K-means cluster algorithm and a decision-making unit that looks for the nearest cluster through the Euclidian distance are applied. Unlike other reported works, the proposal can diagnose the BRB condition during startup transient and steady state regimes of operation. Additionally, the proposal is also implemented into a field programmable gate array in order to offer a low-cost and low-complex online monitoring system. The obtained results demonstrate the proposal effectiveness to diagnose half, one, and two BRBs.
Quantum Statistical Entropy of Five-Dimensional Black Hole
Institute of Scientific and Technical Information of China (English)
ZHAO Ren; WU Yue-Qin; ZHANG Sheng-Li
2006-01-01
The generalized uncertainty relation is introduced to calculate quantum statistic entropy of a black hole.By using the new equation of state density motivated by the generalized uncertainty relation, we discuss entropies of Bose field and Fermi field on the background of the five-dimensional spacetime. In our calculation, we need not introduce cutoff. There is not the divergent logarithmic term as in the original brick-wall method. And it is obtained that the quantum statistic entropy corresponding to black hole horizon is proportional to the area of the horizon. Further it is shown that the entropy of black hole is the entropy of quantum state on the surface of horizon. The black hole's entropy is the intrinsic property of the black hole. The entropy is a quantum effect. It makes people further understand the quantum statistic entropy.
Quantum Statistical Entropy of Five-Dimensional Black Hole
International Nuclear Information System (INIS)
Zhao Ren; Zhang Shengli; Wu Yueqin
2006-01-01
The generalized uncertainty relation is introduced to calculate quantum statistic entropy of a black hole. By using the new equation of state density motivated by the generalized uncertainty relation, we discuss entropies of Bose field and Fermi field on the background of the five-dimensional spacetime. In our calculation, we need not introduce cutoff. There is not the divergent logarithmic term as in the original brick-wall method. And it is obtained that the quantum statistic entropy corresponding to black hole horizon is proportional to the area of the horizon. Further it is shown that the entropy of black hole is the entropy of quantum state on the surface of horizon. The black hole's entropy is the intrinsic property of the black hole. The entropy is a quantum effect. It makes people further understand the quantum statistic entropy.
Entropy of black holes with multiple horizons
Directory of Open Access Journals (Sweden)
Yun He
2018-05-01
Full Text Available We examine the entropy of black holes in de Sitter space and black holes surrounded by quintessence. These black holes have multiple horizons, including at least the black hole event horizon and a horizon outside it (cosmological horizon for de Sitter black holes and “quintessence horizon” for the black holes surrounded by quintessence. Based on the consideration that the two horizons are not independent each other, we conjecture that the total entropy of these black holes should not be simply the sum of entropies of the two horizons, but should have an extra term coming from the correlations between the two horizons. Different from our previous works, in this paper we consider the cosmological constant as the variable and employ an effective method to derive the explicit form of the entropy. We also try to discuss the thermodynamic stabilities of these black holes according to the entropy and the effective temperature.
Entropy of black holes with multiple horizons
He, Yun; Ma, Meng-Sen; Zhao, Ren
2018-05-01
We examine the entropy of black holes in de Sitter space and black holes surrounded by quintessence. These black holes have multiple horizons, including at least the black hole event horizon and a horizon outside it (cosmological horizon for de Sitter black holes and "quintessence horizon" for the black holes surrounded by quintessence). Based on the consideration that the two horizons are not independent each other, we conjecture that the total entropy of these black holes should not be simply the sum of entropies of the two horizons, but should have an extra term coming from the correlations between the two horizons. Different from our previous works, in this paper we consider the cosmological constant as the variable and employ an effective method to derive the explicit form of the entropy. We also try to discuss the thermodynamic stabilities of these black holes according to the entropy and the effective temperature.
Multivariate analysis: models and method
International Nuclear Information System (INIS)
Sanz Perucha, J.
1990-01-01
Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis
Multivariate analysis methods in physics
International Nuclear Information System (INIS)
Wolter, M.
2007-01-01
A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru
Methods in algorithmic analysis
Dobrushkin, Vladimir A
2009-01-01
…helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Naef, Rudolf; Acree, William E
2017-06-25
The calculation of the standard enthalpies of vaporization, sublimation and solvation of organic molecules is presented using a common computer algorithm on the basis of a group-additivity method. The same algorithm is also shown to enable the calculation of their entropy of fusion as well as the total phase-change entropy of liquid crystals. The present method is based on the complete breakdown of the molecules into their constituting atoms and their immediate neighbourhood; the respective calculations of the contribution of the atomic groups by means of the Gauss-Seidel fitting method is based on experimental data collected from literature. The feasibility of the calculations for each of the mentioned descriptors was verified by means of a 10-fold cross-validation procedure proving the good to high quality of the predicted values for the three mentioned enthalpies and for the entropy of fusion, whereas the predictive quality for the total phase-change entropy of liquid crystals was poor. The goodness of fit ( Q ²) and the standard deviation (σ) of the cross-validation calculations for the five descriptors was as follows: 0.9641 and 4.56 kJ/mol ( N = 3386 test molecules) for the enthalpy of vaporization, 0.8657 and 11.39 kJ/mol ( N = 1791) for the enthalpy of sublimation, 0.9546 and 4.34 kJ/mol ( N = 373) for the enthalpy of solvation, 0.8727 and 17.93 J/mol/K ( N = 2637) for the entropy of fusion and 0.5804 and 32.79 J/mol/K ( N = 2643) for the total phase-change entropy of liquid crystals. The large discrepancy between the results of the two closely related entropies is discussed in detail. Molecules for which both the standard enthalpies of vaporization and sublimation were calculable, enabled the estimation of their standard enthalpy of fusion by simple subtraction of the former from the latter enthalpy. For 990 of them the experimental enthalpy-of-fusion values are also known, allowing their comparison with predictions, yielding a correlation coefficient R
Directory of Open Access Journals (Sweden)
Rudolf Naef
2017-06-01
Full Text Available The calculation of the standard enthalpies of vaporization, sublimation and solvation of organic molecules is presented using a common computer algorithm on the basis of a group-additivity method. The same algorithm is also shown to enable the calculation of their entropy of fusion as well as the total phase-change entropy of liquid crystals. The present method is based on the complete breakdown of the molecules into their constituting atoms and their immediate neighbourhood; the respective calculations of the contribution of the atomic groups by means of the Gauss-Seidel fitting method is based on experimental data collected from literature. The feasibility of the calculations for each of the mentioned descriptors was verified by means of a 10-fold cross-validation procedure proving the good to high quality of the predicted values for the three mentioned enthalpies and for the entropy of fusion, whereas the predictive quality for the total phase-change entropy of liquid crystals was poor. The goodness of fit (Q2 and the standard deviation (σ of the cross-validation calculations for the five descriptors was as follows: 0.9641 and 4.56 kJ/mol (N = 3386 test molecules for the enthalpy of vaporization, 0.8657 and 11.39 kJ/mol (N = 1791 for the enthalpy of sublimation, 0.9546 and 4.34 kJ/mol (N = 373 for the enthalpy of solvation, 0.8727 and 17.93 J/mol/K (N = 2637 for the entropy of fusion and 0.5804 and 32.79 J/mol/K (N = 2643 for the total phase-change entropy of liquid crystals. The large discrepancy between the results of the two closely related entropies is discussed in detail. Molecules for which both the standard enthalpies of vaporization and sublimation were calculable, enabled the estimation of their standard enthalpy of fusion by simple subtraction of the former from the latter enthalpy. For 990 of them the experimental enthalpy-of-fusion values are also known, allowing their comparison with predictions, yielding a correlation