WorldWideScience

Sample records for maximum entropy analysis

  1. Maximum entropy analysis of EGRET data

    DEFF Research Database (Denmark)

    Pohl, M.; Strong, A.W.

    1997-01-01

    EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....

  2. Maximum-entropy clustering algorithm and its global convergence analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Constructing a batch of differentiable entropy functions touniformly approximate an objective function by means of the maximum-entropy principle, a new clustering algorithm, called maximum-entropy clustering algorithm, is proposed based on optimization theory. This algorithm is a soft generalization of the hard C-means algorithm and possesses global convergence. Its relations with other clustering algorithms are discussed.

  3. Nonsymmetric entropy and maximum nonsymmetric entropy principle

    International Nuclear Information System (INIS)

    Liu Chengshi

    2009-01-01

    Under the frame of a statistical model, the concept of nonsymmetric entropy which generalizes the concepts of Boltzmann's entropy and Shannon's entropy, is defined. Maximum nonsymmetric entropy principle is proved. Some important distribution laws such as power law, can be derived from this principle naturally. Especially, nonsymmetric entropy is more convenient than other entropy such as Tsallis's entropy in deriving power laws.

  4. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    Science.gov (United States)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  5. Maximum Quantum Entropy Method

    OpenAIRE

    Sim, Jae-Hoon; Han, Myung Joon

    2018-01-01

    Maximum entropy method for analytic continuation is extended by introducing quantum relative entropy. This new method is formulated in terms of matrix-valued functions and therefore invariant under arbitrary unitary transformation of input matrix. As a result, the continuation of off-diagonal elements becomes straightforward. Without introducing any further ambiguity, the Bayesian probabilistic interpretation is maintained just as in the conventional maximum entropy method. The applications o...

  6. Credal Networks under Maximum Entropy

    OpenAIRE

    Lukasiewicz, Thomas

    2013-01-01

    We apply the principle of maximum entropy to select a unique joint probability distribution from the set of all joint probability distributions specified by a credal network. In detail, we start by showing that the unique joint distribution of a Bayesian tree coincides with the maximum entropy model of its conditional distributions. This result, however, does not hold anymore for general Bayesian networks. We thus present a new kind of maximum entropy models, which are computed sequentially. ...

  7. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  8. Application of maximum entropy to neutron tunneling spectroscopy

    International Nuclear Information System (INIS)

    Mukhopadhyay, R.; Silver, R.N.

    1990-01-01

    We demonstrate the maximum entropy method for the deconvolution of high resolution tunneling data acquired with a quasielastic spectrometer. Given a precise characterization of the instrument resolution function, a maximum entropy analysis of lutidine data obtained with the IRIS spectrometer at ISIS results in an effective factor of three improvement in resolution. 7 refs., 4 figs

  9. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  10. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  11. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  12. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  13. Automatic maximum entropy spectral reconstruction in NMR

    International Nuclear Information System (INIS)

    Mobli, Mehdi; Maciejewski, Mark W.; Gryk, Michael R.; Hoch, Jeffrey C.

    2007-01-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system

  14. Maximum entropy methods

    International Nuclear Information System (INIS)

    Ponman, T.J.

    1984-01-01

    For some years now two different expressions have been in use for maximum entropy image restoration and there has been some controversy over which one is appropriate for a given problem. Here two further entropies are presented and it is argued that there is no single correct algorithm. The properties of the four different methods are compared using simple 1D simulations with a view to showing how they can be used together to gain as much information as possible about the original object. (orig.)

  15. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Science.gov (United States)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  16. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  17. Maximum entropy analysis of liquid diffraction data

    International Nuclear Information System (INIS)

    Root, J.H.; Egelstaff, P.A.; Nickel, B.G.

    1986-01-01

    A maximum entropy method for reducing truncation effects in the inverse Fourier transform of structure factor, S(q), to pair correlation function, g(r), is described. The advantages and limitations of the method are explored with the PY hard sphere structure factor as model input data. An example using real data on liquid chlorine, is then presented. It is seen that spurious structure is greatly reduced in comparison to traditional Fourier transform methods. (author)

  18. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    Science.gov (United States)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  19. Stationary neutrino radiation transport by maximum entropy closure

    International Nuclear Information System (INIS)

    Bludman, S.A.

    1994-11-01

    The authors obtain the angular distributions that maximize the entropy functional for Maxwell-Boltzmann (classical), Bose-Einstein, and Fermi-Dirac radiation. In the low and high occupancy limits, the maximum entropy closure is bounded by previously known variable Eddington factors that depend only on the flux. For intermediate occupancy, the maximum entropy closure depends on both the occupation density and the flux. The Fermi-Dirac maximum entropy variable Eddington factor shows a scale invariance, which leads to a simple, exact analytic closure for fermions. This two-dimensional variable Eddington factor gives results that agree well with exact (Monte Carlo) neutrino transport calculations out of a collapse residue during early phases of hydrostatic neutron star formation

  20. MAXIMUM PRINCIPLE FOR SUBSONIC FLOW WITH VARIABLE ENTROPY

    Directory of Open Access Journals (Sweden)

    B. Sizykh Grigory

    2017-01-01

    Full Text Available Maximum principle for subsonic flow is fair for stationary irrotational subsonic gas flows. According to this prin- ciple, if the value of the velocity is not constant everywhere, then its maximum is achieved on the boundary and only on the boundary of the considered domain. This property is used when designing form of an aircraft with a maximum critical val- ue of the Mach number: it is believed that if the local Mach number is less than unit in the incoming flow and on the body surface, then the Mach number is less then unit in all points of flow. The known proof of maximum principle for subsonic flow is based on the assumption that in the whole considered area of the flow the pressure is a function of density. For the ideal and perfect gas (the role of diffusion is negligible, and the Mendeleev-Clapeyron law is fulfilled, the pressure is a function of density if entropy is constant in the entire considered area of the flow. Shows an example of a stationary sub- sonic irrotational flow, in which the entropy has different values on different stream lines, and the pressure is not a function of density. The application of the maximum principle for subsonic flow with respect to such a flow would be unreasonable. This example shows the relevance of the question about the place of the points of maximum value of the velocity, if the entropy is not a constant. To clarify the regularities of the location of these points, was performed the analysis of the com- plete Euler equations (without any simplifying assumptions in 3-D case. The new proof of the maximum principle for sub- sonic flow was proposed. This proof does not rely on the assumption that the pressure is a function of density. Thus, it is shown that the maximum principle for subsonic flow is true for stationary subsonic irrotational flows of ideal perfect gas with variable entropy.

  1. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Directory of Open Access Journals (Sweden)

    Rodrigo Cofré

    2018-01-01

    Full Text Available The spiking activity of neuronal networks follows laws that are not time-reversal symmetric; the notion of pre-synaptic and post-synaptic neurons, stimulus correlations and noise correlations have a clear time order. Therefore, a biologically realistic statistical model for the spiking activity should be able to capture some degree of time irreversibility. We use the thermodynamic formalism to build a framework in the context maximum entropy models to quantify the degree of time irreversibility, providing an explicit formula for the information entropy production of the inferred maximum entropy Markov chain. We provide examples to illustrate our results and discuss the importance of time irreversibility for modeling the spike train statistics.

  2. Short-time maximum entropy method analysis of molecular dynamics simulation: Unimolecular decomposition of formic acid

    Science.gov (United States)

    Takahashi, Osamu; Nomura, Tetsuo; Tabayashi, Kiyohiko; Yamasaki, Katsuyoshi

    2008-07-01

    We performed spectral analysis by using the maximum entropy method instead of the traditional Fourier transform technique to investigate the short-time behavior in molecular systems, such as the energy transfer between vibrational modes and chemical reactions. This procedure was applied to direct ab initio molecular dynamics calculations for the decomposition of formic acid. More reactive trajectories of dehydrolation than those of decarboxylation were obtained for Z-formic acid, which was consistent with the prediction of previous theoretical and experimental studies. Short-time maximum entropy method analyses were performed for typical reactive and non-reactive trajectories. Spectrograms of a reactive trajectory were obtained; these clearly showed the reactant, transient, and product regions, especially for the dehydrolation path.

  3. On an Objective Basis for the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    David J. Miller

    2015-01-01

    Full Text Available In this letter, we elaborate on some of the issues raised by a recent paper by Neapolitan and Jiang concerning the maximum entropy (ME principle and alternative principles for estimating probabilities consistent with known, measured constraint information. We argue that the ME solution for the “problematic” example introduced by Neapolitan and Jiang has stronger objective basis, rooted in results from information theory, than their alternative proposed solution. We also raise some technical concerns about the Bayesian analysis in their work, which was used to independently support their alternative to the ME solution. The letter concludes by noting some open problems involving maximum entropy statistical inference.

  4. Maximum entropy deconvolution of low count nuclear medicine images

    International Nuclear Information System (INIS)

    McGrath, D.M.

    1998-12-01

    Maximum entropy is applied to the problem of deconvolving nuclear medicine images, with special consideration for very low count data. The physics of the formation of scintigraphic images is described, illustrating the phenomena which degrade planar estimates of the tracer distribution. Various techniques which are used to restore these images are reviewed, outlining the relative merits of each. The development and theoretical justification of maximum entropy as an image processing technique is discussed. Maximum entropy is then applied to the problem of planar deconvolution, highlighting the question of the choice of error parameters for low count data. A novel iterative version of the algorithm is suggested which allows the errors to be estimated from the predicted Poisson mean values. This method is shown to produce the exact results predicted by combining Poisson statistics and a Bayesian interpretation of the maximum entropy approach. A facility for total count preservation has also been incorporated, leading to improved quantification. In order to evaluate this iterative maximum entropy technique, two comparable methods, Wiener filtering and a novel Bayesian maximum likelihood expectation maximisation technique, were implemented. The comparison of results obtained indicated that this maximum entropy approach may produce equivalent or better measures of image quality than the compared methods, depending upon the accuracy of the system model used. The novel Bayesian maximum likelihood expectation maximisation technique was shown to be preferable over many existing maximum a posteriori methods due to its simplicity of implementation. A single parameter is required to define the Bayesian prior, which suppresses noise in the solution and may reduce the processing time substantially. Finally, maximum entropy deconvolution was applied as a pre-processing step in single photon emission computed tomography reconstruction of low count data. Higher contrast results were

  5. Two-dimensional maximum entropy image restoration

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.; Trussell, H.J.

    1977-07-01

    An optical check problem was constructed to test P LOG P maximum entropy restoration of an extremely distorted image. Useful recovery of the original image was obtained. Comparison with maximum a posteriori restoration is made. 7 figures

  6. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  7. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...... in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results....... Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges....

  8. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1989-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. The author reviews the need for such methods in data analysis and shows, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. He concludes with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  9. Introduction to maximum entropy

    International Nuclear Information System (INIS)

    Sivia, D.S.

    1988-01-01

    The maximum entropy (MaxEnt) principle has been successfully used in image reconstruction in a wide variety of fields. We review the need for such methods in data analysis and show, by use of a very simple example, why MaxEnt is to be preferred over other regularizing functions. This leads to a more general interpretation of the MaxEnt method, and its use is illustrated with several different examples. Practical difficulties with non-linear problems still remain, this being highlighted by the notorious phase problem in crystallography. We conclude with an example from neutron scattering, using data from a filter difference spectrometer to contrast MaxEnt with a conventional deconvolution. 12 refs., 8 figs., 1 tab

  10. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    Science.gov (United States)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  11. Modeling multisite streamflow dependence with maximum entropy copula

    Science.gov (United States)

    Hao, Z.; Singh, V. P.

    2013-10-01

    Synthetic streamflows at different sites in a river basin are needed for planning, operation, and management of water resources projects. Modeling the temporal and spatial dependence structure of monthly streamflow at different sites is generally required. In this study, the maximum entropy copula method is proposed for multisite monthly streamflow simulation, in which the temporal and spatial dependence structure is imposed as constraints to derive the maximum entropy copula. The monthly streamflows at different sites are then generated by sampling from the conditional distribution. A case study for the generation of monthly streamflow at three sites in the Colorado River basin illustrates the application of the proposed method. Simulated streamflow from the maximum entropy copula is in satisfactory agreement with observed streamflow.

  12. Maximum entropy reconstructions for crystallographic imaging; Cristallographie et reconstruction d`images par maximum d`entropie

    Energy Technology Data Exchange (ETDEWEB)

    Papoular, R

    1997-07-01

    The Fourier Transform is of central importance to Crystallography since it allows the visualization in real space of tridimensional scattering densities pertaining to physical systems from diffraction data (powder or single-crystal diffraction, using x-rays, neutrons, electrons or else). In turn, this visualization makes it possible to model and parametrize these systems, the crystal structures of which are eventually refined by Least-Squares techniques (e.g., the Rietveld method in the case of Powder Diffraction). The Maximum Entropy Method (sometimes called MEM or MaxEnt) is a general imaging technique, related to solving ill-conditioned inverse problems. It is ideally suited for tackling undetermined systems of linear questions (for which the number of variables is much larger than the number of equations). It is already being applied successfully in Astronomy, Radioastronomy and Medical Imaging. The advantages of using MAXIMUM Entropy over conventional Fourier and `difference Fourier` syntheses stem from the following facts: MaxEnt takes the experimental error bars into account; MaxEnt incorporate Prior Knowledge (e.g., the positivity of the scattering density in some instances); MaxEnt allows density reconstructions from incompletely phased data, as well as from overlapping Bragg reflections; MaxEnt substantially reduces truncation errors to which conventional experimental Fourier reconstructions are usually prone. The principles of Maximum Entropy imaging as applied to Crystallography are first presented. The method is then illustrated by a detailed example specific to Neutron Diffraction: the search for proton in solids. (author). 17 refs.

  13. Maximum-Entropy Inference with a Programmable Annealer

    Science.gov (United States)

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.

    2016-03-01

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition.

  14. Maximum-entropy description of animal movement.

    Science.gov (United States)

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  15. Zipf's law, power laws and maximum entropy

    International Nuclear Information System (INIS)

    Visser, Matt

    2013-01-01

    Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified. (paper)

  16. A Research on Maximum Symbolic Entropy from Intrinsic Mode Function and Its Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Zhuofei Xu

    2017-01-01

    Full Text Available Empirical mode decomposition (EMD is a self-adaptive analysis method for nonlinear and nonstationary signals. It has been widely applied to machinery fault diagnosis and structural damage detection. A novel feature, maximum symbolic entropy of intrinsic mode function based on EMD, is proposed to enhance the ability of recognition of EMD in this paper. First, a signal is decomposed into a collection of intrinsic mode functions (IMFs based on the local characteristic time scale of the signal, and then IMFs are transformed into a serious of symbolic sequence with different parameters. Second, it can be found that the entropies of symbolic IMFs are quite different. However, there is always a maximum value for a certain symbolic IMF. Third, take the maximum symbolic entropy as features to describe IMFs from a signal. Finally, the proposed features are applied to evaluate the effect of maximum symbolic entropy in fault diagnosis of rolling bearing, and then the maximum symbolic entropy is compared with other standard time analysis features in a contrast experiment. Although maximum symbolic entropy is only a time domain feature, it can reveal the signal characteristic information accurately. It can also be used in other fields related to EMD method.

  17. Objective Bayesianism and the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Jon Williamson

    2013-09-01

    Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

  18. On Maximum Entropy and Inference

    Directory of Open Access Journals (Sweden)

    Luigi Gresele

    2017-11-01

    Full Text Available Maximum entropy is a powerful concept that entails a sharp separation between relevant and irrelevant variables. It is typically invoked in inference, once an assumption is made on what the relevant variables are, in order to estimate a model from data, that affords predictions on all other (dependent variables. Conversely, maximum entropy can be invoked to retrieve the relevant variables (sufficient statistics directly from the data, once a model is identified by Bayesian model selection. We explore this approach in the case of spin models with interactions of arbitrary order, and we discuss how relevant interactions can be inferred. In this perspective, the dimensionality of the inference problem is not set by the number of parameters in the model, but by the frequency distribution of the data. We illustrate the method showing its ability to recover the correct model in a few prototype cases and discuss its application on a real dataset.

  19. The Maximum Entropy Method for Optical Spectrum Analysis of Real-Time TDDFT

    International Nuclear Information System (INIS)

    Toogoshi, M; Kano, S S; Zempo, Y

    2015-01-01

    The maximum entropy method (MEM) is one of the key techniques for spectral analysis. The major feature is that spectra in the low frequency part can be described by the short time-series data. Thus, we applied MEM to analyse the spectrum from the time dependent dipole moment obtained from the time-dependent density functional theory (TDDFT) calculation in real time. It is intensively studied for computing optical properties. In the MEM analysis, however, the maximum lag of the autocorrelation is restricted by the total number of time-series data. We proposed that, as an improved MEM analysis, we use the concatenated data set made from the several-times repeated raw data. We have applied this technique to the spectral analysis of the TDDFT dipole moment of ethylene and oligo-fluorene with n = 8. As a result, the higher resolution can be obtained, which is closer to that of FT with practically time-evoluted data as the same total number of time steps. The efficiency and the characteristic feature of this technique are presented in this paper. (paper)

  20. Hydrodynamic Relaxation of an Electron Plasma to a Near-Maximum Entropy State

    International Nuclear Information System (INIS)

    Rodgers, D. J.; Servidio, S.; Matthaeus, W. H.; Mitchell, T. B.; Aziz, T.; Montgomery, D. C.

    2009-01-01

    Dynamical relaxation of a pure electron plasma in a Malmberg-Penning trap is studied, comparing experiments, numerical simulations and statistical theories of weakly dissipative two-dimensional (2D) turbulence. Simulations confirm that the dynamics are approximated well by a 2D hydrodynamic model. Statistical analysis favors a theoretical picture of relaxation to a near-maximum entropy state with constrained energy, circulation, and angular momentum. This provides evidence that 2D electron fluid relaxation in a turbulent regime is governed by principles of maximum entropy.

  1. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore

  2. Application of Maximum Entropy Distribution to the Statistical Properties of Wave Groups

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFT filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.

  3. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  4. Tsallis distribution as a standard maximum entropy solution with 'tail' constraint

    International Nuclear Information System (INIS)

    Bercher, J.-F.

    2008-01-01

    We show that Tsallis' distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Renyi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the construction. Finally, the 'maximum entropy tail distribution' is identified as a Generalized Pareto Distribution

  5. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  6. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    Science.gov (United States)

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  7. Analysis of neutron reflectivity data: maximum entropy, Bayesian spectral analysis and speckle holography

    International Nuclear Information System (INIS)

    Sivia, D.S.; Hamilton, W.A.; Smith, G.S.

    1991-01-01

    The analysis of neutron reflectivity data to obtain nuclear scattering length density profiles is akin to the notorious phaseless Fourier problem, well known in many fields such as crystallography. Current methods of analysis culminate in the refinement of a few parameters of a functional model, and are often preceded by a long and laborious process of trial and error. We start by discussing the use of maximum entropy for obtained 'free-form' solutions of the density profile, as an alternative to the trial and error phase when a functional model is not available. Next we consider a Bayesian spectral analysis approach, which is appropriate for optimising the parameters of a simple (but adequate) type of model when the number of parameters is not known. Finally, we suggest a novel experimental procedure, the analogue of astronomical speckle holography, designed to alleviate the ambiguity problems inherent in traditional reflectivity measurements. (orig.)

  8. Maximum entropy beam diagnostic tomography

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1985-01-01

    This paper reviews the formalism of maximum entropy beam diagnostic tomography as applied to the Fusion Materials Irradiation Test (FMIT) prototype accelerator. The same formalism has also been used with streak camera data to produce an ultrahigh speed movie of the beam profile of the Experimental Test Accelerator (ETA) at Livermore. 11 refs., 4 figs

  9. Unification of field theory and maximum entropy methods for learning probability densities

    Science.gov (United States)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  10. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  11. Maximum and minimum entropy states yielding local continuity bounds

    Science.gov (United States)

    Hanson, Eric P.; Datta, Nilanjana

    2018-04-01

    Given an arbitrary quantum state (σ), we obtain an explicit construction of a state ρɛ * ( σ ) [respectively, ρ * , ɛ ( σ ) ] which has the maximum (respectively, minimum) entropy among all states which lie in a specified neighborhood (ɛ-ball) of σ. Computing the entropy of these states leads to a local strengthening of the continuity bound of the von Neumann entropy, i.e., the Audenaert-Fannes inequality. Our bound is local in the sense that it depends on the spectrum of σ. The states ρɛ * ( σ ) and ρ * , ɛ (σ) depend only on the geometry of the ɛ-ball and are in fact optimizers for a larger class of entropies. These include the Rényi entropy and the minimum- and maximum-entropies, providing explicit formulas for certain smoothed quantities. This allows us to obtain local continuity bounds for these quantities as well. In obtaining this bound, we first derive a more general result which may be of independent interest, namely, a necessary and sufficient condition under which a state maximizes a concave and Gâteaux-differentiable function in an ɛ-ball around a given state σ. Examples of such a function include the von Neumann entropy and the conditional entropy of bipartite states. Our proofs employ tools from the theory of convex optimization under non-differentiable constraints, in particular Fermat's rule, and majorization theory.

  12. On the maximum entropy distributions of inherently positive nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Taavitsainen, A., E-mail: aapo.taavitsainen@gmail.com; Vanhanen, R.

    2017-05-11

    The multivariate log-normal distribution is used by many authors and statistical uncertainty propagation programs for inherently positive quantities. Sometimes it is claimed that the log-normal distribution results from the maximum entropy principle, if only means, covariances and inherent positiveness of quantities are known or assumed to be known. In this article we show that this is not true. Assuming a constant prior distribution, the maximum entropy distribution is in fact a truncated multivariate normal distribution – whenever it exists. However, its practical application to multidimensional cases is hindered by lack of a method to compute its location and scale parameters from means and covariances. Therefore, regardless of its theoretical disadvantage, use of other distributions seems to be a practical necessity. - Highlights: • Statistical uncertainty propagation requires a sampling distribution. • The objective distribution of inherently positive quantities is determined. • The objectivity is based on the maximum entropy principle. • The maximum entropy distribution is the truncated normal distribution. • Applicability of log-normal or normal distribution approximation is limited.

  13. Rumor Identification with Maximum Entropy in MicroNet

    Directory of Open Access Journals (Sweden)

    Suisheng Yu

    2017-01-01

    Full Text Available The widely used applications of Microblog, WeChat, and other social networking platforms (that we call MicroNet shorten the period of information dissemination and expand the range of information dissemination, which allows rumors to cause greater harm and have more influence. A hot topic in the information dissemination field is how to identify and block rumors. Based on the maximum entropy model, this paper constructs the recognition mechanism of rumor information in the micronetwork environment. First, based on the information entropy theory, we obtained the characteristics of rumor information using the maximum entropy model. Next, we optimized the original classifier training set and the feature function to divide the information into rumors and nonrumors. Finally, the experimental simulation results show that the rumor identification results using this method are better than the original classifier and other related classification methods.

  14. Maximum entropy PDF projection: A review

    Science.gov (United States)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  15. The maximum-entropy method in superspace

    Czech Academy of Sciences Publication Activity Database

    van Smaalen, S.; Palatinus, Lukáš; Schneider, M.

    2003-01-01

    Roč. 59, - (2003), s. 459-469 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method, * aperiodic crystals * electron density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.558, year: 2003

  16. Principle of maximum entropy for reliability analysis in the design of machine components

    Science.gov (United States)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  17. Power spectrum of the geomagnetic field by the maximum entropy method

    International Nuclear Information System (INIS)

    Kantor, I.J.; Trivedi, N.B.

    1980-01-01

    Monthly mean values of Vassouras (state of Rio de Janeiro) geomagnetic field are analyzed us the maximum entropy method. The method is described and compared with other methods of spectral analysis, and its advantages and disadvantages are presented. (Author) [pt

  18. Maximum Entropy and Theory Construction: A Reply to Favretti

    Directory of Open Access Journals (Sweden)

    John Harte

    2018-04-01

    Full Text Available In the maximum entropy theory of ecology (METE, the form of a function describing the distribution of abundances over species and metabolic rates over individuals in an ecosystem is inferred using the maximum entropy inference procedure. Favretti shows that an alternative maximum entropy model exists that assumes the same prior knowledge and makes predictions that differ from METE’s. He shows that both cannot be correct and asserts that his is the correct one because it can be derived from a classic microstate-counting calculation. I clarify here exactly what the core entities and definitions are for METE, and discuss the relevance of two critical issues raised by Favretti: the existence of a counting procedure for microstates and the choices of definition of the core elements of a theory. I emphasize that a theorist controls how the core entities of his or her theory are defined, and that nature is the final arbiter of the validity of a theory.

  19. A Maximum Entropy Method for a Robust Portfolio Problem

    Directory of Open Access Journals (Sweden)

    Yingying Xu

    2014-06-01

    Full Text Available We propose a continuous maximum entropy method to investigate the robustoptimal portfolio selection problem for the market with transaction costs and dividends.This robust model aims to maximize the worst-case portfolio return in the case that allof asset returns lie within some prescribed intervals. A numerical optimal solution tothe problem is obtained by using a continuous maximum entropy method. Furthermore,some numerical experiments indicate that the robust model in this paper can result in betterportfolio performance than a classical mean-variance model.

  20. A Bayes-Maximum Entropy method for multi-sensor data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Beckerman, M.

    1991-01-01

    In this paper we introduce a Bayes-Maximum Entropy formalism for multi-sensor data fusion, and present an application of this methodology to the fusion of ultrasound and visual sensor data as acquired by a mobile robot. In our approach the principle of maximum entropy is applied to the construction of priors and likelihoods from the data. Distances between ultrasound and visual points of interest in a dual representation are used to define Gibbs likelihood distributions. Both one- and two-dimensional likelihoods are presented, and cast into a form which makes explicit their dependence upon the mean. The Bayesian posterior distributions are used to test a null hypothesis, and Maximum Entropy Maps used for navigation are updated using the resulting information from the dual representation. 14 refs., 9 figs.

  1. Precise charge density studies by maximum entropy method

    CERN Document Server

    Takata, M

    2003-01-01

    For the production research and development of nanomaterials, their structural information is indispensable. Recently, a sophisticated analytical method, which is based on information theory, the Maximum Entropy Method (MEM) using synchrotron radiation powder data, has been successfully applied to determine precise charge densities of metallofullerenes and nanochannel microporous compounds. The results revealed various endohedral natures of metallofullerenes and one-dimensional array formation of adsorbed gas molecules in nanochannel microporous compounds. The concept of MEM analysis was also described briefly. (author)

  2. Twenty-five years of maximum-entropy principle

    Science.gov (United States)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  3. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  4. The Structure of the Class of Maximum Tsallis–Havrda–Chavát Entropy Copulas

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2016-07-01

    Full Text Available A maximum entropy copula is the copula associated with the joint distribution, with prescribed marginal distributions on [ 0 , 1 ] , which maximizes the Tsallis–Havrda–Chavát entropy with q = 2 . We find necessary and sufficient conditions for each maximum entropy copula to be a copula in the class introduced in Rodríguez-Lallena and Úbeda-Flores (2004, and we also show that each copula in that class is a maximum entropy copula.

  5. Discontinuity of maximum entropy inference and quantum phase transitions

    International Nuclear Information System (INIS)

    Chen, Jianxin; Ji, Zhengfeng; Yu, Nengkun; Zeng, Bei; Li, Chi-Kwong; Poon, Yiu-Tung; Shen, Yi; Zhou, Duanlu

    2015-01-01

    In this paper, we discuss the connection between two genuinely quantum phenomena—the discontinuity of quantum maximum entropy inference and quantum phase transitions at zero temperature. It is shown that the discontinuity of the maximum entropy inference of local observable measurements signals the non-local type of transitions, where local density matrices of the ground state change smoothly at the transition point. We then propose to use the quantum conditional mutual information of the ground state as an indicator to detect the discontinuity and the non-local type of quantum phase transitions in the thermodynamic limit. (paper)

  6. Maximum-entropy data restoration using both real- and Fourier-space analysis

    International Nuclear Information System (INIS)

    Anderson, D.M.; Martin, D.C.; Thomas, E.L.

    1989-01-01

    An extension of the maximum-entropy (ME) data-restoration method is presented that is sensitive to periodic correlations in data. The method takes advantage of the higher signal-to-noise ratio for periodic information in Fourier space, thus enhancing statistically significant frequencies in a manner which avoids the user bias inherent in conventional Fourier filtering. This procedure incorporates concepts underlying new approaches in quantum mechanics that consider entropies in both position and momentum spaces, although the emphasis here is on data restoration rather than quantum physics. After a fast Fourier transform of the image, the phases are saved and the array of Fourier moduli are restored using the maximum-entropy criterion. A first-order continuation method is introduced that speeds convergence of the ME computation. The restored moduli together with the original phases are then Fourier inverted to yield a new image; traditional real-space ME restoration is applied to this new image completing one stage in the restoration process. In test cases improvement can be obtained from two to four stages of iteration. It is shown that in traditional Fourier filtering spurious features can be induced by selection or elimination of Fourier components without regard to their statistical significance. With the present approach there is no such freedom for the user to exert personal bias, so that features present in the final image and power spectrum are those which have survived the tests of statistical significance in both real and Fourier space. However, it is still possible for periodicities to 'bleed' across sharp boundaries. An 'uncertainty' relation is derived describing the inverse relationship between the resolution of these boundaries and the level of noise that can be eliminated. (orig./BHO)

  7. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Directory of Open Access Journals (Sweden)

    Richard R Stein

    2015-07-01

    Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  8. Maximum entropy decomposition of quadrupole mass spectra

    International Nuclear Information System (INIS)

    Toussaint, U. von; Dose, V.; Golan, A.

    2004-01-01

    We present an information-theoretic method called generalized maximum entropy (GME) for decomposing mass spectra of gas mixtures from noisy measurements. In this GME approach to the noisy, underdetermined inverse problem, the joint entropies of concentration, cracking, and noise probabilities are maximized subject to the measured data. This provides a robust estimation for the unknown cracking patterns and the concentrations of the contributing molecules. The method is applied to mass spectroscopic data of hydrocarbons, and the estimates are compared with those received from a Bayesian approach. We show that the GME method is efficient and is computationally fast

  9. Maximum entropy estimation via Gauss-LP quadratures

    NARCIS (Netherlands)

    Thély, Maxime; Sutter, Tobias; Mohajerin Esfahani, P.; Lygeros, John; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    We present an approximation method to a class of parametric integration problems that naturally appear when solving the dual of the maximum entropy estimation problem. Our method builds up on a recent generalization of Gauss quadratures via an infinite-dimensional linear program, and utilizes a

  10. Maximum-entropy networks pattern detection, network reconstruction and graph combinatorics

    CERN Document Server

    Squartini, Tiziano

    2017-01-01

    This book is an introduction to maximum-entropy models of random graphs with given topological properties and their applications. Its original contribution is the reformulation of many seemingly different problems in the study of both real networks and graph theory within the unified framework of maximum entropy. Particular emphasis is put on the detection of structural patterns in real networks, on the reconstruction of the properties of networks from partial information, and on the enumeration and sampling of graphs with given properties.  After a first introductory chapter explaining the motivation, focus, aim and message of the book, chapter 2 introduces the formal construction of maximum-entropy ensembles of graphs with local topological constraints. Chapter 3 focuses on the problem of pattern detection in real networks and provides a powerful way to disentangle nontrivial higher-order structural features from those that can be traced back to simpler local constraints. Chapter 4 focuses on the problem o...

  11. Spectral maximum entropy hydrodynamics of fermionic radiation: a three-moment system for one-dimensional flows

    International Nuclear Information System (INIS)

    Banach, Zbigniew; Larecki, Wieslaw

    2013-01-01

    The spectral formulation of the nine-moment radiation hydrodynamics resulting from using the Boltzmann entropy maximization procedure is considered. The analysis is restricted to the one-dimensional flows of a gas of massless fermions. The objective of the paper is to demonstrate that, for such flows, the spectral nine-moment maximum entropy hydrodynamics of fermionic radiation is not a purely formal theory. We first determine the domains of admissible values of the spectral moments and of the Lagrange multipliers corresponding to them. We then prove the existence of a solution to the constrained entropy optimization problem. Due to the strict concavity of the entropy functional defined on the space of distribution functions, there exists a one-to-one correspondence between the Lagrange multipliers and the moments. The maximum entropy closure of moment equations results in the symmetric conservative system of first-order partial differential equations for the Lagrange multipliers. However, this system can be transformed into the equivalent system of conservation equations for the moments. These two systems are consistent with the additional conservation equation interpreted as the balance of entropy. Exploiting the above facts, we arrive at the differential relations satisfied by the entropy function and the additional function required to close the system of moment equations. We refer to this additional function as the moment closure function. In general, the moment closure and entropy–entropy flux functions cannot be explicitly calculated in terms of the moments determining the state of a gas. Therefore, we develop a perturbation method of calculating these functions. Some additional analytical (and also numerical) results are obtained, assuming that the maximum entropy distribution function tends to the Maxwell–Boltzmann limit. (paper)

  12. Critical Analysis of Non-Nuclear Electron-Density Maxima and the Maximum Entropy Method

    NARCIS (Netherlands)

    de Vries, R.Y.; Briels, Willem J.; Feil, D.; Feil, D.

    1996-01-01

    Experimental evidence for the existence of non-nuclear maxima in charge densities is questioned. It is shown that the non-nuclear maxima reported for silicon are artifacts of the maximum entropy method that was used to analyze the x-ray diffraction data. This method can be improved by the use of

  13. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  14. Analysis of QCD sum rule based on the maximum entropy method

    International Nuclear Information System (INIS)

    Gubler, Philipp

    2012-01-01

    QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)

  15. Spatio-temporal spike train analysis for large scale networks using the maximum entropy principle and Monte Carlo method

    International Nuclear Information System (INIS)

    Nasser, Hassan; Cessac, Bruno; Marre, Olivier

    2013-01-01

    Understanding the dynamics of neural networks is a major challenge in experimental neuroscience. For that purpose, a modelling of the recorded activity that reproduces the main statistics of the data is required. In the first part, we present a review on recent results dealing with spike train statistics analysis using maximum entropy models (MaxEnt). Most of these studies have focused on modelling synchronous spike patterns, leaving aside the temporal dynamics of the neural activity. However, the maximum entropy principle can be generalized to the temporal case, leading to Markovian models where memory effects and time correlations in the dynamics are properly taken into account. In the second part, we present a new method based on Monte Carlo sampling which is suited for the fitting of large-scale spatio-temporal MaxEnt models. The formalism and the tools presented here will be essential to fit MaxEnt spatio-temporal models to large neural ensembles. (paper)

  16. The maximum entropy method of moments and Bayesian probability theory

    Science.gov (United States)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  17. The constraint rule of the maximum entropy principle

    NARCIS (Netherlands)

    Uffink, J.

    1995-01-01

    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability

  18. Use of the maximum entropy method in X-ray astronomy

    International Nuclear Information System (INIS)

    Willingale, R.

    1981-01-01

    An algorithm used to apply the maximum entropy method in X-ray astronomy is described. It is easy to programme on a digital computer and fast enough to allow processing of two-dimensional images. The method gives good noise suppression without loss of instrumental resolution and has been successfully applied to several data analysis problems in X-ray astronomy. The restoration of a high-resolution image from the Einstein Observatory demonstrates the use of the algorithm. (author)

  19. The Maximum Entropy Principle and the Modern Portfolio Theory

    Directory of Open Access Journals (Sweden)

    Ailton Cassetari

    2003-12-01

    Full Text Available In this work, a capital allocation methodology base don the Principle of Maximum Entropy was developed. The Shannons entropy is used as a measure, concerning the Modern Portfolio Theory, are also discuted. Particularly, the methodology is tested making a systematic comparison to: 1 the mean-variance (Markovitz approach and 2 the mean VaR approach (capital allocations based on the Value at Risk concept. In principle, such confrontations show the plausibility and effectiveness of the developed method.

  20. Applications of the Maximum Entropy Method in superspace

    Czech Academy of Sciences Publication Activity Database

    van Smaalen, S.; Palatinus, Lukáš

    2004-01-01

    Roč. 305, - (2004), s. 57-62 ISSN 0015-0193 Grant - others:DFG and FCI(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : Maximum Entropy Method * modulated structures * charge density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.517, year: 2004

  1. Maximum entropy principle and hydrodynamic models in statistical mechanics

    International Nuclear Information System (INIS)

    Trovato, M.; Reggiani, L.

    2012-01-01

    This review presents the state of the art of the maximum entropy principle (MEP) in its classical and quantum (QMEP) formulation. Within the classical MEP we overview a general theory able to provide, in a dynamical context, the macroscopic relevant variables for carrier transport in the presence of electric fields of arbitrary strength. For the macroscopic variables the linearized maximum entropy approach is developed including full-band effects within a total energy scheme. Under spatially homogeneous conditions, we construct a closed set of hydrodynamic equations for the small-signal (dynamic) response of the macroscopic variables. The coupling between the driving field and the energy dissipation is analyzed quantitatively by using an arbitrary number of moments of the distribution function. Analogously, the theoretical approach is applied to many one-dimensional n + nn + submicron Si structures by using different band structure models, different doping profiles, different applied biases and is validated by comparing numerical calculations with ensemble Monte Carlo simulations and with available experimental data. Within the quantum MEP we introduce a quantum entropy functional of the reduced density matrix, the principle of quantum maximum entropy is then asserted as fundamental principle of quantum statistical mechanics. Accordingly, we have developed a comprehensive theoretical formalism to construct rigorously a closed quantum hydrodynamic transport within a Wigner function approach. The theory is formulated both in thermodynamic equilibrium and nonequilibrium conditions, and the quantum contributions are obtained by only assuming that the Lagrange multipliers can be expanded in powers of ħ 2 , being ħ the reduced Planck constant. In particular, by using an arbitrary number of moments, we prove that: i) on a macroscopic scale all nonlocal effects, compatible with the uncertainty principle, are imputable to high-order spatial derivatives both of the

  2. Maximum entropy production rate in quantum thermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Beretta, Gian Paolo, E-mail: beretta@ing.unibs.i [Universita di Brescia, via Branze 38, 25123 Brescia (Italy)

    2010-06-01

    In the framework of the recent quest for well-behaved nonlinear extensions of the traditional Schroedinger-von Neumann unitary dynamics that could provide fundamental explanations of recent experimental evidence of loss of quantum coherence at the microscopic level, a recent paper [Gheorghiu-Svirschevski 2001 Phys. Rev. A 63 054102] reproposes the nonlinear equation of motion proposed by the present author [see Beretta G P 1987 Found. Phys. 17 365 and references therein] for quantum (thermo)dynamics of a single isolated indivisible constituent system, such as a single particle, qubit, qudit, spin or atomic system, or a Bose-Einstein or Fermi-Dirac field. As already proved, such nonlinear dynamics entails a fundamental unifying microscopic proof and extension of Onsager's reciprocity and Callen's fluctuation-dissipation relations to all nonequilibrium states, close and far from thermodynamic equilibrium. In this paper we propose a brief but self-contained review of the main results already proved, including the explicit geometrical construction of the equation of motion from the steepest-entropy-ascent ansatz and its exact mathematical and conceptual equivalence with the maximal-entropy-generation variational-principle formulation presented in Gheorghiu-Svirschevski S 2001 Phys. Rev. A 63 022105. Moreover, we show how it can be extended to the case of a composite system to obtain the general form of the equation of motion, consistent with the demanding requirements of strong separability and of compatibility with general thermodynamics principles. The irreversible term in the equation of motion describes the spontaneous attraction of the state operator in the direction of steepest entropy ascent, thus implementing the maximum entropy production principle in quantum theory. The time rate at which the path of steepest entropy ascent is followed has so far been left unspecified. As a step towards the identification of such rate, here we propose a possible

  3. Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

    Science.gov (United States)

    Rostami, Vahid; Porta Mana, PierGianLuca; Grün, Sonja; Helias, Moritz

    2017-10-01

    Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

  4. Unification of field theory and maximum entropy methods for learning probability densities

    OpenAIRE

    Kinney, Justin B.

    2014-01-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy de...

  5. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    OpenAIRE

    Ge Cheng; Zhenyu Zhang; Moses Ntanda Kyebambe; Nasser Kimbugwe

    2016-01-01

    Predicting the outcome of National Basketball Association (NBA) matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME) model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that...

  6. Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager's Reciprocal Relations

    Science.gov (United States)

    Benfenati, Francesco; Beretta, Gian Paolo

    2018-04-01

    We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).

  7. Gamma-ray spectra deconvolution by maximum-entropy methods

    International Nuclear Information System (INIS)

    Los Arcos, J.M.

    1996-01-01

    A maximum-entropy method which includes the response of detectors and the statistical fluctuations of spectra is described and applied to the deconvolution of γ-ray spectra. Resolution enhancement of 25% can be reached for experimental peaks and up to 50% for simulated ones, while the intensities are conserved within 1-2%. (orig.)

  8. Jarzynski equality in the context of maximum path entropy

    Science.gov (United States)

    González, Diego; Davis, Sergio

    2017-06-01

    In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.

  9. Application of the maximum entropy method to profile analysis

    International Nuclear Information System (INIS)

    Armstrong, N.; Kalceff, W.; Cline, J.P.

    1999-01-01

    Full text: A maximum entropy (MaxEnt) method for analysing crystallite size- and strain-induced x-ray profile broadening is presented. This method treats the problems of determining the specimen profile, crystallite size distribution, and strain distribution in a general way by considering them as inverse problems. A common difficulty faced by many experimenters is their inability to determine a well-conditioned solution of the integral equation, which preserves the positivity of the profile or distribution. We show that the MaxEnt method overcomes this problem, while also enabling a priori information, in the form of a model, to be introduced into it. Additionally, we demonstrate that the method is fully quantitative, in that uncertainties in the solution profile or solution distribution can be determined and used in subsequent calculations, including mean particle sizes and rms strain. An outline of the MaxEnt method is presented for the specific problems of determining the specimen profile and crystallite or strain distributions for the correspondingly broadened profiles. This approach offers an alternative to standard methods such as those of Williamson-Hall and Warren-Averbach. An application of the MaxEnt method is demonstrated in the analysis of alumina size-broadened diffraction data (from NIST, Gaithersburg). It is used to determine the specimen profile and column-length distribution of the scattering domains. Finally, these results are compared with the corresponding Williamson-Hall and Warren-Averbach analyses. Copyright (1999) Australian X-ray Analytical Association Inc

  10. Current opinion about maximum entropy methods in Moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Szymanski, K

    2009-01-01

    Current opinion about Maximum Entropy Methods in Moessbauer Spectroscopy is presented. The most important advantage offered by the method is the correct data processing under circumstances of incomplete information. Disadvantage is the sophisticated algorithm and its application to the specific problems.

  11. Image coding based on maximum entropy partitioning for identifying ...

    Indian Academy of Sciences (India)

    A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization ...

  12. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results...

  13. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Ge Cheng

    2016-12-01

    Full Text Available Predicting the outcome of National Basketball Association (NBA matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that the model is able to predict the winning team with 74.4% accuracy, outperforming other classical machine learning algorithms that could only afford a maximum prediction accuracy of 70.6% in the experiments that we performed.

  14. Dynamical maximum entropy approach to flocking.

    Science.gov (United States)

    Cavagna, Andrea; Giardina, Irene; Ginelli, Francesco; Mora, Thierry; Piovani, Duccio; Tavarone, Raffaele; Walczak, Aleksandra M

    2014-04-01

    We derive a new method to infer from data the out-of-equilibrium alignment dynamics of collectively moving animal groups, by considering the maximum entropy model distribution consistent with temporal and spatial correlations of flight direction. When bird neighborhoods evolve rapidly, this dynamical inference correctly learns the parameters of the model, while a static one relying only on the spatial correlations fails. When neighbors change slowly and the detailed balance is satisfied, we recover the static procedure. We demonstrate the validity of the method on simulated data. The approach is applicable to other systems of active matter.

  15. Maximum entropy production: Can it be used to constrain conceptual hydrological models?

    Science.gov (United States)

    M.C. Westhoff; E. Zehe

    2013-01-01

    In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP) is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in...

  16. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  17. Weak scale from the maximum entropy principle

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  18. The prior-derived F constraints in the maximum-entropy method

    Czech Academy of Sciences Publication Activity Database

    Palatinus, Lukáš; van Smaalen, S.

    2005-01-01

    Roč. 61, - (2005), s. 363-372 ISSN 0108-7673 Institutional research plan: CEZ:AV0Z10100521 Keywords : charge density * maximum-entropy method * sodium nitrite Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.791, year: 2005

  19. Maximum non-extensive entropy block bootstrap for non-stationary processes

    Czech Academy of Sciences Publication Activity Database

    Bergamelli, M.; Novotný, Jan; Urga, G.

    2015-01-01

    Roč. 91, 1/2 (2015), s. 115-139 ISSN 0001-771X R&D Projects: GA ČR(CZ) GA14-27047S Institutional support: RVO:67985998 Keywords : maximum entropy * bootstrap * Monte Carlo simulations Subject RIV: AH - Economics

  20. Combined analysis of steady state and transient transport by the maximum entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Giannone, L.; Stroth, U; Koellermeyer, J [Association Euratom-Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); and others

    1996-04-01

    A new maximum entropy approach has been applied to analyse three types of transient transport experiments. For sawtooth propagation experiments in the ASDEX Upgrade and ECRH power modulation and power-switching experiments in the Wendelstein 7-AS Stellarator, either the time evolution of the temperature perturbation or the phase and amplitude of the modulated temperature perturbation are used as non-linear constraints to the {chi}{sub e} profile to be fitted. Simultaneously, the constraints given by the equilibrium temperature profile for steady-state power balance are fitted. In the maximum entropy formulation, the flattest {chi}{sub e} profile consistent with the constraints is found. It was found that {chi}{sub e} determined from sawtooth propagation was greater than the power balance value by a factor of five in the ASDEX Upgrade. From power modulation experiments, employing the measurements of four modulation frequencies simultaneously, the power deposition profile as well as the {chi}{sub e} profile could be determined. A comparison of the predictions of a time-independent {chi}{sub e} model and a power-dependent {chi}{sub e} model is made. The power-switching experiments show that the {chi}{sub e} profile must change within a millisecond to a new value consistent with the power balance value at the new input power. Neither power deposition broadening due to suprathermal electrons nor temperature or gradient dependences of {chi}{sub e} can explain this observation. (author).

  1. Reconstruction of the electron momentum density distribution by the maximum entropy method

    International Nuclear Information System (INIS)

    Dobrzynski, L.

    1996-01-01

    The application of the Maximum Entropy Algorithm to the analysis of the Compton profiles is discussed. It is shown that the reconstruction of electron momentum density may be reliably carried out. However, there are a number of technical problems which have to be overcome in order to produce trustworthy results. In particular one needs the experimental Compton profiles measured for many directions, and to have efficient computational resources. The use of various cross-checks is recommended. (orig.)

  2. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  3. Spectrum unfolding in X-ray spectrometry using the maximum entropy method

    International Nuclear Information System (INIS)

    Fernandez, Jorge E.; Scot, Viviana; Di Giulio, Eugenio

    2014-01-01

    The solution of the unfolding problem is an ever-present issue in X-ray spectrometry. The maximum entropy technique solves this problem by taking advantage of some known a priori physical information and by ensuring an outcome with only positive values. This method is implemented in MAXED (MAXimum Entropy Deconvolution), a software code contained in the package UMG (Unfolding with MAXED and GRAVEL) developed at PTB and distributed by NEA Data Bank. This package contains also the code GRAVEL (used to estimate the precision of the solution). This article introduces the new code UMESTRAT (Unfolding Maximum Entropy STRATegy) which applies a semi-automatic strategy to solve the unfolding problem by using a suitable combination of MAXED and GRAVEL for applications in X-ray spectrometry. Some examples of the use of UMESTRAT are shown, demonstrating its capability to remove detector artifacts from the measured spectrum consistently with the model used for the detector response function (DRF). - Highlights: ► A new strategy to solve the unfolding problem in X-ray spectrometry is presented. ► The presented strategy uses a suitable combination of the codes MAXED and GRAVEL. ► The applied strategy provides additional information on the Detector Response Function. ► The code UMESTRAT is developed to apply this new strategy in a semi-automatic mode

  4. Maximum entropy method in momentum density reconstruction

    International Nuclear Information System (INIS)

    Dobrzynski, L.; Holas, A.

    1997-01-01

    The Maximum Entropy Method (MEM) is applied to the reconstruction of the 3-dimensional electron momentum density distributions observed through the set of Compton profiles measured along various crystallographic directions. It is shown that the reconstruction of electron momentum density may be reliably carried out with the aid of simple iterative algorithm suggested originally by Collins. A number of distributions has been simulated in order to check the performance of MEM. It is shown that MEM can be recommended as a model-free approach. (author). 13 refs, 1 fig

  5. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    Science.gov (United States)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  6. Feasible Histories, Maximum Entropy

    International Nuclear Information System (INIS)

    Pitowsky, I.

    1999-01-01

    We consider the broadest possible consistency condition for a family of histories, which extends all previous proposals. A family that satisfies this condition is called feasible. On each feasible family of histories we choose a probability measure by maximizing entropy, while keeping the probabilities of commuting histories to their quantum mechanical values. This procedure is justified by the assumption that decoherence increases entropy. Finally, a criterion for identifying the nearly classical families is proposed

  7. Applications of the principle of maximum entropy: from physics to ecology.

    Science.gov (United States)

    Banavar, Jayanth R; Maritan, Amos; Volkov, Igor

    2010-02-17

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.

  8. Applications of the principle of maximum entropy: from physics to ecology

    International Nuclear Information System (INIS)

    Banavar, Jayanth R; Volkov, Igor; Maritan, Amos

    2010-01-01

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori. (topical review)

  9. Maximum entropy reconstruction of the configurational density of states from microcanonical simulations

    International Nuclear Information System (INIS)

    Davis, Sergio

    2013-01-01

    In this work we develop a method for inferring the underlying configurational density of states of a molecular system by combining information from several microcanonical molecular dynamics or Monte Carlo simulations at different energies. This method is based on Jaynes' Maximum Entropy formalism (MaxEnt) for Bayesian statistical inference under known expectation values. We present results of its application to measure thermodynamic entropy and free energy differences in embedded-atom models of metals.

  10. Stimulus-dependent maximum entropy models of neural population codes.

    Directory of Open Access Journals (Sweden)

    Einat Granot-Atedgi

    Full Text Available Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.

  11. Spectrum unfolding, sensitivity analysis and propagation of uncertainties with the maximum entropy deconvolution code MAXED

    CERN Document Server

    Reginatto, M; Neumann, S

    2002-01-01

    MAXED was developed to apply the maximum entropy principle to the unfolding of neutron spectrometric measurements. The approach followed in MAXED has several features that make it attractive: it permits inclusion of a priori information in a well-defined and mathematically consistent way, the algorithm used to derive the solution spectrum is not ad hoc (it can be justified on the basis of arguments that originate in information theory), and the solution spectrum is a non-negative function that can be written in closed form. This last feature permits the use of standard methods for the sensitivity analysis and propagation of uncertainties of MAXED solution spectra. We illustrate its use with unfoldings of NE 213 scintillation detector measurements of photon calibration spectra, and of multisphere neutron spectrometer measurements of cosmic-ray induced neutrons at high altitude (approx 20 km) in the atmosphere.

  12. Maximum entropy tokamak configurations

    International Nuclear Information System (INIS)

    Minardi, E.

    1989-01-01

    The new entropy concept for the collective magnetic equilibria is applied to the description of the states of a tokamak subject to ohmic and auxiliary heating. The condition for the existence of steady state plasma states with vanishing entropy production implies, on one hand, the resilience of specific current density profiles and, on the other, severe restrictions on the scaling of the confinement time with power and current. These restrictions are consistent with Goldston scaling and with the existence of a heat pinch. (author)

  13. Maximum entropy methods for extracting the learned features of deep neural networks.

    Science.gov (United States)

    Finnegan, Alex; Song, Jun S

    2017-10-01

    New architectures of multilayer artificial neural networks and new methods for training them are rapidly revolutionizing the application of machine learning in diverse fields, including business, social science, physical sciences, and biology. Interpreting deep neural networks, however, currently remains elusive, and a critical challenge lies in understanding which meaningful features a network is actually learning. We present a general method for interpreting deep neural networks and extracting network-learned features from input data. We describe our algorithm in the context of biological sequence analysis. Our approach, based on ideas from statistical physics, samples from the maximum entropy distribution over possible sequences, anchored at an input sequence and subject to constraints implied by the empirical function learned by a network. Using our framework, we demonstrate that local transcription factor binding motifs can be identified from a network trained on ChIP-seq data and that nucleosome positioning signals are indeed learned by a network trained on chemical cleavage nucleosome maps. Imposing a further constraint on the maximum entropy distribution also allows us to probe whether a network is learning global sequence features, such as the high GC content in nucleosome-rich regions. This work thus provides valuable mathematical tools for interpreting and extracting learned features from feed-forward neural networks.

  14. Halo-independence with quantified maximum entropy at DAMA/LIBRA

    Energy Technology Data Exchange (ETDEWEB)

    Fowlie, Andrew, E-mail: andrew.j.fowlie@googlemail.com [ARC Centre of Excellence for Particle Physics at the Tera-scale, Monash University, Melbourne, Victoria 3800 (Australia)

    2017-10-01

    Using the DAMA/LIBRA anomaly as an example, we formalise the notion of halo-independence in the context of Bayesian statistics and quantified maximum entropy. We consider an infinite set of possible profiles, weighted by an entropic prior and constrained by a likelihood describing noisy measurements of modulated moments by DAMA/LIBRA. Assuming an isotropic dark matter (DM) profile in the galactic rest frame, we find the most plausible DM profiles and predictions for unmodulated signal rates at DAMA/LIBRA. The entropic prior contains an a priori unknown regularisation factor, β, that describes the strength of our conviction that the profile is approximately Maxwellian. By varying β, we smoothly interpolate between a halo-independent and a halo-dependent analysis, thus exploring the impact of prior information about the DM profile.

  15. Analysis of positron lifetime spectra using quantified maximum entropy and a general linear filter

    International Nuclear Information System (INIS)

    Shukla, A.; Peter, M.; Hoffmann, L.

    1993-01-01

    Two new approaches are used to analyze positron annihilation lifetime spectra. A general linear filter is designed to filter the noise from lifetime data. The quantified maximum entropy method is used to solve the inverse problem of finding the lifetimes and intensities present in data. We determine optimal values of parameters needed for fitting using Bayesian methods. Estimates of errors are provided. We present results on simulated and experimental data with extensive tests to show the utility of this method and compare it with other existing methods. (orig.)

  16. Nonequilibrium thermodynamics and maximum entropy production in the Earth system: applications and implications.

    Science.gov (United States)

    Kleidon, Axel

    2009-06-01

    The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.

  17. Incommensurate modulations made visible by the Maximum Entropy Method in superspace

    Czech Academy of Sciences Publication Activity Database

    Palatinus, Lukáš; van Smaalen, S.

    2004-01-01

    Roč. 219, - (2004), s. 719-729 ISSN 0044-2968 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : Maximum Entropy Method * modulated structures * charge density Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.390, year: 2004

  18. Comparison of tomography reconstruction by maximum entropy and filtered retro projection

    International Nuclear Information System (INIS)

    Abdala, F.J.P.; Simpson, D.M.; Roberty, N.C.

    1992-01-01

    The tomographic reconstruction with few projections is studied, comparing the maximum entropy method with filtered retro projection. Simulations with and without the presence of noise and also with the presence of an object of high density inside of the skull are showed. (C.G.C.)

  19. Can the maximum entropy principle be explained as a consistency requirement?

    NARCIS (Netherlands)

    Uffink, J.

    1997-01-01

    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in

  20. Derivation of some new distributions in statistical mechanics using maximum entropy approach

    Directory of Open Access Journals (Sweden)

    Ray Amritansu

    2014-01-01

    Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.

  1. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    Science.gov (United States)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  2. Bayesian interpretation of Generalized empirical likelihood by maximum entropy

    OpenAIRE

    Rochet , Paul

    2011-01-01

    We study a parametric estimation problem related to moment condition models. As an alternative to the generalized empirical likelihood (GEL) and the generalized method of moments (GMM), a Bayesian approach to the problem can be adopted, extending the MEM procedure to parametric moment conditions. We show in particular that a large number of GEL estimators can be interpreted as a maximum entropy solution. Moreover, we provide a more general field of applications by proving the method to be rob...

  3. Maximum Entropy Closure of Balance Equations for Miniband Semiconductor Superlattices

    Directory of Open Access Journals (Sweden)

    Luis L. Bonilla

    2016-07-01

    Full Text Available Charge transport in nanosized electronic systems is described by semiclassical or quantum kinetic equations that are often costly to solve numerically and difficult to reduce systematically to macroscopic balance equations for densities, currents, temperatures and other moments of macroscopic variables. The maximum entropy principle can be used to close the system of equations for the moments but its accuracy or range of validity are not always clear. In this paper, we compare numerical solutions of balance equations for nonlinear electron transport in semiconductor superlattices. The equations have been obtained from Boltzmann–Poisson kinetic equations very far from equilibrium for strong fields, either by the maximum entropy principle or by a systematic Chapman–Enskog perturbation procedure. Both approaches produce the same current-voltage characteristic curve for uniform fields. When the superlattices are DC voltage biased in a region where there are stable time periodic solutions corresponding to recycling and motion of electric field pulses, the differences between the numerical solutions produced by numerically solving both types of balance equations are smaller than the expansion parameter used in the perturbation procedure. These results and possible new research venues are discussed.

  4. On the maximum-entropy method for kinetic equation of radiation, particle and gas

    International Nuclear Information System (INIS)

    El-Wakil, S.A.; Madkour, M.A.; Degheidy, A.R.; Machali, H.M.

    1995-01-01

    The maximum-entropy approach is used to calculate some problems in radiative transfer and reactor physics such as the escape probability, the emergent and transmitted intensities for a finite slab as well as the emergent intensity for a semi-infinite medium. Also, it is employed to solve problems involving spherical geometry, such as luminosity (the total energy emitted by a sphere), neutron capture probability and the albedo problem. The technique is also employed in the kinetic theory of gases to calculate the Poiseuille flow and thermal creep of a rarefied gas between two plates. Numerical calculations are achieved and compared with the published data. The comparisons demonstrate that the maximum-entropy results are good in agreement with the exact ones. (orig.)

  5. Maximum entropy and Bayesian methods

    International Nuclear Information System (INIS)

    Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.

    1992-01-01

    Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come

  6. ON A GENERALIZATION OF THE MAXIMUM ENTROPY THEOREM OF BURG

    Directory of Open Access Journals (Sweden)

    JOSÉ MARCANO

    2017-01-01

    Full Text Available In this article we introduce some matrix manipulations that allow us to obtain a version of the original Christoffel-Darboux formula, which is of interest in many applications of linear algebra. Using these developments matrix and Jensen’s inequality, we obtain the main result of this proposal, which is the generalization of the maximum entropy theorem of Burg for multivariate processes.

  7. Spectral density analysis of time correlation functions in lattice QCD using the maximum entropy method

    International Nuclear Information System (INIS)

    Fiebig, H. Rudolf

    2002-01-01

    We study various aspects of extracting spectral information from time correlation functions of lattice QCD by means of Bayesian inference with an entropic prior, the maximum entropy method (MEM). Correlator functions of a heavy-light meson-meson system serve as a repository for lattice data with diverse statistical quality. Attention is given to spectral mass density functions, inferred from the data, and their dependence on the parameters of the MEM. We propose to employ simulated annealing, or cooling, to solve the Bayesian inference problem, and discuss the practical issues of the approach

  8. Maximum Entropy Approach in Dynamic Contrast-Enhanced Magnetic Resonance Imaging.

    Science.gov (United States)

    Farsani, Zahra Amini; Schmid, Volker J

    2017-01-01

    In the estimation of physiological kinetic parameters from Dynamic Contrast-Enhanced Magnetic Resonance Imaging (DCE-MRI) data, the determination of the arterial input function (AIF) plays a key role. This paper proposes a Bayesian method to estimate the physiological parameters of DCE-MRI along with the AIF in situations, where no measurement of the AIF is available. In the proposed algorithm, the maximum entropy method (MEM) is combined with the maximum a posterior approach (MAP). To this end, MEM is used to specify a prior probability distribution of the unknown AIF. The ability of this method to estimate the AIF is validated using the Kullback-Leibler divergence. Subsequently, the kinetic parameters can be estimated with MAP. The proposed algorithm is evaluated with a data set from a breast cancer MRI study. The application shows that the AIF can reliably be determined from the DCE-MRI data using MEM. Kinetic parameters can be estimated subsequently. The maximum entropy method is a powerful tool to reconstructing images from many types of data. This method is useful for generating the probability distribution based on given information. The proposed method gives an alternative way to assess the input function from the existing data. The proposed method allows a good fit of the data and therefore a better estimation of the kinetic parameters. In the end, this allows for a more reliable use of DCE-MRI. Schattauer GmbH.

  9. Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle

    Science.gov (United States)

    Shoufan Fang; George Z. Gertner

    2000-01-01

    When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...

  10. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    Science.gov (United States)

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  11. Test the principle of maximum entropy in constant sum 2×2 game: Evidence in experimental economics

    International Nuclear Information System (INIS)

    Xu, Bin; Zhang, Hongen; Wang, Zhijian; Zhang, Jianbo

    2012-01-01

    By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two-person constant sum 2×2 game in the social system. It firstly shows that, in these competing game environments, the outcome of human's decision-making obeys the principle of the maximum entropy. -- Highlights: ► Test the uncertainty in two-person constant sum games with experimental data. ► On game level, the constant sum game fits the principle of maximum entropy. ► On group level, all empirical entropy values are close to theoretical maxima. ► The results can be different for the games that are not constant sum game.

  12. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  13. Maximum entropy restoration of laser fusion target x-ray photographs

    International Nuclear Information System (INIS)

    Brolley, J.E.; Lazarus, R.B.; Suydam, B.R.

    1976-01-01

    Maximum entropy principles were used to analyze the microdensitometer traces of a laser-fusion target photograph. The object is a glowing laser-fusion target microsphere 0.95 cm from a pinhole of radius 2 x 10 -4 cm, the image is 7.2 cm from the pinhole and the photon wavelength is likely to be 6.2 x 10 -8 cm. Some computational aspects of the problem are also considered

  14. Robust optimum design with maximum entropy method; Saidai entropy ho mochiita robust sei saitekika sekkeiho

    Energy Technology Data Exchange (ETDEWEB)

    Kawaguchi, K; Egashira, Y; Watanabe, G [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    Vehicle and unit performance change according to not only external causes represented by the environment such as temperature or weather, but also internal causes which are dispersion of component characteristics and manufacturing processes or aged deteriorations. We developed the design method to estimate thus performance distributions with maximum entropy method and to calculate specifications with high performance robustness using Fuzzy theory. This paper describes the details of these methods and examples applied to power window system. 3 refs., 7 figs., 4 tabs.

  15. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    Science.gov (United States)

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/. © 2011 American Institute of Physics

  16. Modeling of the Maximum Entropy Problem as an Optimal Control Problem and its Application to Pdf Estimation of Electricity Price

    Directory of Open Access Journals (Sweden)

    M. E. Haji Abadi

    2013-09-01

    Full Text Available In this paper, the continuous optimal control theory is used to model and solve the maximum entropy problem for a continuous random variable. The maximum entropy principle provides a method to obtain least-biased probability density function (Pdf estimation. In this paper, to find a closed form solution for the maximum entropy problem with any number of moment constraints, the entropy is considered as a functional measure and the moment constraints are considered as the state equations. Therefore, the Pdf estimation problem can be reformulated as the optimal control problem. Finally, the proposed method is applied to estimate the Pdf of the hourly electricity prices of New England and Ontario electricity markets. Obtained results show the efficiency of the proposed method.

  17. On the equivalence between the minimum entropy generation rate and the maximum conversion rate for a reactive system

    International Nuclear Information System (INIS)

    Bispo, Heleno; Silva, Nilton; Brito, Romildo; Manzi, João

    2013-01-01

    Highlights: • Minimum entropy generation (MEG) principle improved the reaction performance. • MEG rate and the maximum conversion equivalence have been analyzed. • Temperature and residence time are used to the domain establishment of MEG. • Satisfying the temperature and residence time relationship results a optimal performance. - Abstract: The analysis of the equivalence between the minimum entropy generation (MEG) rate and the maximum conversion rate for a reactive system is the main purpose of this paper. While being used as a strategy of optimization, the minimum entropy production was applied to the production of propylene glycol in a Continuous Stirred-Tank Reactor (CSTR) with a view to determining the best operating conditions, and under such conditions, a high conversion rate was found. The effects of the key variables and restrictions on the validity domain of MEG were investigated, which raises issues that are included within a broad discussion. The results from simulations indicate that from the chemical reaction standpoint a maximum conversion rate can be considered as equivalent to MEG. Such a result can be clearly explained by examining the classical Maxwell–Boltzmann distribution, where the molecules of the reactive system under the condition of the MEG rate present a distribution of energy with reduced dispersion resulting in a better quality of collision between molecules with a higher conversion rate

  18. Maximum entropy networks are more controllable than preferential attachment networks

    International Nuclear Information System (INIS)

    Hou, Lvlin; Small, Michael; Lao, Songyang

    2014-01-01

    A maximum entropy (ME) method to generate typical scale-free networks has been recently introduced. We investigate the controllability of ME networks and Barabási–Albert preferential attachment networks. Our experimental results show that ME networks are significantly more easily controlled than BA networks of the same size and the same degree distribution. Moreover, the control profiles are used to provide insight into control properties of both classes of network. We identify and classify the driver nodes and analyze the connectivity of their neighbors. We find that driver nodes in ME networks have fewer mutual neighbors and that their neighbors have lower average degree. We conclude that the properties of the neighbors of driver node sensitively affect the network controllability. Hence, subtle and important structural differences exist between BA networks and typical scale-free networks of the same degree distribution. - Highlights: • The controllability of maximum entropy (ME) and Barabási–Albert (BA) networks is investigated. • ME networks are significantly more easily controlled than BA networks of the same degree distribution. • The properties of the neighbors of driver node sensitively affect the network controllability. • Subtle and important structural differences exist between BA networks and typical scale-free networks

  19. Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method

    Science.gov (United States)

    Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung

    2015-04-01

    In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting

  20. Test the principle of maximum entropy in constant sum 2×2 game: Evidence in experimental economics

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Bin, E-mail: xubin211@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Public Administration College, Zhejiang Gongshang University, Hangzhou, 310018 (China); Zhang, Hongen, E-mail: hongen777@163.com [Department of Physics, Zhejiang University, Hangzhou, 310027 (China); Wang, Zhijian, E-mail: wangzj@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Zhang, Jianbo, E-mail: jbzhang08@zju.edu.cn [Department of Physics, Zhejiang University, Hangzhou, 310027 (China)

    2012-03-19

    By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two-person constant sum 2×2 game in the social system. It firstly shows that, in these competing game environments, the outcome of human's decision-making obeys the principle of the maximum entropy. -- Highlights: ► Test the uncertainty in two-person constant sum games with experimental data. ► On game level, the constant sum game fits the principle of maximum entropy. ► On group level, all empirical entropy values are close to theoretical maxima. ► The results can be different for the games that are not constant sum game.

  1. Deconvolution in the presence of noise using the Maximum Entropy Principle

    International Nuclear Information System (INIS)

    Steenstrup, S.

    1984-01-01

    The main problem in deconvolution in the presence of noise is the nonuniqueness. This problem is overcome by the application of the Maximum Entropy Principle. The way the noise enters in the formulation of the problem is examined in some detail and the final equations are derived such that the necessary assumptions becomes explicit. Examples using X-ray diffraction data are shown. (orig.)

  2. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.

    Science.gov (United States)

    Shalymov, Dmitry S; Fradkov, Alexander L

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.

  3. Application of Bayesian Maximum Entropy Filter in parameter calibration of groundwater flow model in PingTung Plain

    Science.gov (United States)

    Cheung, Shao-Yong; Lee, Chieh-Han; Yu, Hwa-Lung

    2017-04-01

    Due to the limited hydrogeological observation data and high levels of uncertainty within, parameter estimation of the groundwater model has been an important issue. There are many methods of parameter estimation, for example, Kalman filter provides a real-time calibration of parameters through measurement of groundwater monitoring wells, related methods such as Extended Kalman Filter and Ensemble Kalman Filter are widely applied in groundwater research. However, Kalman Filter method is limited to linearity. This study propose a novel method, Bayesian Maximum Entropy Filtering, which provides a method that can considers the uncertainty of data in parameter estimation. With this two methods, we can estimate parameter by given hard data (certain) and soft data (uncertain) in the same time. In this study, we use Python and QGIS in groundwater model (MODFLOW) and development of Extended Kalman Filter and Bayesian Maximum Entropy Filtering in Python in parameter estimation. This method may provide a conventional filtering method and also consider the uncertainty of data. This study was conducted through numerical model experiment to explore, combine Bayesian maximum entropy filter and a hypothesis for the architecture of MODFLOW groundwater model numerical estimation. Through the virtual observation wells to simulate and observe the groundwater model periodically. The result showed that considering the uncertainty of data, the Bayesian maximum entropy filter will provide an ideal result of real-time parameters estimation.

  4. Application of the maximum entropy production principle to electrical systems

    International Nuclear Information System (INIS)

    Christen, Thomas

    2006-01-01

    For a simple class of electrical systems, the principle of the maximum entropy production rate (MaxEP) is discussed. First, we compare the MaxEP principle and the principle of the minimum entropy production rate and illustrate the superiority of the MaxEP principle for the example of two parallel constant resistors. Secondly, we show that the Steenbeck principle for the electric arc as well as the ohmic contact behaviour of space-charge limited conductors follow from the MaxEP principle. In line with work by Dewar, the investigations seem to suggest that the MaxEP principle can also be applied to systems far from equilibrium, provided appropriate information is available that enters the constraints of the optimization problem. Finally, we apply the MaxEP principle to a mesoscopic system and show that the universal conductance quantum, e 2 /h, of a one-dimensional ballistic conductor can be estimated

  5. Mixed memory, (non) Hurst effect, and maximum entropy of rainfall in the tropical Andes

    Science.gov (United States)

    Poveda, Germán

    2011-02-01

    Diverse linear and nonlinear statistical parameters of rainfall under aggregation in time and the kind of temporal memory are investigated. Data sets from the Andes of Colombia at different resolutions (15 min and 1-h), and record lengths (21 months and 8-40 years) are used. A mixture of two timescales is found in the autocorrelation and autoinformation functions, with short-term memory holding for time lags less than 15-30 min, and long-term memory onwards. Consistently, rainfall variance exhibits different temporal scaling regimes separated at 15-30 min and 24 h. Tests for the Hurst effect evidence the frailty of the R/ S approach in discerning the kind of memory in high resolution rainfall, whereas rigorous statistical tests for short-memory processes do reject the existence of the Hurst effect. Rainfall information entropy grows as a power law of aggregation time, S( T) ˜ Tβ with = 0.51, up to a timescale, TMaxEnt (70-202 h), at which entropy saturates, with β = 0 onwards. Maximum entropy is reached through a dynamic Generalized Pareto distribution, consistently with the maximum information-entropy principle for heavy-tailed random variables, and with its asymptotically infinitely divisible property. The dynamics towards the limit distribution is quantified. Tsallis q-entropies also exhibit power laws with T, such that Sq( T) ˜ Tβ( q) , with β( q) ⩽ 0 for q ⩽ 0, and β( q) ≃ 0.5 for q ⩾ 1. No clear patterns are found in the geographic distribution within and among the statistical parameters studied, confirming the strong variability of tropical Andean rainfall.

  6. PNNL: A Supervised Maximum Entropy Approach to Word Sense Disambiguation

    Energy Technology Data Exchange (ETDEWEB)

    Tratz, Stephen C.; Sanfilippo, Antonio P.; Gregory, Michelle L.; Chappell, Alan R.; Posse, Christian; Whitney, Paul D.

    2007-06-23

    In this paper, we described the PNNL Word Sense Disambiguation system as applied to the English All-Word task in Se-mEval 2007. We use a supervised learning approach, employing a large number of features and using Information Gain for dimension reduction. Our Maximum Entropy approach combined with a rich set of features produced results that are significantly better than baseline and are the highest F-score for the fined-grained English All-Words subtask.

  7. Physical entropy, information entropy and their evolution equations

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Inspired by the evolution equation of nonequilibrium statistical physics entropy and the concise statistical formula of the entropy production rate, we develop a theory of the dynamic information entropy and build a nonlinear evolution equation of the information entropy density changing in time and state variable space. Its mathematical form and physical meaning are similar to the evolution equation of the physical entropy: The time rate of change of information entropy density originates together from drift, diffusion and production. The concise statistical formula of information entropy production rate is similar to that of physical entropy also. Furthermore, we study the similarity and difference between physical entropy and information entropy and the possible unification of the two statistical entropies, and discuss the relationship among the principle of entropy increase, the principle of equilibrium maximum entropy and the principle of maximum information entropy as well as the connection between them and the entropy evolution equation.

  8. Venus atmosphere profile from a maximum entropy principle

    Directory of Open Access Journals (Sweden)

    L. N. Epele

    2007-10-01

    Full Text Available The variational method with constraints recently developed by Verkley and Gerkema to describe maximum-entropy atmospheric profiles is generalized to ideal gases but with temperature-dependent specific heats. In so doing, an extended and non standard potential temperature is introduced that is well suited for tackling the problem under consideration. This new formalism is successfully applied to the atmosphere of Venus. Three well defined regions emerge in this atmosphere up to a height of 100 km from the surface: the lowest one up to about 35 km is adiabatic, a transition layer located at the height of the cloud deck and finally a third region which is practically isothermal.

  9. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  10. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  11. Maximum Entropy Method in Moessbauer Spectroscopy - a Problem of Magnetic Texture

    International Nuclear Information System (INIS)

    Satula, D.; Szymanski, K.; Dobrzynski, L.

    2011-01-01

    A reconstruction of the three dimensional distribution of the hyperfine magnetic field, isomer shift and texture parameter z from the Moessbauer spectra by the maximum entropy method is presented. The method was tested on the simulated spectrum consisting of two Gaussian hyperfine field distributions with different values of the texture parameters. It is shown that proper prior has to be chosen in order to arrive at the physically meaningful results. (authors)

  12. A Two-Stage Maximum Entropy Prior of Location Parameter with a Stochastic Multivariate Interval Constraint and Its Properties

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2016-05-01

    Full Text Available This paper proposes a two-stage maximum entropy prior to elicit uncertainty regarding a multivariate interval constraint of the location parameter of a scale mixture of normal model. Using Shannon’s entropy, this study demonstrates how the prior, obtained by using two stages of a prior hierarchy, appropriately accounts for the information regarding the stochastic constraint and suggests an objective measure of the degree of belief in the stochastic constraint. The study also verifies that the proposed prior plays the role of bridging the gap between the canonical maximum entropy prior of the parameter with no interval constraint and that with a certain multivariate interval constraint. It is shown that the two-stage maximum entropy prior belongs to the family of rectangle screened normal distributions that is conjugate for samples from a normal distribution. Some properties of the prior density, useful for developing a Bayesian inference of the parameter with the stochastic constraint, are provided. We also propose a hierarchical constrained scale mixture of normal model (HCSMN, which uses the prior density to estimate the constrained location parameter of a scale mixture of normal model and demonstrates the scope of its applicability.

  13. Maximum Entropy Methods as the Bridge Between Microscopic and Macroscopic Theory

    Science.gov (United States)

    Taylor, Jamie M.

    2016-09-01

    This paper is concerned with an investigation into a function of macroscopic variables known as the singular potential, building on previous work by Ball and Majumdar. The singular potential is a function of the admissible statistical averages of probability distributions on a state space, defined so that it corresponds to the maximum possible entropy given known observed statistical averages, although non-classical entropy-like objective functions will also be considered. First the set of admissible moments must be established, and under the conditions presented in this work the set is open, bounded and convex allowing a description in terms of supporting hyperplanes, which provides estimates on the development of singularities for related probability distributions. Under appropriate conditions it is shown that the singular potential is strictly convex, as differentiable as the microscopic entropy, and blows up uniformly as the macroscopic variable tends to the boundary of the set of admissible moments. Applications of the singular potential are then discussed, and particular consideration will be given to certain free-energy functionals typical in mean-field theory, demonstrating an equivalence between certain microscopic and macroscopic free-energy functionals. This allows statements about L^1-local minimisers of Onsager's free energy to be obtained which cannot be given by two-sided variations, and overcomes the need to ensure local minimisers are bounded away from zero and +∞ before taking L^∞ variations. The analysis also permits the definition of a dual order parameter for which Onsager's free energy allows an explicit representation. Also, the difficulties in approximating the singular potential by everywhere defined functions, in particular by polynomial functions, are addressed, with examples demonstrating the failure of the Taylor approximation to preserve relevant shape properties of the singular potential.

  14. Applications of the maximum entropy principle in nuclear physics

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1990-01-01

    Soon after the advent of information theory the principle of maximum entropy was recognized as furnishing the missing rationale for the familiar rules of classical thermodynamics. More recently it has also been applied successfully in nuclear physics. As an elementary example we derive a physically meaningful macroscopic description of the spectrum of neutrons emitted in nuclear fission, and compare the well known result with accurate data on 252 Cf. A second example, derivation of an expression for resonance-averaged cross sections for nuclear reactions like scattering or fission, is less trivial. Entropy maximization, constrained by given transmission coefficients, yields probability distributions for the R- and S-matrix elements, from which average cross sections can be calculated. If constrained only by the range of the spectrum of compound-nuclear levels it produces the Gaussian Orthogonal Ensemble (GOE) of Hamiltonian matrices that again yields expressions for average cross sections. Both avenues give practically the same numbers in spite of the quite different cross section formulae. These results were employed in a new model-aided evaluation of the 238 U neutron cross sections in the unresolved resonance region. (orig.) [de

  15. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  16. Separation of Stochastic and Deterministic Information from Seismological Time Series with Nonlinear Dynamics and Maximum Entropy Methods

    International Nuclear Information System (INIS)

    Gutierrez, Rafael M.; Useche, Gina M.; Buitrago, Elias

    2007-01-01

    We present a procedure developed to detect stochastic and deterministic information contained in empirical time series, useful to characterize and make models of different aspects of complex phenomena represented by such data. This procedure is applied to a seismological time series to obtain new information to study and understand geological phenomena. We use concepts and methods from nonlinear dynamics and maximum entropy. The mentioned method allows an optimal analysis of the available information

  17. Structure of incommensurate ammonium tetrafluoroberyllate studied by structure refinements and the maximum entropy method

    Czech Academy of Sciences Publication Activity Database

    Palatinus, Lukáš; Amami, M.; van Smaalen, S.

    2004-01-01

    Roč. 60, - (2004), s. 127-137 ISSN 0108-7681 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : incommensurate modulation * superspace * maximum entropy method Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 5.418, year: 2004

  18. Quantum maximum-entropy principle for closed quantum hydrodynamic transport within a Wigner function formalism

    International Nuclear Information System (INIS)

    Trovato, M.; Reggiani, L.

    2011-01-01

    By introducing a quantum entropy functional of the reduced density matrix, the principle of quantum maximum entropy is asserted as fundamental principle of quantum statistical mechanics. Accordingly, we develop a comprehensive theoretical formalism to construct rigorously a closed quantum hydrodynamic transport within a Wigner function approach. The theoretical formalism is formulated in both thermodynamic equilibrium and nonequilibrium conditions, and the quantum contributions are obtained by only assuming that the Lagrange multipliers can be expanded in powers of (ℎ/2π) 2 . In particular, by using an arbitrary number of moments, we prove that (1) on a macroscopic scale all nonlocal effects, compatible with the uncertainty principle, are imputable to high-order spatial derivatives, both of the numerical density n and of the effective temperature T; (2) the results available from the literature in the framework of both a quantum Boltzmann gas and a degenerate quantum Fermi gas are recovered as a particular case; (3) the statistics for the quantum Fermi and Bose gases at different levels of degeneracy are explicitly incorporated; (4) a set of relevant applications admitting exact analytical equations are explicitly given and discussed; (5) the quantum maximum entropy principle keeps full validity in the classical limit, when (ℎ/2π)→0.

  19. Hydrodynamic equations for electrons in graphene obtained from the maximum entropy principle

    Energy Technology Data Exchange (ETDEWEB)

    Barletti, Luigi, E-mail: luigi.barletti@unifi.it [Dipartimento di Matematica e Informatica “Ulisse Dini”, Università degli Studi di Firenze, Viale Morgagni 67/A, 50134 Firenze (Italy)

    2014-08-15

    The maximum entropy principle is applied to the formal derivation of isothermal, Euler-like equations for semiclassical fermions (electrons and holes) in graphene. After proving general mathematical properties of the equations so obtained, their asymptotic form corresponding to significant physical regimes is investigated. In particular, the diffusive regime, the Maxwell-Boltzmann regime (high temperature), the collimation regime and the degenerate gas limit (vanishing temperature) are considered.

  20. Lattice Field Theory with the Sign Problem and the Maximum Entropy Method

    Directory of Open Access Journals (Sweden)

    Masahiro Imachi

    2007-02-01

    Full Text Available Although numerical simulation in lattice field theory is one of the most effective tools to study non-perturbative properties of field theories, it faces serious obstacles coming from the sign problem in some theories such as finite density QCD and lattice field theory with the θ term. We reconsider this problem from the point of view of the maximum entropy method.

  1. LensEnt2: Maximum-entropy weak lens reconstruction

    Science.gov (United States)

    Marshall, P. J.; Hobson, M. P.; Gull, S. F.; Bridle, S. L.

    2013-08-01

    LensEnt2 is a maximum entropy reconstructor of weak lensing mass maps. The method takes each galaxy shape as an independent estimator of the reduced shear field and incorporates an intrinsic smoothness, determined by Bayesian methods, into the reconstruction. The uncertainties from both the intrinsic distribution of galaxy shapes and galaxy shape estimation are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures are calculated with corresponding uncertainties. The input is a galaxy ellipticity catalog with each measured galaxy shape treated as a noisy tracer of the reduced shear field, which is inferred on a fine pixel grid assuming positivity, and smoothness on scales of w arcsec where w is an input parameter. The ICF width w can be chosen by computing the evidence for it.

  2. Information entropies in antikaon-nucleon scattering and optimal state analysis

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.; Petrascu, C.

    1998-01-01

    It is known that Jaynes interpreted the entropy as the expected self-information of a class of mutually exclusive and exhaustive events, while the probability is considered to be the rational degree of belief we assign to events based on available experimental evidence. The axiomatic derivation of Jaynes principle of maximum entropy as well as of the Kullback principle of minimum cross-entropy have been reported. Moreover, the optimal states in the Hilbert space of the scattering amplitude, which are analogous to the coherent states from the Hilbert space of the wave functions, were introduced and developed. The possibility that each optimal state possesses a specific minimum entropic uncertainty relation similar to that of the coherent states was recently conjectured. In fact, the (angle and angular momenta) information entropies, as well as the entropic angle-angular momentum uncertainty relations, in the hadron-hadron scattering, are introduced. The experimental information entropies for the pion-nucleon scattering are calculated by using the available phase shift analyses. These results are compared with the information entropies of the optimal states. Then, the optimal state dominance in the pion-nucleon scattering is systematically observed for all P LAB = 0.02 - 10 GeV/c. Also, it is shown that the angle-angular momentum entropic uncertainty relations are satisfied with high accuracy by all the experimental information entropies. In this paper the (angle and angular momentum) information entropies of hadron-hadron scattering are experimentally investigated by using the antikaon-nucleon phase shift analysis. Then, it is shown that the experimental entropies are in agreement with the informational entropies of optimal states. The results obtained in this paper can be explained not only by the presence of an optimal background which accompanied the production of the elementary resonances but also by the presence of the optimal resonances. On the other hand

  3. The generalized F constraint in the maximum-entropy method - a study on simulated data

    Czech Academy of Sciences Publication Activity Database

    Palatinus, Lukáš; van Smaalen, S.

    2002-01-01

    Roč. 58, - (2002), s. 559-567 ISSN 0108-7673 Grant - others:DFG(DE) XX Institutional research plan: CEZ:AV0Z1010914 Keywords : maximum-entropy method * electron density * oxalic acid Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.417, year: 2002

  4. Maximum entropy estimation of a Benzene contaminated plume using ecotoxicological assays

    International Nuclear Information System (INIS)

    Wahyudi, Agung; Bartzke, Mariana; Küster, Eberhard; Bogaert, Patrick

    2013-01-01

    Ecotoxicological bioassays, e.g. based on Danio rerio teratogenicity (DarT) or the acute luminescence inhibition with Vibrio fischeri, could potentially lead to significant benefits for detecting on site contaminations on qualitative or semi-quantitative bases. The aim was to use the observed effects of two ecotoxicological assays for estimating the extent of a Benzene groundwater contamination plume. We used a Maximum Entropy (MaxEnt) method to rebuild a bivariate probability table that links the observed toxicity from the bioassays with Benzene concentrations. Compared with direct mapping of the contamination plume as obtained from groundwater samples, the MaxEnt concentration map exhibits on average slightly higher concentrations though the global pattern is close to it. This suggest MaxEnt is a valuable method to build a relationship between quantitative data, e.g. contaminant concentrations, and more qualitative or indirect measurements, in a spatial mapping framework, which is especially useful when clear quantitative relation is not at hand. - Highlights: ► Ecotoxicological shows significant benefits for detecting on site contaminations. ► MaxEnt to rebuild qualitative link on concentration and ecotoxicological assays. ► MaxEnt shows similar pattern when compared with concentrations map of groundwater. ► MaxEnt is a valuable method especially when quantitative relation is not at hand. - A Maximum Entropy method to rebuild qualitative relationships between Benzene groundwater concentrations and their ecotoxicological effect.

  5. Maximum Entropy, Word-Frequency, Chinese Characters, and Multiple Meanings

    Science.gov (United States)

    Yan, Xiaoyong; Minnhagen, Petter

    2015-01-01

    The word-frequency distribution of a text written by an author is well accounted for by a maximum entropy distribution, the RGF (random group formation)-prediction. The RGF-distribution is completely determined by the a priori values of the total number of words in the text (M), the number of distinct words (N) and the number of repetitions of the most common word (kmax). It is here shown that this maximum entropy prediction also describes a text written in Chinese characters. In particular it is shown that although the same Chinese text written in words and Chinese characters have quite differently shaped distributions, they are nevertheless both well predicted by their respective three a priori characteristic values. It is pointed out that this is analogous to the change in the shape of the distribution when translating a given text to another language. Another consequence of the RGF-prediction is that taking a part of a long text will change the input parameters (M, N, kmax) and consequently also the shape of the frequency distribution. This is explicitly confirmed for texts written in Chinese characters. Since the RGF-prediction has no system-specific information beyond the three a priori values (M, N, kmax), any specific language characteristic has to be sought in systematic deviations from the RGF-prediction and the measured frequencies. One such systematic deviation is identified and, through a statistical information theoretical argument and an extended RGF-model, it is proposed that this deviation is caused by multiple meanings of Chinese characters. The effect is stronger for Chinese characters than for Chinese words. The relation between Zipf’s law, the Simon-model for texts and the present results are discussed. PMID:25955175

  6. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    Science.gov (United States)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further

  7. The Data-Constrained Generalized Maximum Entropy Estimator of the GLM: Asymptotic Theory and Inference

    Directory of Open Access Journals (Sweden)

    Nicholas Scott Cardell

    2013-05-01

    Full Text Available Maximum entropy methods of parameter estimation are appealing because they impose no additional structure on the data, other than that explicitly assumed by the analyst. In this paper we prove that the data constrained GME estimator of the general linear model is consistent and asymptotically normal. The approach we take in establishing the asymptotic properties concomitantly identifies a new computationally efficient method for calculating GME estimates. Formulae are developed to compute asymptotic variances and to perform Wald, likelihood ratio, and Lagrangian multiplier statistical tests on model parameters. Monte Carlo simulations are provided to assess the performance of the GME estimator in both large and small sample situations. Furthermore, we extend our results to maximum cross-entropy estimators and indicate a variant of the GME estimator that is unbiased. Finally, we discuss the relationship of GME estimators to Bayesian estimators, pointing out the conditions under which an unbiased GME estimator would be efficient.

  8. Bayesian and maximum entropy methods for fusion diagnostic measurements with compact neutron spectrometers

    International Nuclear Information System (INIS)

    Reginatto, Marcel; Zimbal, Andreas

    2008-01-01

    In applications of neutron spectrometry to fusion diagnostics, it is advantageous to use methods of data analysis which can extract information from the spectrum that is directly related to the parameters of interest that describe the plasma. We present here methods of data analysis which were developed with this goal in mind, and which were applied to spectrometric measurements made with an organic liquid scintillation detector (type NE213). In our approach, we combine Bayesian parameter estimation methods and unfolding methods based on the maximum entropy principle. This two-step method allows us to optimize the analysis of the data depending on the type of information that we want to extract from the measurements. To illustrate these methods, we analyze neutron measurements made at the PTB accelerator under controlled conditions, using accelerator-produced neutron beams. Although the methods have been chosen with a specific application in mind, they are general enough to be useful for many other types of measurements

  9. Comments on a derivation and application of the 'maximum entropy production' principle

    International Nuclear Information System (INIS)

    Grinstein, G; Linsker, R

    2007-01-01

    We show that (1) an error invalidates the derivation (Dewar 2005 J. Phys. A: Math. Gen. 38 L371) of the maximum entropy production (MaxEP) principle for systems far from equilibrium, for which the constitutive relations are nonlinear; and (2) the claim (Dewar 2003 J. Phys. A: Math. Gen. 36 631) that the phenomenon of 'self-organized criticality' is a consequence of MaxEP for slowly driven systems is unjustified. (comment)

  10. LIBOR troubles: Anomalous movements detection based on maximum entropy

    Science.gov (United States)

    Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria

    2016-05-01

    According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.

  11. Entropy generation and thermodynamic analysis of solar air heaters with artificial roughness on absorber plate

    Directory of Open Access Journals (Sweden)

    Prasad Radha K.

    2017-09-01

    Full Text Available This paper presents mathematical modelling and numerical analysis to evaluate entropy generation analysis (EGA by considering pressure drop and second law efficiency based on thermodynamics for forced convection heat transfer in rectangular duct of a solar air heater with wire as artificial roughness in the form of arc shape geometry on the absorber plate. The investigation includes evaluations of entropy generation, entropy generation number, Bejan number and irreversibilities of roughened as well as smooth absorber plate solar air heaters to compare the relative performances. Furthermore, effects of various roughness parameters and operating parameters on entropy generation have also been investigated. Entropy generation and irreversibilities (exergy destroyed has its minimum value at relative roughness height of 0.0422 and relative angle of attack of 0.33, which leads to the maximum exergetic efficiency. Entropy generation and exergy based analyses can be adopted for the evaluation of the overall performance of solar air heaters.

  12. Comparison Between Bayesian and Maximum Entropy Analyses of Flow Networks†

    Directory of Open Access Journals (Sweden)

    Steven H. Waldrip

    2017-02-01

    Full Text Available We compare the application of Bayesian inference and the maximum entropy (MaxEnt method for the analysis of flow networks, such as water, electrical and transport networks. The two methods have the advantage of allowing a probabilistic prediction of flow rates and other variables, when there is insufficient information to obtain a deterministic solution, and also allow the effects of uncertainty to be included. Both methods of inference update a prior to a posterior probability density function (pdf by the inclusion of new information, in the form of data or constraints. The MaxEnt method maximises an entropy function subject to constraints, using the method of Lagrange multipliers,to give the posterior, while the Bayesian method finds its posterior by multiplying the prior with likelihood functions incorporating the measured data. In this study, we examine MaxEnt using soft constraints, either included in the prior or as probabilistic constraints, in addition to standard moment constraints. We show that when the prior is Gaussian,both Bayesian inference and the MaxEnt method with soft prior constraints give the same posterior means, but their covariances are different. In the Bayesian method, the interactions between variables are applied through the likelihood function, using second or higher-order cross-terms within the posterior pdf. In contrast, the MaxEnt method incorporates interactions between variables using Lagrange multipliers, avoiding second-order correlation terms in the posterior covariance. The MaxEnt method with soft prior constraints, therefore, has a numerical advantage over Bayesian inference, in that the covariance terms are avoided in its integrations. The second MaxEnt method with soft probabilistic constraints is shown to give posterior means of similar, but not identical, structure to the other two methods, due to its different formulation.

  13. Maximum entropy principal for transportation

    International Nuclear Information System (INIS)

    Bilich, F.; Da Silva, R.

    2008-01-01

    In this work we deal with modeling of the transportation phenomenon for use in the transportation planning process and policy-impact studies. The model developed is based on the dependence concept, i.e., the notion that the probability of a trip starting at origin i is dependent on the probability of a trip ending at destination j given that the factors (such as travel time, cost, etc.) which affect travel between origin i and destination j assume some specific values. The derivation of the solution of the model employs the maximum entropy principle combining a priori multinomial distribution with a trip utility concept. This model is utilized to forecast trip distributions under a variety of policy changes and scenarios. The dependence coefficients are obtained from a regression equation where the functional form is derived based on conditional probability and perception of factors from experimental psychology. The dependence coefficients encode all the information that was previously encoded in the form of constraints. In addition, the dependence coefficients encode information that cannot be expressed in the form of constraints for practical reasons, namely, computational tractability. The equivalence between the standard formulation (i.e., objective function with constraints) and the dependence formulation (i.e., without constraints) is demonstrated. The parameters of the dependence-based trip-distribution model are estimated, and the model is also validated using commercial air travel data in the U.S. In addition, policy impact analyses (such as allowance of supersonic flights inside the U.S. and user surcharge at noise-impacted airports) on air travel are performed.

  14. A Hybrid Physical and Maximum-Entropy Landslide Susceptibility Model

    Directory of Open Access Journals (Sweden)

    Jerry Davis

    2015-06-01

    Full Text Available The clear need for accurate landslide susceptibility mapping has led to multiple approaches. Physical models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical methods can include other factors influencing slope stability such as distance to roads, but rely on good landslide inventories. The maximum entropy (MaxEnt model has been widely and successfully used in species distribution mapping, because data on absence are often uncertain. Similarly, knowledge about the absence of landslides is often limited due to mapping scale or methodology. In this paper a hybrid approach is described that combines the physically-based landslide susceptibility model “Stability INdex MAPping” (SINMAP with MaxEnt. This method is tested in a coastal watershed in Pacifica, CA, USA, with a well-documented landslide history including 3 inventories of 154 scars on 1941 imagery, 142 in 1975, and 253 in 1983. Results indicate that SINMAP alone overestimated susceptibility due to insufficient data on root cohesion. Models were compared using SINMAP stability index (SI or slope alone, and SI or slope in combination with other environmental factors: curvature, a 50-m trail buffer, vegetation, and geology. For 1941 and 1975, using slope alone was similar to using SI alone; however in 1983 SI alone creates an Areas Under the receiver operator Curve (AUC of 0.785, compared with 0.749 for slope alone. In maximum-entropy models created using all environmental factors, the stability index (SI from SINMAP represented the greatest contributions in all three years (1941: 48.1%; 1975: 35.3; and 1983: 48%, with AUC of 0.795, 0822, and 0.859, respectively; however; using slope instead of SI created similar overall AUC values, likely due to the combined effect with plan curvature indicating focused hydrologic inputs and vegetation identifying the effect of root cohesion

  15. Reconstruction of calmodulin single-molecule FRET states, dye interactions, and CaMKII peptide binding by MultiNest and classic maximum entropy

    Science.gov (United States)

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2013-08-01

    We analyzed single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  16. Reconstruction of Calmodulin Single-Molecule FRET States, Dye-Interactions, and CaMKII Peptide Binding by MultiNest and Classic Maximum Entropy.

    Science.gov (United States)

    Devore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2013-08-30

    We analyze single molecule FRET burst measurements using Bayesian nested sampling. The MultiNest algorithm produces accurate FRET efficiency distributions from single-molecule data. FRET efficiency distributions recovered by MultiNest and classic maximum entropy are compared for simulated data and for calmodulin labeled at residues 44 and 117. MultiNest compares favorably with maximum entropy analysis for simulated data, judged by the Bayesian evidence. FRET efficiency distributions recovered for calmodulin labeled with two different FRET dye pairs depended on the dye pair and changed upon Ca 2+ binding. We also looked at the FRET efficiency distributions of calmodulin bound to the calcium/calmodulin dependent protein kinase II (CaMKII) binding domain. For both dye pairs, the FRET efficiency distribution collapsed to a single peak in the case of calmodulin bound to the CaMKII peptide. These measurements strongly suggest that consideration of dye-protein interactions is crucial in forming an accurate picture of protein conformations from FRET data.

  17. Maximum entropy formalism for the analytic continuation of matrix-valued Green's functions

    Science.gov (United States)

    Kraberger, Gernot J.; Triebl, Robert; Zingl, Manuel; Aichhorn, Markus

    2017-10-01

    We present a generalization of the maximum entropy method to the analytic continuation of matrix-valued Green's functions. To treat off-diagonal elements correctly based on Bayesian probability theory, the entropy term has to be extended for spectral functions that are possibly negative in some frequency ranges. In that way, all matrix elements of the Green's function matrix can be analytically continued; we introduce a computationally cheap element-wise method for this purpose. However, this method cannot ensure important constraints on the mathematical properties of the resulting spectral functions, namely positive semidefiniteness and Hermiticity. To improve on this, we present a full matrix formalism, where all matrix elements are treated simultaneously. We show the capabilities of these methods using insulating and metallic dynamical mean-field theory (DMFT) Green's functions as test cases. Finally, we apply the methods to realistic material calculations for LaTiO3, where off-diagonal matrix elements in the Green's function appear due to the distorted crystal structure.

  18. The simplest maximum entropy model for collective behavior in a neural network

    International Nuclear Information System (INIS)

    Tkačik, Gašper; Marre, Olivier; Mora, Thierry; Amodei, Dario; Bialek, William; Berry II, Michael J

    2013-01-01

    Recent work emphasizes that the maximum entropy principle provides a bridge between statistical mechanics models for collective behavior in neural networks and experiments on networks of real neurons. Most of this work has focused on capturing the measured correlations among pairs of neurons. Here we suggest an alternative, constructing models that are consistent with the distribution of global network activity, i.e. the probability that K out of N cells in the network generate action potentials in the same small time bin. The inverse problem that we need to solve in constructing the model is analytically tractable, and provides a natural ‘thermodynamics’ for the network in the limit of large N. We analyze the responses of neurons in a small patch of the retina to naturalistic stimuli, and find that the implied thermodynamics is very close to an unusual critical point, in which the entropy (in proper units) is exactly equal to the energy. (paper)

  19. Bayesian Maximum Entropy prediction of soil categories using a traditional soil map as soft information.

    NARCIS (Netherlands)

    Brus, D.J.; Bogaert, P.; Heuvelink, G.B.M.

    2008-01-01

    Bayesian Maximum Entropy was used to estimate the probabilities of occurrence of soil categories in the Netherlands, and to simulate realizations from the associated multi-point pdf. Besides the hard observations (H) of the categories at 8369 locations, the soil map of the Netherlands 1:50 000 was

  20. Numerical analysis of entropy generation in an annular microcombustor using multistep kinetics

    International Nuclear Information System (INIS)

    Jejurkar, Swarup Y.; Mishra, D.P.

    2013-01-01

    Entropy generation by combustion and additional irreversibility due to heat loss was studied numerically for a premixed flame based microcombustor. Detailed axisymmetric reactive flow model employing a 21 step–9 species reaction mechanism for hydrogen–air mixture was considered. The analysis identified reactions contributing most of the entropy generated in combustion. These reactions are removed from thermodynamic equilibrium in the low temperature region between 400 and 700 K of the flame and a combination of their high affinity and low temperature induces entropy generation in this region. Single step kinetics and a reduced scheme neglecting HO 2 is consequently incapable of accurately calculating the entropy generation and second law performance. Overall entropy generation rates increased from lean to rich mixtures in the range Φ = 0.5–1.4 and were dominated by combustion reactions. Characterization of combustor performance in terms of second law efficiency showed that availability reduction by wall heat losses and combustion irreversibility were of the same order for stoichiometric and both decreased for rich flames. On the other hand, near-quenching fuel lean flames (Φ≤0.75) suffered mostly from combustion irreversibility. These trends caused the minimum efficiency (maximum thermodynamic irreversibility) point to locate near stoichiometric fuel–air composition. -- Highlights: ► Reaction set dominating heat release and entropy generation involve HO 2 . ► Entropy generation increased from lean to rich Φ. ► Second law efficiency is minimum at stoichiometric Φ. ► Post-flame heat loss, transport processes needed in microcombustor entropy analysis

  1. Evaluation of single and multi-threshold entropy-based algorithms for folded substrate analysis

    Directory of Open Access Journals (Sweden)

    Magdolna Apro

    2011-10-01

    Full Text Available This paper presents a detailed evaluation of two variants of Maximum Entropy image segmentation algorithm(single and multi-thresholding with respect to their performance on segmenting test images showing folded substrates.The segmentation quality was determined by evaluating values of four different measures: misclassificationerror, modified Hausdorff distance, relative foreground area error and positive-negative false detection ratio. Newnormalization methods were proposed in order to combine all parameters into a unique algorithm evaluation rating.The segmentation algorithms were tested on images obtained by three different digitalisation methods coveringfour different surface textures. In addition, the methods were also tested on three images presenting a perfect fold.The obtained results showed that Multi-Maximum Entropy algorithm is better suited for the analysis of imagesshowing folded substrates.

  2. Chaos control of ferroresonance system based on RBF-maximum entropy clustering algorithm

    International Nuclear Information System (INIS)

    Liu Fan; Sun Caixin; Sima Wenxia; Liao Ruijin; Guo Fei

    2006-01-01

    With regards to the ferroresonance overvoltage of neutral grounded power system, a maximum-entropy learning algorithm based on radial basis function neural networks is used to control the chaotic system. The algorithm optimizes the object function to derive learning rule of central vectors, and uses the clustering function of network hidden layers. It improves the regression and learning ability of neural networks. The numerical experiment of ferroresonance system testifies the effectiveness and feasibility of using the algorithm to control chaos in neutral grounded system

  3. How fast can we learn maximum entropy models of neural populations?

    Energy Technology Data Exchange (ETDEWEB)

    Ganmor, Elad; Schneidman, Elad [Department of Neuroscience, Weizmann Institute of Science, Rehovot 76100 (Israel); Segev, Ronen, E-mail: elad.ganmor@weizmann.ac.i, E-mail: elad.schneidman@weizmann.ac.i [Department of Life Sciences and Zlotowski Center for Neuroscience, Ben-Gurion University of the Negev, Beer-Sheva 84105 (Israel)

    2009-12-01

    Most of our knowledge about how the brain encodes information comes from recordings of single neurons. However, computations in the brain are carried out by large groups of neurons. Modelling the joint activity of many interacting elements is computationally hard because of the large number of possible activity patterns and limited experimental data. Recently it was shown in several different neural systems that maximum entropy pairwise models, which rely only on firing rates and pairwise correlations of neurons, are excellent models for the distribution of activity patterns of neural populations, and in particular, their responses to natural stimuli. Using simultaneous recordings of large groups of neurons in the vertebrate retina responding to naturalistic stimuli, we show here that the relevant statistics required for finding the pairwise model can be accurately estimated within seconds. Furthermore, while higher order statistics may, in theory, improve model accuracy, they are, in practice, harmful for times of up to 20 minutes due to sampling noise. Finally, we demonstrate that trading accuracy for entropy may actually improve model performance when data is limited, and suggest an optimization method that automatically adjusts model constraints in order to achieve good performance.

  4. How fast can we learn maximum entropy models of neural populations?

    International Nuclear Information System (INIS)

    Ganmor, Elad; Schneidman, Elad; Segev, Ronen

    2009-01-01

    Most of our knowledge about how the brain encodes information comes from recordings of single neurons. However, computations in the brain are carried out by large groups of neurons. Modelling the joint activity of many interacting elements is computationally hard because of the large number of possible activity patterns and limited experimental data. Recently it was shown in several different neural systems that maximum entropy pairwise models, which rely only on firing rates and pairwise correlations of neurons, are excellent models for the distribution of activity patterns of neural populations, and in particular, their responses to natural stimuli. Using simultaneous recordings of large groups of neurons in the vertebrate retina responding to naturalistic stimuli, we show here that the relevant statistics required for finding the pairwise model can be accurately estimated within seconds. Furthermore, while higher order statistics may, in theory, improve model accuracy, they are, in practice, harmful for times of up to 20 minutes due to sampling noise. Finally, we demonstrate that trading accuracy for entropy may actually improve model performance when data is limited, and suggest an optimization method that automatically adjusts model constraints in order to achieve good performance.

  5. Developing the fuzzy c-means clustering algorithm based on maximum entropy for multitarget tracking in a cluttered environment

    Science.gov (United States)

    Chen, Xiao; Li, Yaan; Yu, Jing; Li, Yuxing

    2018-01-01

    For fast and more effective implementation of tracking multiple targets in a cluttered environment, we propose a multiple targets tracking (MTT) algorithm called maximum entropy fuzzy c-means clustering joint probabilistic data association that combines fuzzy c-means clustering and the joint probabilistic data association (PDA) algorithm. The algorithm uses the membership value to express the probability of the target originating from measurement. The membership value is obtained through fuzzy c-means clustering objective function optimized by the maximum entropy principle. When considering the effect of the public measurement, we use a correction factor to adjust the association probability matrix to estimate the state of the target. As this algorithm avoids confirmation matrix splitting, it can solve the high computational load problem of the joint PDA algorithm. The results of simulations and analysis conducted for tracking neighbor parallel targets and cross targets in a different density cluttered environment show that the proposed algorithm can realize MTT quickly and efficiently in a cluttered environment. Further, the performance of the proposed algorithm remains constant with increasing process noise variance. The proposed algorithm has the advantages of efficiency and low computational load, which can ensure optimum performance when tracking multiple targets in a dense cluttered environment.

  6. Improvement of the detector resolution in X-ray spectrometry by using the maximum entropy method

    International Nuclear Information System (INIS)

    Fernández, Jorge E.; Scot, Viviana; Giulio, Eugenio Di; Sabbatucci, Lorenzo

    2015-01-01

    In every X-ray spectroscopy measurement the influence of the detection system causes loss of information. Different mechanisms contribute to form the so-called detector response function (DRF): the detector efficiency, the escape of photons as a consequence of photoelectric or scattering interactions, the spectrum smearing due to the energy resolution, and, in solid states detectors (SSD), the charge collection artifacts. To recover the original spectrum, it is necessary to remove the detector influence by solving the so-called inverse problem. The maximum entropy unfolding technique solves this problem by imposing a set of constraints, taking advantage of the known a priori information and preserving the positive-defined character of the X-ray spectrum. This method has been included in the tool UMESTRAT (Unfolding Maximum Entropy STRATegy), which adopts a semi-automatic strategy to solve the unfolding problem based on a suitable combination of the codes MAXED and GRAVEL, developed at PTB. In the past UMESTRAT proved the capability to resolve characteristic peaks which were revealed as overlapped by a Si SSD, giving good qualitative results. In order to obtain quantitative results, UMESTRAT has been modified to include the additional constraint of the total number of photons of the spectrum, which can be easily determined by inverting the diagonal efficiency matrix. The features of the improved code are illustrated with some examples of unfolding from three commonly used SSD like Si, Ge, and CdTe. The quantitative unfolding can be considered as a software improvement of the detector resolution. - Highlights: • Radiation detection introduces distortions in X- and Gamma-ray spectrum measurements. • UMESTRAT is a graphical tool to unfold X- and Gamma-ray spectra. • UMESTRAT uses the maximum entropy method. • UMESTRAT’s new version produces unfolded spectra with quantitative meaning. • UMESTRAT is a software tool to improve the detector resolution.

  7. Using maximum entropy modeling to identify and prioritize red spruce forest habitat in West Virginia

    Science.gov (United States)

    Nathan R. Beane; James S. Rentch; Thomas M. Schuler

    2013-01-01

    Red spruce forests in West Virginia are found in island-like distributions at high elevations and provide essential habitat for the endangered Cheat Mountain salamander and the recently delisted Virginia northern flying squirrel. Therefore, it is important to identify restoration priorities of red spruce forests. Maximum entropy modeling was used to identify areas of...

  8. On the Five-Moment Hamburger Maximum Entropy Reconstruction

    Science.gov (United States)

    Summy, D. P.; Pullin, D. I.

    2018-05-01

    We consider the Maximum Entropy Reconstruction (MER) as a solution to the five-moment truncated Hamburger moment problem in one dimension. In the case of five monomial moment constraints, the probability density function (PDF) of the MER takes the form of the exponential of a quartic polynomial. This implies a possible bimodal structure in regions of moment space. An analytical model is developed for the MER PDF applicable near a known singular line in a centered, two-component, third- and fourth-order moment (μ _3 , μ _4 ) space, consistent with the general problem of five moments. The model consists of the superposition of a perturbed, centered Gaussian PDF and a small-amplitude packet of PDF-density, called the outlying moment packet (OMP), sitting far from the mean. Asymptotic solutions are obtained which predict the shape of the perturbed Gaussian and both the amplitude and position on the real line of the OMP. The asymptotic solutions show that the presence of the OMP gives rise to an MER solution that is singular along a line in (μ _3 , μ _4 ) space emanating from, but not including, the point representing a standard normal distribution, or thermodynamic equilibrium. We use this analysis of the OMP to develop a numerical regularization of the MER, creating a procedure we call the Hybrid MER (HMER). Compared with the MER, the HMER is a significant improvement in terms of robustness and efficiency while preserving accuracy in its prediction of other important distribution features, such as higher order moments.

  9. Maximum entropy method approach to the θ term

    International Nuclear Information System (INIS)

    Imachi, Masahiro; Shinno, Yasuhiko; Yoneyama, Hiroshi

    2004-01-01

    In Monte Carlo simulations of lattice field theory with a θ term, one confronts the complex weight problem, or the sign problem. This is circumvented by performing the Fourier transform of the topological charge distribution P(Q). This procedure, however, causes flattening phenomenon of the free energy f(θ), which makes study of the phase structure unfeasible. In order to treat this problem, we apply the maximum entropy method (MEM) to a Gaussian form of P(Q), which serves as a good example to test whether the MEM can be applied effectively to the θ term. We study the case with flattering as well as that without flattening. In the latter case, the results of the MEM agree with those obtained from the direct application of the Fourier transform. For the former, the MEM gives a smoother f(θ) than that of the Fourier transform. Among various default models investigated, the images which yield the least error do not show flattening, although some others cannot be excluded given the uncertainly related to statistical error. (author)

  10. Multivariate refined composite multiscale entropy analysis

    International Nuclear Information System (INIS)

    Humeau-Heurtier, Anne

    2016-01-01

    Multiscale entropy (MSE) has become a prevailing method to quantify signals complexity. MSE relies on sample entropy. However, MSE may yield imprecise complexity estimation at large scales, because sample entropy does not give precise estimation of entropy when short signals are processed. A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. Nevertheless, RCMSE is for univariate signals only. The simultaneous analysis of multi-channel (multivariate) data often over-performs studies based on univariate signals. We therefore introduce an extension of RCMSE to multivariate data. Applications of multivariate RCMSE to simulated processes reveal its better performances over the standard multivariate MSE. - Highlights: • Multiscale entropy quantifies data complexity but may be inaccurate at large scale. • A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. • Nevertheless, RCMSE is adapted to univariate time series only. • We herein introduce an extension of RCMSE to multivariate data. • It shows better performances than the standard multivariate multiscale entropy.

  11. Mammographic image restoration using maximum entropy deconvolution

    International Nuclear Information System (INIS)

    Jannetta, A; Jackson, J C; Kotre, C J; Birch, I P; Robson, K J; Padgett, R

    2004-01-01

    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization

  12. Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins.

    Science.gov (United States)

    Säwén, Elin; Massad, Tariq; Landersjö, Clas; Damberg, Peter; Widmalm, Göran

    2010-08-21

    The conformational space available to the flexible molecule α-D-Manp-(1-->2)-α-D-Manp-OMe, a model for the α-(1-->2)-linked mannose disaccharide in N- or O-linked glycoproteins, is determined using experimental data and molecular simulation combined with a maximum entropy approach that leads to a converged population distribution utilizing different input information. A database survey of the Protein Data Bank where structures having the constituent disaccharide were retrieved resulted in an ensemble with >200 structures. Subsequent filtering removed erroneous structures and gave the database (DB) ensemble having three classes of mannose-containing compounds, viz., N- and O-linked structures, and ligands to proteins. A molecular dynamics (MD) simulation of the disaccharide revealed a two-state equilibrium with a major and a minor conformational state, i.e., the MD ensemble. These two different conformation ensembles of the disaccharide were compared to measured experimental spectroscopic data for the molecule in water solution. However, neither of the two populations were compatible with experimental data from optical rotation, NMR (1)H,(1)H cross-relaxation rates as well as homo- and heteronuclear (3)J couplings. The conformational distributions were subsequently used as background information to generate priors that were used in a maximum entropy analysis. The resulting posteriors, i.e., the population distributions after the application of the maximum entropy analysis, still showed notable deviations that were not anticipated based on the prior information. Therefore, reparameterization of homo- and heteronuclear Karplus relationships for the glycosidic torsion angles Φ and Ψ were carried out in which the importance of electronegative substituents on the coupling pathway was deemed essential resulting in four derived equations, two (3)J(COCC) and two (3)J(COCH) being different for the Φ and Ψ torsions, respectively. These Karplus relationships are denoted

  13. Study on Droplet Size and Velocity Distributions of a Pressure Swirl Atomizer Based on the Maximum Entropy Formalism

    Directory of Open Access Journals (Sweden)

    Kai Yan

    2015-01-01

    Full Text Available A predictive model for droplet size and velocity distributions of a pressure swirl atomizer has been proposed based on the maximum entropy formalism (MEF. The constraint conditions of the MEF model include the conservation laws of mass, momentum, and energy. The effects of liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio on the droplet size and velocity distributions of a pressure swirl atomizer are investigated. Results show that model based on maximum entropy formalism works well to predict droplet size and velocity distributions under different spray conditions. Liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio have different effects on droplet size and velocity distributions of a pressure swirl atomizer.

  14. Optimized Kernel Entropy Components.

    Science.gov (United States)

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  15. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    Science.gov (United States)

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  16. Calculating the Prior Probability Distribution for a Causal Network Using Maximum Entropy: Alternative Approaches

    Directory of Open Access Journals (Sweden)

    Michael J. Markham

    2011-07-01

    Full Text Available Some problems occurring in Expert Systems can be resolved by employing a causal (Bayesian network and methodologies exist for this purpose. These require data in a specific form and make assumptions about the independence relationships involved. Methodologies using Maximum Entropy (ME are free from these conditions and have the potential to be used in a wider context including systems consisting of given sets of linear and independence constraints, subject to consistency and convergence. ME can also be used to validate results from the causal network methodologies. Three ME methods for determining the prior probability distribution of causal network systems are considered. The first method is Sequential Maximum Entropy in which the computation of a progression of local distributions leads to the over-all distribution. This is followed by development of the Method of Tribus. The development takes the form of an algorithm that includes the handling of explicit independence constraints. These fall into two groups those relating parents of vertices, and those deduced from triangulation of the remaining graph. The third method involves a variation in the part of that algorithm which handles independence constraints. Evidence is presented that this adaptation only requires the linear constraints and the parental independence constraints to emulate the second method in a substantial class of examples.

  17. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    International Nuclear Information System (INIS)

    Bilsky, A V; Lozhkin, V A; Markovich, D M; Tokarev, M P

    2013-01-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART. (paper)

  18. A parametrization of two-dimensional turbulence based on a maximum entropy production principle with a local conservation of energy

    International Nuclear Information System (INIS)

    Chavanis, Pierre-Henri

    2014-01-01

    In the context of two-dimensional (2D) turbulence, we apply the maximum entropy production principle (MEPP) by enforcing a local conservation of energy. This leads to an equation for the vorticity distribution that conserves all the Casimirs, the energy, and that increases monotonically the mixing entropy (H-theorem). Furthermore, the equation for the coarse-grained vorticity dissipates monotonically all the generalized enstrophies. These equations may provide a parametrization of 2D turbulence. They do not generally relax towards the maximum entropy state. The vorticity current vanishes for any steady state of the 2D Euler equation. Interestingly, the equation for the coarse-grained vorticity obtained from the MEPP turns out to coincide, after some algebraic manipulations, with the one obtained with the anticipated vorticity method. This shows a connection between these two approaches when the conservation of energy is treated locally. Furthermore, the newly derived equation, which incorporates a diffusion term and a drift term, has a nice physical interpretation in terms of a selective decay principle. This sheds new light on both the MEPP and the anticipated vorticity method. (paper)

  19. Measuring Coupling of Rhythmical Time Series Using Cross Sample Entropy and Cross Recurrence Quantification Analysis

    Directory of Open Access Journals (Sweden)

    John McCamley

    2017-01-01

    Full Text Available The aim of this investigation was to compare and contrast the use of cross sample entropy (xSE and cross recurrence quantification analysis (cRQA measures for the assessment of coupling of rhythmical patterns. Measures were assessed using simulated signals with regular, chaotic, and random fluctuations in frequency, amplitude, and a combination of both. Biological data were studied as models of normal and abnormal locomotor-respiratory coupling. Nine signal types were generated for seven frequency ratios. Fifteen patients with COPD (abnormal coupling and twenty-one healthy controls (normal coupling walked on a treadmill at three speeds while breathing and walking were recorded. xSE and the cRQA measures of percent determinism, maximum line, mean line, and entropy were quantified for both the simulated and experimental data. In the simulated data, xSE, percent determinism, and entropy were influenced by the frequency manipulation. The 1 : 1 frequency ratio was different than other frequency ratios for almost all measures and/or manipulations. The patients with COPD used a 2 : 3 ratio more often and xSE, percent determinism, maximum line, mean line, and cRQA entropy were able to discriminate between the groups. Analysis of the effects of walking speed indicated that all measures were able to discriminate between speeds.

  20. Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing

    Directory of Open Access Journals (Sweden)

    Radu Mutihac

    2009-06-01

    Full Text Available Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.

  1. Spatiotemporal analysis and mapping of oral cancer risk in changhua county (taiwan): an application of generalized bayesian maximum entropy method.

    Science.gov (United States)

    Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo

    2010-02-01

    Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.

  2. Low Streamflow Forcasting using Minimum Relative Entropy

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  3. n-Order and maximum fuzzy similarity entropy for discrimination of signals of different complexity: Application to fetal heart rate signals.

    Science.gov (United States)

    Zaylaa, Amira; Oudjemia, Souad; Charara, Jamal; Girault, Jean-Marc

    2015-09-01

    This paper presents two new concepts for discrimination of signals of different complexity. The first focused initially on solving the problem of setting entropy descriptors by varying the pattern size instead of the tolerance. This led to the search for the optimal pattern size that maximized the similarity entropy. The second paradigm was based on the n-order similarity entropy that encompasses the 1-order similarity entropy. To improve the statistical stability, n-order fuzzy similarity entropy was proposed. Fractional Brownian motion was simulated to validate the different methods proposed, and fetal heart rate signals were used to discriminate normal from abnormal fetuses. In all cases, it was found that it was possible to discriminate time series of different complexity such as fractional Brownian motion and fetal heart rate signals. The best levels of performance in terms of sensitivity (90%) and specificity (90%) were obtained with the n-order fuzzy similarity entropy. However, it was shown that the optimal pattern size and the maximum similarity measurement were related to intrinsic features of the time series. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Identification of a Threshold Value for the DEMATEL Method: Using the Maximum Mean De-Entropy Algorithm

    Science.gov (United States)

    Chung-Wei, Li; Gwo-Hshiung, Tzeng

    To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.

  5. Entropy and equilibrium via games of complexity

    Science.gov (United States)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  6. Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method

    Science.gov (United States)

    Pei-Jui, Wu; Hwa-Lung, Yu

    2016-04-01

    The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .

  7. Entropy and transverse section reconstruction

    International Nuclear Information System (INIS)

    Gullberg, G.T.

    1976-01-01

    A new approach to the reconstruction of a transverse section using projection data from multiple views incorporates the concept of maximum entropy. The principle of maximizing information entropy embodies the assurance of minimizing bias or prejudice in the reconstruction. Using maximum entropy is a necessary condition for the reconstructed image. This entropy criterion is most appropriate for 3-D reconstruction of objects from projections where the system is underdetermined or the data are limited statistically. This is the case in nuclear medicine time limitations in patient studies do not yield sufficient projections

  8. Vertical and horizontal processes in the global atmosphere and the maximum entropy production conjecture

    Directory of Open Access Journals (Sweden)

    S. Pascale

    2012-01-01

    Full Text Available The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport.

  9. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  10. Using Maximum Entropy to Find Patterns in Genomes

    Science.gov (United States)

    Liu, Sophia; Hockenberry, Adam; Lancichinetti, Andrea; Jewett, Michael; Amaral, Luis

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. To accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. This approach can also be easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes. National Institute of General Medical Science, Northwestern University Presidential Fellowship, National Science Foundation, David and Lucile Packard Foundation, Camille Dreyfus Teacher Scholar Award.

  11. Electron density profile reconstruction by maximum entropy method with multichannel HCN laser interferometer system on SPAC VII

    International Nuclear Information System (INIS)

    Kubo, S.; Narihara, K.; Tomita, Y.; Hasegawa, M.; Tsuzuki, T.; Mohri, A.

    1988-01-01

    A multichannel HCN laser interferometer system has been developed to investigate the plasma electron confinement properties in SPAC VII device. Maximum entropy method is applied to reconstruct the electron density profile from measured line integrated data. Particle diffusion coefficient in the peripheral region of the REB ring core spherator was obtained from the evolution of the density profile. (author)

  12. Exact Maximum-Entropy Estimation with Feynman Diagrams

    Science.gov (United States)

    Netser Zernik, Amitai; Schlank, Tomer M.; Tessler, Ran J.

    2018-02-01

    A longstanding open problem in statistics is finding an explicit expression for the probability measure which maximizes entropy with respect to given constraints. In this paper a solution to this problem is found, using perturbative Feynman calculus. The explicit expression is given as a sum over weighted trees.

  13. Non-equilibrium thermodynamics, maximum entropy production and Earth-system evolution.

    Science.gov (United States)

    Kleidon, Axel

    2010-01-13

    The present-day atmosphere is in a unique state far from thermodynamic equilibrium. This uniqueness is for instance reflected in the high concentration of molecular oxygen and the low relative humidity in the atmosphere. Given that the concentration of atmospheric oxygen has likely increased throughout Earth-system history, we can ask whether this trend can be generalized to a trend of Earth-system evolution that is directed away from thermodynamic equilibrium, why we would expect such a trend to take place and what it would imply for Earth-system evolution as a whole. The justification for such a trend could be found in the proposed general principle of maximum entropy production (MEP), which states that non-equilibrium thermodynamic systems maintain steady states at which entropy production is maximized. Here, I justify and demonstrate this application of MEP to the Earth at the planetary scale. I first describe the non-equilibrium thermodynamic nature of Earth-system processes and distinguish processes that drive the system's state away from equilibrium from those that are directed towards equilibrium. I formulate the interactions among these processes from a thermodynamic perspective and then connect them to a holistic view of the planetary thermodynamic state of the Earth system. In conclusion, non-equilibrium thermodynamics and MEP have the potential to provide a simple and holistic theory of Earth-system functioning. This theory can be used to derive overall evolutionary trends of the Earth's past, identify the role that life plays in driving thermodynamic states far from equilibrium, identify habitability in other planetary environments and evaluate human impacts on Earth-system functioning. This journal is © 2010 The Royal Society

  14. Image Segmentation using a Refined Comprehensive Learning Particle Swarm Optimizer for Maximum Tsallis Entropy Thresholding

    OpenAIRE

    L. Jubair Ahmed; A. Ebenezer Jeyakumar

    2013-01-01

    Thresholding is one of the most important techniques for performing image segmentation. In this paper to compute optimum thresholds for Maximum Tsallis entropy thresholding (MTET) model, a new hybrid algorithm is proposed by integrating the Comprehensive Learning Particle Swarm Optimizer (CPSO) with the Powell’s Conjugate Gradient (PCG) method. Here the CPSO will act as the main optimizer for searching the near-optimal thresholds while the PCG method will be used to fine tune the best solutio...

  15. Multiscale permutation entropy analysis of electrocardiogram

    Science.gov (United States)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  16. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  17. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy

    Directory of Open Access Journals (Sweden)

    Maya Gupta

    2010-04-01

    Full Text Available Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian estimates, depend on the accuracy of the prior parameters, but example simulations show that the performance can be substantially improved compared to maximum likelihood or state-of-the-art nonparametric estimators.

  18. Maximum Entropy Production Is Not a Steady State Attractor for 2D Fluid Convection

    Directory of Open Access Journals (Sweden)

    Stuart Bartlett

    2016-12-01

    Full Text Available Multiple authors have claimed that the natural convection of a fluid is a process that exhibits maximum entropy production (MEP. However, almost all such investigations were limited to fixed temperature boundary conditions (BCs. It was found that under those conditions, the system tends to maximize its heat flux, and hence it was concluded that the MEP state is a dynamical attractor. However, since entropy production varies with heat flux and difference of inverse temperature, it is essential that any complete investigation of entropy production allows for variations in heat flux and temperature difference. Only then can we legitimately assess whether the MEP state is the most attractive. Our previous work made use of negative feedback BCs to explore this possibility. We found that the steady state of the system was far from the MEP state. For any system, entropy production can only be maximized subject to a finite set of physical and material constraints. In the case of our previous work, it was possible that the adopted set of fluid parameters were constraining the system in such a way that it was entirely prevented from reaching the MEP state. Hence, in the present work, we used a different set of boundary parameters, such that the steady states of the system were in the local vicinity of the MEP state. If MEP was indeed an attractor, relaxing those constraints of our previous work should have caused a discrete perturbation to the surface of steady state heat flux values near the value corresponding to MEP. We found no such perturbation, and hence no discernible attraction to the MEP state. Furthermore, systems with fixed flux BCs actually minimize their entropy production (relative to the alternative stable state, that of pure diffusive heat transport. This leads us to conclude that the principle of MEP is not an accurate indicator of which stable steady state a convective system will adopt. However, for all BCs considered, the quotient of

  19. Entropy estimates for simple random fields

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Justesen, Jørn

    1995-01-01

    We consider the problem of determining the maximum entropy of a discrete random field on a lattice subject to certain local constraints on symbol configurations. The results are expected to be of interest in the analysis of digitized images and two dimensional codes. We shall present some examples...... of binary and ternary fields with simple constraints. Exact results on the entropies are known only in a few cases, but we shall present close bounds and estimates that are computationally efficient...

  20. Refined generalized multiscale entropy analysis for physiological signals

    Science.gov (United States)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  1. A subjective supply–demand model: the maximum Boltzmann/Shannon entropy solution

    International Nuclear Information System (INIS)

    Piotrowski, Edward W; Sładkowski, Jan

    2009-01-01

    The present authors have put forward a projective geometry model of rational trading. The expected (mean) value of the time that is necessary to strike a deal and the profit strongly depend on the strategies adopted. A frequent trader often prefers maximal profit intensity to the maximization of profit resulting from a separate transaction because the gross profit/income is the adopted/recommended benchmark. To investigate activities that have different periods of duration we define, following the queuing theory, the profit intensity as a measure of this economic category. The profit intensity in repeated trading has a unique property of attaining its maximum at a fixed point regardless of the shape of demand curves for a wide class of probability distributions of random reverse transactions (i.e. closing of the position). These conclusions remain valid for an analogous model based on supply analysis. This type of market game is often considered in research aiming at finding an algorithm that maximizes profit of a trader who negotiates prices with the Rest of the World (a collective opponent), possessing a definite and objective supply profile. Such idealization neglects the sometimes important influence of an individual trader on the demand/supply profile of the Rest of the World and in extreme cases questions the very idea of demand/supply profile. Therefore we put forward a trading model in which the demand/supply profile of the Rest of the World induces the (rational) trader to (subjectively) presume that he/she lacks (almost) all knowledge concerning the market but his/her average frequency of trade. This point of view introduces maximum entropy principles into the model and broadens the range of economic phenomena that can be perceived as a sort of thermodynamical system. As a consequence, the profit intensity has a fixed point with an astonishing connection with Fibonacci classical works and looking for the quickest algorithm for obtaining the extremum of a

  2. A subjective supply-demand model: the maximum Boltzmann/Shannon entropy solution

    Science.gov (United States)

    Piotrowski, Edward W.; Sładkowski, Jan

    2009-03-01

    The present authors have put forward a projective geometry model of rational trading. The expected (mean) value of the time that is necessary to strike a deal and the profit strongly depend on the strategies adopted. A frequent trader often prefers maximal profit intensity to the maximization of profit resulting from a separate transaction because the gross profit/income is the adopted/recommended benchmark. To investigate activities that have different periods of duration we define, following the queuing theory, the profit intensity as a measure of this economic category. The profit intensity in repeated trading has a unique property of attaining its maximum at a fixed point regardless of the shape of demand curves for a wide class of probability distributions of random reverse transactions (i.e. closing of the position). These conclusions remain valid for an analogous model based on supply analysis. This type of market game is often considered in research aiming at finding an algorithm that maximizes profit of a trader who negotiates prices with the Rest of the World (a collective opponent), possessing a definite and objective supply profile. Such idealization neglects the sometimes important influence of an individual trader on the demand/supply profile of the Rest of the World and in extreme cases questions the very idea of demand/supply profile. Therefore we put forward a trading model in which the demand/supply profile of the Rest of the World induces the (rational) trader to (subjectively) presume that he/she lacks (almost) all knowledge concerning the market but his/her average frequency of trade. This point of view introduces maximum entropy principles into the model and broadens the range of economic phenomena that can be perceived as a sort of thermodynamical system. As a consequence, the profit intensity has a fixed point with an astonishing connection with Fibonacci classical works and looking for the quickest algorithm for obtaining the extremum of a

  3. The Maximum Entropy Production Principle: Its Theoretical Foundations and Applications to the Earth System

    Directory of Open Access Journals (Sweden)

    Axel Kleidon

    2010-03-01

    Full Text Available The Maximum Entropy Production (MEP principle has been remarkably successful in producing accurate predictions for non-equilibrium states. We argue that this is because the MEP principle is an effective inference procedure that produces the best predictions from the available information. Since all Earth system processes are subject to the conservation of energy, mass and momentum, we argue that in practical terms the MEP principle should be applied to Earth system processes in terms of the already established framework of non-equilibrium thermodynamics, with the assumption of local thermodynamic equilibrium at the appropriate scales.

  4. Entropy Generation Analysis of Desalination Technologies

    Directory of Open Access Journals (Sweden)

    John H. Lienhard V

    2011-09-01

    Full Text Available Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies. Entropy generation analysis, and specifically, Second Law efficiency, is an important tool for illustrating the influence of irreversibilities within a system on the required energy input. When defining Second Law efficiency, the useful exergy output of the system must be properly defined. For desalination systems, this is the minimum least work of separation required to extract a unit of water from a feed stream of a given salinity. In order to evaluate the Second Law efficiency, entropy generation mechanisms present in a wide range of desalination processes are analyzed. In particular, entropy generated in the run down to equilibrium of discharge streams must be considered. Physical models are applied to estimate the magnitude of entropy generation by component and individual processes. These formulations are applied to calculate the total entropy generation in several desalination systems including multiple effect distillation, multistage flash, membrane distillation, mechanical vapor compression, reverse osmosis, and humidification-dehumidification. Within each technology, the relative importance of each source of entropy generation is discussed in order to determine which should be the target of entropy generation minimization. As given here, the correct application of Second Law efficiency shows which systems operate closest to the reversible limit and helps to indicate which systems have the greatest potential for improvement.

  5. [Analysis of the Muscle Fatigue Based on Band Spectrum Entropy of Multi-channel Surface Electromyography].

    Science.gov (United States)

    Liu, Jian; Zou, Renling; Zhang, Dongheng; Xu, Xiulin; Hu, Xiufang

    2016-06-01

    Exercise-induced muscle fatigue is a phenomenon that the maximum voluntary contraction force or power output of muscle is temporarily reduced due to muscular movement.If the fatigue is not treated properly,it will bring about a severe injury to the human body.With multi-channel collection of lower limb surface electromyography signals,this article analyzes the muscle fatigue by adoption of band spectrum entropy method which combined electromyographic signal spectral analysis and nonlinear dynamics.The experimental result indicated that with the increase of muscle fatigue,muscle signal spectrum began to move to low frequency,the energy concentrated,the system complexity came down,and the band spectrum entropy which reflected the complexity was also reduced.By monitoring the entropy,we can measure the degree of muscle fatigue,and provide an indicator to judge fatigue degree for the sports training and clinical rehabilitation training.

  6. Online Robot Dead Reckoning Localization Using Maximum Relative Entropy Optimization With Model Constraints

    International Nuclear Information System (INIS)

    Urniezius, Renaldas

    2011-01-01

    The principle of Maximum relative Entropy optimization was analyzed for dead reckoning localization of a rigid body when observation data of two attached accelerometers was collected. Model constraints were derived from the relationships between the sensors. The experiment's results confirmed that accelerometers each axis' noise can be successfully filtered utilizing dependency between channels and the dependency between time series data. Dependency between channels was used for a priori calculation, and a posteriori distribution was derived utilizing dependency between time series data. There was revisited data of autocalibration experiment by removing the initial assumption that instantaneous rotation axis of a rigid body was known. Performance results confirmed that such an approach could be used for online dead reckoning localization.

  7. A parallel implementation of a maximum entropy reconstruction algorithm for PET images in a visual language

    International Nuclear Information System (INIS)

    Bastiens, K.; Lemahieu, I.

    1994-01-01

    The application of a maximum entropy reconstruction algorithm to PET images requires a lot of computing resources. A parallel implementation could seriously reduce the execution time. However, programming a parallel application is still a non trivial task, needing specialized people. In this paper a programming environment based on a visual programming language is used for a parallel implementation of the reconstruction algorithm. This programming environment allows less experienced programmers to use the performance of multiprocessor systems. (authors)

  8. Nuclear Enhanced X-ray Maximum Entropy Method Used to Analyze Local Distortions in Simple Structures

    DEFF Research Database (Denmark)

    Christensen, Sebastian; Bindzus, Niels; Christensen, Mogens

    We introduce a novel method for reconstructing pseudo nuclear density distributions (NDDs): Nuclear Enhanced X-ray Maximum Entropy Method (NEXMEM). NEXMEM offers an alternative route to experimental NDDs, exploiting the superior quality of synchrotron X-ray data compared to neutron data. The method...... proposed to result from anharmonic phonon scattering or from local fluctuating dipoles on the Pb site.[1,2] No macroscopic symmetry change are associated with these effects, rendering them invisible to conventional crystallographic techniques. For this reason PbX was until recently believed to adopt...

  9. Maximum Entropy: Clearing up Mysteries

    Directory of Open Access Journals (Sweden)

    Marian Grendár

    2001-04-01

    Full Text Available Abstract: There are several mystifications and a couple of mysteries pertinent to MaxEnt. The mystifications, pitfalls and traps are set up mainly by an unfortunate formulation of Jaynes' die problem, the cause célèbre of MaxEnt. After discussing the mystifications a new formulation of the problem is proposed. Then we turn to the mysteries. An answer to the recurring question 'Just what are we accomplishing when we maximize entropy?' [8], based on MaxProb rationale of MaxEnt [6], is recalled. A brief view on the other mystery: 'What is the relation between MaxEnt and the Bayesian method?' [9], in light of the MaxProb rationale of MaxEnt suggests that there is not and cannot be a conflict between MaxEnt and Bayes Theorem.

  10. The Kalman Filter Revisited Using Maximum Relative Entropy

    Directory of Open Access Journals (Sweden)

    Adom Giffin

    2014-02-01

    Full Text Available In 1960, Rudolf E. Kalman created what is known as the Kalman filter, which is a way to estimate unknown variables from noisy measurements. The algorithm follows the logic that if the previous state of the system is known, it could be used as the best guess for the current state. This information is first applied a priori to any measurement by using it in the underlying dynamics of the system. Second, measurements of the unknown variables are taken. These two pieces of information are taken into account to determine the current state of the system. Bayesian inference is specifically designed to accommodate the problem of updating what we think of the world based on partial or uncertain information. In this paper, we present a derivation of the general Bayesian filter, then adapt it for Markov systems. A simple example is shown for pedagogical purposes. We also show that by using the Kalman assumptions or “constraints”, we can arrive at the Kalman filter using the method of maximum (relative entropy (MrE, which goes beyond Bayesian methods. Finally, we derive a generalized, nonlinear filter using MrE, where the original Kalman Filter is a special case. We further show that the variable relationship can be any function, and thus, approximations, such as the extended Kalman filter, the unscented Kalman filter and other Kalman variants are special cases as well.

  11. A parallel implementation of a maximum entropy reconstruction algorithm for PET images in a visual language

    Energy Technology Data Exchange (ETDEWEB)

    Bastiens, K; Lemahieu, I [University of Ghent - ELIS Department, St. Pietersnieuwstraat 41, B-9000 Ghent (Belgium)

    1994-12-31

    The application of a maximum entropy reconstruction algorithm to PET images requires a lot of computing resources. A parallel implementation could seriously reduce the execution time. However, programming a parallel application is still a non trivial task, needing specialized people. In this paper a programming environment based on a visual programming language is used for a parallel implementation of the reconstruction algorithm. This programming environment allows less experienced programmers to use the performance of multiprocessor systems. (authors). 8 refs, 3 figs, 1 tab.

  12. The inverse Fourier problem in the case of poor resolution in one given direction: the maximum-entropy solution

    International Nuclear Information System (INIS)

    Papoular, R.J.; Zheludev, A.; Ressouche, E.; Schweizer, J.

    1995-01-01

    When density distributions in crystals are reconstructed from 3D diffraction data, a problem sometimes occurs when the spatial resolution in one given direction is very small compared to that in perpendicular directions. In this case, a 2D projected density is usually reconstructed. For this task, the conventional Fourier inversion method only makes use of those structure factors measured in the projection plane. All the other structure factors contribute zero to the reconstruction of a projected density. On the contrary, the maximum-entropy method uses all the 3D data, to yield 3D-enhanced 2D projected density maps. It is even possible to reconstruct a projection in the extreme case when not one structure factor in the plane of projection is known. In the case of poor resolution along one given direction, a Fourier inversion reconstruction gives very low quality 3D densities 'smeared' in the third dimension. The application of the maximum-entropy procedure reduces the smearing significantly and reasonably well resolved projections along most directions can now be obtained from the MaxEnt 3D density. To illustrate these two ideas, particular examples based on real polarized neutron diffraction data sets are presented. (orig.)

  13. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jing; GUO ZengYuan

    2008-01-01

    The defects of Cleusius entropy which Include s premise of reversible process and a process quantlty of heat in Its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state funcllon. Unlike Clausius entropy, the improved deflnltion consists of system properties wlthout premise just like other state functions, for example, pressure p and enthalpy h, etc. it is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved deflnitlon of Clausius entropy provides a clear concept as well as a convenient method for en-tropy change calculation.

  14. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  15. Nonadditive entropy maximization is inconsistent with Bayesian updating

    Science.gov (United States)

    Pressé, Steve

    2014-11-01

    The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  16. The improvement of Clausius entropy and its application in entropy analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The defects of Clausius entropy which include a premise of reversible process and a process quantity of heat in its definition are discussed in this paper. Moreover, the heat temperature quotient under reversible conditions, i.e. (δQ/T)rev, is essentially a process quantity although it is numerically equal to the entropy change. The sum of internal energy temperature quotient and work temperature quotient is defined as the improved form of Clausius entropy and it can be further proved to be a state function. Unlike Clausius entropy, the improved definition consists of system properties without premise just like other state functions, for example, pressure p and enthalpy h, etc. It is unnecessary to invent reversible paths when calculating entropy change for irreversible processes based on the improved form of entropy since it is independent of process. Furthermore, entropy balance equations for internally and externally irreversible processes are deduced respectively based on the concepts of thermal reservoir entropy transfer and system entropy transfer. Finally, some examples are presented to show that the improved definition of Clausius entropy provides a clear concept as well as a convenient method for en- tropy change calculation.

  17. Entropy concentration and the empirical coding game

    NARCIS (Netherlands)

    Grünwald, P.D.

    2008-01-01

    We give a characterization of maximum entropy/minimum relative entropy inference by providing two 'strong entropy concentration' theorems. These theorems unify and generalize Jaynes''concentration phenomenon' and Van Campenhout and Cover's 'conditional limit theorem'. The theorems characterize

  18. The Maximum Entropy Limit of Small-scale Magnetic Field Fluctuations in the Quiet Sun

    Science.gov (United States)

    Gorobets, A. Y.; Berdyugina, S. V.; Riethmüller, T. L.; Blanco Rodríguez, J.; Solanki, S. K.; Barthol, P.; Gandorfer, A.; Gizon, L.; Hirzberger, J.; van Noort, M.; Del Toro Iniesta, J. C.; Orozco Suárez, D.; Schmidt, W.; Martínez Pillet, V.; Knölker, M.

    2017-11-01

    The observed magnetic field on the solar surface is characterized by a very complex spatial and temporal behavior. Although feature-tracking algorithms have allowed us to deepen our understanding of this behavior, subjectivity plays an important role in the identification and tracking of such features. In this paper, we continue studies of the temporal stochasticity of the magnetic field on the solar surface without relying either on the concept of magnetic features or on subjective assumptions about their identification and interaction. We propose a data analysis method to quantify fluctuations of the line-of-sight magnetic field by means of reducing the temporal field’s evolution to the regular Markov process. We build a representative model of fluctuations converging to the unique stationary (equilibrium) distribution in the long time limit with maximum entropy. We obtained different rates of convergence to the equilibrium at fixed noise cutoff for two sets of data. This indicates a strong influence of the data spatial resolution and mixing-polarity fluctuations on the relaxation process. The analysis is applied to observations of magnetic fields of the relatively quiet areas around an active region carried out during the second flight of the Sunrise/IMaX and quiet Sun areas at the disk center from the Helioseismic and Magnetic Imager on board the Solar Dynamics Observatory satellite.

  19. Entropy Bounds for Constrained Two-Dimensional Fields

    DEFF Research Database (Denmark)

    Forchhammer, Søren Otto; Justesen, Jørn

    1999-01-01

    The maximum entropy and thereby the capacity of 2-D fields given by certain constraints on configurations are considered. Upper and lower bounds are derived.......The maximum entropy and thereby the capacity of 2-D fields given by certain constraints on configurations are considered. Upper and lower bounds are derived....

  20. 2D Tsallis Entropy for Image Segmentation Based on Modified Chaotic Bat Algorithm

    Directory of Open Access Journals (Sweden)

    Zhiwei Ye

    2018-03-01

    Full Text Available Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum entropy, 1D cross entropy, 1D Tsallis entropy, fuzzy entropy, 2D Fisher, 2D maximum entropy and 2D cross entropy. On the other hand, due to the existence of huge computational costs, meta-heuristics algorithms like genetic algorithm (GA, particle swarm optimization (PSO, ant colony optimization algorithm (ACO and differential evolution algorithm (DE are used to accelerate the 2D Tsallis entropy thresholding method. In this paper, considering 2D Tsallis entropy as a constrained optimization problem, the optimal thresholds are acquired by maximizing the objective function using a modified chaotic Bat algorithm (MCBA. The proposed algorithm has been tested on some actual and infrared images. The results are compared with that of PSO, GA, ACO and DE and demonstrate that the proposed method outperforms other approaches involved in the paper, which is a feasible and effective option for image segmentation.

  1. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production

    Science.gov (United States)

    Kleidon, A.

    2010-01-01

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion. PMID:20368248

  2. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production.

    Science.gov (United States)

    Kleidon, A

    2010-05-12

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.

  3. Choosing between Higher Moment Maximum Entropy Models and Its Application to Homogeneous Point Processes with Random Effects

    Directory of Open Access Journals (Sweden)

    Lotfi Khribi

    2017-12-01

    Full Text Available In the Bayesian framework, the usual choice of prior in the prediction of homogeneous Poisson processes with random effects is the gamma one. Here, we propose the use of higher order maximum entropy priors. Their advantage is illustrated in a simulation study and the choice of the best order is established by two goodness-of-fit criteria: Kullback–Leibler divergence and a discrepancy measure. This procedure is illustrated on a warranty data set from the automobile industry.

  4. MAXED, a computer code for the deconvolution of multisphere neutron spectrometer data using the maximum entropy method

    International Nuclear Information System (INIS)

    Reginatto, M.; Goldhagen, P.

    1998-06-01

    The problem of analyzing data from a multisphere neutron spectrometer to infer the energy spectrum of the incident neutrons is discussed. The main features of the code MAXED, a computer program developed to apply the maximum entropy principle to the deconvolution (unfolding) of multisphere neutron spectrometer data, are described, and the use of the code is illustrated with an example. A user's guide for the code MAXED is included in an appendix. The code is available from the authors upon request

  5. A Maximum Entropy Approach to Assess Debonding in Honeycomb aluminum Plates

    Directory of Open Access Journals (Sweden)

    Viviana Meruane

    2014-05-01

    Full Text Available Honeycomb sandwich structures are used in a wide variety of applications. Nevertheless, due to manufacturing defects or impact loads, these structures can be subject to imperfect bonding or debonding between the skin and the honeycomb core. The presence of debonding reduces the bending stiffness of the composite panel, which causes detectable changes in its vibration characteristics. This article presents a new supervised learning algorithm to identify debonded regions in aluminum honeycomb panels. The algorithm uses a linear approximation method handled by a statistical inference model based on the maximum-entropy principle. The merits of this new approach are twofold: training is avoided and data is processed in a period of time that is comparable to the one of neural networks. The honeycomb panels are modeled with finite elements using a simplified three-layer shell model. The adhesive layer between the skin and core is modeled using linear springs, the rigidities of which are reduced in debonded sectors. The algorithm is validated using experimental data of an aluminum honeycomb panel under different damage scenarios.

  6. LQG and maximum entropy control design for the Hubble Space Telescope

    Science.gov (United States)

    Collins, Emmanuel G., Jr.; Richter, Stephen

    Solar array vibrations are responsible for serious pointing control problems on the Hubble Space Telescope (HST). The original HST control law was not designed to attenuate these disturbances because they were not perceived to be a problem prior to launch. However, significant solar array vibrations do occur due to large changes in the thermal environment as the HST orbits the earth. Using classical techniques, Marshall Space Flight Center in conjunction with Lockheed Missiles and Space Company developed modified HST controllers that were able to suppress the influence of the vibrations of the solar arrays on the line-of-sight (LOS) performance. Substantial LOS improvement was observed when two of these controllers were implemented on orbit. This paper describes the development of modified HST controllers by using modern control techniques, particularly linear-quadratic-gaussian (LQG) design and Maximum Entropy robust control design, a generalization of LQG that incorporates robustness constraints with respect to modal errors. The fundamental issues are discussed candidly and controllers designed using these modern techniques are described.

  7. Entropie analysis of floating car data systems

    Directory of Open Access Journals (Sweden)

    F. Gössel

    2004-01-01

    Full Text Available The knowledge of the actual traffic state is a basic prerequisite of modern traffic telematic systems. Floating Car Data (FCD systems are becoming more and more important for the provision of actual and reliable traffic data. In these systems the vehicle velocity is the original variable for the evaluation of the current traffic condition. As real FCDsystems are operating under conditions of limited transmission and processing capacity the analysis of the original variable vehicle speed is of special interest. Entropy considerations are especially useful for the deduction of fundamental restrictions and limitations. The paper analyses velocity-time profiles by means of information entropy. It emphasises in quantification of the information content of velocity-time profiles and the discussion of entropy dynamic in velocity-time profiles. Investigations are based on empirical data derived during field trials. The analysis of entropy dynamic is carried out in two different ways. On one hand velocity differences within a certain interval of time are used, on the other hand the transinformation between velocities in certain time distances was evaluated. One important result is an optimal sample-rate for the detection of velocity data in FCD-systems. The influence of spatial segmentation and of different states of traffic was discussed.

  8. ENTROPY PRODUCTION IN COLLISIONLESS SYSTEMS. II. ARBITRARY PHASE-SPACE OCCUPATION NUMBERS

    International Nuclear Information System (INIS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-01-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  9. Modelling streambank erosion potential using maximum entropy in a central Appalachian watershed

    Directory of Open Access Journals (Sweden)

    J. Pitchford

    2015-03-01

    Full Text Available We used maximum entropy to model streambank erosion potential (SEP in a central Appalachian watershed to help prioritize sites for management. Model development included measuring erosion rates, application of a quantitative approach to locate Target Eroding Areas (TEAs, and creation of maps of boundary conditions. We successfully constructed a probability distribution of TEAs using the program Maxent. All model evaluation procedures indicated that the model was an excellent predictor, and that the major environmental variables controlling these processes were streambank slope, soil characteristics, bank position, and underlying geology. A classification scheme with low, moderate, and high levels of SEP derived from logistic model output was able to differentiate sites with low erosion potential from sites with moderate and high erosion potential. A major application of this type of modelling framework is to address uncertainty in stream restoration planning, ultimately helping to bridge the gap between restoration science and practice.

  10. Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.

    Science.gov (United States)

    O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao

    2017-07-01

    Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.

  11. Maximum entropy based reconstruction of soft X ray emissivity profiles in W7-AS

    International Nuclear Information System (INIS)

    Ertl, K.; Linden, W. von der; Dose, V.; Weller, A.

    1996-01-01

    The reconstruction of 2-D emissivity profiles from soft X ray tomography measurements constitutes a highly underdetermined and ill-posed inversion problem, because of the restricted viewing access, the number of chords and the increased noise level in most plasma devices. An unbiased and consistent probabilistic approach within the framework of Bayesian inference is provided by the maximum entropy method, which is independent of model assumptions, but allows any prior knowledge available to be incorporated. The formalism is applied to the reconstruction of emissivity profiles in an NBI heated plasma discharge to determine the dependence of the Shafranov shift on β, the reduction of which was a particular objective in designing the advanced W7-AS stellarator. (author). 40 refs, 7 figs

  12. An understanding of human dynamics in urban subway traffic from the Maximum Entropy Principle

    Science.gov (United States)

    Yong, Nuo; Ni, Shunjiang; Shen, Shifei; Ji, Xuewei

    2016-08-01

    We studied the distribution of entry time interval in Beijing subway traffic by analyzing the smart card transaction data, and then deduced the probability distribution function of entry time interval based on the Maximum Entropy Principle. Both theoretical derivation and data statistics indicated that the entry time interval obeys power-law distribution with an exponential cutoff. In addition, we pointed out the constraint conditions for the distribution form and discussed how the constraints affect the distribution function. It is speculated that for bursts and heavy tails in human dynamics, when the fitted power exponent is less than 1.0, it cannot be a pure power-law distribution, but with an exponential cutoff, which may be ignored in the previous studies.

  13. Entropy analysis on non-equilibrium two-phase flow models

    International Nuclear Information System (INIS)

    Karwat, H.; Ruan, Y.Q.

    1995-01-01

    A method of entropy analysis according to the second law of thermodynamics is proposed for the assessment of a class of practical non-equilibrium two-phase flow models. Entropy conditions are derived directly from a local instantaneous formulation for an arbitrary control volume of a structural two-phase fluid, which are finally expressed in terms of the averaged thermodynamic independent variables and their time derivatives as well as the boundary conditions for the volume. On the basis of a widely used thermal-hydraulic system code it is demonstrated with practical examples that entropy production rates in control volumes can be numerically quantified by using the data from the output data files. Entropy analysis using the proposed method is useful in identifying some potential problems in two-phase flow models and predictions as well as in studying the effects of some free parameters in closure relationships

  14. Entropy analysis on non-equilibrium two-phase flow models

    Energy Technology Data Exchange (ETDEWEB)

    Karwat, H.; Ruan, Y.Q. [Technische Universitaet Muenchen, Garching (Germany)

    1995-09-01

    A method of entropy analysis according to the second law of thermodynamics is proposed for the assessment of a class of practical non-equilibrium two-phase flow models. Entropy conditions are derived directly from a local instantaneous formulation for an arbitrary control volume of a structural two-phase fluid, which are finally expressed in terms of the averaged thermodynamic independent variables and their time derivatives as well as the boundary conditions for the volume. On the basis of a widely used thermal-hydraulic system code it is demonstrated with practical examples that entropy production rates in control volumes can be numerically quantified by using the data from the output data files. Entropy analysis using the proposed method is useful in identifying some potential problems in two-phase flow models and predictions as well as in studying the effects of some free parameters in closure relationships.

  15. Imaging VLBI polarimetry data from Active Galactic Nuclei using the Maximum Entropy Method

    Directory of Open Access Journals (Sweden)

    Coughlan Colm P.

    2013-12-01

    Full Text Available Mapping the relativistic jets emanating from AGN requires the use of a deconvolution algorithm to account for the effects of missing baseline spacings. The CLEAN algorithm is the most commonly used algorithm in VLBI imaging today and is suitable for imaging polarisation data. The Maximum Entropy Method (MEM is presented as an alternative with some advantages over the CLEAN algorithm, including better spatial resolution and a more rigorous and unbiased approach to deconvolution. We have developed a MEM code suitable for deconvolving VLBI polarisation data. Monte Carlo simulations investigating the performance of CLEAN and the MEM code on a variety of source types are being carried out. Real polarisation (VLBA data taken at multiple wavelengths have also been deconvolved using MEM, and several of the resulting polarisation and Faraday rotation maps are presented and discussed.

  16. Optimization of rainfall networks using information entropy and temporal variability analysis

    Science.gov (United States)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  17. A Note of Caution on Maximizing Entropy

    Directory of Open Access Journals (Sweden)

    Richard E. Neapolitan

    2014-07-01

    Full Text Available The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of performing Bayesian updating using Bayes’ Theorem, and its use often has efficacious results. However, in some circumstances the results seem unacceptable and unintuitive. This paper discusses some of these cases, and discusses how to identify some of the situations in which this principle should not be used. The paper starts by reviewing three approaches to probability, namely the classical approach, the limiting frequency approach, and the Bayesian approach. It then introduces maximum entropy and shows its relationship to the three approaches. Next, through examples, it shows that maximizing entropy sometimes can stand in direct opposition to Bayesian updating based on reasonable prior beliefs. The paper concludes that if we take the Bayesian approach that probability is about reasonable belief based on all available information, then we can resolve the conflict between the maximum entropy approach and the Bayesian approach that is demonstrated in the examples.

  18. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    Science.gov (United States)

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  19. Multi-scale symbolic transfer entropy analysis of EEG

    Science.gov (United States)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  20. Linking entropy flow with typhoon evolution: a case-study

    International Nuclear Information System (INIS)

    Liu, C; Xu, H; Liu, Y

    2007-01-01

    This paper is mainly aimed at investigating the relationship of entropy flow with an atmospheric system (typhoon), based on the observational analyses covering its whole life-cycle. The formula for calculating entropy flow is derived starting with the Gibbs relation with data from the NCEP/NCAR reanalysis. The results show that: (i) entropy flow characteristics at different vertical layers of the system are heterogeneous with predominant negative entropy flow in the large portion of the troposphere and positive ones at upper levels during its development; (ii) changes in the maximum surface wind velocity or the intensity of a typhoon are synchronous with the total entropy flow around the typhoon centre and its neighbourhood, suggesting that the growth of a severe atmospheric system relies greatly upon the negative entropy flow being strong enough, and that entropy flow analysis might provide a particular point of view and a powerful tool to understand the mechanism responsible for the life-cycle of an atmospheric system and associated weather events; and (iii) the horizontal pattern of negative entropy flow near the surface might contain some significant information conducive to the track forecast of typhoons

  1. Applications of Entropy in Finance: A Review

    Directory of Open Access Journals (Sweden)

    Guanqun Tong

    2013-11-01

    Full Text Available Although the concept of entropy is originated from thermodynamics, its concepts and relevant principles, especially the principles of maximum entropy and minimum cross-entropy, have been extensively applied in finance. In this paper, we review the concepts and principles of entropy, as well as their applications in the field of finance, especially in portfolio selection and asset pricing. Furthermore, we review the effects of the applications of entropy and compare them with other traditional and new methods.

  2. Maximum-Entropy Models of Sequenced Immune Repertoires Predict Antigen-Antibody Affinity.

    Directory of Open Access Journals (Sweden)

    Lorenzo Asti

    2016-04-01

    Full Text Available The immune system has developed a number of distinct complex mechanisms to shape and control the antibody repertoire. One of these mechanisms, the affinity maturation process, works in an evolutionary-like fashion: after binding to a foreign molecule, the antibody-producing B-cells exhibit a high-frequency mutation rate in the genome region that codes for the antibody active site. Eventually, cells that produce antibodies with higher affinity for their cognate antigen are selected and clonally expanded. Here, we propose a new statistical approach based on maximum entropy modeling in which a scoring function related to the binding affinity of antibodies against a specific antigen is inferred from a sample of sequences of the immune repertoire of an individual. We use our inference strategy to infer a statistical model on a data set obtained by sequencing a fairly large portion of the immune repertoire of an HIV-1 infected patient. The Pearson correlation coefficient between our scoring function and the IC50 neutralization titer measured on 30 different antibodies of known sequence is as high as 0.77 (p-value 10-6, outperforming other sequence- and structure-based models.

  3. Information and Entropy

    Science.gov (United States)

    Caticha, Ariel

    2007-11-01

    What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.

  4. Applicability of the minimum entropy generation method for optimizing thermodynamic cycles

    Institute of Scientific and Technical Information of China (English)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations.In this paper,it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed.For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates,it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered.However,the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included,because the total heat into the system of interest is not fixed.An irreversible Carnot cycle and an irreversible Brayton cycle are analysed.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed.

  5. Applicability of the minimum entropy generation method for optimizing thermodynamic cycles

    International Nuclear Information System (INIS)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations. In this paper, it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed. For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates, it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered. However, the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included, because the total heat into the system of interest is not fixed. An irreversible Carnot cycle and an irreversible Brayton cycle are analysed. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed. (general)

  6. Entropy analysis in yeast DNA

    International Nuclear Information System (INIS)

    Kim, Jongkwang; Kim, Sowun; Lee, Kunsang; Kwon, Younghun

    2009-01-01

    In this article, we investigate the language structure in yeast 16 chromosomes. In order to find it, we use the entropy analysis for codons (or amino acids) of yeast 16 chromosomes, developed in analysis of natural language by Montemurro et al. From the analysis, we can see that there exists a language structure in codons (or amino acids) of yeast 16 chromosomes. Also we find that the grammar structure of amino acids of yeast 16 chromosomes has a deep relationship with secondary structure of protein.

  7. Application of Shannon Wavelet Entropy and Shannon Wavelet Packet Entropy in Analysis of Power System Transient Signals

    Directory of Open Access Journals (Sweden)

    Jikai Chen

    2016-12-01

    Full Text Available In a power system, the analysis of transient signals is the theoretical basis of fault diagnosis and transient protection theory. Shannon wavelet entropy (SWE and Shannon wavelet packet entropy (SWPE are powerful mathematics tools for transient signal analysis. Combined with the recent achievements regarding SWE and SWPE, their applications are summarized in feature extraction of transient signals and transient fault recognition. For wavelet aliasing at adjacent scale of wavelet decomposition, the impact of wavelet aliasing is analyzed for feature extraction accuracy of SWE and SWPE, and their differences are compared. Meanwhile, the analyses mentioned are verified by partial discharge (PD feature extraction of power cable. Finally, some new ideas and further researches are proposed in the wavelet entropy mechanism, operation speed and how to overcome wavelet aliasing.

  8. Entropy corresponding to the interior of a Schwarzschild black hole

    Directory of Open Access Journals (Sweden)

    Bibhas Ranjan Majhi

    2017-07-01

    Full Text Available Interior volume within the horizon of a black hole is a non-trivial concept which turns out to be very important to explain several issues in the context of quantum nature of black hole. Here we show that the entropy, contained by the maximum interior volume for massless modes, is proportional to the Bekenstein–Hawking expression. The proportionality constant is less than unity implying the horizon bears maximum entropy than that by the interior. The derivation is very systematic and free of any ambiguity. To do so the precise value of the energy of the modes, living in the interior, is derived by constraint analysis. Finally, the implications of the result are discussed.

  9. Entropy corresponding to the interior of a Schwarzschild black hole

    Science.gov (United States)

    Majhi, Bibhas Ranjan; Samanta, Saurav

    2017-07-01

    Interior volume within the horizon of a black hole is a non-trivial concept which turns out to be very important to explain several issues in the context of quantum nature of black hole. Here we show that the entropy, contained by the maximum interior volume for massless modes, is proportional to the Bekenstein-Hawking expression. The proportionality constant is less than unity implying the horizon bears maximum entropy than that by the interior. The derivation is very systematic and free of any ambiguity. To do so the precise value of the energy of the modes, living in the interior, is derived by constraint analysis. Finally, the implications of the result are discussed.

  10. Network Inference and Maximum Entropy Estimation on Information Diagrams

    Czech Academy of Sciences Publication Activity Database

    Martin, E.A.; Hlinka, J.; Meinke, A.; Děchtěrenko, Filip; Tintěra, J.; Oliver, I.; Davidsen, J.

    2017-01-01

    Roč. 7, č. 1 (2017), s. 1-15, č. článku 7062. ISSN 2045-2322 R&D Projects: GA ČR GA13-23940S Institutional support: RVO:68081740 Keywords : complex networks * mutual information * entropy maximization * fMRI Subject RIV: AN - Psychology OBOR OECD: Cognitive sciences Impact factor: 4.259, year: 2016

  11. Studies of the pressure dependence of the charge density distribution in cerium phosphide by the maximum-entropy method

    CERN Document Server

    Ishimatsu, N; Takata, M; Nishibori, E; Sakata, M; Hayashi, J; Shirotani, I; Shimomura, O

    2002-01-01

    The physical properties relating to 4f electrons in cerium phosphide, especially the temperature dependence and the isomorphous transition that occurs at around 10 GPa, were studied by means of x-ray powder diffraction and charge density distribution maps derived by the maximum-entropy method. The compressibility of CeP was exactly determined using a helium pressure medium and the anomaly that indicated the isomorphous transition was observed in the compressibility. We also discuss the anisotropic charge density distribution of Ce ions and its temperature dependence.

  12. Application of the maximum entropy method to dynamical fermion simulations

    Science.gov (United States)

    Clowser, Jonathan

    This thesis presents results for spectral functions extracted from imaginary-time correlation functions obtained from Monte Carlo simulations using the Maximum Entropy Method (MEM). The advantages this method are (i) no a priori assumptions or parametrisations of the spectral function are needed, (ii) a unique solution exists and (iii) the statistical significance of the resulting image can be quantitatively analysed. The Gross Neveu model in d = 3 spacetime dimensions (GNM3) is a particularly interesting model to study with the MEM because at T = 0 it has a broken phase with a rich spectrum of mesonic bound states and a symmetric phase where there are resonances. Results for the elementary fermion, the Goldstone boson (pion), the sigma, the massive pseudoscalar meson and the symmetric phase resonances are presented. UKQCD Nf = 2 dynamical QCD data is also studied with MEM. Results are compared to those found from the quenched approximation, where the effects of quark loops in the QCD vacuum are neglected, to search for sea-quark effects in the extracted spectral functions. Information has been extract from the difficult axial spatial and scalar as well as the pseudoscalar, vector and axial temporal channels. An estimate for the non-singlet scalar mass in the chiral limit is given which is in agreement with the experimental value of Mao = 985 MeV.

  13. Forest Tree Species Distribution Mapping Using Landsat Satellite Imagery and Topographic Variables with the Maximum Entropy Method in Mongolia

    Science.gov (United States)

    Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn

    2016-06-01

    Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled

  14. FOREST TREE SPECIES DISTRIBUTION MAPPING USING LANDSAT SATELLITE IMAGERY AND TOPOGRAPHIC VARIABLES WITH THE MAXIMUM ENTROPY METHOD IN MONGOLIA

    Directory of Open Access Journals (Sweden)

    S. H. Chiang

    2016-06-01

    Full Text Available Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface

  15. Least squares autoregressive (maximum entropy) spectral estimation for Fourier spectroscopy and its application to the electron cyclotron emission from plasma

    International Nuclear Information System (INIS)

    Iwama, N.; Inoue, A.; Tsukishima, T.; Sato, M.; Kawahata, K.

    1981-07-01

    A new procedure for the maximum entropy spectral estimation is studied for the purpose of data processing in Fourier transform spectroscopy. The autoregressive model fitting is examined under a least squares criterion based on the Yule-Walker equations. An AIC-like criterion is suggested for selecting the model order. The principal advantage of the new procedure lies in the enhanced frequency resolution particularly for small values of the maximum optical path-difference of the interferogram. The usefulness of the procedure is ascertained by some numerical simulations and further by experiments with respect to a highly coherent submillimeter wave and the electron cyclotron emission from a stellarator plasma. (author)

  16. Entropy: From Thermodynamics to Hydrology

    Directory of Open Access Journals (Sweden)

    Demetris Koutsoyiannis

    2014-02-01

    Full Text Available Some known results from statistical thermophysics as well as from hydrology are revisited from a different perspective trying: (a to unify the notion of entropy in thermodynamic and statistical/stochastic approaches of complex hydrological systems and (b to show the power of entropy and the principle of maximum entropy in inference, both deductive and inductive. The capability for deductive reasoning is illustrated by deriving the law of phase change transition of water (Clausius-Clapeyron from scratch by maximizing entropy in a formal probabilistic frame. However, such deductive reasoning cannot work in more complex hydrological systems with diverse elements, yet the entropy maximization framework can help in inductive inference, necessarily based on data. Several examples of this type are provided in an attempt to link statistical thermophysics with hydrology with a unifying view of entropy.

  17. Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy

    Science.gov (United States)

    Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng

    2018-06-01

    To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.

  18. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  19. Second Law Analysis of the Optimal Fin by Minimum Entropy Generation

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Based on the entropy generation concept of thermodynamics, this paper established a general theoretical model for the analysis of entropy generation to optimize fms, in which the minimum entropy generation was selected as the object to be studied. The irreversibility due to heat transfer and friction was taken into account so that the minimum entropygeneration number has been analyzed with respect to second law of thermodynamics in the forced cross-flow. The optimum dimensions of cylinder pins were discussed. It's found that the minimum entropy generation number depends on parameters related to the fluid and fin physical parameters. Variations of the minimum entropy generation number with different parameters were analyzed.

  20. Texture analysis using Renyi's generalized entropies

    NARCIS (Netherlands)

    Grigorescu, SE; Petkov, N

    2003-01-01

    We propose a texture analysis method based on Renyi's generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The

  1. Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states

    CERN Document Server

    Dewar, R

    2003-01-01

    Jaynes' information theory formalism of statistical mechanics is applied to the stationary states of open, non-equilibrium systems. First, it is shown that the probability distribution p subGAMMA of the underlying microscopic phase space trajectories GAMMA over a time interval of length tau satisfies p subGAMMA propor to exp(tau sigma subGAMMA/2k sub B) where sigma subGAMMA is the time-averaged rate of entropy production of GAMMA. Three consequences of this result are then derived: (1) the fluctuation theorem, which describes the exponentially declining probability of deviations from the second law of thermodynamics as tau -> infinity; (2) the selection principle of maximum entropy production for non-equilibrium stationary states, empirical support for which has been found in studies of phenomena as diverse as the Earth's climate and crystal growth morphology; and (3) the emergence of self-organized criticality for flux-driven systems in the slowly-driven limit. The explanation of these results on general inf...

  2. A Novel Maximum Entropy Markov Model for Human Facial Expression Recognition.

    Directory of Open Access Journals (Sweden)

    Muhammad Hameed Siddiqi

    Full Text Available Research in video based FER systems has exploded in the past decade. However, most of the previous methods work well when they are trained and tested on the same dataset. Illumination settings, image resolution, camera angle, and physical characteristics of the people differ from one dataset to another. Considering a single dataset keeps the variance, which results from differences, to a minimum. Having a robust FER system, which can work across several datasets, is thus highly desirable. The aim of this work is to design, implement, and validate such a system using different datasets. In this regard, the major contribution is made at the recognition module which uses the maximum entropy Markov model (MEMM for expression recognition. In this model, the states of the human expressions are modeled as the states of an MEMM, by considering the video-sensor observations as the observations of MEMM. A modified Viterbi is utilized to generate the most probable expression state sequence based on such observations. Lastly, an algorithm is designed which predicts the expression state from the generated state sequence. Performance is compared against several existing state-of-the-art FER systems on six publicly available datasets. A weighted average accuracy of 97% is achieved across all datasets.

  3. Electronic structure of beta-FeSi sub 2 obtained by maximum entropy method and photoemission spectroscopy

    CERN Document Server

    Kakemoto, H; Makita, Y; Kino, Y; Tsukamoto, T; Shin, S; Wada, S; Tsurumi, T

    2003-01-01

    The electronic structure of beta-FeSi sub 2 was investigated by maximum entropy method (MEM) and photoemission spectroscopy. The electronic structure obtained by MEM using X-ray diffraction data at room temperature (RT) showed covalent bonds of Fe-Si and Si-Si electrons. The photoemission spectra of beta-FeSi sub 2 at RT were changed by incidence photon energies. For photon energies between 50 and 100 eV, resonant photoemission spectra caused by a super Coster-Kronig transition were observed. In order to reduce resonant effect about Fe(3d) for obtained photoemission spectra, difference spectrum between 53 and 57 eV was calculated, and it was compared with ab-initio band calculation and spectra function.

  4. A Note on Burg’s Modified Entropy in Statistical Mechanics

    Directory of Open Access Journals (Sweden)

    Amritansu Ray

    2016-02-01

    Full Text Available Burg’s entropy plays an important role in this age of information euphoria, particularly in understanding the emergent behavior of a complex system such as statistical mechanics. For discrete or continuous variable, maximization of Burg’s Entropy subject to its only natural and mean constraint always provide us a positive density function though the Entropy is always negative. On the other hand, Burg’s modified entropy is a better measure than the standard Burg’s entropy measure since this is always positive and there is no computational problem for small probabilistic values. Moreover, the maximum value of Burg’s modified entropy increases with the number of possible outcomes. In this paper, a premium has been put on the fact that if Burg’s modified entropy is used instead of conventional Burg’s entropy in a maximum entropy probability density (MEPD function, the result yields a better approximation of the probability distribution. An important lemma in basic algebra and a suitable example with tables and graphs in statistical mechanics have been given to illustrate the whole idea appropriately.

  5. Multifield stochastic particle production: beyond a maximum entropy ansatz

    Energy Technology Data Exchange (ETDEWEB)

    Amin, Mustafa A.; Garcia, Marcos A.G.; Xie, Hong-Yi; Wen, Osmond, E-mail: mustafa.a.amin@gmail.com, E-mail: marcos.garcia@rice.edu, E-mail: hxie39@wisc.edu, E-mail: ow4@rice.edu [Physics and Astronomy Department, Rice University, 6100 Main Street, Houston, TX 77005 (United States)

    2017-09-01

    We explore non-adiabatic particle production for N {sub f} coupled scalar fields in a time-dependent background with stochastically varying effective masses, cross-couplings and intervals between interactions. Under the assumption of weak scattering per interaction, we provide a framework for calculating the typical particle production rates after a large number of interactions. After setting up the framework, for analytic tractability, we consider interactions (effective masses and cross couplings) characterized by series of Dirac-delta functions in time with amplitudes and locations drawn from different distributions. Without assuming that the fields are statistically equivalent, we present closed form results (up to quadratures) for the asymptotic particle production rates for the N {sub f}=1 and N {sub f}=2 cases. We also present results for the general N {sub f} >2 case, but with more restrictive assumptions. We find agreement between our analytic results and direct numerical calculations of the total occupation number of the produced particles, with departures that can be explained in terms of violation of our assumptions. We elucidate the precise connection between the maximum entropy ansatz (MEA) used in Amin and Baumann (2015) and the underlying statistical distribution of the self and cross couplings. We provide and justify a simple to use (MEA-inspired) expression for the particle production rate, which agrees with our more detailed treatment when the parameters characterizing the effective mass and cross-couplings between fields are all comparable to each other. However, deviations are seen when some parameters differ significantly from others. We show that such deviations become negligible for a broad range of parameters when N {sub f}>> 1.

  6. Maximum entropy models of ecosystem functioning

    International Nuclear Information System (INIS)

    Bertram, Jason

    2014-01-01

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example

  7. Maximum entropy models of ecosystem functioning

    Energy Technology Data Exchange (ETDEWEB)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au [Research School of Biology, The Australian National University, Canberra ACT 0200 (Australia)

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.

  8. Absorption and scattering coefficients estimation in two-dimensional participating media using the generalized maximum entropy and Levenberg-Marquardt methods

    International Nuclear Information System (INIS)

    Berrocal T, Mariella J.; Roberty, Nilson C.; Silva Neto, Antonio J.; Universidade Federal, Rio de Janeiro, RJ

    2002-01-01

    The solution of inverse problems in participating media where there is emission, absorption and dispersion of the radiation possesses several applications in engineering and medicine. The objective of this work is to estimative the coefficients of absorption and dispersion in two-dimensional heterogeneous participating media, using in independent form the Generalized Maximum Entropy and Levenberg Marquardt methods. Both methods are based on the solution of the direct problem that is modeled by the Boltzmann equation in cartesian geometry. Some cases testes are presented. (author)

  9. Entropy flow and generation in radiative transfer between surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Z.M.; Basu, S. [Georgia Institute of Technolgy, Atlanta, GA (United States). George W. Woodruff School of Mechanical Engineering

    2007-02-15

    Entropy of radiation has been used to derive the laws of blackbody radiation and determine the maximum efficiency of solar energy conversion. Along with the advancement in thermophotovoltaic technologies and nanoscale heat radiation, there is an urgent need to determine the entropy flow and generation in radiative transfer between nonideal surfaces when multiple reflections are significant. This paper investigates entropy flow and generation when incoherent multiple reflections are included, without considering the effects of interference and photon tunneling. The concept of partial equilibrium is applied to interpret the monochromatic radiation temperature of thermal radiation, T{sub l}(l,{omega}), which is dependent on both wavelength l and direction {omega}. The entropy flux and generation can thus be evaluated for nonideal surfaces. It is shown that several approximate expressions found in the literature can result in significant errors in entropy analysis even for diffuse-gray surfaces. The present study advances the thermodynamics of nonequilibrium thermal radiation and will have a significant impact on the future development of thermophotovoltaic and other radiative energy conversion devices. (author)

  10. Tsallis Entropy and the Transition to Scaling in Fragmentation

    Science.gov (United States)

    Sotolongo-Costa, Oscar; Rodriguez, Arezky H.; Rodgers, G. J.

    2000-12-01

    By using the maximum entropy principle with Tsallis entropy we obtain a fragment size distribution function which undergoes a transition to scaling. This distribution function reduces to those obtained by other authors using Shannon entropy. The treatment is easily generalisable to any process of fractioning with suitable constraints.

  11. Normal Mode Analysis in Zeolites: Toward an Efficient Calculation of Adsorption Entropies.

    Science.gov (United States)

    De Moor, Bart A; Ghysels, An; Reyniers, Marie-Françoise; Van Speybroeck, Veronique; Waroquier, Michel; Marin, Guy B

    2011-04-12

    An efficient procedure for normal-mode analysis of extended systems, such as zeolites, is developed and illustrated for the physisorption and chemisorption of n-octane and isobutene in H-ZSM-22 and H-FAU using periodic DFT calculations employing the Vienna Ab Initio Simulation Package. Physisorption and chemisorption entropies resulting from partial Hessian vibrational analysis (PHVA) differ at most 10 J mol(-1) K(-1) from those resulting from full Hessian vibrational analysis, even for PHVA schemes in which only a very limited number of atoms are considered free. To acquire a well-conditioned Hessian, much tighter optimization criteria than commonly used for electronic energy calculations in zeolites are required, i.e., at least an energy cutoff of 400 eV, maximum force of 0.02 eV/Å, and self-consistent field loop convergence criteria of 10(-8) eV. For loosely bonded complexes the mobile adsorbate method is applied, in which frequency contributions originating from translational or rotational motions of the adsorbate are removed from the total partition function and replaced by free translational and/or rotational contributions. The frequencies corresponding with these translational and rotational modes can be selected unambiguously based on a mobile block Hessian-PHVA calculation, allowing the prediction of physisorption entropies within an accuracy of 10-15 J mol(-1) K(-1) as compared to experimental values. The approach presented in this study is useful for studies on other extended catalytic systems.

  12. Energy and entropy analysis of closed adiabatic expansion based trilateral cycles

    International Nuclear Information System (INIS)

    Garcia, Ramon Ferreiro; Carril, Jose Carbia; Gomez, Javier Romero; Gomez, Manuel Romero

    2016-01-01

    the Carnot factor is exceeded are determined, where carbon dioxide, nitrogen, helium and hydrogen are considered as real working fluids, followed by an entropic analysis in order to verify 2nd law fulfilment. The results of the analysis show that within a range of relatively low operating temperatures, high thermal efficiency is achieved, reaching 44.9% for helium when the Carnot factor is 33.3% under a ratio of temperatures of 450/300 K. With respect to entropy analysis, it is verified that the results of the latter demonstrate compliance with the second principle, while violating Carnot constraints, since the Carnot factor is constrained only by the Carnot, Stirling and Ericsson cycles and its associated Carnot engine characteristics. However, the most relevant findings through the performed analysis concern the detection of some inconsistencies regarding the conventional 2nd law efficiency definition and the exergy transfer definition from thermal power sources to thermal cycles. In summary, a TC undergoing isochoric heat absorption, adiabatic expansion and isobaric heat rejection under closed transformations can yield improved performance over traditional thermal cycles, even exceeding the Carnot factor under relatively low top temperatures, for which Carnot efficiency is lower. Furthermore, the concept of 2nd law efficiency, defined as the ratio of the thermal to the Carnot efficiency, has been reconsidered in agreement with the results achieved. That is, the definition of 2nd law efficiency lacks both theoretical and practical sense. In the same way, as a result of discarding the Carnot factor as limiting the thermal efficiency, the definition of the exergy transfer to a thermal cycle (the maximum available energy) must be defined as the product of the transferred heat from a heat source and the thermal efficiency.

  13. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  14. Scaling-Laws of Flow Entropy with Topological Metrics of Water Distribution Networks

    Directory of Open Access Journals (Sweden)

    Giovanni Francesco Santonastaso

    2018-01-01

    Full Text Available Robustness of water distribution networks is related to their connectivity and topological structure, which also affect their reliability. Flow entropy, based on Shannon’s informational entropy, has been proposed as a measure of network redundancy and adopted as a proxy of reliability in optimal network design procedures. In this paper, the scaling properties of flow entropy of water distribution networks with their size and other topological metrics are studied. To such aim, flow entropy, maximum flow entropy, link density and average path length have been evaluated for a set of 22 networks, both real and synthetic, with different size and topology. The obtained results led to identify suitable scaling laws of flow entropy and maximum flow entropy with water distribution network size, in the form of power–laws. The obtained relationships allow comparing the flow entropy of water distribution networks with different size, and provide an easy tool to define the maximum achievable entropy of a specific water distribution network. An example of application of the obtained relationships to the design of a water distribution network is provided, showing how, with a constrained multi-objective optimization procedure, a tradeoff between network cost and robustness is easily identified.

  15. Comparison of two views of maximum entropy in biodiversity: Frank (2011) and Pueyo et al. (2007).

    Science.gov (United States)

    Pueyo, Salvador

    2012-05-01

    An increasing number of authors agree in that the maximum entropy principle (MaxEnt) is essential for the understanding of macroecological patterns. However, there are subtle but crucial differences among the approaches by several of these authors. This poses a major obstacle for anyone interested in applying the methodology of MaxEnt in this context. In a recent publication, Frank (2011) gives some arguments why his own approach would represent an improvement as compared to the earlier paper by Pueyo et al. (2007) and also to the views by Edwin T. Jaynes, who first formulated MaxEnt in the context of statistical physics. Here I show that his criticisms are flawed and that there are fundamental reasons to prefer the original approach.

  16. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  17. Maximizing entropy of image models for 2-D constrained coding

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Danieli, Matteo; Burini, Nino

    2010-01-01

    This paper considers estimating and maximizing the entropy of two-dimensional (2-D) fields with application to 2-D constrained coding. We consider Markov random fields (MRF), which have a non-causal description, and the special case of Pickard random fields (PRF). The PRF are 2-D causal finite...... context models, which define stationary probability distributions on finite rectangles and thus allow for calculation of the entropy. We consider two binary constraints and revisit the hard square constraint given by forbidding neighboring 1s and provide novel results for the constraint that no uniform 2...... £ 2 squares contains all 0s or all 1s. The maximum values of the entropy for the constraints are estimated and binary PRF satisfying the constraint are characterized and optimized w.r.t. the entropy. The maximum binary PRF entropy is 0.839 bits/symbol for the no uniform squares constraint. The entropy...

  18. Entropy-based implied volatility and its information content

    NARCIS (Netherlands)

    X. Xiao (Xiao); C. Zhou (Chen)

    2016-01-01

    markdownabstractThis paper investigates the maximum entropy approach on estimating implied volatility. The entropy approach also allows to measure option implied skewness and kurtosis nonparametrically, and to construct confidence intervals. Simulations show that the en- tropy approach outperforms

  19. An Adaptively Accelerated Bayesian Deblurring Method with Entropy Prior

    Directory of Open Access Journals (Sweden)

    Yong-Hoon Kim

    2008-05-01

    Full Text Available The development of an efficient adaptively accelerated iterative deblurring algorithm based on Bayesian statistical concept has been reported. Entropy of an image has been used as a “prior” distribution and instead of additive form, used in conventional acceleration methods an exponent form of relaxation constant has been used for acceleration. Thus the proposed method is called hereafter as adaptively accelerated maximum a posteriori with entropy prior (AAMAPE. Based on empirical observations in different experiments, the exponent is computed adaptively using first-order derivatives of the deblurred image from previous two iterations. This exponent improves speed of the AAMAPE method in early stages and ensures stability at later stages of iteration. In AAMAPE method, we also consider the constraint of the nonnegativity and flux conservation. The paper discusses the fundamental idea of the Bayesian image deblurring with the use of entropy as prior, and the analytical analysis of superresolution and the noise amplification characteristics of the proposed method. The experimental results show that the proposed AAMAPE method gives lower RMSE and higher SNR in 44% lesser iterations as compared to nonaccelerated maximum a posteriori with entropy prior (MAPE method. Moreover, AAMAPE followed by wavelet wiener filtering gives better result than the state-of-the-art methods.

  20. A Trustworthiness Evaluation Method for Software Architectures Based on the Principle of Maximum Entropy (POME and the Grey Decision-Making Method (GDMM

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2014-09-01

    Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.

  1. Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location

    Directory of Open Access Journals (Sweden)

    Qiaoning Yang

    2015-10-01

    Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.

  2. Maximum entropy reconstruction of poloidal magnetic field and radial electric field profiles in tokamaks

    Science.gov (United States)

    Chen, Yihang; Xiao, Chijie; Yang, Xiaoyi; Wang, Tianbo; Xu, Tianchao; Yu, Yi; Xu, Min; Wang, Long; Lin, Chen; Wang, Xiaogang

    2017-10-01

    The Laser-driven Ion beam trace probe (LITP) is a new diagnostic method for measuring poloidal magnetic field (Bp) and radial electric field (Er) in tokamaks. LITP injects a laser-driven ion beam into the tokamak, and Bp and Er profiles can be reconstructed using tomography methods. A reconstruction code has been developed to validate the LITP theory, and both 2D reconstruction of Bp and simultaneous reconstruction of Bp and Er have been attained. To reconstruct from experimental data with noise, Maximum Entropy and Gaussian-Bayesian tomography methods were applied and improved according to the characteristics of the LITP problem. With these improved methods, a reconstruction error level below 15% has been attained with a data noise level of 10%. These methods will be further tested and applied in the following LITP experiments. Supported by the ITER-CHINA program 2015GB120001, CHINA MOST under 2012YQ030142 and National Natural Science Foundation Abstract of China under 11575014 and 11375053.

  3. Scaling of the magnetic entropy change of Fe3−xMnxSi

    International Nuclear Information System (INIS)

    Said, M.R.; Hamam, Y.A.; Abu-Aljarayesh, I.

    2014-01-01

    The magnetic entropy change of Fe 3−x Mn x Si (for x=1.15, 1.3 and 1.5) has been extracted from isothermal magnetization measurements near the Curie temperature. We used the scaling hypotheses of the thermodynamic potentials to scale the magnetic entropy change to a single universal curve for each sample. The effect of the exchange field and the Curie temperature on the maximum entropy change is discussed. - Highlights: • The maximum of the magnetic entropy change occurs at temperatures T>T C . • The exchange field enhances the magnetic entropy change. • The magnetic entropy change at T C is inversely proportional to T C . • Scaling hypothesis is used to scale the magnetic entropy change

  4. Generalized sample entropy analysis for traffic signals based on similarity measure

    Science.gov (United States)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  5. A second law analysis and entropy generation minimization of an absorption chiller

    KAUST Repository

    Myat, Aung; Thu, Kyaw; Kim, Youngdeuk; Chakraborty, Anutosh; Chun, Wongee; Ng, K. C.

    2011-01-01

    This paper presents performance analysis of absorption refrigeration system (ARS) using an entropy generation analysis. A numerical model predicts the performance of absorption cycle operating under transient conditions along with the entropy generation computation at assorted heat source temperatures, and it captures also the dynamic changes of lithium bromide solution properties such as concentration, density, vapor pressure and overall heat transfer coefficients. An optimization tool, namely the genetic algorithm (GA), is used as to locate the system minima for all defined domain of heat source and cooling water temperatures. The analysis shows that minimization of entropy generation the in absorption cycle leads to the maximization of the COP. © 2011 Elsevier Ltd. All rights reserved.

  6. A second law analysis and entropy generation minimization of an absorption chiller

    KAUST Repository

    Myat, Aung

    2011-10-01

    This paper presents performance analysis of absorption refrigeration system (ARS) using an entropy generation analysis. A numerical model predicts the performance of absorption cycle operating under transient conditions along with the entropy generation computation at assorted heat source temperatures, and it captures also the dynamic changes of lithium bromide solution properties such as concentration, density, vapor pressure and overall heat transfer coefficients. An optimization tool, namely the genetic algorithm (GA), is used as to locate the system minima for all defined domain of heat source and cooling water temperatures. The analysis shows that minimization of entropy generation the in absorption cycle leads to the maximization of the COP. © 2011 Elsevier Ltd. All rights reserved.

  7. Thermoeconomic diagnosis and entropy generation paradox

    DEFF Research Database (Denmark)

    Sigthorsson, Oskar; Ommen, Torben Schmidt; Elmegaard, Brian

    2017-01-01

    In the entropy generation paradox, the entropy generation number, as a function of heat exchanger effectiveness, counter-intuitively approaches zero in two limits symmetrically from a single maximum. In thermoeconomic diagnosis, namely in the characteristic curve method, the exergy destruction...... to the entropy generation paradox, as a decreased heat exchanger effectiveness (as in the case of an operation anomaly in the component) can counter-intuitively result in decreased exergy destruction rate of the component. Therefore, along with an improper selection of independent variables, the heat exchanger...... increases in case of an operation anomaly in a component. The normalised exergy destruction rate as the dependent variable therefore resolves the relation of the characteristic curve method with the entropy generation paradox....

  8. Network Inference and Maximum Entropy Estimation on Information Diagrams

    Czech Academy of Sciences Publication Activity Database

    Martin, E.A.; Hlinka, Jaroslav; Meinke, A.; Děchtěrenko, Filip; Tintěra, J.; Oliver, I.; Davidsen, J.

    2017-01-01

    Roč. 7, č. 1 (2017), č. článku 7062. ISSN 2045-2322 R&D Projects: GA ČR GA13-23940S; GA MZd(CZ) NV15-29835A Grant - others:GA MŠk(CZ) LO1611 Institutional support: RVO:67985807 Keywords : complex networks * mutual information * entropy maximization * fMRI Subject RIV: BD - Theory of Information OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 4.259, year: 2016

  9. Entropy of international trades

    Science.gov (United States)

    Oh, Chang-Young; Lee, D.-S.

    2017-05-01

    The organization of international trades is highly complex under the collective efforts towards economic profits of participating countries given inhomogeneous resources for production. Considering the trade flux as the probability of exporting a product from a country to another, we evaluate the entropy of the world trades in the period 1950-2000. The trade entropy has increased with time, and we show that it is mainly due to the extension of trade partnership. For a given number of trade partners, the mean trade entropy is about 60% of the maximum possible entropy, independent of time, which can be regarded as a characteristic of the trade fluxes' heterogeneity and is shown to be derived from the scaling and functional behaviors of the universal trade-flux distribution. The correlation and time evolution of the individual countries' gross-domestic products and the number of trade partners show that most countries achieved their economic growth partly by extending their trade relationship.

  10. Linearized semiclassical initial value time correlation functions with maximum entropy analytic continuation.

    Science.gov (United States)

    Liu, Jian; Miller, William H

    2008-09-28

    The maximum entropy analytic continuation (MEAC) method is used to extend the range of accuracy of the linearized semiclassical initial value representation (LSC-IVR)/classical Wigner approximation for real time correlation functions. LSC-IVR provides a very effective "prior" for the MEAC procedure since it is very good for short times, exact for all time and temperature for harmonic potentials (even for correlation functions of nonlinear operators), and becomes exact in the classical high temperature limit. This combined MEAC+LSC/IVR approach is applied here to two highly nonlinear dynamical systems, a pure quartic potential in one dimensional and liquid para-hydrogen at two thermal state points (25 and 14 K under nearly zero external pressure). The former example shows the MEAC procedure to be a very significant enhancement of the LSC-IVR for correlation functions of both linear and nonlinear operators, and especially at low temperature where semiclassical approximations are least accurate. For liquid para-hydrogen, the LSC-IVR is seen already to be excellent at T=25 K, but the MEAC procedure produces a significant correction at the lower temperature (T=14 K). Comparisons are also made as to how the MEAC procedure is able to provide corrections for other trajectory-based dynamical approximations when used as priors.

  11. Entropy jump across an inviscid shock wave

    Science.gov (United States)

    Salas, Manuel D.; Iollo, Angelo

    1995-01-01

    The shock jump conditions for the Euler equations in their primitive form are derived by using generalized functions. The shock profiles for specific volume, speed, and pressure are shown to be the same, however density has a different shock profile. Careful study of the equations that govern the entropy shows that the inviscid entropy profile has a local maximum within the shock layer. We demonstrate that because of this phenomenon, the entropy, propagation equation cannot be used as a conservation law.

  12. Analysis of thermal systems using the entropy balance method

    Energy Technology Data Exchange (ETDEWEB)

    Huang, C L.D.; Fartaj, S A; Fenton, D L [Kansas State Univ., Manhattan, KS (United States). Dept. of Mechanical Engineering

    1992-04-01

    This study investigates the applicability of the second law of thermodynamics using an entropy balance method to analyse and design thermal systems. As examples, the entropy balance method is used to analyse a single stage chiller system and a single stage heat transformer, both with lithium-bromide/water as the working fluid. The entropy method yields not only the same information as is conveyed by the methods of energy and exergy analysis, but it also predicts clearly the influence of irreversibilities of individual components on the coefficient of performance and its effectiveness, based on the process properties, rather than on ambient conditions. Furthermore, this method is capable of presenting the overall distribution of the heat input by displaying the additional heat required to overcome irreversibility of each component without ambiguity. (Author).

  13. Relation Entropy and Transferable Entropy Think of Aggregation on Group Decision Making

    Institute of Scientific and Technical Information of China (English)

    CHENG Qi-yue; QIU Wan-hua; LIU Xiao-feng

    2002-01-01

    In this paper, aggregation question based on group decision making and a single decision making is studied. The theory of entropy is applied to the sets pair analysis. The system of relation entropy and the transferable entropy notion are put. The character is studied. An potential by the relation entropy and transferable entropy are defined. It is the consistency measure on the group between a single decision making. We gained a new aggregation effective definition on the group misjudge.

  14. Quantum key distribution with finite resources: Smooth Min entropy vs. Smooth Renyi entropy

    Energy Technology Data Exchange (ETDEWEB)

    Mertz, Markus; Abruzzo, Silvestre; Bratzik, Sylvia; Kampermann, Hermann; Bruss, Dagmar [Institut fuer Theoretische Physik III, Duesseldorf (Germany)

    2010-07-01

    We consider different entropy measures that play an important role in the analysis of the security of QKD with finite resources. The smooth min entropy leads to an optimal bound for the length of a secure key. Another bound on the secure key length was derived by using Renyi entropies. Unfortunately, it is very hard or even impossible to calculate these entropies for realistic QKD scenarios. To estimate the security rate it becomes important to find computable bounds on these entropies. Here, we compare a lower bound for the smooth min entropy with a bound using Renyi entropies. We compare these entropies for the six-state protocol with symmetric attacks.

  15. ENTROPY FLOW CHARACTERISTICS ANALYSIS OF TYPHOON MATSA (0509)

    Institute of Scientific and Technical Information of China (English)

    XU Hui; LIU Chong-jian

    2008-01-01

    The evolution of Typhoon Matsa (0509) is examined in terms of entropy flow through an entropy balance equation derived from the Gibbs relation, according to the second law of thermodynamics. The entropy flows in the various significant stages of (genesis, development and decaying) during its evolution are diagnosed based on the outputs of the PSU/NCAR mesoscale model (known as MM5). The results show that: (1) the vertical spatial distribution of entropy flow for Matsa is characterized by a predominantly negative entropy flow in a large portion of the troposphere and a positive flow in the upper levels; (2) the fields of entropy flows at the middle troposphere (500 hPa) show that the growth of the typhoon is greatly dependent on the negative entropy flows from its surroundings; and (3) the simulated centres of heavy rainfall associated with the typhoon match well with the zones of large negative entropy flows, suggesting that they may be a significant indicator for severe weather events.

  16. Predictive modeling and mapping of Malayan Sun Bear (Helarctos malayanus) distribution using maximum entropy.

    Science.gov (United States)

    Nazeri, Mona; Jusoff, Kamaruzaman; Madani, Nima; Mahmud, Ahmad Rodzi; Bahman, Abdul Rani; Kumar, Lalit

    2012-01-01

    One of the available tools for mapping the geographical distribution and potential suitable habitats is species distribution models. These techniques are very helpful for finding poorly known distributions of species in poorly sampled areas, such as the tropics. Maximum Entropy (MaxEnt) is a recently developed modeling method that can be successfully calibrated using a relatively small number of records. In this research, the MaxEnt model was applied to describe the distribution and identify the key factors shaping the potential distribution of the vulnerable Malayan Sun Bear (Helarctos malayanus) in one of the main remaining habitats in Peninsular Malaysia. MaxEnt results showed that even though Malaysian sun bear habitat is tied with tropical evergreen forests, it lives in a marginal threshold of bio-climatic variables. On the other hand, current protected area networks within Peninsular Malaysia do not cover most of the sun bears potential suitable habitats. Assuming that the predicted suitability map covers sun bears actual distribution, future climate change, forest degradation and illegal hunting could potentially severely affect the sun bear's population.

  17. Predictive modeling and mapping of Malayan Sun Bear (Helarctos malayanus distribution using maximum entropy.

    Directory of Open Access Journals (Sweden)

    Mona Nazeri

    Full Text Available One of the available tools for mapping the geographical distribution and potential suitable habitats is species distribution models. These techniques are very helpful for finding poorly known distributions of species in poorly sampled areas, such as the tropics. Maximum Entropy (MaxEnt is a recently developed modeling method that can be successfully calibrated using a relatively small number of records. In this research, the MaxEnt model was applied to describe the distribution and identify the key factors shaping the potential distribution of the vulnerable Malayan Sun Bear (Helarctos malayanus in one of the main remaining habitats in Peninsular Malaysia. MaxEnt results showed that even though Malaysian sun bear habitat is tied with tropical evergreen forests, it lives in a marginal threshold of bio-climatic variables. On the other hand, current protected area networks within Peninsular Malaysia do not cover most of the sun bears potential suitable habitats. Assuming that the predicted suitability map covers sun bears actual distribution, future climate change, forest degradation and illegal hunting could potentially severely affect the sun bear's population.

  18. Merging daily sea surface temperature data from multiple satellites using a Bayesian maximum entropy method

    Science.gov (United States)

    Tang, Shaolei; Yang, Xiaofeng; Dong, Di; Li, Ziwei

    2015-12-01

    Sea surface temperature (SST) is an important variable for understanding interactions between the ocean and the atmosphere. SST fusion is crucial for acquiring SST products of high spatial resolution and coverage. This study introduces a Bayesian maximum entropy (BME) method for blending daily SSTs from multiple satellite sensors. A new spatiotemporal covariance model of an SST field is built to integrate not only single-day SSTs but also time-adjacent SSTs. In addition, AVHRR 30-year SST climatology data are introduced as soft data at the estimation points to improve the accuracy of blended results within the BME framework. The merged SSTs, with a spatial resolution of 4 km and a temporal resolution of 24 hours, are produced in the Western Pacific Ocean region to demonstrate and evaluate the proposed methodology. Comparisons with in situ drifting buoy observations show that the merged SSTs are accurate and the bias and root-mean-square errors for the comparison are 0.15°C and 0.72°C, respectively.

  19. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    Science.gov (United States)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  20. Applications of quantum entropy to statistics

    International Nuclear Information System (INIS)

    Silver, R.N.; Martz, H.F.

    1994-01-01

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to heirarchical Bayes methods

  1. Entropy for Mechanically Vibrating Systems

    Science.gov (United States)

    Tufano, Dante

    The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators

  2. Well posedness and maximum entropy approximation for the dynamics of quantitative traits

    KAUST Repository

    Boďová , Katarí na; Haskovec, Jan; Markowich, Peter A.

    2017-01-01

    We study the Fokker–Planck equation derived in the large system limit of the Markovian process describing the dynamics of quantitative traits. The Fokker–Planck equation is posed on a bounded domain and its transport and diffusion coefficients vanish on the domain’s boundary. We first argue that, despite this degeneracy, the standard no-flux boundary condition is valid. We derive the weak formulation of the problem and prove the existence and uniqueness of its solutions by constructing the corresponding contraction semigroup on a suitable function space. Then, we prove that for the parameter regime with high enough mutation rate the problem exhibits a positive spectral gap, which implies exponential convergence to equilibrium.Next, we provide a simple derivation of the so-called Dynamic Maximum Entropy (DynMaxEnt) method for approximation of observables (moments) of the Fokker–Planck solution, which can be interpreted as a nonlinear Galerkin approximation. The limited applicability of the DynMaxEnt method inspires us to introduce its modified version that is valid for the whole range of admissible parameters. Finally, we present several numerical experiments to demonstrate the performance of both the original and modified DynMaxEnt methods. We observe that in the parameter regimes where both methods are valid, the modified one exhibits slightly better approximation properties compared to the original one.

  3. Well posedness and maximum entropy approximation for the dynamics of quantitative traits

    KAUST Repository

    Boďová, Katarína

    2017-11-06

    We study the Fokker–Planck equation derived in the large system limit of the Markovian process describing the dynamics of quantitative traits. The Fokker–Planck equation is posed on a bounded domain and its transport and diffusion coefficients vanish on the domain’s boundary. We first argue that, despite this degeneracy, the standard no-flux boundary condition is valid. We derive the weak formulation of the problem and prove the existence and uniqueness of its solutions by constructing the corresponding contraction semigroup on a suitable function space. Then, we prove that for the parameter regime with high enough mutation rate the problem exhibits a positive spectral gap, which implies exponential convergence to equilibrium.Next, we provide a simple derivation of the so-called Dynamic Maximum Entropy (DynMaxEnt) method for approximation of observables (moments) of the Fokker–Planck solution, which can be interpreted as a nonlinear Galerkin approximation. The limited applicability of the DynMaxEnt method inspires us to introduce its modified version that is valid for the whole range of admissible parameters. Finally, we present several numerical experiments to demonstrate the performance of both the original and modified DynMaxEnt methods. We observe that in the parameter regimes where both methods are valid, the modified one exhibits slightly better approximation properties compared to the original one.

  4. A Revision of Clausius Work on the Second Law. 1. On the Lack of Inner Consistency of Clausius Analysis Leading to the Law of Increasing Entropy

    Directory of Open Access Journals (Sweden)

    José C. Iñiguez

    1999-10-01

    Full Text Available Abstract: This paper, the first in a series of four, will expose the lack of inner consistency of the analysis through which Clausius re-expressed the second law of thermodynamics: "Heat cannot, of itself, pass from a colder to a hotter body", as the law of increasing entropy: "The entropy of the universe tends to a maximum". In the two following papers the flaw in Clausius analysis producing the said lack of consistency will be located, corrected and some of its consequences, discussed. Among them the one stating that the identification of the two above written statements of the second law is valid only under certain circumstances. In the fourth and final

  5. The mechanics of granitoid systems and maximum entropy production rates.

    Science.gov (United States)

    Hobbs, Bruce E; Ord, Alison

    2010-01-13

    A model for the formation of granitoid systems is developed involving melt production spatially below a rising isotherm that defines melt initiation. Production of the melt volumes necessary to form granitoid complexes within 10(4)-10(7) years demands control of the isotherm velocity by melt advection. This velocity is one control on the melt flux generated spatially just above the melt isotherm, which is the control valve for the behaviour of the complete granitoid system. Melt transport occurs in conduits initiated as sheets or tubes comprising melt inclusions arising from Gurson-Tvergaard constitutive behaviour. Such conduits appear as leucosomes parallel to lineations and foliations, and ductile and brittle dykes. The melt flux generated at the melt isotherm controls the position of the melt solidus isotherm and hence the physical height of the Transport/Emplacement Zone. A conduit width-selection process, driven by changes in melt viscosity and constitutive behaviour, operates within the Transport Zone to progressively increase the width of apertures upwards. Melt can also be driven horizontally by gradients in topography; these horizontal fluxes can be similar in magnitude to vertical fluxes. Fluxes induced by deformation can compete with both buoyancy and topographic-driven flow over all length scales and results locally in transient 'ponds' of melt. Pluton emplacement is controlled by the transition in constitutive behaviour of the melt/magma from elastic-viscous at high temperatures to elastic-plastic-viscous approaching the melt solidus enabling finite thickness plutons to develop. The system involves coupled feedback processes that grow at the expense of heat supplied to the system and compete with melt advection. The result is that limits are placed on the size and time scale of the system. Optimal characteristics of the system coincide with a state of maximum entropy production rate. This journal is © 2010 The Royal Society

  6. Tail Risk Constraints and Maximum Entropy

    Directory of Open Access Journals (Sweden)

    Donald Geman

    2015-06-01

    Full Text Available Portfolio selection in the financial literature has essentially been analyzed under two central assumptions: full knowledge of the joint probability distribution of the returns of the securities that will comprise the target portfolio; and investors’ preferences are expressed through a utility function. In the real world, operators build portfolios under risk constraints which are expressed both by their clients and regulators and which bear on the maximal loss that may be generated over a given time period at a given confidence level (the so-called Value at Risk of the position. Interestingly, in the finance literature, a serious discussion of how much or little is known from a probabilistic standpoint about the multi-dimensional density of the assets’ returns seems to be of limited relevance. Our approach in contrast is to highlight these issues and then adopt throughout a framework of entropy maximization to represent the real world ignorance of the “true” probability distributions, both univariate and multivariate, of traded securities’ returns. In this setting, we identify the optimal portfolio under a number of downside risk constraints. Two interesting results are exhibited: (i the left- tail constraints are sufficiently powerful to override all other considerations in the conventional theory; (ii the “barbell portfolio” (maximal certainty/ low risk in one set of holdings, maximal uncertainty in another, which is quite familiar to traders, naturally emerges in our construction.

  7. Maximum entropy approach to H-theory: Statistical mechanics of hierarchical systems.

    Science.gov (United States)

    Vasconcelos, Giovani L; Salazar, Domingos S P; Macêdo, A M S

    2018-02-01

    A formalism, called H-theory, is applied to the problem of statistical equilibrium of a hierarchical complex system with multiple time and length scales. In this approach, the system is formally treated as being composed of a small subsystem-representing the region where the measurements are made-in contact with a set of "nested heat reservoirs" corresponding to the hierarchical structure of the system, where the temperatures of the reservoirs are allowed to fluctuate owing to the complex interactions between degrees of freedom at different scales. The probability distribution function (pdf) of the temperature of the reservoir at a given scale, conditioned on the temperature of the reservoir at the next largest scale in the hierarchy, is determined from a maximum entropy principle subject to appropriate constraints that describe the thermal equilibrium properties of the system. The marginal temperature distribution of the innermost reservoir is obtained by integrating over the conditional distributions of all larger scales, and the resulting pdf is written in analytical form in terms of certain special transcendental functions, known as the Fox H functions. The distribution of states of the small subsystem is then computed by averaging the quasiequilibrium Boltzmann distribution over the temperature of the innermost reservoir. This distribution can also be written in terms of H functions. The general family of distributions reported here recovers, as particular cases, the stationary distributions recently obtained by Macêdo et al. [Phys. Rev. E 95, 032315 (2017)10.1103/PhysRevE.95.032315] from a stochastic dynamical approach to the problem.

  8. EEG entropy measures in anesthesia

    Science.gov (United States)

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation

  9. EEG entropy measures in anesthesia

    Directory of Open Access Journals (Sweden)

    Zhenhu eLiang

    2015-02-01

    Full Text Available Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs’ effect is lacking. In this study, we compare the capability of twelve entropy indices for monitoring depth of anesthesia (DoA and detecting the burst suppression pattern (BSP, in anesthesia induced by GA-BAergic agents.Methods: Twelve indices were investigated, namely Response Entropy (RE and State entropy (SE, three wavelet entropy (WE measures (Shannon WE (SWE, Tsallis WE (TWE and Renyi WE (RWE, Hilbert-Huang spectral entropy (HHSE, approximate entropy (ApEn, sample entropy (SampEn, Fuzzy entropy, and three permutation entropy (PE measures (Shannon PE (SPE, Tsallis PE (TPE and Renyi PE (RPE. Two EEG data sets from sevoflurane-induced and isoflu-rane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, phar-macokinetic / pharmacodynamic (PK/PD modeling and prediction probability analysis were applied. The multifractal detrended fluctuation analysis (MDFA as a non-entropy measure was compared.Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline vari-ability, higher coefficient of determination and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an ad-vantage in computation efficiency compared with MDFA.Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure.Significance: Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA.

  10. Conservation analysis of dengue virust-cell epitope-based vaccine candidates using peptide block entropy

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Zhang, Guang Lan; Keskin, Derin B.

    2011-01-01

    residues. The block entropy analysis provides broad coverage of variant antigens. We applied the block entropy analysis method to the proteomes of the four serotypes of dengue virus (DENV) and found 1,551 blocks of 9-mer peptides, which cover 99% of available sequences with five or fewer unique peptides...

  11. Downstream-Conditioned Maximum Entropy Method for Exit Boundary Conditions in the Lattice Boltzmann Method

    Directory of Open Access Journals (Sweden)

    Javier A. Dottori

    2015-01-01

    Full Text Available A method for modeling outflow boundary conditions in the lattice Boltzmann method (LBM based on the maximization of the local entropy is presented. The maximization procedure is constrained by macroscopic values and downstream components. The method is applied to fully developed boundary conditions of the Navier-Stokes equations in rectangular channels. Comparisons are made with other alternative methods. In addition, the new downstream-conditioned entropy is studied and it was found that there is a correlation with the velocity gradient during the flow development.

  12. Entropy generation analysis of an adsorption cooling cycle

    KAUST Repository

    Thu, Kyaw

    2013-05-01

    This paper discusses the analysis of an adsorption (AD) chiller using system entropy generation as a thermodynamic framework for evaluating total dissipative losses that occurred in a batch-operated AD cycle. The study focuses on an adsorption cycle operating at heat source temperatures ranging from 60 to 85 °C, whilst the chilled water inlet temperature is fixed at 12.5 °C,-a temperature of chilled water deemed useful for dehumidification and cooling. The total entropy generation model examines the processes of key components of the AD chiller such as the heat and mass transfer, flushing and de-superheating of liquid refrigerant. The following key findings are observed: (i) The cycle entropy generation increases with the increase in the heat source temperature (10.8 to 46.2 W/K) and the largest share of entropy generation or rate of energy dissipation occurs at the adsorption process, (ii) the second highest energy rate dissipation is the desorption process, (iii) the remaining energy dissipation rates are the evaporation and condensation processes, respectively. Some of the noteworthy highlights from the study are the inevitable but significant dissipative losses found in switching processes of adsorption-desorption and vice versa, as well as the de-superheating of warm condensate that is refluxed at non-thermal equilibrium conditions from the condenser to the evaporator for the completion of the refrigeration cycle. © 2012 Elsevier Ltd. All rights reserved.

  13. Horton Ratios Link Self-Similarity with Maximum Entropy of Eco-Geomorphological Properties in Stream Networks

    Directory of Open Access Journals (Sweden)

    Bruce T. Milne

    2017-05-01

    Full Text Available Stream networks are branched structures wherein water and energy move between land and atmosphere, modulated by evapotranspiration and its interaction with the gravitational dissipation of potential energy as runoff. These actions vary among climates characterized by Budyko theory, yet have not been integrated with Horton scaling, the ubiquitous pattern of eco-hydrological variation among Strahler streams that populate river basins. From Budyko theory, we reveal optimum entropy coincident with high biodiversity. Basins on either side of optimum respond in opposite ways to precipitation, which we evaluated for the classic Hubbard Brook experiment in New Hampshire and for the Whitewater River basin in Kansas. We demonstrate that Horton ratios are equivalent to Lagrange multipliers used in the extremum function leading to Shannon information entropy being maximal, subject to constraints. Properties of stream networks vary with constraints and inter-annual variation in water balance that challenge vegetation to match expected resource supply throughout the network. The entropy-Horton framework informs questions of biodiversity, resilience to perturbations in water supply, changes in potential evapotranspiration, and land use changes that move ecosystems away from optimal entropy with concomitant loss of productivity and biodiversity.

  14. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Kaixuan, E-mail: kaixuanxubjtu@yeah.net; Wang, Jun

    2017-02-26

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model. - Highlights: • Two new entropy approaches for estimation of nonlinear complexity are proposed for the financial market. • Effectiveness analysis of proposed methods is presented and their respective features are studied. • Empirical research of proposed analysis on seven world financial market indices. • Numerical simulation of Potts financial dynamics is preformed for nonlinear complexity behaviors.

  15. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    International Nuclear Information System (INIS)

    Xu, Kaixuan; Wang, Jun

    2017-01-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model. - Highlights: • Two new entropy approaches for estimation of nonlinear complexity are proposed for the financial market. • Effectiveness analysis of proposed methods is presented and their respective features are studied. • Empirical research of proposed analysis on seven world financial market indices. • Numerical simulation of Potts financial dynamics is preformed for nonlinear complexity behaviors.

  16. The criteria for selecting a method for unfolding neutron spectra based on the information entropy theory

    International Nuclear Information System (INIS)

    Zhu, Qingjun; Song, Fengquan; Ren, Jie; Chen, Xueyong; Zhou, Bin

    2014-01-01

    To further expand the application of an artificial neural network in the field of neutron spectrometry, the criteria for choosing between an artificial neural network and the maximum entropy method for the purpose of unfolding neutron spectra was presented. The counts of the Bonner spheres for IAEA neutron spectra were used as a database, and the artificial neural network and the maximum entropy method were used to unfold neutron spectra; the mean squares of the spectra were defined as the differences between the desired and unfolded spectra. After the information entropy of each spectrum was calculated using information entropy theory, the relationship between the mean squares of the spectra and the information entropy was acquired. Useful information from the information entropy guided the selection of unfolding methods. Due to the importance of the information entropy, the method for predicting the information entropy using the Bonner spheres' counts was established. The criteria based on the information entropy theory can be used to choose between the artificial neural network and the maximum entropy method unfolding methods. The application of an artificial neural network to unfold neutron spectra was expanded. - Highlights: • Two neutron spectra unfolding methods, ANN and MEM, were compared. • The spectrum's entropy offers useful information for selecting unfolding methods. • For the spectrum with low entropy, the ANN was generally better than MEM. • The spectrum's entropy was predicted based on the Bonner spheres' counts

  17. Entropy - Some Cosmological Questions Answered by Model of Expansive Nondecelerative Universe

    Directory of Open Access Journals (Sweden)

    Miroslav Sukenik

    2003-01-01

    Full Text Available Abstract: The paper summarizes the background of Expansive Nondecelerative Universe model and its potential to offer answers to some open cosmological questions related to entropy. Three problems are faced in more detail, namely that of Hawkings phenomenon of black holes evaporation, maximum entropy of the Universe during its evolution, and time evolution of specific entropy.

  18. Towards operational interpretations of generalized entropies

    Science.gov (United States)

    Topsøe, Flemming

    2010-12-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  19. Towards operational interpretations of generalized entropies

    International Nuclear Information System (INIS)

    Topsoee, Flemming

    2010-01-01

    The driving force behind our study has been to overcome the difficulties you encounter when you try to extend the clear and convincing operational interpretations of classical Boltzmann-Gibbs-Shannon entropy to other notions, especially to generalized entropies as proposed by Tsallis. Our approach is philosophical, based on speculations regarding the interplay between truth, belief and knowledge. The main result demonstrates that, accepting philosophically motivated assumptions, the only possible measures of entropy are those suggested by Tsallis - which, as we know, include classical entropy. This result constitutes, so it seems, a more transparent interpretation of entropy than previously available. However, further research to clarify the assumptions is still needed. Our study points to the thesis that one should never consider the notion of entropy in isolation - in order to enable a rich and technically smooth study, further concepts, such as divergence, score functions and descriptors or controls should be included in the discussion. This will clarify the distinction between Nature and Observer and facilitate a game theoretical discussion. The usefulness of this distinction and the subsequent exploitation of game theoretical results - such as those connected with the notion of Nash equilibrium - is demonstrated by a discussion of the Maximum Entropy Principle.

  20. Texture Analysis Using Rényi’s Generalized Entropies

    NARCIS (Netherlands)

    Grigorescu, S.E.; Petkov, N.

    2003-01-01

    We propose a texture analysis method based on Rényi’s generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The

  1. Relations Among Some Fuzzy Entropy Formulae

    Institute of Scientific and Technical Information of China (English)

    卿铭

    2004-01-01

    Fuzzy entropy has been widely used to analyze and design fuzzy systems, and many fuzzy entropy formulae have been proposed. For further in-deepth analysis of fuzzy entropy, the axioms and some important formulae of fuzzy entropy are introduced. Some equivalence results among these fuzzy entropy formulae are proved, and it is shown that fuzzy entropy is a special distance measurement.

  2. Black hole entropy, curved space and monsters

    International Nuclear Information System (INIS)

    Hsu, Stephen D.H.; Reeb, David

    2008-01-01

    We investigate the microscopic origin of black hole entropy, in particular the gap between the maximum entropy of ordinary matter and that of black holes. Using curved space, we construct configurations with entropy greater than the area A of a black hole of equal mass. These configurations have pathological properties and we refer to them as monsters. When monsters are excluded we recover the entropy bound on ordinary matter S 3/4 . This bound implies that essentially all of the microstates of a semiclassical black hole are associated with the growth of a slightly smaller black hole which absorbs some additional energy. Our results suggest that the area entropy of black holes is the logarithm of the number of distinct ways in which one can form the black hole from ordinary matter and smaller black holes, but only after the exclusion of monster states

  3. Entropy and Multifractality in Relativistic Ion-Ion Collisions

    Directory of Open Access Journals (Sweden)

    Shaista Khan

    2018-01-01

    Full Text Available Entropy production in multiparticle systems is investigated by analyzing the experimental data on ion-ion collisions at AGS and SPS energies and comparing the findings with those reported earlier for hadron-hadron, hadron-nucleus, and nucleus-nucleus collisions. It is observed that the entropy produced in limited and full phase space, when normalized to maximum rapidity, exhibits a kind of scaling which is nicely supported by Monte Carlo model HIJING. Using Rényi’s order q information entropy, multifractal characteristics of particle production are examined in terms of generalized dimensions, Dq. Nearly the same values of multifractal specific heat, c, observed in hadronic and ion-ion collisions over a wide range of incident energies suggest that the quantity c might be used as a universal characteristic of multiparticle production in hadron-hadron, hadron-nucleus, and nucleus-nucleus collisions. The analysis is extended to the study of spectrum of scaling indices. The findings reveal that Rényi’s order q information entropy could be another way to investigate the fluctuations in multiplicity distributions in terms of spectral function f(α, which has been argued to be a convenient function for comparison sake not only among different experiments but also between the data and theoretical models.

  4. Empirical study on entropy models of cellular manufacturing systems

    Institute of Scientific and Technical Information of China (English)

    Zhifeng Zhang; Renbin Xiao

    2009-01-01

    From the theoretical point of view,the states of manufacturing resources can be monitored and assessed through the amount of information needed to describe their technological structure and operational state.The amount of information needed to describe cellular manufacturing systems is investigated by two measures:the structural entropy and the operational entropy.Based on the Shannon entropy,the models of the structural entropy and the operational entropy of cellular manufacturing systems are developed,and the cognizance of the states of manufacturing resources is also illustrated.Scheduling is introduced to measure the entropy models of cellular manufacturing systems,and the feasible concepts of maximum schedule horizon and schedule adherence are advanced to quantitatively evaluate the effectiveness of schedules.Finally,an example is used to demonstrate the validity of the proposed methodology.

  5. Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

    Directory of Open Access Journals (Sweden)

    Ali Mohammad-Djafari

    2015-06-01

    Full Text Available The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP, information theory, relative entropy and the Kullback–Leibler (KL divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC and, in particular, the variational Bayesian approximation (VBA methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC methods. We will also see that VBA englobes joint maximum a posteriori (MAP, as well as the different expectation-maximization (EM algorithms as particular cases.

  6. Entropy production in a box: Analysis of instabilities in confined hydrothermal systems

    Science.gov (United States)

    Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.

    2017-09-01

    We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.

  7. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  8. Optimization between heating load and entropy-production rate for endoreversible absorption heat-transformers

    International Nuclear Information System (INIS)

    Sun Fengrui; Qin Xiaoyong; Chen Lingen; Wu Chih

    2005-01-01

    For an endoreversible four-heat-reservoir absorption heat-transformer cycle, for which a linear (Newtonian) heat-transfer law applies, an ecological optimization criterion is proposed for the best mode of operation of the cycle. This involves maximizing a function representing the compromise between the heating load and the entropy-production rate. The optimal relation between the ecological criterion and the COP (coefficient of performance), the maximum ecological criterion and the corresponding COP, heating load and entropy production rate, as well as the ecological criterion and entropy-production rate at the maximum heating load are derived using finite-time thermodynamics. Moreover, compared with the heating-load criterion, the effects of the cycle parameters on the ecological performance are studied by numerical examples. These show that achieving the maximum ecological criterion makes the entropy-production rate decrease by 77.0% and the COP increase by 55.4% with only 27.3% heating-load losses compared with the maximum heating-load objective. The results reflect that the ecological criterion has long-term significance for optimal design of absorption heat-transformers

  9. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    Science.gov (United States)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  10. An entropy approach for evaluating the maximum information content achievable by an urban rainfall network

    Directory of Open Access Journals (Sweden)

    E. Ridolfi

    2011-07-01

    Full Text Available Hydrological models are the basis of operational flood-forecasting systems. The accuracy of these models is strongly dependent on the quality and quantity of the input information represented by rainfall height. Finer space-time rainfall resolution results in more accurate hazard forecasting. In this framework, an optimum raingauge network is essential in predicting flood events.

    This paper develops an entropy-based approach to evaluate the maximum information content achievable by a rainfall network for different sampling time intervals. The procedure is based on the determination of the coefficients of transferred and nontransferred information and on the relative isoinformation contours.

    The nontransferred information value achieved by the whole network is strictly dependent on the sampling time intervals considered. An empirical curve is defined, to assess the objective of the research: the nontransferred information value is plotted versus the associated sampling time on a semi-log scale. The curve has a linear trend.

    In this paper, the methodology is applied to the high-density raingauge network of the urban area of Rome.

  11. SpatEntropy: Spatial Entropy Measures in R

    OpenAIRE

    Altieri, Linda; Cocchi, Daniela; Roli, Giulia

    2018-01-01

    This article illustrates how to measure the heterogeneity of spatial data presenting a finite number of categories via computation of spatial entropy. The R package SpatEntropy contains functions for the computation of entropy and spatial entropy measures. The extension to spatial entropy measures is a unique feature of SpatEntropy. In addition to the traditional version of Shannon's entropy, the package includes Batty's spatial entropy, O'Neill's entropy, Li and Reynolds' contagion index, Ka...

  12. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  13. Gravitational entropies in LTB dust models

    International Nuclear Information System (INIS)

    Sussman, Roberto A; Larena, Julien

    2014-01-01

    We consider generic Lemaître–Tolman–Bondi (LTB) dust models to probe the gravitational entropy proposals of Clifton, Ellis and Tavakol (CET) and of Hosoya and Buchert (HB). We also consider a variant of the HB proposal based on a suitable quasi-local scalar weighted average. We show that the conditions for entropy growth for all proposals are directly related to a negative correlation of similar fluctuations of the energy density and Hubble scalar. While this correlation is evaluated locally for the CET proposal, it must be evaluated in a non-local domain dependent manner for the two HB proposals. By looking at the fulfilment of these conditions at the relevant asymptotic limits we are able to provide a well grounded qualitative description of the full time evolution and radial asymptotic scaling of the three entropies in generic models. The following rigorous analytic results are obtained for the three proposals: (i) entropy grows when the density growing mode is dominant, (ii) all ever-expanding hyperbolic models reach a stable terminal equilibrium characterized by an inhomogeneous entropy maximum in their late time evolution; (iii) regions with decaying modes and collapsing elliptic models exhibit unstable equilibria associated with an entropy minimum (iv) near singularities the CET entropy diverges while the HB entropies converge; (v) the CET entropy converges for all models in the radial asymptotic range, whereas the HB entropies only converge for models asymptotic to a Friedmann–Lemaître–Robertson–Walker background. The fact that different independent proposals yield fairly similar conditions for entropy production, time evolution and radial scaling in generic LTB models seems to suggest that their common notion of a ‘gravitational entropy’ may be a theoretically robust concept applicable to more general spacetimes. (paper)

  14. Adjoint entropy vs topological entropy

    OpenAIRE

    Giordano Bruno, Anna

    2012-01-01

    Recently the adjoint algebraic entropy of endomorphisms of abelian groups was introduced and studied. We generalize the notion of adjoint entropy to continuous endomorphisms of topological abelian groups. Indeed, the adjoint algebraic entropy is defined using the family of all finite-index subgroups, while we take only the subfamily of all open finite-index subgroups to define the topological adjoint entropy. This allows us to compare the (topological) adjoint entropy with the known topologic...

  15. The different paths to entropy

    International Nuclear Information System (INIS)

    Benguigui, L

    2013-01-01

    In order to understand how the complex concept of entropy emerged, we propose a trip into the past, reviewing the works of Clausius, Boltzmann, Gibbs and Planck. In particular, since Gibbs's work is not very well known we present a detailed analysis, recalling the three definitions of entropy that Gibbs gives. The introduction of entropy in quantum mechanics gives in a compact form all the classical definitions of entropy. Perhaps one of the most important aspects of entropy is to see it as a thermodynamic potential like the others proposed by Callen. The calculation of fluctuations in thermodynamic quantities is thus naturally related to entropy. We close with some remarks on entropy and irreversibility. (paper)

  16. Symbolic transfer entropy-based premature signal analysis

    International Nuclear Information System (INIS)

    Wang Jun; Yu Zheng-Feng

    2012-01-01

    In this paper, we use symbolic transfer entropy to study the coupling strength between premature signals. Numerical experiments show that three types of signal couplings are in the same direction. Among them, normal signal coupling is the strongest, followed by that of premature ventricular contractions, and that of atrial premature beats is the weakest. The T test shows that the entropies of the three signals are distinct. Symbolic transfer entropy requires less data, can distinguish the three types of signals and has very good computational efficiency. (interdisciplinary physics and related areas of science and technology)

  17. State fusion entropy for continuous and site-specific analysis of landslide stability changing regularities

    Science.gov (United States)

    Liu, Yong; Qin, Zhimeng; Hu, Baodan; Feng, Shuai

    2018-04-01

    Stability analysis is of great significance to landslide hazard prevention, especially the dynamic stability. However, many existing stability analysis methods are difficult to analyse the continuous landslide stability and its changing regularities in a uniform criterion due to the unique landslide geological conditions. Based on the relationship between displacement monitoring data, deformation states and landslide stability, a state fusion entropy method is herein proposed to derive landslide instability through a comprehensive multi-attribute entropy analysis of deformation states, which are defined by a proposed joint clustering method combining K-means and a cloud model. Taking Xintan landslide as the detailed case study, cumulative state fusion entropy presents an obvious increasing trend after the landslide entered accelerative deformation stage and historical maxima match highly with landslide macroscopic deformation behaviours in key time nodes. Reasonable results are also obtained in its application to several other landslides in the Three Gorges Reservoir in China. Combined with field survey, state fusion entropy may serve for assessing landslide stability and judging landslide evolutionary stages.

  18. Upper entropy axioms and lower entropy axioms

    International Nuclear Information System (INIS)

    Guo, Jin-Li; Suo, Qi

    2015-01-01

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover, different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics

  19. Application of Markov chains-entropy to analysis of depositional environments

    Energy Technology Data Exchange (ETDEWEB)

    Men Guizhen; Shi Xiaohong; Zhao Shuzhi

    1989-01-01

    The paper systematically and comprehensively discussed application of Markov chains-entropy to analysis of depositional environments of the upper Carboniferous series Taiyuan Formation in Anjialing, Pingshuo open-cast mine, Shanxi. Definite geological meanings were given respectively to calculated values of transition probability matrix, extremity probability matrix, substitution matrix and the entropy. The lithologic successions of coarse-fine-coarse grained layers from bottom upwards in the coal-bearing series made up the general symmetric cyclic patterns. It was suggested that the coal-bearing strata deposited in the coal-forming environment in delta plain-littoral swamps. Quantitative study of cyclic visibility and variation of formation was conducted. The assemblage relation among stratigraphic sequences and the significance of predicting vertical change were emphasized. Results of study showed that overall analysis of Markov chains was an effective method for analysis of depositional environments of coal-bearing strata. 2 refs., 5 figs.

  20. Entropy of adsorption of mixed surfactants from solutions onto the air/water interface

    Science.gov (United States)

    Chen, L.-W.; Chen, J.-H.; Zhou, N.-F.

    1995-01-01

    The partial molar entropy change for mixed surfactant molecules adsorbed from solution at the air/water interface has been investigated by surface thermodynamics based upon the experimental surface tension isotherms at various temperatures. Results for different surfactant mixtures of sodium dodecyl sulfate and sodium tetradecyl sulfate, decylpyridinium chloride and sodium alkylsulfonates have shown that the partial molar entropy changes for adsorption of the mixed surfactants were generally negative and decreased with increasing adsorption to a minimum near the maximum adsorption and then increased abruptly. The entropy decrease can be explained by the adsorption-orientation of surfactant molecules in the adsorbed monolayer and the abrupt entropy increase at the maximum adsorption is possible due to the strong repulsion between the adsorbed molecules.

  1. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  2. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    International Nuclear Information System (INIS)

    Almog, Assaf; Garlaschelli, Diego

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information. (paper)

  3. Detrended fluctuation analysis and Kolmogorov–Sinai entropy of electroencephalogram signals

    International Nuclear Information System (INIS)

    Lim, Jung Ho; Khang, Eun Joo; Lee, Tae Hyun; Kim, In Hye; Maeng, Seong Eun; Lee, Jae Woo

    2013-01-01

    We measured the electroencephalogram (EEG) of young students in the relaxed state and in the state of the mathematical activities. We applied the detrended fluctuation analysis and Kolmogorov–Sinai entropy (KSE) in the EEG signals. We found that the detrended fluctuation functions follow a power law with Hurst exponents larger than 1/2. The Hurst exponents enhanced at all EEG channels in the state of mathematical activities. The KSE in the relaxed state is larger than those in the state of the mathematical activities. These indicate that the entropy is enhanced in the disorder state of the brain.

  4. Quantum Rényi relative entropies affirm universality of thermodynamics.

    Science.gov (United States)

    Misra, Avijit; Singh, Uttam; Bera, Manabendra Nath; Rajagopal, A K

    2015-10-01

    We formulate a complete theory of quantum thermodynamics in the Rényi entropic formalism exploiting the Rényi relative entropies, starting from the maximum entropy principle. In establishing the first and second laws of quantum thermodynamics, we have correctly identified accessible work and heat exchange in both equilibrium and nonequilibrium cases. The free energy (internal energy minus temperature times entropy) remains unaltered, when all the entities entering this relation are suitably defined. Exploiting Rényi relative entropies we have shown that this "form invariance" holds even beyond equilibrium and has profound operational significance in isothermal process. These results reduce to the Gibbs-von Neumann results when the Rényi entropic parameter α approaches 1. Moreover, it is shown that the universality of the Carnot statement of the second law is the consequence of the form invariance of the free energy, which is in turn the consequence of maximum entropy principle. Further, the Clausius inequality, which is the precursor to the Carnot statement, is also shown to hold based on the data processing inequalities for the traditional and sandwiched Rényi relative entropies. Thus, we find that the thermodynamics of nonequilibrium state and its deviation from equilibrium together determine the thermodynamic laws. This is another important manifestation of the concepts of information theory in thermodynamics when they are extended to the quantum realm. Our work is a substantial step towards formulating a complete theory of quantum thermodynamics and corresponding resource theory.

  5. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    Science.gov (United States)

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  6. Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis

    Directory of Open Access Journals (Sweden)

    Jianping Li

    2013-12-01

    Full Text Available What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer entropy matrix using bank stock price sequences. This paper also evaluates the stability of Chinese banking system by simulating the risk contagion process. This paper contributes to the literature on interbank contagion mainly in two ways: it establishes a convincing connection between interbank market and transfer entropy, and exploits the market information (stock price rather than presumptions to determine the interbank exposure matrix. Second, the empirical analysis provides an in depth understanding of the stability of the current Chinese banking system.

  7. Application of the entropy generation minimization method to a solar heat exchanger: A pseudo-optimization design process based on the analysis of the local entropy generation maps

    International Nuclear Information System (INIS)

    Giangaspero, Giorgio; Sciubba, Enrico

    2013-01-01

    This paper presents an application of the entropy generation minimization method to the pseudo-optimization of the configuration of the heat exchange surfaces in a Solar Rooftile. An initial “standard” commercial configuration is gradually improved by introducing design changes aimed at the reduction of the thermodynamic losses due to heat transfer and fluid friction. Different geometries (pins, fins and others) are analysed with a commercial CFD (Computational Fluid Dynamics) code that also computes the local entropy generation rate. The design improvement process is carried out on the basis of a careful analysis of the local entropy generation maps and the rationale behind each step of the process is discussed in this perspective. The results are compared with other entropy generation minimization techniques available in the recent technical literature. It is found that the geometry with pin-fins has the best performance among the tested ones, and that the optimal pin array shape parameters (pitch and span) can be determined by a critical analysis of the integrated and local entropy maps and of the temperature contours. - Highlights: ► An entropy generation minimization method is applied to a solar heat exchanger. ► The approach is heuristic and leads to a pseudo-optimization process with CFD as main tool. ► The process is based on the evaluation of the local entropy generation maps. ► The geometry with pin-fins in general outperforms all other configurations. ► The entropy maps and temperature contours can be used to determine the optimal pin array design parameters

  8. Entropy, neutro-entropy and anti-entropy for neutrosophic information

    OpenAIRE

    Vasile Patrascu

    2017-01-01

    This approach presents a multi-valued representation of the neutrosophic information. It highlights the link between the bifuzzy information and neutrosophic one. The constructed deca-valued structure shows the neutrosophic information complexity. This deca-valued structure led to construction of two new concepts for the neutrosophic information: neutro-entropy and anti-entropy. These two concepts are added to the two existing: entropy and non-entropy. Thus, we obtained the following triad: e...

  9. Developing Soil Moisture Profiles Utilizing Remotely Sensed MW and TIR Based SM Estimates Through Principle of Maximum Entropy

    Science.gov (United States)

    Mishra, V.; Cruise, J. F.; Mecikalski, J. R.

    2015-12-01

    Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Earlier studies show that the principle of maximum entropy (POME) can be utilized to develop vertical soil moisture profiles with accuracy (MAE of about 1% for a monotonically dry profile; nearly 2% for monotonically wet profiles and 3.8% for mixed profiles) with minimum constraints (surface, mean and bottom soil moisture contents). In this study, the constraints for the vertical soil moisture profiles were obtained from remotely sensed data. Low resolution (25 km) MW soil moisture estimates (AMSR-E) were downscaled to 4 km using a soil evaporation efficiency index based disaggregation approach. The downscaled MW soil moisture estimates served as a surface boundary condition, while 4 km resolution TIR based Atmospheric Land Exchange Inverse (ALEXI) estimates provided the required mean root-zone soil moisture content. Bottom soil moisture content is assumed to be a soil dependent constant. Mulit-year (2002-2011) gridded profiles were developed for the southeastern United States using the POME method. The soil moisture profiles were compared to those generated in land surface models (Land Information System (LIS) and an agricultural model DSSAT) along with available NRCS SCAN sites in the study region. The end product, spatial soil moisture profiles, can be assimilated into agricultural and hydrologic models in lieu of precipitation for data scarce regions.Developing accurate vertical soil moisture profiles with minimum input requirements is important to agricultural as well as land surface modeling. Previous studies have shown that the principle of maximum entropy (POME) can be utilized with minimal constraints to develop vertical soil moisture profiles with accuracy (MAE = 1% for monotonically dry profiles; MAE = 2% for monotonically wet profiles and MAE = 3.8% for mixed profiles) when compared to laboratory and field

  10. A Maximum Entropy-Based Chaotic Time-Variant Fragile Watermarking Scheme for Image Tampering Detection

    Directory of Open Access Journals (Sweden)

    Guo-Jheng Yang

    2013-08-01

    Full Text Available The fragile watermarking technique is used to protect intellectual property rights while also providing security and rigorous protection. In order to protect the copyright of the creators, it can be implanted in some representative text or totem. Because all of the media on the Internet are digital, protection has become a critical issue, and determining how to use digital watermarks to protect digital media is thus the topic of our research. This paper uses the Logistic map with parameter u = 4 to generate chaotic dynamic behavior with the maximum entropy 1. This approach increases the security and rigor of the protection. The main research target of information hiding is determining how to hide confidential data so that the naked eye cannot see the difference. Next, we introduce one method of information hiding. Generally speaking, if the image only goes through Arnold’s cat map and the Logistic map, it seems to lack sufficient security. Therefore, our emphasis is on controlling Arnold’s cat map and the initial value of the chaos system to undergo small changes and generate different chaos sequences. Thus, the current time is used to not only make encryption more stringent but also to enhance the security of the digital media.

  11. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    Science.gov (United States)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  12. Multivariate Generalized Multiscale Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Anne Humeau-Heurtier

    2016-11-01

    Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.

  13. Notes on entanglement entropy in string theory

    International Nuclear Information System (INIS)

    He, Song; Numasawa, Tokiro; Takayanagi, Tadashi; Watanabe, Kento

    2015-01-01

    In this paper, we study the conical entropy in string theory in the simplest setup of dividing the nine dimensional space into two halves. This corresponds to the leading quantum correction to the horizon entropy in string theory on the Rindler space. This entropy is also called the conical entropy and includes surface term contributions. We first derive a new simple formula of the conical entropy for any free higher spin fields. Then we apply this formula to computations of conical entropy in open and closed superstring. In our analysis of closed string, we study the twisted conical entropy defined by making use of string theory on Melvin backgrounds. This quantity is easier to calculate owing to the folding trick. Our analysis shows that the conical entropy in closed superstring is UV finite owing to the string scale cutoff.

  14. Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoming Xu

    2017-01-01

    Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.

  15. Explaining the entropy concept and entropy components

    Directory of Open Access Journals (Sweden)

    Marko Popovic

    2018-04-01

    Full Text Available Total entropy of a thermodynamic system consists of two components: thermal entropy due to energy, and residual entropy due to molecular orientation. In this article, a three-step method for explaining entropy is suggested. Step one is to use a classical method to introduce thermal entropy STM as a function of temperature T and heat capacity at constant pressure Cp: STM = ∫(Cp/T dT. Thermal entropy is the entropy due to uncertainty in motion of molecules and vanishes at absolute zero (zero-point energy state. It is also the measure of useless thermal energy that cannot be converted into useful work. The next step is to introduce residual entropy S0 as a function of the number of molecules N and the number of distinct orientations available to them in a crystal m: S0 = N kB ln m, where kB is the Boltzmann constant. Residual entropy quantifies the uncertainty in molecular orientation. Residual entropy, unlike thermal entropy, is independent of temperature and remains present at absolute zero. The third step is to show that thermal entropy and residual entropy add up to the total entropy of a thermodynamic system S: S = S0 + STM. This method of explanation should result in a better comprehension of residual entropy and thermal entropy, as well as of their similarities and differences. The new method was tested in teaching at Faculty of Chemistry University of Belgrade, Serbia. The results of the test show that the new method has a potential to improve the quality of teaching.

  16. Analysis of Entropy Generation in Flow of Methanol-Based Nanofluid in a Sinusoidal Wavy Channel

    Directory of Open Access Journals (Sweden)

    Muhammad Qasim

    2017-10-01

    Full Text Available The entropy generation due to heat transfer and fluid friction in mixed convective peristaltic flow of methanol-Al2O3 nano fluid is examined. Maxwell’s thermal conductivity model is used in analysis. Velocity and temperature profiles are utilized in the computation of the entropy generation number. The effects of involved physical parameters on velocity, temperature, entropy generation number, and Bejan number are discussed and explained graphically.

  17. Entropy and Digital Installation

    Directory of Open Access Journals (Sweden)

    Susan Ballard

    2005-01-01

    Full Text Available This paper examines entropy as a process which introduces ideas of distributed materiality to digital installation. Beginning from an analysis of entropy as both force and probability measure within information theory and it’s extension in Ruldof Arnheim’s text ‘Entropy and Art” it develops an argument for the positive rather thannegative forces of entropy. The paper centres on a discussion of two recent works by New Zealand artists Ronnie van Hout (“On the Run”, Wellington City Gallery, NZ, 2004 and Alex Monteith (“Invisible Cities”, Physics Room Contemporary Art Space, Christchurch, NZ, 2004. Ballard suggests that entropy, rather than being a hindrance to understanding or a random chaotic force, discloses a necessary and material politics of noise present in digital installation.

  18. Entropy, neutro-entropy and anti-entropy for neutrosophic information

    OpenAIRE

    Vasile Patrascu

    2017-01-01

    This article shows a deca-valued representation of neutrosophic information in which are defined the following features: truth, falsity, weak truth, weak falsity, ignorance, contradiction, saturation, neutrality, ambiguity and hesitation. Using these features, there are constructed computing formulas for entropy, neutro-entropy and anti-entropy.

  19. Energy conservation and maximal entropy production in enzyme reactions.

    Science.gov (United States)

    Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš

    2017-08-01

    A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Beyond the second law entropy production and non-equilibrium systems

    CERN Document Server

    Lineweaver, Charles; Niven, Robert; Regenauer-Lieb, Klaus

    2014-01-01

    The Second Law, a cornerstone of thermodynamics, governs the average direction of dissipative, non-equilibrium processes. But it says nothing about their actual rates or the probability of fluctuations about the average. This interdisciplinary book, written and peer-reviewed by international experts, presents recent advances in the search for new non-equilibrium principles beyond the Second Law, and their applications to a wide range of systems across physics, chemistry and biology. Beyond The Second Law brings together traditionally isolated areas of non-equilibrium research and highlights potentially fruitful connections between them, with entropy production playing the unifying role. Key theoretical concepts include the Maximum Entropy Production principle, the Fluctuation Theorem, and the Maximum Entropy method of statistical inference. Applications of these principles are illustrated in such diverse fields as climatology, cosmology, crystal growth morphology, Earth system science, environmental physics, ...

  1. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  2. Electron density distribution in Si and Ge using multipole, maximum ...

    Indian Academy of Sciences (India)

    Si and Ge has been studied using multipole, maximum entropy method (MEM) and ... and electron density distribution using the currently available versatile ..... data should be subjected to maximum possible utility for the characterization of.

  3. Force-Time Entropy of Isometric Impulse.

    Science.gov (United States)

    Hsieh, Tsung-Yu; Newell, Karl M

    2016-01-01

    The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.

  4. Variations mechanism in entropy of wave height field and its relation with thermodynamic entropy

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper gives a brief description of annual period and seasonal variation in the wave height field entropy in the northeastern Pacific. A calculation of the quantity of the, received by lithosphere systems in the northern hemisphere is introduced. The wave heat field entropy is compared with the difference in the quantity of the sun's radiation heat. Analysis on the transfer method, period and lag of this seasonal variation led to the conclusion that the annual period and seasonal variation in the entropy of the wave height field in the Northwestern Pacific is due to the seasonal variation of the sun's radiation heat. Furthermore, the inconsistency between thermodynamic entropy and information entropy was studied.

  5. Packer Detection for Multi-Layer Executables Using Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Munkhbayar Bat-Erdene

    2017-03-01

    Full Text Available Packing algorithms are broadly used to avoid anti-malware systems, and the proportion of packed malware has been growing rapidly. However, just a few studies have been conducted on detection various types of packing algorithms in a systemic way. Following this understanding, we elaborate a method to classify packing algorithms of a given executable into three categories: single-layer packing, re-packing, or multi-layer packing. We convert entropy values of the executable file loaded into memory into symbolic representations, for which we used SAX (Symbolic Aggregate Approximation. Based on experiments of 2196 programs and 19 packing algorithms, we identify that precision (97.7%, accuracy (97.5%, and recall ( 96.8% of our method are respectively high to confirm that entropy analysis is applicable in identifying packing algorithms.

  6. A review of entropy generation in microchannels

    Directory of Open Access Journals (Sweden)

    Mohamed M Awad

    2015-12-01

    Full Text Available In this study, a critical review of thermodynamic optimum of microchannels based on entropy generation analysis is presented. Using entropy generation analysis as evaluation parameter of microchannels has been reported by many studies in the literature. In these studies, different working fluids such as nanofluids, air, water, engine oil, aniline, ethylene glycol, and non-Newtonian fluids have been used. For the case of nanofluids, “nanoparticles” has been used in various kinds such as Al2O3 and Cu, and “base fluid” has been used in various kinds such as water and ethylene glycol. Furthermore, studies on thermodynamic optimum of microchannels based on entropy generation analysis are summarized in a table. At the end, recommendations of future work for thermodynamic optimum of microchannels based on entropy generation analysis are given. As a result, this article can not only be used as the starting point for the researcher interested in entropy generation in microchannels, but it also includes recommendations for future studies on entropy generation in microchannels.

  7. Towards an entropy-based analysis of log variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2017-01-01

    the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  8. Towards an Entropy-based Analysis of Log Variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Debois, Søren; Slaats, Tijs

    2018-01-01

    the development of hybrid miners: given a log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... experimentation on both synthetic and real-life logs that these entropy measures do indeed give insights into the complexity of the log and can act as an indicator of which mining paradigm should be used....

  9. Local entropy generation analysis of a rotary magnetic heat pump regenerator

    International Nuclear Information System (INIS)

    Drost, M.K.; White, M.D.

    1990-01-01

    The rotary magnetic heat pump has attractive thermodynamic performance but it is strongly influenced by the effectiveness of the regenerator. This paper uses local entropy generation analysis to evaluate the regenerator design and to suggest design improvements. The results show that performance of the proposed design is dominated by heat transfer related entropy generation. This suggests that enhancement concepts that improve heat transfer should be considered, even if the enhancement causes a significant increase in viscous losses (pressure drop). One enhancement technique, the use of flow disruptors, was evaluated and the results showed that flow disruptors can significantly reduce thermodynamic losses

  10. Entropy Inequality Violations from Ultraspinning Black Holes.

    Science.gov (United States)

    Hennigar, Robie A; Mann, Robert B; Kubizňák, David

    2015-07-17

    We construct a new class of rotating anti-de Sitter (AdS) black hole solutions with noncompact event horizons of finite area in any dimension and study their thermodynamics. In four dimensions these black holes are solutions to gauged supergravity. We find that their entropy exceeds the maximum implied from the conjectured reverse isoperimetric inequality, which states that for a given thermodynamic volume, the black hole entropy is maximized for Schwarzschild-AdS space. We use this result to suggest more stringent conditions under which this conjecture may hold.

  11. Predicting the distribution of the Asian tapir in Peninsular Malaysia using maximum entropy modeling.

    Science.gov (United States)

    Clements, Gopalasamy Reuben; Rayan, D Mark; Aziz, Sheema Abdul; Kawanishi, Kae; Traeholt, Carl; Magintan, David; Yazi, Muhammad Fadlli Abdul; Tingley, Reid

    2012-12-01

    In 2008, the IUCN threat status of the Asian tapir (Tapirus indicus) was reclassified from 'vulnerable' to 'endangered'. The latest distribution map from the IUCN Red List suggests that the tapirs' native range is becoming increasingly fragmented in Peninsular Malaysia, but distribution data collected by local researchers suggest a more extensive geographical range. Here, we compile a database of 1261 tapir occurrence records within Peninsular Malaysia, and demonstrate that this species, indeed, has a much broader geographical range than the IUCN range map suggests. However, extreme spatial and temporal bias in these records limits their utility for conservation planning. Therefore, we used maximum entropy (MaxEnt) modeling to elucidate the potential extent of the Asian tapir's occurrence in Peninsular Malaysia while accounting for bias in existing distribution data. Our MaxEnt model predicted that the Asian tapir has a wider geographic range than our fine-scale data and the IUCN range map both suggest. Approximately 37% of Peninsular Malaysia contains potentially suitable tapir habitats. Our results justify a revision to the Asian tapir's extent of occurrence in the IUCN Red List. Furthermore, our modeling demonstrated that selectively logged forests encompass 45% of potentially suitable tapir habitats, underscoring the importance of these habitats for the conservation of this species in Peninsular Malaysia. © 2012 Wiley Publishing Asia Pty Ltd, ISZS and IOZ/CAS.

  12. The Dynameomics Entropy Dictionary: A Large-Scale Assessment of Conformational Entropy across Protein Fold Space.

    Science.gov (United States)

    Towse, Clare-Louise; Akke, Mikael; Daggett, Valerie

    2017-04-27

    Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing data set of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identifies deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, α-helices have lower entropy on average than do β-sheets, and both are lower than coil regions.

  13. Assessment of maximum available work of a hydrogen fueled compression ignition engine using exergy analysis

    International Nuclear Information System (INIS)

    Chintala, Venkateswarlu; Subramanian, K.A.

    2014-01-01

    This work is aimed at study of maximum available work and irreversibility (mixing, combustion, unburned, and friction) of a dual-fuel diesel engine (H 2 (hydrogen)–diesel) using exergy analysis. The maximum available work increased with H 2 addition due to reduction in irreversibility of combustion because of less entropy generation. The irreversibility of unburned fuel with the H 2 fuel also decreased due to the engine combustion with high temperature whereas there is no effect of H 2 on mixing and friction irreversibility. The maximum available work of the diesel engine at rated load increased from 29% with conventional base mode (without H 2 ) to 31.7% with dual-fuel mode (18% H 2 energy share) whereas total irreversibility of the engine decreased drastically from 41.2% to 39.3%. The energy efficiency of the engine with H 2 increased about 10% with 36% reduction in CO 2 emission. The developed methodology could also be applicable to find the effect and scope of different technologies including exhaust gas recirculation and turbo charging on maximum available work and energy efficiency of diesel engines. - Highlights: • Energy efficiency of diesel engine increases with hydrogen under dual-fuel mode. • Maximum available work of the engine increases significantly with hydrogen. • Combustion and unburned fuel irreversibility decrease with hydrogen. • No significant effect of hydrogen on mixing and friction irreversibility. • Reduction in CO 2 emission along with HC, CO and smoke emissions

  14. The Role of Configurational Entropy in Amorphous Systems

    Directory of Open Access Journals (Sweden)

    Kirsten A. Graeser

    2010-05-01

    Full Text Available Configurational entropy is an important parameter in amorphous systems. It is involved in the thermodynamic considerations, plays an important role in the molecular mobility calculations through its appearance in the Adam-Gibbs equation and provides information on the solubility increase of an amorphous form compared to its crystalline counterpart. This paper presents a calorimetric method which enables the scientist to quickly determine the values for the configurational entropy at any temperature and obtain the maximum of information from these measurements.

  15. A Real-Time Analysis Method for Pulse Rate Variability Based on Improved Basic Scale Entropy

    Directory of Open Access Journals (Sweden)

    Yongxin Chou

    2017-01-01

    Full Text Available Base scale entropy analysis (BSEA is a nonlinear method to analyze heart rate variability (HRV signal. However, the time consumption of BSEA is too long, and it is unknown whether the BSEA is suitable for analyzing pulse rate variability (PRV signal. Therefore, we proposed a method named sliding window iterative base scale entropy analysis (SWIBSEA by combining BSEA and sliding window iterative theory. The blood pressure signals of healthy young and old subjects are chosen from the authoritative international database MIT/PhysioNet/Fantasia to generate PRV signals as the experimental data. Then, the BSEA and the SWIBSEA are used to analyze the experimental data; the results show that the SWIBSEA reduces the time consumption and the buffer cache space while it gets the same entropy as BSEA. Meanwhile, the changes of base scale entropy (BSE for healthy young and old subjects are the same as that of HRV signal. Therefore, the SWIBSEA can be used for deriving some information from long-term and short-term PRV signals in real time, which has the potential for dynamic PRV signal analysis in some portable and wearable medical devices.

  16. Estimation of Fine Particulate Matter in Taipei Using Landuse Regression and Bayesian Maximum Entropy Methods

    Directory of Open Access Journals (Sweden)

    Yi-Ming Kuo

    2011-06-01

    Full Text Available Fine airborne particulate matter (PM2.5 has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS, the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME method. The resulting epistemic framework can assimilate knowledge bases including: (a empirical-based spatial trends of PM concentration based on landuse regression, (b the spatio-temporal dependence among PM observation information, and (c site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan from 2005–2007.

  17. Estimation of fine particulate matter in Taipei using landuse regression and bayesian maximum entropy methods.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-06-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.

  18. Bayesian Maximum Entropy space/time estimation of surface water chloride in Maryland using river distances.

    Science.gov (United States)

    Jat, Prahlad; Serre, Marc L

    2016-12-01

    Widespread contamination of surface water chloride is an emerging environmental concern. Consequently accurate and cost-effective methods are needed to estimate chloride along all river miles of potentially contaminated watersheds. Here we introduce a Bayesian Maximum Entropy (BME) space/time geostatistical estimation framework that uses river distances, and we compare it with Euclidean BME to estimate surface water chloride from 2005 to 2014 in the Gunpowder-Patapsco, Severn, and Patuxent subbasins in Maryland. River BME improves the cross-validation R 2 by 23.67% over Euclidean BME, and river BME maps are significantly different than Euclidean BME maps, indicating that it is important to use river BME maps to assess water quality impairment. The river BME maps of chloride concentration show wide contamination throughout Baltimore and Columbia-Ellicott cities, the disappearance of a clean buffer separating these two large urban areas, and the emergence of multiple localized pockets of contamination in surrounding areas. The number of impaired river miles increased by 0.55% per year in 2005-2009 and by 1.23% per year in 2011-2014, corresponding to a marked acceleration of the rate of impairment. Our results support the need for control measures and increased monitoring of unassessed river miles. Copyright © 2016. Published by Elsevier Ltd.

  19. Decision Aggregation in Distributed Classification by a Transductive Extension of Maximum Entropy/Improved Iterative Scaling

    Directory of Open Access Journals (Sweden)

    George Kesidis

    2008-06-01

    Full Text Available In many ensemble classification paradigms, the function which combines local/base classifier decisions is learned in a supervised fashion. Such methods require common labeled training examples across the classifier ensemble. However, in some scenarios, where an ensemble solution is necessitated, common labeled data may not exist: (i legacy/proprietary classifiers, and (ii spatially distributed and/or multiple modality sensors. In such cases, it is standard to apply fixed (untrained decision aggregation such as voting, averaging, or naive Bayes rules. In recent work, an alternative transductive learning strategy was proposed. There, decisions on test samples were chosen aiming to satisfy constraints measured by each local classifier. This approach was shown to reliably correct for class prior mismatch and to robustly account for classifier dependencies. Significant gains in accuracy over fixed aggregation rules were demonstrated. There are two main limitations of that work. First, feasibility of the constraints was not guaranteed. Second, heuristic learning was applied. Here, we overcome these problems via a transductive extension of maximum entropy/improved iterative scaling for aggregation in distributed classification. This method is shown to achieve improved decision accuracy over the earlier transductive approach and fixed rules on a number of UC Irvine datasets.

  20. Entropy resistance analyses of a two-stream parallel flow heat exchanger with viscous heating

    International Nuclear Information System (INIS)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Heat exchangers are widely used in industry, and analyses and optimizations of the performance of heat exchangers are important topics. In this paper, we define the concept of entropy resistance based on the entropy generation analyses of a one-dimensional heat transfer process. With this concept, a two-stream parallel flow heat exchanger with viscous heating is analyzed and discussed. It is found that the minimization of entropy resistance always leads to the maximum heat transfer rate for the discussed two-stream parallel flow heat exchanger, while the minimizations of entropy generation rate, entropy generation numbers, and revised entropy generation number do not always. (general)

  1. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    Science.gov (United States)

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Minimal length, Friedmann equations and maximum density

    Energy Technology Data Exchange (ETDEWEB)

    Awad, Adel [Center for Theoretical Physics, British University of Egypt,Sherouk City 11837, P.O. Box 43 (Egypt); Department of Physics, Faculty of Science, Ain Shams University,Cairo, 11566 (Egypt); Ali, Ahmed Farag [Centre for Fundamental Physics, Zewail City of Science and Technology,Sheikh Zayed, 12588, Giza (Egypt); Department of Physics, Faculty of Science, Benha University,Benha, 13518 (Egypt)

    2014-06-16

    Inspired by Jacobson’s thermodynamic approach, Cai et al. have shown the emergence of Friedmann equations from the first law of thermodynamics. We extend Akbar-Cai derivation http://dx.doi.org/10.1103/PhysRevD.75.084003 of Friedmann equations to accommodate a general entropy-area law. Studying the resulted Friedmann equations using a specific entropy-area law, which is motivated by the generalized uncertainty principle (GUP), reveals the existence of a maximum energy density closed to Planck density. Allowing for a general continuous pressure p(ρ,a) leads to bounded curvature invariants and a general nonsingular evolution. In this case, the maximum energy density is reached in a finite time and there is no cosmological evolution beyond this point which leaves the big bang singularity inaccessible from a spacetime prospective. The existence of maximum energy density and a general nonsingular evolution is independent of the equation of state and the spacial curvature k. As an example we study the evolution of the equation of state p=ωρ through its phase-space diagram to show the existence of a maximum energy which is reachable in a finite time.

  3. Maximum entropy state of the quasi-geostrophic bi-disperse point vortex system: bifurcation phenomena under periodic boundary conditions

    Energy Technology Data Exchange (ETDEWEB)

    Funakoshi, Satoshi; Sato, Tomoyoshi; Miyazaki, Takeshi, E-mail: funakosi@miyazaki.mce.uec.ac.jp, E-mail: miyazaki@mce.uec.ac.jp [Department of Mechanical Engineering and Intelligent Systems, University of Electro-Communications, 1-5-1, Chofugaoka, Chofu, Tokyo 182-8585 (Japan)

    2012-06-01

    We investigate the statistical mechanics of quasi-geostrophic point vortices of mixed sign (bi-disperse system) numerically and theoretically. Direct numerical simulations under periodic boundary conditions are performed using a fast special-purpose computer for molecular dynamics (GRAPE-DR). Clustering of point vortices of like sign is observed and two-dimensional (2D) equilibrium states are formed. It is shown that they are the solutions of the 2D mean-field equation, i.e. the sinh-Poisson equation. The sinh-Poisson equation is generalized to study the 3D nature of the equilibrium states, and a new mean-field equation with the 3D Laplace operator is derived based on the maximum entropy theory. 3D solutions are obtained at very low energy level. These solution branches, however, cannot be traced up to the higher energy level at which the direct numerical simulations are performed, and transitions to 2D solution branches take place when the energy is increased. (paper)

  4. Precipitation Interpolation by Multivariate Bayesian Maximum Entropy Based on Meteorological Data in Yun- Gui-Guang region, Mainland China

    Science.gov (United States)

    Wang, Chaolin; Zhong, Shaobo; Zhang, Fushen; Huang, Quanyi

    2016-11-01

    Precipitation interpolation has been a hot area of research for many years. It had close relation to meteorological factors. In this paper, precipitation from 91 meteorological stations located in and around Yunnan, Guizhou and Guangxi Zhuang provinces (or autonomous region), Mainland China was taken into consideration for spatial interpolation. Multivariate Bayesian maximum entropy (BME) method with auxiliary variables, including mean relative humidity, water vapour pressure, mean temperature, mean wind speed and terrain elevation, was used to get more accurate regional distribution of annual precipitation. The means, standard deviations, skewness and kurtosis of meteorological factors were calculated. Variogram and cross- variogram were fitted between precipitation and auxiliary variables. The results showed that the multivariate BME method was precise with hard and soft data, probability density function. Annual mean precipitation was positively correlated with mean relative humidity, mean water vapour pressure, mean temperature and mean wind speed, negatively correlated with terrain elevation. The results are supposed to provide substantial reference for research of drought and waterlog in the region.

  5. Value at risk estimation with entropy-based wavelet analysis in exchange markets

    Science.gov (United States)

    He, Kaijian; Wang, Lijun; Zou, Yingchao; Lai, Kin Keung

    2014-08-01

    In recent years, exchange markets are increasingly integrated together. Fluctuations and risks across different exchange markets exhibit co-moving and complex dynamics. In this paper we propose the entropy-based multivariate wavelet based approaches to analyze the multiscale characteristic in the multidimensional domain and improve further the Value at Risk estimation reliability. Wavelet analysis has been introduced to construct the entropy-based Multiscale Portfolio Value at Risk estimation algorithm to account for the multiscale dynamic correlation. The entropy measure has been proposed as the more effective measure with the error minimization principle to select the best basis when determining the wavelet families and the decomposition level to use. The empirical studies conducted in this paper have provided positive evidence as to the superior performance of the proposed approach, using the closely related Chinese Renminbi and European Euro exchange market.

  6. Solutions to the Cosmic Initial Entropy Problem without Equilibrium Initial Conditions

    Directory of Open Access Journals (Sweden)

    Vihan M. Patel

    2017-08-01

    Full Text Available The entropy of the observable universe is increasing. Thus, at earlier times the entropy was lower. However, the cosmic microwave background radiation reveals an apparently high entropy universe close to thermal and chemical equilibrium. A two-part solution to this cosmic initial entropy problem is proposed. Following Penrose, we argue that the evenly distributed matter of the early universe is equivalent to low gravitational entropy. There are two competing explanations for how this initial low gravitational entropy comes about. (1 Inflation and baryogenesis produce a virtually homogeneous distribution of matter with a low gravitational entropy. (2 Dissatisfied with explaining a low gravitational entropy as the product of a ‘special’ scalar field, some theorists argue (following Boltzmann for a “more natural” initial condition in which the entire universe is in an initial equilibrium state of maximum entropy. In this equilibrium model, our observable universe is an unusual low entropy fluctuation embedded in a high entropy universe. The anthropic principle and the fluctuation theorem suggest that this low entropy region should be as small as possible and have as large an entropy as possible, consistent with our existence. However, our low entropy universe is much larger than needed to produce observers, and we see no evidence for an embedding in a higher entropy background. The initial conditions of inflationary models are as natural as the equilibrium background favored by many theorists.

  7. Causal nexus between energy consumption and carbon dioxide emission for Malaysia using maximum entropy bootstrap approach.

    Science.gov (United States)

    Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid

    2015-12-01

    This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.

  8. Quantum entropy and uncertainty for two-mode squeezed, coherent and intelligent spin states

    Science.gov (United States)

    Aragone, C.; Mundarain, D.

    1993-01-01

    We compute the quantum entropy for monomode and two-mode systems set in squeezed states. Thereafter, the quantum entropy is also calculated for angular momentum algebra when the system is either in a coherent or in an intelligent spin state. These values are compared with the corresponding values of the respective uncertainties. In general, quantum entropies and uncertainties have the same minimum and maximum points. However, for coherent and intelligent spin states, it is found that some minima for the quantum entropy turn out to be uncertainty maxima. We feel that the quantum entropy we use provides the right answer, since it is given in an essentially unique way.

  9. Identification of Random Dynamic Force Using an Improved Maximum Entropy Regularization Combined with a Novel Conjugate Gradient

    Directory of Open Access Journals (Sweden)

    ChunPing Ren

    2017-01-01

    Full Text Available We propose a novel mathematical algorithm to offer a solution for the inverse random dynamic force identification in practical engineering. Dealing with the random dynamic force identification problem using the proposed algorithm, an improved maximum entropy (IME regularization technique is transformed into an unconstrained optimization problem, and a novel conjugate gradient (NCG method was applied to solve the objective function, which was abbreviated as IME-NCG algorithm. The result of IME-NCG algorithm is compared with that of ME, ME-CG, ME-NCG, and IME-CG algorithm; it is found that IME-NCG algorithm is available for identifying the random dynamic force due to smaller root mean-square-error (RMSE, lower restoration time, and fewer iterative steps. Example of engineering application shows that L-curve method is introduced which is better than Generalized Cross Validation (GCV method and is applied to select regularization parameter; thus the proposed algorithm can be helpful to alleviate the ill-conditioned problem in identification of dynamic force and to acquire an optimal solution of inverse problem in practical engineering.

  10. Entropy statistics and information theory

    NARCIS (Netherlands)

    Frenken, K.; Hanusch, H.; Pyka, A.

    2007-01-01

    Entropy measures provide important tools to indicate variety in distributions at particular moments in time (e.g., market shares) and to analyse evolutionary processes over time (e.g., technical change). Importantly, entropy statistics are suitable to decomposition analysis, which renders the

  11. Harmonic analysis of electric locomotive and traction power system based on wavelet singular entropy

    Science.gov (United States)

    Dun, Xiaohong

    2018-05-01

    With the rapid development of high-speed railway and heavy-haul transport, the locomotive and traction power system has become the main harmonic source of China's power grid. In response to this phenomenon, the system's power quality issues need timely monitoring, assessment and governance. Wavelet singular entropy is an organic combination of wavelet transform, singular value decomposition and information entropy theory, which combines the unique advantages of the three in signal processing: the time-frequency local characteristics of wavelet transform, singular value decomposition explores the basic modal characteristics of data, and information entropy quantifies the feature data. Based on the theory of singular value decomposition, the wavelet coefficient matrix after wavelet transform is decomposed into a series of singular values that can reflect the basic characteristics of the original coefficient matrix. Then the statistical properties of information entropy are used to analyze the uncertainty of the singular value set, so as to give a definite measurement of the complexity of the original signal. It can be said that wavelet entropy has a good application prospect in fault detection, classification and protection. The mat lab simulation shows that the use of wavelet singular entropy on the locomotive and traction power system harmonic analysis is effective.

  12. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  13. Entropy in an expanding universe

    International Nuclear Information System (INIS)

    Frautschi, S.

    1982-01-01

    The question of how the observed evolution of organized structures from initial chaos in the expanding universe can be reconciled with the laws of statistical mechanics is studied, with emphasis on effects of the expansion and gravity. Some major sources of entropy increase are listed. An expanding causal region is defined in which the entropy, though increasing, tends to fall further and further behind its maximum possible value, thus allowing for the development of order. The related questions of whether entropy will continue increasing without limit in the future, and whether such increase in the form of Hawking radiation or radiation from positronium might enable life to maintain itself permanently, are considered. Attempts to find a scheme for preserving life based on solid structures fail because events such as quantum tunneling recurrently disorganize matter on a very long but fixed time scale, whereas all energy sources slow down progressively in an expanding universe. However, there remains hope that other modes of life capable of maintaining themselves permanently can be found

  14. Entropy for the Complexity of Physiological Signal Dynamics.

    Science.gov (United States)

    Zhang, Xiaohua Douglas

    2017-01-01

    Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

  15. Entropy fluxes, endoreversibility, and solar energy conversion

    Science.gov (United States)

    de Vos, A.; Landsberg, P. T.; Baruch, P.; Parrott, J. E.

    1993-09-01

    A formalism illustrating the conversion of radiation energy into work can be obtained in terms of energy and entropy fluxes. Whereas the Landsberg equality was derived for photothermal conversion with zero bandgap, a generalized inequality for photothermal/photovoltaic conversion with a single, but arbitrary, bandgap was deduced. This result was derived for a direct energy and entropy balance. The formalism of endoreversible dynamics was adopted in order to show the correlation with the latter approach. It was a surprising fact that the generalized Landsberg inequality was derived by optimizing some quantity W(sup *), which obtains it maximum value under short-circuit condition.

  16. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.

    Science.gov (United States)

    Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier

    2016-08-01

    Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.

  17. Modeling Electric Discharges with Entropy Production Rate Principles

    Directory of Open Access Journals (Sweden)

    Thomas Christen

    2009-12-01

    Full Text Available Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate principles are related to the maximum entropy production rate principle (MEPP. Secondly, three typical examples are discussed, which provide a certain insight in the structure of the models that are candidates for MEPP application. It is then thirdly argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production. Finally, it is additionally conjectured that a further reason for the success of MEPP in certain far from equilibrium systems might be based on a hidden linearity of the underlying kinetic equation(s.

  18. Evidence of shallow positron traps in ion-implanted InP observed by maximum entropy reconstruction of positron lifetime distribution: a test of MELT

    International Nuclear Information System (INIS)

    Chen, Z.Q.; Wang, S.J.

    1999-01-01

    A newly developed maximum entropy method, which was realized by the computer program MELT introduced by Shukla et al., was used to analyze positron lifetime spectra measured in semiconductors. Several simulation studies were done to test the performance of this algorithm. Reliable reconstruction of positron lifetime distributions can be extracted at relatively lower counts, which shows the applicability and superiority of this method. Two positron lifetime spectra measured in ion-implanted p-InP(Zn) at 140 and 280 K, respectively were analyzed by this program. The lifetime distribution differed greatly for the two temperatures, giving direct evidence of the existence of shallow positron traps at low temperature

  19. Thermal-hydraulic performance analysis of a subchannel with square and triangle fuel rod arrangements using the entropy generation approach

    Institute of Scientific and Technical Information of China (English)

    S.Talebi; M.M.Valoujerdi

    2017-01-01

    The present paper discusses entropy generation in fully developed turbulent flows through a subchannel,arranged in square and triangle arrays.Entropy generation is due to contribution of both heat transfer and pressure drop.Our main objective is to study the effect of key parameters such as spacer grid,fuel rod power distribution,Reynolds number Re,dimensionless heat power ω,lengthto-fuel-diameter ratio λ,and pitch-to-diameter ratio ξ on subchannel entropy generation.The analysis explicitly shows the contribution of heat transfer and pressure drop to the total entropy generation.An analytical formulation is introduced to total entropy generation for situations with uniform and sinusoidal rod power distribution.It is concluded that power distribution affects entropy generation.A smoother power profile leads to less entropy generation.The entropy generation of square rod array bundles is more efficient than that of triangular rod arrays,and spacer grids generate more entropy.

  20. A Bayesian maximum entropy-based methodology for optimal spatiotemporal design of groundwater monitoring networks.

    Science.gov (United States)

    Hosseini, Marjan; Kerachian, Reza

    2017-09-01

    This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.

  1. Entropy Coherent and Entropy Convex Measures of Risk

    NARCIS (Netherlands)

    Laeven, R.J.A.; Stadje, M.A.

    2011-01-01

    We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. We prove that convex, entropy convex and entropy coherent measures of risk emerge as certainty equivalents under variational, homothetic and multiple priors preferences,

  2. Application of the EGM Method to a LED-Based Spotlight: A Constrained Pseudo-Optimization Design Process Based on the Analysis of the Local Entropy Generation Maps

    Directory of Open Access Journals (Sweden)

    Enrico Sciubba

    2011-06-01

    Full Text Available In this paper, the entropy generation minimization (EGM method is applied to an industrial heat transfer problem: the forced convective cooling of a LED-based spotlight. The design specification calls for eighteen diodes arranged on a circular copper plate of 35 mm diameter. Every diode dissipates 3 W and the maximum allowedtemperature of the plate is 80 °C. The cooling relies on the forced convection driven by a jet of air impinging on the plate. An initial complex geometry of plate fins is presented and analyzed with a commercial CFD code that computes the entropy generation rate. A pseudo-optimization process is carried out via a successive series of design modifications based on a careful analysis of the entropy generation maps. One of the advantages of the EGM method is that the rationale behind each step of the design process can be justified on a physical basis. It is found that the best performance is attained when the fins are periodically spaced in the radial direction.

  3. Entropy coherent and entropy convex measures of risk

    NARCIS (Netherlands)

    Laeven, Roger; Stadje, M.A.

    2010-01-01

    We introduce entropy coherent and entropy convex measures of risk and prove a collection of axiomatic characterization and duality results. We show in particular that entropy coherent and entropy convex measures of risk emerge as negative certainty equivalents in (the regular and a generalized

  4. Definition and measurement of entropy in high energy heavy ion collisions

    International Nuclear Information System (INIS)

    Remler, E.A.

    1986-01-01

    This talk has two parts: the first on the definition and the second on the measurement of entropy. The connection to nuclear thermodynamics can be retained without the local equilibrium assumption via two steps. The first is relatively simple and goes as follows. The authors make the certainly reasonable assumption that in central collisions, at the moment of maximum compression, the state is similar to one or more fireballs and that the total entropy of each fireball approximates that of an equilibrated system at the same total energy and average density. This entropy, if measurable, would determine much of the thermodynamic properties of nuclear matter. The second step therefore concerns measurement of this entropy. This paper develops a method by which entropy may be measured using a minimum amount of theory. In particular, it is not based on any assumption local equilibrium

  5. Parameterized entropy analysis of EEG following hypoxic-ischemic brain injury

    International Nuclear Information System (INIS)

    Tong Shanbao; Bezerianos, Anastasios; Malhotra, Amit; Zhu Yisheng; Thakor, Nitish

    2003-01-01

    In the present study Tsallis and Renyi entropy methods were used to study the electric activity of brain following hypoxic-ischemic (HI) injury. We investigated the performances of these parameterized information measures in describing the electroencephalogram (EEG) signal of controlled experimental animal HI injury. The results show that (a): compared with Shannon and Renyi entropy, the parameterized Tsallis entropy acts like a spatial filter and the information rate can either tune to long range rhythms or to short abrupt changes, such as bursts or spikes during the beginning of recovery, by the entropic index q; (b): Renyi entropy is a compact and predictive indicator for monitoring the physiological changes during the recovery of brain injury. There is a reduction in the Renyi entropy after brain injury followed by a gradual recovery upon resuscitation

  6. A Connection Entropy Approach to Water Resources Vulnerability Analysis in a Changing Environment

    Directory of Open Access Journals (Sweden)

    Zhengwei Pan

    2017-11-01

    Full Text Available This paper establishes a water resources vulnerability framework based on sensitivity, natural resilience and artificial adaptation, through the analyses of the four states of the water system and its accompanying transformation processes. Furthermore, it proposes an analysis method for water resources vulnerability based on connection entropy, which extends the concept of contact entropy. An example is given of the water resources vulnerability in Anhui Province of China, which analysis illustrates that, overall, vulnerability levels fluctuated and showed apparent improvement trends from 2001 to 2015. Some suggestions are also provided for the improvement of the level of water resources vulnerability in Anhui Province, considering the viewpoint of the vulnerability index.

  7. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    Science.gov (United States)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  8. Entropy coherent and entropy convex measures of risk

    NARCIS (Netherlands)

    Laeven, R.J.A.; Stadje, M.

    2013-01-01

    We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. Entropy coherent and entropy convex measures of risk are special cases of φ-coherent and φ-convex measures of risk. Contrary to the classical use of coherent and convex

  9. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    Science.gov (United States)

    Silva, Carlos; Annamalai, Kalyan

    2008-06-01

    The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  10. Entropy generation in a diesel engine turbocharging system

    International Nuclear Information System (INIS)

    Nakonieczny, K.

    2002-01-01

    The paper describes a model of entropy production in a diesel engine turbocharging system, discussing the processes occurring in the compressor, turbine, piping system, charge-air cooler and valves with the exclusion of combustion. The charging efficiency of the system is studied in two distinct engine operating states, conforming to maximum torque and nominal power conditions. Unlike in the standard approach, where the irreversibilities are derived from the balance equation for exergy and thus are addressed inexactly, the criterion function based on the notion of entropy generation, introduced in this paper, improves second law analysis of turbocharged engines by accounting for a direct description of the system internal irreversibilities. This function is used for the examination of an impact of the system design parameters on its efficiency. Computations based on the unsteady one-dimensional flow model show that, under the variations of the inlet pipe length, the timings of inlet valve opening and exhaust valve closure, and the valve overlap period, a favourable correlation can be found between the decrease of entropy production and the increase in amount of air charged into the engine cylinders. The other variables under study, including the turbine equivalent area, temperature decrease in intercooler and wastegate effective area ratio, show an opposite correlation, and thus, can be viewed as constraints in the system optimisation

  11. Entropy-based Probabilistic Fatigue Damage Prognosis and Algorithmic Performance Comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  12. Entropy-based probabilistic fatigue damage prognosis and algorithmic performance comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  13. Maximum entropy algorithm and its implementation for the neutral beam profile measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Wook; Cho, Gyu Seong [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Cho, Yong Sub [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A tomography algorithm to maximize the entropy of image using Lagrangian multiplier technique and conjugate gradient method has been designed for the measurement of 2D spatial distribution of intense neutral beams of KSTAR NBI (Korea Superconducting Tokamak Advanced Research Neutral Beam Injector), which is now being designed. A possible detection system was assumed and a numerical simulation has been implemented to test the reconstruction quality of given beam profiles. This algorithm has the good applicability for sparse projection data and thus, can be used for the neutral beam tomography. 8 refs., 3 figs. (Author)

  14. Maximum entropy algorithm and its implementation for the neutral beam profile measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Wook; Cho, Gyu Seong [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Cho, Yong Sub [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    A tomography algorithm to maximize the entropy of image using Lagrangian multiplier technique and conjugate gradient method has been designed for the measurement of 2D spatial distribution of intense neutral beams of KSTAR NBI (Korea Superconducting Tokamak Advanced Research Neutral Beam Injector), which is now being designed. A possible detection system was assumed and a numerical simulation has been implemented to test the reconstruction quality of given beam profiles. This algorithm has the good applicability for sparse projection data and thus, can be used for the neutral beam tomography. 8 refs., 3 figs. (Author)

  15. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Science.gov (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  16. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  17. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-03-01

    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  18. Pitfalls of Exergy Analysis

    Science.gov (United States)

    Vágner, Petr; Pavelka, Michal; Maršík, František

    2017-04-01

    The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.

  19. Logarithmic black hole entropy corrections and holographic Renyi entropy

    Energy Technology Data Exchange (ETDEWEB)

    Mahapatra, Subhash [The Institute of Mathematical Sciences, Chennai (India); KU Leuven - KULAK, Department of Physics, Kortrijk (Belgium)

    2018-01-15

    The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G{sub D}{sup 0}. The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)

  20. Logarithmic black hole entropy corrections and holographic Renyi entropy

    International Nuclear Information System (INIS)

    Mahapatra, Subhash

    2018-01-01

    The entanglement and Renyi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Renyi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order G D 0 . The entropic c-function and the inequalities of the Renyi entropy are also satisfied even with the correction terms. (orig.)

  1. An entropy approach to size and variance heterogeneity

    NARCIS (Netherlands)

    Balasubramanyan, L.; Stefanou, S.E.; Stokes, J.R.

    2012-01-01

    In this paper, we investigate the effect of bank size differences on cost efficiency heterogeneity using a heteroskedastic stochastic frontier model. This model is implemented by using an information theoretic maximum entropy approach. We explicitly model both bank size and variance heterogeneity

  2. Entropy and cosmology.

    Science.gov (United States)

    Zucker, M. H.

    This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own

  3. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  4. Application of maximum entropy to statistical inference for inversion of data from a single track segment.

    Science.gov (United States)

    Stotts, Steven A; Koch, Robert A

    2017-08-01

    In this paper an approach is presented to estimate the constraint required to apply maximum entropy (ME) for statistical inference with underwater acoustic data from a single track segment. Previous algorithms for estimating the ME constraint require multiple source track segments to determine the constraint. The approach is relevant for addressing model mismatch effects, i.e., inaccuracies in parameter values determined from inversions because the propagation model does not account for all acoustic processes that contribute to the measured data. One effect of model mismatch is that the lowest cost inversion solution may be well outside a relatively well-known parameter value's uncertainty interval (prior), e.g., source speed from track reconstruction or towed source levels. The approach requires, for some particular parameter value, the ME constraint to produce an inferred uncertainty interval that encompasses the prior. Motivating this approach is the hypothesis that the proposed constraint determination procedure would produce a posterior probability density that accounts for the effect of model mismatch on inferred values of other inversion parameters for which the priors might be quite broad. Applications to both measured and simulated data are presented for model mismatch that produces minimum cost solutions either inside or outside some priors.

  5. Entropy and Entropy Production: Old Misconceptions and New Breakthroughs

    Directory of Open Access Journals (Sweden)

    Leonid M. Martyushev

    2013-03-01

    Full Text Available Persistent misconceptions existing for dozens of years and influencing progress in various fields of science are sometimes encountered in the scientific and especially, the popular-science literature. The present brief review deals with two such interrelated misconceptions (misunderstandings. The first misunderstanding: entropy is a measure of disorder. This is an old and very common opinion. The second misconception is that the entropy production minimizes in the evolution of nonequilibrium systems. However, as it has recently become clear, evolution (progress in Nature demonstrates the opposite, i.e., maximization of the entropy production. The principal questions connected with this maximization are considered herein. The two misconceptions mentioned above can lead to the apparent contradiction between the conclusions of modern thermodynamics and the basic conceptions of evolution existing in biology. In this regard, the analysis of these issues seems extremely important and timely as it contributes to the deeper understanding of the laws of development of the surrounding World and the place of humans in it.

  6. Maximizing Entropy of Pickard Random Fields for 2x2 Binary Constraints

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren

    2014-01-01

    This paper considers the problem of maximizing the entropy of two-dimensional (2D) Pickard Random Fields (PRF) subject to constraints. We consider binary Pickard Random Fields, which provides a 2D causal finite context model and use it to define stationary probabilities for 2x2 squares, thus...... allowing us to calculate the entropy of the field. All possible binary 2x2 constraints are considered and all constraints are categorized into groups according to their properties. For constraints which can be modeled by a PRF approach and with positive entropy, we characterize and provide statistics...... of the maximum PRF entropy. As examples, we consider the well known hard square constraint along with a few other constraints....

  7. Giant onsite electronic entropy enhances the performance of ceria for water splitting.

    Science.gov (United States)

    Naghavi, S Shahab; Emery, Antoine A; Hansen, Heine A; Zhou, Fei; Ozolins, Vidvuds; Wolverton, Chris

    2017-08-18

    Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Ce 4+ /Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.Solid-state entropy of reduction increases the thermodynamic efficiency of ceria for two-step thermochemical water splitting. Here, the authors report a large and different source of entropy, the onsite electronic configurational entropy arising from coupling between orbital and spin angular momenta in f orbitals.

  8. Entropy Generation Analysis and Performance Evaluation of Turbulent Forced Convective Heat Transfer to Nanofluids

    Directory of Open Access Journals (Sweden)

    Yu Ji

    2017-03-01

    Full Text Available The entropy generation analysis of fully turbulent convective heat transfer to nanofluids in a circular tube is investigated numerically using the Reynolds Averaged Navier–Stokes (RANS model. The nanofluids with particle concentration of 0%, 1%, 2%, 4% and 6% are treated as single phases of effective properties. The uniform heat flux is enforced at the tube wall. To confirm the validity of the numerical approach, the results have been compared with empirical correlations and analytical formula. The self-similarity profiles of local entropy generation are also studied, in which the peak values of entropy generation by direct dissipation, turbulent dissipation, mean temperature gradients and fluctuating temperature gradients for different Reynolds number as well as different particle concentration are observed. In addition, the effects of Reynolds number, volume fraction of nanoparticles and heat flux on total entropy generation and Bejan number are discussed. In the results, the intersection points of total entropy generation for water and four nanofluids are observed, when the entropy generation decrease before the intersection and increase after the intersection as the particle concentration increases. Finally, by definition of Ep, which combines the first law and second law of thermodynamics and attributed to evaluate the real performance of heat transfer processes, the optimal Reynolds number Reop corresponding to the best performance and the advisable Reynolds number Read providing the appropriate Reynolds number range for nanofluids in convective heat transfer can be determined.

  9. An entropy-based analysis of lane changing behavior: An interactive approach.

    Science.gov (United States)

    Kosun, Caglar; Ozdemir, Serhan

    2017-05-19

    As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would

  10. Path length entropy analysis of diastolic heart sounds.

    Science.gov (United States)

    Griffel, Benjamin; Zia, Mohammad K; Fridman, Vladamir; Saponieri, Cesare; Semmlow, John L

    2013-09-01

    Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multiscale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%-81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Assessing suitable area for Acacia dealbata Mill. in the Ceira River Basin (Central Portugal based on maximum entropy modelling approach

    Directory of Open Access Journals (Sweden)

    Jorge Pereira

    2015-12-01

    Full Text Available Biological invasion by exotic organisms became a key issue, a concern associated to the deep impacts on several domains described as resultant from such processes. A better understanding of the processes, the identification of more susceptible areas, and the definition of preventive or mitigation measures are identified as critical for the purpose of reducing associated impacts. The use of species distribution modeling might help on the purpose of identifying areas that are more susceptible to invasion. This paper aims to present preliminary results on assessing the susceptibility to invasion by the exotic species Acacia dealbata Mill. in the Ceira river basin. The results are based on the maximum entropy modeling approach, considered one of the correlative modelling techniques with better predictive performance. Models which validation is based on independent data sets present better performance, an evaluation based on the AUC of ROC accuracy measure.

  12. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    Directory of Open Access Journals (Sweden)

    Kalyan Annamalai

    2008-06-01

    Full Text Available The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB and Centers for Disease Control and Prevention (CDC, which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years. From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  13. Permutation Entropy: New Ideas and Challenges

    Directory of Open Access Journals (Sweden)

    Karsten Keller

    2017-03-01

    Full Text Available Over recent years, some new variants of Permutation entropy have been introduced and applied to EEG analysis, including a conditional variant and variants using some additional metric information or being based on entropies that are different from the Shannon entropy. In some situations, it is not completely clear what kind of information the new measures and their algorithmic implementations provide. We discuss the new developments and illustrate them for EEG data.

  14. ENTROPY FUNCTIONAL FOR CONTINUOUS SYSTEMS OF FINITE ENTROPY

    Institute of Scientific and Technical Information of China (English)

    M. Rahimi A. Riazi

    2012-01-01

    In this article,we introduce the concept of entropy functional for continuous systems on compact metric spaces,and prove some of its properties.We also extract the Kolmogorov entropy from the entropy functional.

  15. Entropy of the Mixture of Sources and Entropy Dimension

    OpenAIRE

    Smieja, Marek; Tabor, Jacek

    2011-01-01

    We investigate the problem of the entropy of the mixture of sources. There is given an estimation of the entropy and entropy dimension of convex combination of measures. The proof is based on our alternative definition of the entropy based on measures instead of partitions.

  16. Study on spectral entropy of two-phase flow density wave instability

    International Nuclear Information System (INIS)

    Zhang Zuoyi

    1992-05-01

    By using mathematic proof, spectral entropy calculations for simple examples and a practical two-phase flow system, it has been proved that under the same stochastic input, the output spectral entropy of a stable linear system is in maximum, while for an unstable linear system, its entropy is in relative lower level. Because the spectral entropy describes the output uncertainty of a system and the second law of thermodynamics rules the direction of natural tendency, the spontaneous process can develop only toward the direction of uncertainty increasing, and the opposite is impossible. It seems that the physical mechanism of the stability of a system can be explained as following: Any deviation from its original state of a stable system will reduce the spectral entropy and violate the natural tendency so that the system will return to original state. On the contrary, the deviation from its original state of an unstable system will increase the spectral entropy that will enhance the deviation and the system will be further away from its original state

  17. A Modified Entropy Generation Number for Heat Exchangers

    Institute of Scientific and Technical Information of China (English)

    1996-01-01

    This paper demonstrates the difference between the entropy generation number method proposed by Bejian and the method of entropy generation per unit amount of heat transferred in analyzing the ther-modynamic performance of heat exchangers,points out the reason for leading to the above difference.A modified entropy generation number for evaluating the irreversibility of heat exchangers is proposed which is in consistent with the entropy generation per unit amount of heat transferred in entropy generation analysis.The entropy generated by friction is also investigated.Results show that when the entropy generated by friction in heat exchangers in taken into account,there is a minimum total entropy generation number while the NTU and the ratio of heat capacity rates vary.The existence of this minimum is the prerequisite of heat exchanger optimization.

  18. Spatiotemporal modeling of PM2.5 concentrations at the national scale combining land use regression and Bayesian maximum entropy in China.

    Science.gov (United States)

    Chen, Li; Gao, Shuang; Zhang, Hui; Sun, Yanling; Ma, Zhenxing; Vedal, Sverre; Mao, Jian; Bai, Zhipeng

    2018-05-03

    Concentrations of particulate matter with aerodynamic diameter Bayesian Maximum Entropy (BME) interpolation of the LUR space-time residuals were developed to estimate the PM 2.5 concentrations on a national scale in China. This hybrid model could potentially provide more valid predictions than a commonly-used LUR model. The LUR/BME model had good performance characteristics, with R 2  = 0.82 and root mean square error (RMSE) of 4.6 μg/m 3 . Prediction errors of the LUR/BME model were reduced by incorporating soft data accounting for data uncertainty, with the R 2 increasing by 6%. The performance of LUR/BME is better than OK/BME. The LUR/BME model is the most accurate fine spatial scale PM 2.5 model developed to date for China. Copyright © 2018. Published by Elsevier Ltd.

  19. Multivariate Multi-Scale Permutation Entropy for Complexity Analysis of Alzheimer’s Disease EEG

    Directory of Open Access Journals (Sweden)

    Isabella Palamara

    2012-07-01

    Full Text Available An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database.

  20. Use of mutual information to decrease entropy: Implications for the second law of thermodynamics

    International Nuclear Information System (INIS)

    Lloyd, S.

    1989-01-01

    Several theorems on the mechanics of gathering information are proved, and the possibility of violating the second law of thermodynamics by obtaining information is discussed in light of these theorems. Maxwell's demon can lower the entropy of his surroundings by an amount equal to the difference between the maximum entropy of his recording device and its initial entropy, without generating a compensating entropy increase. A demon with human-scale recording devices can reduce the entropy of a gas by a negligible amount only, but the proof of the demon's impracticability leaves open the possibility that systems highly correlated with their environment can reduce the environment's entropy by a substantial amount without increasing entropy elsewhere. In the event that a boundary condition for the universe requires it to be in a state of low entropy when small, the correlations induced between different particle modes during the expansion phase allow the modes to behave like Maxwell's demons during the contracting phase, reducing the entropy of the universe to a low value

  1. Entanglement entropy and differential entropy for massive flavors

    International Nuclear Information System (INIS)

    Jones, Peter A.R.; Taylor, Marika

    2015-01-01

    In this paper we compute the holographic entanglement entropy for massive flavors in the D3-D7 system, for arbitrary mass and various entangling region geometries. We show that the universal terms in the entanglement entropy exactly match those computed in the dual theory using conformal perturbation theory. We derive holographically the universal terms in the entanglement entropy for a CFT perturbed by a relevant operator, up to second order in the coupling; our results are valid for any entangling region geometry. We present a new method for computing the entanglement entropy of any top-down brane probe system using Kaluza-Klein holography and illustrate our results with massive flavors at finite density. Finally we discuss the differential entropy for brane probe systems, emphasising that the differential entropy captures only the effective lower-dimensional Einstein metric rather than the ten-dimensional geometry.

  2. Entropy of localized states and black hole evaporation

    International Nuclear Information System (INIS)

    Olum, K.D.

    1997-01-01

    We call a state 'vacuum bounded' if every measurement performed outside a specified interior region gives the same result as in the vacuum. We compute the maximum entropy of a vacuum-bounded state with a given energy for a one-dimensional model, with the aid of numerical calculations on a lattice. The maximum entropy is larger than it would be for rigid wall boundary conditions by an amount δS, which for large energies is approx-lt(1)/(6)ln(L in T), where L in is the length of the interior region. Assuming that the state resulting from the evaporation of a black hole is similar to a vacuum-bounded state, and that the similarity between vacuum-bounded and rigid-wall-bounded problems extends from 1 to 3 dimensions, we apply these results to the black hole information paradox. Under these assumptions we conclude that large amounts of information cannot be emitted in the final explosion of a black hole. copyright 1997 The American Physical Society

  3. Vector entropy imaging theory with application to computerized tomography

    International Nuclear Information System (INIS)

    Wang Yuanmei; Cheng Jianping; Heng, Pheng Ann

    2002-01-01

    Medical imaging theory for x-ray CT and PET is based on image reconstruction from projections. In this paper a novel vector entropy imaging theory under the framework of multiple criteria decision making is presented. We also study the most frequently used image reconstruction methods, namely, least square, maximum entropy, and filtered back-projection methods under the framework of the single performance criterion optimization. Finally, we introduce some of the results obtained by various reconstruction algorithms using computer-generated noisy projection data from the Hoffman phantom and real CT scanner data. Comparison of the reconstructed images indicates that the vector entropy method gives the best in error (difference between the original phantom data and reconstruction), smoothness (suppression of noise), grey value resolution and is free of ghost images. (author)

  4. Analysis of crack propagation in concrete structures with structural information entropy

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The propagation of cracks in concrete structures causes energy dissipation and release, and also causes energy redistribution in the structures. Entropy can characterize the energy redistribution. To investigate the relation between the propagation of cracks and the entropy in concrete structures, cracked concrete structures are treated as dissipative structures. Structural information entropy is defined for concrete structures. A compact tension test is conducted. Meanwhile, numerical simulations are also carried out. Both the test and numerical simulation results show that the structural information entropy in the structures can characterize the propagation of cracks in concrete structures.

  5. [Prediction of potential geographic distribution of Lyme disease in Qinghai province with Maximum Entropy model].

    Science.gov (United States)

    Zhang, Lin; Hou, Xuexia; Liu, Huixin; Liu, Wei; Wan, Kanglin; Hao, Qin

    2016-01-01

    To predict the potential geographic distribution of Lyme disease in Qinghai by using Maximum Entropy model (MaxEnt). The sero-diagnosis data of Lyme disease in 6 counties (Huzhu, Zeku, Tongde, Datong, Qilian and Xunhua) and the environmental and anthropogenic data including altitude, human footprint, normalized difference vegetation index (NDVI) and temperature in Qinghai province since 1990 were collected. By using the data of Huzhu Zeku and Tongde, the prediction of potential distribution of Lyme disease in Qinghai was conducted with MaxEnt. The prediction results were compared with the human sero-prevalence of Lyme disease in Datong, Qilian and Xunhua counties in Qinghai. Three hot spots of Lyme disease were predicted in Qinghai, which were all in the east forest areas. Furthermore, the NDVI showed the most important role in the model prediction, followed by human footprint. Datong, Qilian and Xunhua counties were all in eastern Qinghai. Xunhua was in hot spot areaⅡ, Datong was close to the north of hot spot area Ⅲ, while Qilian with lowest sero-prevalence of Lyme disease was not in the hot spot areas. The data were well modeled in MaxEnt (Area Under Curve=0.980). The actual distribution of Lyme disease in Qinghai was in consistent with the results of the model prediction. MaxEnt could be used in predicting the potential distribution patterns of Lyme disease. The distribution of vegetation and the range and intensity of human activity might be related with Lyme disease distribution.

  6. Bubble Entropy: An Entropy Almost Free of Parameters.

    Science.gov (United States)

    Manis, George; Aktaruzzaman, Md; Sassi, Roberto

    2017-11-01

    Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation

  7. Logarithmic black hole entropy corrections and holographic Rényi entropy

    Science.gov (United States)

    Mahapatra, Subhash

    2018-01-01

    The entanglement and Rényi entropies for spherical entangling surfaces in CFTs with gravity duals can be explicitly calculated by mapping these entropies first to the thermal entropy on hyperbolic space and then, using the AdS/CFT correspondence, to the Wald entropy of topological black holes. Here we extend this idea by taking into account corrections to the Wald entropy. Using the method based on horizon symmetries and the asymptotic Cardy formula, we calculate corrections to the Wald entropy and find that these corrections are proportional to the logarithm of the area of the horizon. With the corrected expression for the entropy of the black hole, we then find corrections to the Rényi entropies. We calculate these corrections for both Einstein and Gauss-Bonnet gravity duals. Corrections with logarithmic dependence on the area of the entangling surface naturally occur at the order GD^0. The entropic c-function and the inequalities of the Rényi entropy are also satisfied even with the correction terms.

  8. Symplectic entropy

    International Nuclear Information System (INIS)

    De Nicola, Sergio; Fedele, Renato; Man'ko, Margarita A; Man'ko, Vladimir I

    2007-01-01

    The tomographic-probability description of quantum states is reviewed. The symplectic tomography of quantum states with continuous variables is studied. The symplectic entropy of the states with continuous variables is discussed and its relation to Shannon entropy and information is elucidated. The known entropic uncertainty relations of the probability distribution in position and momentum of a particle are extended and new uncertainty relations for symplectic entropy are obtained. The partial case of symplectic entropy, which is optical entropy of quantum states, is considered. The entropy associated to optical tomogram is shown to satisfy the new entropic uncertainty relation. The example of Gaussian states of harmonic oscillator is studied and the entropic uncertainty relations for optical tomograms of the Gaussian state are shown to minimize the uncertainty relation

  9. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    Science.gov (United States)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  10. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  11. An automatic classifier of emotions built from entropy of noise.

    Science.gov (United States)

    Ferreira, Jacqueline; Brás, Susana; Silva, Carlos F; Soares, Sandra C

    2017-04-01

    The electrocardiogram (ECG) signal has been widely used to study the physiological substrates of emotion. However, searching for better filtering techniques in order to obtain a signal with better quality and with the maximum relevant information remains an important issue for researchers in this field. Signal processing is largely performed for ECG analysis and interpretation, but this process can be susceptible to error in the delineation phase. In addition, it can lead to the loss of important information that is usually considered as noise and, consequently, discarded from the analysis. The goal of this study was to evaluate if the ECG noise allows for the classification of emotions, while using its entropy as an input in a decision tree classifier. We collected the ECG signal from 25 healthy participants while they were presented with videos eliciting negative (fear and disgust) and neutral emotions. The results indicated that the neutral condition showed a perfect identification (100%), whereas the classification of negative emotions indicated good identification performances (60% of sensitivity and 80% of specificity). These results suggest that the entropy of noise contains relevant information that can be useful to improve the analysis of the physiological correlates of emotion. © 2016 Society for Psychophysiological Research.

  12. Analysis of optimal Reynolds number for developing laminar forced convection in double sine ducts based on entropy generation minimization principle

    International Nuclear Information System (INIS)

    Ko, T.H.

    2006-01-01

    In the present paper, the entropy generation and optimal Reynolds number for developing forced convection in a double sine duct with various wall heat fluxes, which frequently occurs in plate heat exchangers, are studied based on the entropy generation minimization principle by analytical thermodynamic analysis as well as numerical investigation. According to the thermodynamic analysis, a very simple expression for the optimal Reynolds number for the double sine duct as a function of mass flow rate, wall heat flux, working fluid and geometric dimensions is proposed. In the numerical simulations, the investigated Reynolds number (Re) covers the range from 86 to 2000 and the wall heat flux (q'') varies as 160, 320 and 640 W/m 2 . From the numerical simulation of the developing laminar forced convection in the double sine duct, the effect of Reynolds number on entropy generation in the duct has been examined, through which the optimal Reynolds number with minimal entropy generation is detected. The optimal Reynolds number obtained from the analytical thermodynamic analysis is compared with the one from the numerical solutions and is verified to have a similar magnitude of entropy generation as the minimal entropy generation predicted by the numerical simulations. The optimal analysis provided in the present paper gives worthy information for heat exchanger design, since the thermal system could have the least irreversibility and best exergy utilization if the optimal Re can be used according to practical design conditions

  13. Applicability of entropy, entransy and exergy analyses to the optimization of the Organic Rankine Cycle

    International Nuclear Information System (INIS)

    Zhu, Yadong; Hu, Zhe; Zhou, Yaodong; Jiang, Liang; Yu, Lijun

    2014-01-01

    Graphical abstract: Fig. 3a. Variations of the evaluation parameters with evaporation temperature in the case of prescribed hot and cold streams for R123. Fig. 3(a) indicates that among the seven parameters, the minimum entropy generation rate, exergy destruction rate, entransy efficiency, revised entropy generation number and the maximum entransy loss rate are corresponding to the maximum output power. However, the minimum entransy dissipation rate does not associate with the output power variation, it can be explained as follow: the entransy dissipation is one part of the entransy loss rate besides entransy variation (work entransy) or does not consider the influence of work output on the change of entransy. - Highlights: • Theories of entropy, exergy and entransy are applied to the optimization of the ORC. • Two commonly utilized working fluids – R123 and N-pentane are chosen for comparison. • Variable evaporation temperature, hot stream temperature and mass flow rate are considered. • 3-D coordinates are utilized to observe the global variation of parameters. • The concept of entransy loss rate is appropriate for all the cases discussed in this paper. - Abstract: Based on the theories of entropy, entransy and exergy, the concepts of entropy generation rate, revised entropy generation number, exergy destruction rate, entransy loss rate, entransy dissipation rate and entransy efficiency are applied to the optimization of the Organic Rankine Cycle. Cycles operating on R123 and N-pentane have been compared in three common cases which are variable evaporation temperature, hot stream temperature and hot stream mass flow rate. The optimization goal is to produce maximum output power. Some numerical analyses and simulations are presented, and the results show that when both the hot and cold stream conditions are fixed, all the entropy principle, the exergy theory, the entransy loss rate and the entransy efficiency are applicable to the optimization of the

  14. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  15. the use of entropy index for gender inequality analysis

    African Journals Online (AJOL)

    DJFLEX

    years old; the share of females in wage employment in the non-agricultural ... have greater access to education than females and the gap in access ... (1) is referred to as Shannon entropy (see Ciuperca and Girardin, 2005). Shannon entropy has the property of symmetry i.e. the measure is unchanged if the outcomes i.

  16. Giant onsite electronic entropy enhances the performance of ceria for water splitting

    DEFF Research Database (Denmark)

    Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine Anton

    2017-01-01

    lanthanides, and reaches a maximum value of ≈4.7 kB per oxygen vacancy for Ce4+/Ce3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has...... a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions....

  17. Symmetry Analysis of Gait between Left and Right Limb Using Cross-Fuzzy Entropy

    Directory of Open Access Journals (Sweden)

    Yi Xia

    2016-01-01

    Full Text Available The purpose of this paper is the investigation of gait symmetry problem by using cross-fuzzy entropy (C-FuzzyEn, which is a recently proposed cross entropy that has many merits as compared to the frequently used cross sample entropy (C-SampleEn. First, we used several simulation signals to test its performance regarding the relative consistency and dependence on data length. Second, the gait time series of the left and right stride interval were used to calculate the C-FuzzyEn values for gait symmetry analysis. Besides the statistical analysis, we also realized a support vector machine (SVM classifier to perform the classification of normal and abnormal gaits. The gait dataset consists of 15 patients with Parkinson’s disease (PD and 16 control (CO subjects. The results show that the C-FuzzyEn values of the PD patients’ gait are significantly higher than that of the CO subjects with a p value of less than 10-5, and the best classification performance evaluated by a leave-one-out (LOO cross-validation method is an accuracy of 96.77%. Such encouraging results imply that the C-FuzzyEn-based gait symmetry measure appears as a suitable tool for analyzing abnormal gaits.

  18. Configurational entropy of charged AdS black holes

    Directory of Open Access Journals (Sweden)

    Chong Oh Lee

    2017-09-01

    Full Text Available When we consider charged AdS black holes in higher dimensional spacetime and a molecule number density along coexistence curves is numerically extended to higher dimensional cases. It is found that a number density difference of a small and large black holes decrease as a total dimension grows up. In particular, we find that a configurational entropy is a concave function of a reduced temperature and reaches a maximum value at a critical (second-order phase transition point. Furthermore, the bigger a total dimension becomes, the more concave function in a configurational entropy while the more convex function in a reduced pressure.

  19. Simultaneous State and Parameter Estimation Using Maximum Relative Entropy with Nonhomogenous Differential Equation Constraints

    Directory of Open Access Journals (Sweden)

    Adom Giffin

    2014-09-01

    Full Text Available In this paper, we continue our efforts to show how maximum relative entropy (MrE can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a non-equilibrium state, i.e., perturbed by varying external forces. Traditional parameter estimation can be performed by using filters, such as the extended Kalman filter (EKF. However, as shown with a toy example of a system with first order non-homogeneous ordinary differential equations, assumptions made by the EKF algorithm (such as the Markov assumption may not be valid. The problem can be solved with exponential smoothing, e.g., exponentially weighted moving average (EWMA. Although this has been shown to produce acceptable filtering results in real exponential systems, it still cannot simultaneously estimate both the state and its parameters and has its own assumptions that are not always valid, for example when jump discontinuities exist. We show that by applying MrE as a filter, we can not only develop the closed form solutions, but we can also infer the parameters of the differential equation simultaneously with the means. This is useful in real, physical systems, where we want to not only filter the noise from our measurements, but we also want to simultaneously infer the parameters of the dynamics of a nonlinear and non-equilibrium system. Although there were many assumptions made throughout the paper to illustrate that EKF and exponential smoothing are special cases ofMrE, we are not “constrained”, by these assumptions. In other words, MrE is completely general and can be used in broader ways.

  20. Quantile-based Bayesian maximum entropy approach for spatiotemporal modeling of ambient air quality levels.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsin

    2013-02-05

    Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.

  1. Entropy equilibrium equation and dynamic entropy production in environment liquid

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The entropy equilibrium equation is the basis of the nonequilibrium state thermodynamics. But the internal energy implies the kinetic energy of the fluid micelle relative to mass center in the classical entropy equilibrium equation at present. This internal energy is not the mean kinetic energy of molecular movement in thermodynamics. Here a modified entropy equilibrium equation is deduced, based on the concept that the internal energy is just the mean kinetic energy of the molecular movement. A dynamic entropy production is introduced into the entropy equilibrium equation to describe the dynamic process distinctly. This modified entropy equilibrium equation can describe not only the entropy variation of the irreversible processes but also the reversible processes in a thermodynamic system. It is more reasonable and suitable for wider applications.

  2. COLLAGE-BASED INVERSE PROBLEMS FOR IFSM WITH ENTROPY MAXIMIZATION AND SPARSITY CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    Herb Kunze

    2013-11-01

    Full Text Available We consider the inverse problem associated with IFSM: Given a target function f, find an IFSM, such that its invariant fixed point f is sufficiently close to f in the Lp distance. In this paper, we extend the collage-based method developed by Forte and Vrscay (1995 along two different directions. We first search for a set of mappings that not only minimizes the collage error but also maximizes the entropy of the dynamical system. We then include an extra term in the minimization process which takes into account the sparsity of the set of mappings. In this new formulation, the minimization of collage error is treated as multi-criteria problem: we consider three different and conflicting criteria i.e., collage error, entropy and sparsity. To solve this multi-criteria program we proceed by scalarization and we reduce the model to a single-criterion program by combining all objective functions with different trade-off weights. The results of some numerical computations are presented. Numerical studies indicate that a maximum entropy principle exists for this approximation problem, i.e., that the suboptimal solutions produced by collage coding can be improved at least slightly by adding a maximum entropy criterion.

  3. Bayesian maximum entropy integration of ozone observations and model predictions: an application for attainment demonstration in North Carolina.

    Science.gov (United States)

    de Nazelle, Audrey; Arunachalam, Saravanan; Serre, Marc L

    2010-08-01

    States in the USA are required to demonstrate future compliance of criteria air pollutant standards by using both air quality monitors and model outputs. In the case of ozone, the demonstration tests aim at relying heavily on measured values, due to their perceived objectivity and enforceable quality. Weight given to numerical models is diminished by integrating them in the calculations only in a relative sense. For unmonitored locations, the EPA has suggested the use of a spatial interpolation technique to assign current values. We demonstrate that this approach may lead to erroneous assignments of nonattainment and may make it difficult for States to establish future compliance. We propose a method that combines different sources of information to map air pollution, using the Bayesian Maximum Entropy (BME) Framework. The approach gives precedence to measured values and integrates modeled data as a function of model performance. We demonstrate this approach in North Carolina, using the State's ozone monitoring network in combination with outputs from the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. We show that the BME data integration approach, compared to a spatial interpolation of measured data, improves the accuracy and the precision of ozone estimations across the state.

  4. Differences between state entropy and bispectral index during analysis of identical electroencephalogram signals: a comparison with two randomised anaesthetic techniques.

    Science.gov (United States)

    Pilge, Stefanie; Kreuzer, Matthias; Karatchiviev, Veliko; Kochs, Eberhard F; Malcharek, Michael; Schneider, Gerhard

    2015-05-01

    It is claimed that bispectral index (BIS) and state entropy reflect an identical clinical spectrum, the hypnotic component of anaesthesia. So far, it is not known to what extent different devices display similar index values while processing identical electroencephalogram (EEG) signals. To compare BIS and state entropy during analysis of identical EEG data. Inspection of raw EEG input to detect potential causes of erroneous index calculation. Offline re-analysis of EEG data from a randomised, single-centre controlled trial using the Entropy Module and an Aspect A-2000 monitor. Klinikum rechts der Isar, Technische Universität München, Munich. Forty adult patients undergoing elective surgery under general anaesthesia. Blocked randomisation of 20 patients per anaesthetic group (sevoflurane/remifentanil or propofol/remifentanil). Isolated forearm technique for differentiation between consciousness and unconsciousness. Prediction probability (PK) of state entropy to discriminate consciousness from unconsciousness. Correlation and agreement between state entropy and BIS from deep to light hypnosis. Analysis of raw EEG compared with index values that are in conflict with clinical examination, with frequency measures (frequency bands/Spectral Edge Frequency 95) and visual inspection for physiological EEG patterns (e.g. beta or delta arousal), pathophysiological features such as high-frequency signals (electromyogram/high-frequency EEG or eye fluttering/saccades), different types of electro-oculogram or epileptiform EEG and technical artefacts. PK of state entropy was 0.80 and of BIS 0.84; correlation coefficient of state entropy with BIS 0.78. Nine percent BIS and 14% state entropy values disagreed with clinical examination. Highest incidence of disagreement occurred after state transitions, in particular for state entropy after loss of consciousness during sevoflurane anaesthesia. EEG sequences which led to false 'conscious' index values often showed high

  5. An exploration for the macroscopic physical meaning of entropy

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The macroscopic physical meaning of entropy is analyzed based on the exergy (availability) of a combined system (a closed system and its environment), which is the maximum amount of useful work obtainable from the system and the environment as the system is brought into equilibrium with the environment. The process the system experiences can be divided in two sequent sub-processes, the process at constant volume, which represents the heat interaction of the system with the environment, and the adiabatic process, which represents the work interaction of the system with the environment. It is shown that the macroscopic physical meaning of entropy is a measure of the unavailable energy of a closed system for doing useful work through heat interaction. This statement is more precise than those reported in prior literature. The unavailability function of a closed system can be defined as T0S and p0V in volume constant process and adiabatic process, respectively. Their changes, that is, AiTgS) and A (p0V) represent the unusable parts of the internal energy of a closed system for doing useful work in corresponding processes. Finally, the relation between Clausius entropy and Boltzmann entropy is discussed based on the comparison of their expressions for absolute entropy.

  6. Entropy generation analysis of the revised Cheng-Minkowycz problem for natural convective boundary layer flow of nanofluid in a porous medium

    Directory of Open Access Journals (Sweden)

    Rashidi Mohammad Mehdi

    2015-01-01

    Full Text Available The similar solution on the equations of the revised Cheng-Minkowycz problem for natural convective boundary layer flow of nanofluid through a porous medium gives (using an analytical method, a system of non-linear partial differential equations which are solved by optimal homotopy analysis method. Effects of various drastic parameters on the fluid and heat transfer characteristics have been analyzed. A very good agreement is observed between the obtained results and the numerical ones. The entropy generation has been derived and a comprehensive parametric analysis on that has been done. Each component of the entropy generation has been analyzed separately and the contribution of each one on the total value of entropy generation has been determined. It is found that the entropy generation as an important aspect of the industrial applications has been affected by various parameters which should be controlled to minimize the entropy generation.

  7. Understanding Atmospheric Behaviour in Terms of Entropy: A Review of Applications of the Second Law of Thermodynamics to Meteorology

    Directory of Open Access Journals (Sweden)

    Donghai Wang

    2011-01-01

    Full Text Available The concept of entropy and its relevant principles, mainly the principle of maximum entropy production (MEP, the effect of negative entropy flow (NEF on the organization of atmospheric systems and the principle of the Second Law of thermodynamics, as well as their applications to atmospheric sciences, are reviewed. Some formulations of sub-grid processes such as diffusion parameterization schemes in computational geophysical fluid dynamics that can be improved based on full-irreversibility are also discussed, although they have not yet been systematically subjected to scrutiny from the perspective of the entropy budgets. A comparative investigation shows that the principle of MEP applies to the entropy production of macroscopic fluxes and determines the most probable state, that is, a system may choose a development meta-stable trajectory with a smaller production since entropy production behavior involves many specific dynamical and thermodynamic processes in the atmosphere and the extremal principles only provide a general insight into the overall configuration of the atmosphere. In contrast to the principle of MEP, the analysis of NEF is able to provide a new insight into the mechanism responsible for the evolution of a weather system as well as a new approach to predicting its track and intensity trend.

  8. Entropy and information

    CERN Document Server

    Volkenstein, Mikhail V

    2009-01-01

    The book "Entropy and Information" deals with the thermodynamical concept of entropy and its relationship to information theory. It is successful in explaining the universality of the term "Entropy" not only as a physical phenomenon, but reveals its existence also in other domains. E.g., Volkenstein discusses the "meaning" of entropy in a biological context and shows how entropy is related to artistic activities. Written by the renowned Russian bio-physicist Mikhail V. Volkenstein, this book on "Entropy and Information" surely serves as a timely introduction to understand entropy from a thermodynamic perspective and is definitely an inspiring and thought-provoking book that should be read by every physicist, information-theorist, biologist, and even artist.

  9. Parametric Bayesian Estimation of Differential Entropy and Relative Entropy

    OpenAIRE

    Gupta; Srivastava

    2010-01-01

    Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian est...

  10. Entropy production analysis of hysteresis characteristic of a pump-turbine model

    International Nuclear Information System (INIS)

    Li, Deyou; Wang, Hongjie; Qin, Yonglin; Han, Lei; Wei, Xianzhu; Qin, Daqing

    2017-01-01

    Highlights: • An interesting hysteresis phenomenon was analyzed using entropy production theory. • A function was used to calculate the entropy production in the wall region. • Generation mechanism of the hump and hysteresis characteristics was obtained. - Abstract: The hydraulic loss due to friction and unstable flow patterns in hydro-turbines causes a drop in their efficiency. The traditional method for analyzing the hydraulic loss is by evaluating the pressure drop, which has certain limitations and cannot determine the exact locations at which the high hydraulic loss occurs. In this study, entropy production theory was adopted to obtain a detailed distribution of the hydraulic loss in a pump-turbine in the pump mode. In the past, the wall effects of entropy production were not considered, which caused larger errors as compared with the method of pressure difference. First, a wall equation was proposed to calculate the hydraulic loss in the wall region. The comparison of hydraulic loss calculated by entropy production and pressure difference revealed a better result. Then, through the use of the entropy production theory, the performance characteristics were determined for a pump-turbine with 19 mm guide vane opening, and the variation in the entropy production was obtained. Recently, an interesting phenomenon, i.e., a hysteresis characteristic, was observed in the hump region in pump-turbines. Research shows that the hysteresis characteristic is a result of the Euler momentum and hydraulic loss; the hydraulic loss accounts for a major portion of the hysteresis characteristic. Finally, the hysteresis characteristic in the hump region was analyzed in detail through the entropy production. The results showed that the hump characteristic and the accompanying hysteresis phenomenon are caused by backflow at the runner inlet and the presence of separation vortices close to the hub and the shroud in the stay/guide vanes, which is dependent on the direction of

  11. Perspective: Maximum caliber is a general variational principle for dynamical systems.

    Science.gov (United States)

    Dixit, Purushottam D; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A

    2018-01-07

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics-such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production-are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  12. Perspective: Maximum caliber is a general variational principle for dynamical systems

    Science.gov (United States)

    Dixit, Purushottam D.; Wagoner, Jason; Weistuch, Corey; Pressé, Steve; Ghosh, Kingshuk; Dill, Ken A.

    2018-01-01

    We review here Maximum Caliber (Max Cal), a general variational principle for inferring distributions of paths in dynamical processes and networks. Max Cal is to dynamical trajectories what the principle of maximum entropy is to equilibrium states or stationary populations. In Max Cal, you maximize a path entropy over all possible pathways, subject to dynamical constraints, in order to predict relative path weights. Many well-known relationships of non-equilibrium statistical physics—such as the Green-Kubo fluctuation-dissipation relations, Onsager's reciprocal relations, and Prigogine's minimum entropy production—are limited to near-equilibrium processes. Max Cal is more general. While it can readily derive these results under those limits, Max Cal is also applicable far from equilibrium. We give examples of Max Cal as a method of inference about trajectory distributions from limited data, finding reaction coordinates in bio-molecular simulations, and modeling the complex dynamics of non-thermal systems such as gene regulatory networks or the collective firing of neurons. We also survey its basis in principle and some limitations.

  13. The maximum entropy determination of nuclear densities of calcium isotopes from elastic scattering of alpha particles

    International Nuclear Information System (INIS)

    Engel, Y.M.; Friedman, E.; Levine, R.D.

    1982-01-01

    Radial moments of the real part of the optical potential for elastic scattering of 104 MeV α particles are used as constraints, in determining the nuclear density of maximal entropy. The potential is related to the density by the folding model. (orig.)

  14. Maximum entropy analysis of cosmic ray composition

    Czech Academy of Sciences Publication Activity Database

    Nosek, D.; Ebr, Jan; Vícha, Jakub; Trávníček, Petr; Nosková, J.

    2016-01-01

    Roč. 76, Mar (2016), s. 9-18 ISSN 0927-6505 R&D Projects: GA ČR(CZ) GA14-17501S Institutional support: RVO:68378271 Keywords : ultra-high energy cosmic rays * extensive air showers * cosmic ray composition Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 3.257, year: 2016

  15. Quantum dynamical entropy revisited

    International Nuclear Information System (INIS)

    Hudetz, T.

    1996-10-01

    We define a new quantum dynamical entropy, which is a 'hybrid' of the closely related, physically oriented entropy introduced by Alicki and Fannes in 1994, and of the mathematically well-developed, single-argument entropy introduced by Connes, Narnhofer and Thirring in 1987. We show that this new quantum dynamical entropy has many properties similar to the ones of the Alicki-Fannes entropy, and also inherits some additional properties from the CNT entropy. In particular, the 'hybrid' entropy interpolates between the two different ways in which both the AF and the CNT entropy of the shift automorphism on the quantum spin chain agree with the usual quantum entropy density, resulting in even better agreement. Also, the new quantum dynamical entropy generalizes the classical dynamical entropy of Kolmogorov and Sinai in the same way as does the AF entropy. Finally, we estimate the 'hybrid' entropy both for the Powers-Price shift systems and for the noncommutative Arnold map on the irrational rotation C * -algebra, leaving some interesting open problems. (author)

  16. Regional Sustainable Development Analysis Based on Information Entropy-Sichuan Province as an Example.

    Science.gov (United States)

    Liang, Xuedong; Si, Dongyang; Zhang, Xinli

    2017-10-13

    According to the implementation of a scientific development perspective, sustainable development needs to consider regional development, economic and social development, and the harmonious development of society and nature, but regional sustainable development is often difficult to quantify. Through an analysis of the structure and functions of a regional system, this paper establishes an evaluation index system, which includes an economic subsystem, an ecological environmental subsystem and a social subsystem, to study regional sustainable development capacity. A sustainable development capacity measure model for Sichuan Province was established by applying the information entropy calculation principle and the Brusselator principle. Each subsystem and entropy change in a calendar year in Sichuan Province were analyzed to evaluate Sichuan Province's sustainable development capacity. It was found that the established model could effectively show actual changes in sustainable development levels through the entropy change reaction system, at the same time this model could clearly demonstrate how those forty-six indicators from the three subsystems impact on the regional sustainable development, which could make up for the lack of sustainable development research.

  17. An implementation of the maximum-caliber principle by replica-averaged time-resolved restrained simulations.

    Science.gov (United States)

    Capelli, Riccardo; Tiana, Guido; Camilloni, Carlo

    2018-05-14

    Inferential methods can be used to integrate experimental informations and molecular simulations. The maximum entropy principle provides a framework for using equilibrium experimental data, and it has been shown that replica-averaged simulations, restrained using a static potential, are a practical and powerful implementation of such a principle. Here we show that replica-averaged simulations restrained using a time-dependent potential are equivalent to the principle of maximum caliber, the dynamic version of the principle of maximum entropy, and thus may allow us to integrate time-resolved data in molecular dynamics simulations. We provide an analytical proof of the equivalence as well as a computational validation making use of simple models and synthetic data. Some limitations and possible solutions are also discussed.

  18. ECG contamination of EEG signals: effect on entropy.

    Science.gov (United States)

    Chakrabarti, Dhritiman; Bansal, Sonia

    2016-02-01

    Entropy™ is a proprietary algorithm which uses spectral entropy analysis of electroencephalographic (EEG) signals to produce indices which are used as a measure of depth of hypnosis. We describe a report of electrocardiographic (ECG) contamination of EEG signals leading to fluctuating erroneous Entropy values. An explanation is provided for mechanism behind this observation by describing the spread of ECG signals in head and neck and its influence on EEG/Entropy by correlating the observation with the published Entropy algorithm. While the Entropy algorithm has been well conceived, there are still instances in which it can produce erroneous values. Such erroneous values and their cause may be identified by close scrutiny of the EEG waveform if Entropy values seem out of sync with that expected at given anaesthetic levels.

  19. ENTROPIES AND FLUX-SPLITTINGS FOR THE ISENTROPIC EULER EQUATIONS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The authors establish the existence of a large class of mathematical entropies (the so-called weak entropies) associated with the Euler equations for an isentropic, compressible fluid governed by a general pressure law. A mild assumption on the behavior of the pressure law near the vacuum is solely required. The analysis is based on an asymptotic expansion of the fundamental solution (called here the entropy kernel) of a highly singular Euler-Poisson-Darboux equation. The entropy kernel is only H lder continuous and its regularity is carefully investigated. Relying on a notion introduced earlier by the authors, it is also proven that, for the Euler equations, the set of entropy flux-splittings coincides with the set of entropies-entropy fluxes. These results imply the existence of a flux-splitting consistent with all of the entropy inequalities.

  20. The pigeon's discrimination of visual entropy: a logarithmic function.

    Science.gov (United States)

    Young, Michael E; Wasserman, Edward A

    2002-11-01

    We taught 8 pigeons to discriminate 16-icon arrays that differed in their visual variability or "entropy" to see whether the relationship between entropy and discriminative behavior is linear (in which equivalent differences in entropy should produce equivalent changes in behavior) or logarithmic (in which higher entropy values should be less discriminable from one another than lower entropy values). Pigeons received a go/no-go task in which the lower entropy arrays were reinforced for one group and the higher entropy arrays were reinforced for a second group. The superior discrimination of the second group was predicted by a theoretical analysis in which excitatory and inhibitory stimulus generalization gradients fall along a logarithmic, but not a linear scale. Reanalysis of previously published data also yielded results consistent with a logarithmic relationship between entropy and discriminative behavior.