WorldWideScience

Sample records for maximum entropy based

  1. Kernel-based Maximum Entropy Clustering

    Institute of Scientific and Technical Information of China (English)

    JIANG Wei; QU Jiao; LI Benxi

    2007-01-01

    With the development of Support Vector Machine (SVM),the "kernel method" has been studied in a general way.In this paper,we present a novel Kernel-based Maximum Entropy Clustering algorithm (KMEC).By using mercer kernel functions,the proposed algorithm is firstly map the data from their original space to high dimensional space where the data are expected to be more separable,then perform MEC clustering in the feature space.The experimental results show that the proposed method has better performance in the non-hyperspherical and complex data structure.

  2. A New Detection Approach Based on the Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    DONG Xiaomei; XIANG Guang; YU Ge; LI Xiaohua

    2006-01-01

    The maximum entropy model was introduced and a new intrusion detection approach based on the maximum entropy model was proposed. The vector space model was adopted for data presentation. The minimal entropy partitioning method was utilized for attribute discretization. Experiments on the KDD CUP 1999 standard data set were designed and the experimental results were shown. The receiver operating characteristic(ROC) curve analysis approach was utilized to analyze the experimental results. The analysis results show that the proposed approach is comparable to those based on support vector machine(SVM) and outperforms those based on C4.5 and Naive Bayes classifiers. According to the overall evaluation result, the proposed approach is a little better than those based on SVM.

  3. Resolution of overlapping ambiguity strings based on maximum entropy model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Feng; FAN Xiao-zhong

    2006-01-01

    The resolution of overlapping ambiguity strings (OAS) is studied based on the maximum entropy model.There are two model outputs,where either the first two characters form a word or the last two characters form a word.The features of the model include one word in context of OAS,the current OAS and word probability relation of two kinds of segmentation results.OAS in training text is found by the combination of the FMM and BMM segmentation method.After feature tagging they are used to train the maximum entropy model.The People Daily corpus of January 1998 is used in training and testing.Experimental results show a closed test precision of 98.64% and an open test precision of 95.01%.The open test precision is 3,76% better compared with that of the precision of common word probability method.

  4. Color Image Enhancement Based on Maximum Fuzzy Entropy

    Institute of Scientific and Technical Information of China (English)

    QU Yi; XU Li-hong; KANG Qi

    2004-01-01

    A color image enhancement approach based on maximum fuzzy entropy and genetic algorithm is proposed in this paper. It enhances color images by stretching the contrast of S and I components respectively in the HSI color representation. The image is transformed from the property domain to the fuzzy domain with S-function. To preserve as much information as possible in the fuzzy the domain, the fuzzy entropy function is used as objective function in a genetic algorithm to optimize three parameters of the S-function. The Sigmoid function is applied to intensify the membership values and the results are transformed back to the property domain to produce the enhanced image. Experiments show the effectiveness of the approach.

  5. A Clustering Method Based on the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Edwin Aldana-Bobadilla

    2015-01-01

    Full Text Available Clustering is an unsupervised process to determine which unlabeled objects in a set share interesting properties. The objects are grouped into k subsets (clusters whose elements optimize a proximity measure. Methods based on information theory have proven to be feasible alternatives. They are based on the assumption that a cluster is one subset with the minimal possible degree of “disorder”. They attempt to minimize the entropy of each cluster. We propose a clustering method based on the maximum entropy principle. Such a method explores the space of all possible probability distributions of the data to find one that maximizes the entropy subject to extra conditions based on prior information about the clusters. The prior information is based on the assumption that the elements of a cluster are “similar” to each other in accordance with some statistical measure. As a consequence of such a principle, those distributions of high entropy that satisfy the conditions are favored over others. Searching the space to find the optimal distribution of object in the clusters represents a hard combinatorial problem, which disallows the use of traditional optimization techniques. Genetic algorithms are a good alternative to solve this problem. We benchmark our method relative to the best theoretical performance, which is given by the Bayes classifier when data are normally distributed, and a multilayer perceptron network, which offers the best practical performance when data are not normal. In general, a supervised classification method will outperform a non-supervised one, since, in the first case, the elements of the classes are known a priori. In what follows, we show that our method’s effectiveness is comparable to a supervised one. This clearly exhibits the superiority of our method.

  6. Promoter recognition based on the maximum entropy hidden Markov model.

    Science.gov (United States)

    Zhao, Xiao-yu; Zhang, Jin; Chen, Yuan-yuan; Li, Qiang; Yang, Tao; Pian, Cong; Zhang, Liang-yun

    2014-08-01

    Since the fast development of genome sequencing has produced large scale data, the current work uses the bioinformatics methods to recognize different gene regions, such as exon, intron and promoter, which play an important role in gene regulations. In this paper, we introduce a new method based on the maximum entropy Markov model (MEMM) to recognize the promoter, which utilizes the biological features of the promoter for the condition. However, it leads to a high false positive rate (FPR). In order to reduce the FPR, we provide another new method based on the maximum entropy hidden Markov model (ME-HMM) without the independence assumption, which could also accommodate the biological features effectively. To demonstrate the precision, the new methods are implemented by R language and the hidden Markov model (HMM) is introduced for comparison. The experimental results show that the new methods may not only overcome the shortcomings of HMM, but also have their own advantages. The results indicate that, MEMM is excellent for identifying the conserved signals, and ME-HMM can demonstrably improve the true positive rate.

  7. Adaptive edge image enhancement based on maximum fuzzy entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiu-hua; YANG Kun-tao

    2006-01-01

    Based on the maximum fuzzy entropy principle,the edge image with low contrast is optimally classified into two classes adaptively,under the condition of probability partition and fuzzy partition.The optimal threshold is used as the classified threshold value,and a local parametric gray-level transformation is applied to the obtained classes.By means of two parameters representing,the homogeneity of the regions in edge image is improved.The excellent performance of the proposed technique is exercisable through simulation results on a set of test images.It is shown how the extracted and enhanced edges provide an efficient edge-representation of images.It is shown that the proposed technique possesses excellent performance in homogeneity through simulations on a set of test images,and the extracted and enhanced edges provide an efficient edge-representation of images.

  8. LIBOR troubles: Anomalous movements detection based on maximum entropy

    Science.gov (United States)

    Bariviera, Aurelio F.; Martín, María T.; Plastino, Angelo; Vampa, Victoria

    2016-05-01

    According to the definition of the London Interbank Offered Rate (LIBOR), contributing banks should give fair estimates of their own borrowing costs in the interbank market. Between 2007 and 2009, several banks made inappropriate submissions of LIBOR, sometimes motivated by profit-seeking from their trading positions. In 2012, several newspapers' articles began to cast doubt on LIBOR integrity, leading surveillance authorities to conduct investigations on banks' behavior. Such procedures resulted in severe fines imposed to involved banks, who recognized their financial inappropriate conduct. In this paper, we uncover such unfair behavior by using a forecasting method based on the Maximum Entropy principle. Our results are robust against changes in parameter settings and could be of great help for market surveillance.

  9. Beyond maximum entropy: Fractal Pixon-based image reconstruction

    Science.gov (United States)

    Puetter, Richard C.; Pina, R. K.

    1994-01-01

    We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other competing methods, including Goodness-of-Fit methods such as Least-Squares fitting and Lucy-Richardson reconstruction, as well as Maximum Entropy (ME) methods such as those embodied in the MEMSYS algorithms. Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME. Our past work has shown how uniform information content pixons can be used to develop a 'Super-ME' method in which entropy is maximized exactly. Recently, however, we have developed a superior pixon basis for the image, the Fractal Pixon Basis (FPB). Unlike the Uniform Pixon Basis (UPB) of our 'Super-ME' method, the FPB basis is selected by employing fractal dimensional concepts to assess the inherent structure in the image. The Fractal Pixon Basis results in the best image reconstructions to date, superior to both UPB and the best ME reconstructions. In this paper, we review the theory of the UPB and FPB pixon and apply our methodology to the reconstruction of far-infrared imaging of the galaxy M51. The results of our reconstruction are compared to published reconstructions of the same data using the Lucy-Richardson algorithm, the Maximum Correlation Method developed at IPAC, and the MEMSYS ME algorithms. The results show that our reconstructed image has a spatial resolution a factor of two better than best previous methods (and a factor of 20 finer than the width of the point response function), and detects sources two orders of magnitude fainter than other methods.

  10. Maximum Entropy Fundamentals

    Directory of Open Access Journals (Sweden)

    F. Topsøe

    2001-09-01

    Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over

  11. OIL MONITORING DIAGNOSTIC CRITERIONS BASED ON MAXIMUM ENTROPY PRINCIPLE

    Institute of Scientific and Technical Information of China (English)

    Huo Hua; Li Zhuguo; Xia Yanchun

    2005-01-01

    A method of applying maximum entropy probability density estimation approach to constituting diagnostic criterions of oil monitoring data is presented. The method promotes the precision of diagnostic criterions for evaluating the wear state of mechanical facilities, and judging abnormal data. According to the critical boundary points defined, a new measure on monitoring wear state and identifying probable wear faults can be got. The method can be applied to spectrometric analysis and direct reading ferrographic analysis. On the basis of the analysis and discussion of two examples of 8NVD48A-2U diesel engines, the practicality is proved to be an effective method in oil monitoring.

  12. Hardware/software partitioning based on dynamic combination of maximum entropy and chaos optimization algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hong-lie; ZHANG Guo-yin; YAO Ai-hong

    2010-01-01

    This paper presents an algorithm that combines the chaos optimization algorithm with the maximum entropy(COA-ME)by using entropy model based on chaos algorithm,in which the maximum entropy is used as the second method of searching the excellent solution.The search direction is improved by chaos optimization algorithm and realizes the selective acceptance of wrong solution.The experimental result shows that the presented algorithm can be used in the partitioning of hardware/software of reconfigurable system.It effectively reduces the local extremum problem,and search speed as well as performance of partitioning is improved.

  13. A Load Balancing Algorithm Based on Maximum Entropy Methods in Homogeneous Clusters

    Directory of Open Access Journals (Sweden)

    Long Chen

    2014-10-01

    Full Text Available In order to solve the problems of ill-balanced task allocation, long response time, low throughput rate and poor performance when the cluster system is assigning tasks, we introduce the concept of entropy in thermodynamics into load balancing algorithms. This paper proposes a new load balancing algorithm for homogeneous clusters based on the Maximum Entropy Method (MEM. By calculating the entropy of the system and using the maximum entropy principle to ensure that each scheduling and migration is performed following the increasing tendency of the entropy, the system can achieve the load balancing status as soon as possible, shorten the task execution time and enable high performance. The result of simulation experiments show that this algorithm is more advanced when it comes to the time and extent of the load balance of the homogeneous cluster system compared with traditional algorithms. It also provides novel thoughts of solutions for the load balancing problem of the homogeneous cluster system.

  14. Alternative Multiview Maximum Entropy Discrimination.

    Science.gov (United States)

    Chao, Guoqing; Sun, Shiliang

    2016-07-01

    Maximum entropy discrimination (MED) is a general framework for discriminative estimation based on maximum entropy and maximum margin principles, and can produce hard-margin support vector machines under some assumptions. Recently, the multiview version of MED multiview MED (MVMED) was proposed. In this paper, we try to explore a more natural MVMED framework by assuming two separate distributions p1( Θ1) over the first-view classifier parameter Θ1 and p2( Θ2) over the second-view classifier parameter Θ2 . We name the new MVMED framework as alternative MVMED (AMVMED), which enforces the posteriors of two view margins to be equal. The proposed AMVMED is more flexible than the existing MVMED, because compared with MVMED, which optimizes one relative entropy, AMVMED assigns one relative entropy term to each of the two views, thus incorporating a tradeoff between the two views. We give the detailed solving procedure, which can be divided into two steps. The first step is solving our optimization problem without considering the equal margin posteriors from two views, and then, in the second step, we consider the equal posteriors. Experimental results on multiple real-world data sets verify the effectiveness of the AMVMED, and comparisons with MVMED are also reported.

  15. Generalized Maximum Entropy

    Science.gov (United States)

    Cheeseman, Peter; Stutz, John

    2005-01-01

    A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this problem, advocated by E.T. Jaynes [1], is to ignore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represented by a probability density {e.g: a Gaussian), and this uncertainty yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].

  16. Urban expressway traffic state forecasting based on multimode maximum entropy model

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The accurate and timely traffic state prediction has become increasingly important for the traffic participants,especially for the traffic managements. In this paper,the traffic state is described by Micro-LOS,and a direct prediction method is introduced. The development of the proposed method is based on Maximum Entropy (ME) models trained for multiple modes. In the Multimode Maximum Entropy (MME) framework,the different features like temporal and spatial features of traffic systems,regional traffic state are integrated simultaneously,and the different state behaviors based on 14 traffic modes defined by average speed according to the date-time division are also dealt with. The experiments based on the real data in Beijing expressway prove that the MME models outperforms the already existing model in both effectiveness and robustness.

  17. Applying rough sets in word segmentation disambiguation based on maximum entropy model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    To solve the complicated feature extraction and long distance dependency problem in Word Segmentation Disambiguation ( WSD), this paper proposes to apply rough sets in WSD based on the Maximum Entropy model. Firstly, rough set theory is applied to extract the complicated features and long distance features, even from noise or inconsistent corpus. Secondly, these features are added into the Maximum Entropy model, and consequently, the feature weights can be assigned according to the performance of the whole disambiguation model. Finally, the semantic lexicon is adopted to build class-based rough set features to overcome data sparseness. The experiment indicated that our method performed better than previous models, which got top rank in WSD in 863 Evaluation in 2003. This system ranked first and second respectively in MSR and PKU open test in the Second International Chinese Word Segmentation Bakeoff held in 2005.

  18. Comparison between experiments and predictions based on maximum entropy for sprays from a pressure atomizer

    Science.gov (United States)

    Li, X.; Chin, L. P.; Tankin, R. S.; Jackson, T.; Stutrud, J.; Switzer, G.

    1991-07-01

    Measurements were made of the droplet size and velocity distributions in a hollow cone spray from a pressure atomizer using a phase/Doppler particle analyzer. The maximum entropy principle is used to predict these distributions. The constraints imposed in this model involve conversation of mass, momentum, and energy. Estimates of the source terms associated with these constraints are made based on physical reasoning. Agreement between the measurements and the predictions is very good.

  19. SAR Images Unsupervised Change Detection Based on Combination of Texture Feature Vector with Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    ZHUANG Huifu

    2016-03-01

    Full Text Available Generally, spatial-contextual information would be used in change detection because there is significant speckle noise in synthetic aperture radar(SAR images. In this paper, using the rich texture information of SAR images, an unsupervised change detection approach to high-resolution SAR images based on texture feature vector and maximum entropy principle is proposed. The difference image is generated by using the 32-dimensional texture feature vector of gray-level co-occurrence matrix(GLCM. And the automatic threshold is obtained by maximum entropy principle. In this method, the appropriate window size to change detection is 11×11 according to the regression analysis of window size and precision index. The experimental results show that the proposed approach is better could both reduce the influence of speckle noise and improve the detection accuracy of high-resolution SAR image effectively; and it is better than Markov random field.

  20. Optimal Multi-Level Thresholding Based on Maximum Tsallis Entropy via an Artificial Bee Colony Approach

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2011-04-01

    Full Text Available This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1 the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2 the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid.

  1. A Robust Image Tampering Detection Method Based on Maximum Entropy Criteria

    Directory of Open Access Journals (Sweden)

    Bo Zhao

    2015-12-01

    Full Text Available This paper proposes a novel image watermarking method based on local energy and maximum entropy aiming to improve the robustness. First, the image feature distribution is extracted by employing the local energy model and then it is transformed as a digital watermark by employing a Discrete Cosine Transform (DCT. An offset image is thus obtained according to the difference between the extracted digital watermarking and the feature distribution of the watermarked image. The entropy of the pixel value distribution is computed first. The Lorenz curve is used to measure the polarization degree of the pixel value distribution. In the pixel location distribution flow, the maximum entropy criteria is applied in segmenting the offset image into potentially tampered regions and unchanged regions. All-connected graph and 2-D Gaussian probability are utilized to obtain the probability distribution of the pixel location. Finally, the factitious tampering probability value of a pending detected image is computed through combining the weighting factors of pixel value and pixel location distribution. Experimental results show that the proposed method is more robust against the commonly used image processing operations, such as Gaussian noise, impulse noise, etc. Simultaneously, the proposed method achieves high sensitivity against factitious tampering.

  2. Maximum Entropy in Drug Discovery

    Directory of Open Access Journals (Sweden)

    Chih-Yuan Tseng

    2014-07-01

    Full Text Available Drug discovery applies multidisciplinary approaches either experimentally, computationally or both ways to identify lead compounds to treat various diseases. While conventional approaches have yielded many US Food and Drug Administration (FDA-approved drugs, researchers continue investigating and designing better approaches to increase the success rate in the discovery process. In this article, we provide an overview of the current strategies and point out where and how the method of maximum entropy has been introduced in this area. The maximum entropy principle has its root in thermodynamics, yet since Jaynes’ pioneering work in the 1950s, the maximum entropy principle has not only been used as a physics law, but also as a reasoning tool that allows us to process information in hand with the least bias. Its applicability in various disciplines has been abundantly demonstrated. We give several examples of applications of maximum entropy in different stages of drug discovery. Finally, we discuss a promising new direction in drug discovery that is likely to hinge on the ways of utilizing maximum entropy.

  3. Chaos control of ferroresonance system based on RBF-maximum entropy clustering algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Liu Fan [Key Lab of High Voltage and Electrical New Technology of Ministry of Education, Chongqing University, Chongqing 400044 (China)]. E-mail: liufan2003@yahoo.com.cn; Sun Caixin [Key Lab of High Voltage and Electrical New Technology of Ministry of Education, Chongqing University, Chongqing 400044 (China); Sima Wenxia [Key Lab of High Voltage and Electrical New Technology of Ministry of Education, Chongqing University, Chongqing 400044 (China); Liao Ruijin [Key Lab of High Voltage and Electrical New Technology of Ministry of Education, Chongqing University, Chongqing 400044 (China); Guo Fei [Key Lab of High Voltage and Electrical New Technology of Ministry of Education, Chongqing University, Chongqing 400044 (China)

    2006-09-11

    With regards to the ferroresonance overvoltage of neutral grounded power system, a maximum-entropy learning algorithm based on radial basis function neural networks is used to control the chaotic system. The algorithm optimizes the object function to derive learning rule of central vectors, and uses the clustering function of network hidden layers. It improves the regression and learning ability of neural networks. The numerical experiment of ferroresonance system testifies the effectiveness and feasibility of using the algorithm to control chaos in neutral grounded system.

  4. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    OpenAIRE

    Ge Cheng; Zhenyu Zhang; Moses Ntanda Kyebambe; Nasser Kimbugwe

    2016-01-01

    Predicting the outcome of National Basketball Association (NBA) matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME) model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that...

  5. Multi-frequency synthesis algorithm based on Generalized Maximum Entropy Method. Application to 0954+658

    CERN Document Server

    Bajkova, Anisa T

    2011-01-01

    We propose the multi-frequency synthesis (MFS) algorithm with spectral correction of frequency-dependent source brightness distribution based on maximum entropy method. In order to take into account the spectral terms of n-th order in the Taylor expansion for the frequency-dependent brightness distribution, we use a generalized form of the maximum entropy method suitable for reconstruction of not only positive-definite functions, but also sign-variable ones. The proposed algorithm is aimed at producing both improved total intensity image and two-dimensional spectral index distribution over the source. We consider also the problem of frequency-dependent variation of the radio core positions of self-absorbed active galactic nuclei, which should be taken into account in a correct multi-frequency synthesis. First, the proposed MFS algorithm has been tested on simulated data and then applied to four-frequency synthesis imaging of the radio source 0954+658 from VLBA observational data obtained quasi-simultaneously ...

  6. A novel impact identification algorithm based on a linear approximation with maximum entropy

    Science.gov (United States)

    Sanchez, N.; Meruane, V.; Ortiz-Bernardin, A.

    2016-09-01

    This article presents a novel impact identification algorithm that uses a linear approximation handled by a statistical inference model based on the maximum-entropy principle, termed linear approximation with maximum entropy (LME). Unlike other regression algorithms as artificial neural networks (ANNs) and support vector machines, the proposed algorithm requires only parameter to be selected and the impact is identified after solving a convex optimization problem that has a unique solution. In addition, with LME data is processed in a period of time that is comparable to the one of other algorithms. The performance of the proposed methodology is validated by considering an experimental aluminum plate. Time varying strain data is measured using four piezoceramic sensors bonded to the plate. To demonstrate the potential of the proposed approach over existing ones, results obtained via LME are compared with those of ANN and least square support vector machines. The results demonstrate that with a low number of sensors it is possible to accurately locate and quantify impacts on a structure and that LME outperforms other impact identification algorithms.

  7. Maximum joint entropy and information-based collaboration of automated learning machines

    Science.gov (United States)

    Malakar, N. K.; Knuth, K. H.; Lary, D. J.

    2012-05-01

    We are working to develop automated intelligent agents, which can act and react as learning machines with minimal human intervention. To accomplish this, an intelligent agent is viewed as a question-asking machine, which is designed by coupling the processes of inference and inquiry to form a model-based learning unit. In order to select maximally-informative queries, the intelligent agent needs to be able to compute the relevance of a question. This is accomplished by employing the inquiry calculus, which is dual to the probability calculus, and extends information theory by explicitly requiring context. Here, we consider the interaction between two questionasking intelligent agents, and note that there is a potential information redundancy with respect to the two questions that the agents may choose to pose. We show that the information redundancy is minimized by maximizing the joint entropy of the questions, which simultaneously maximizes the relevance of each question while minimizing the mutual information between them. Maximum joint entropy is therefore an important principle of information-based collaboration, which enables intelligent agents to efficiently learn together.

  8. RESEARCH OF PINYIN-TO-CHARACTER CONVERSION BASED ON MAXIMUM ENTROPY MODEL

    Institute of Scientific and Technical Information of China (English)

    Zhao Yan; Wang Xiaolong; Liu Bingquan; Guan Yi

    2006-01-01

    This paper applied Maximum Entropy (ME) model to Pinyin-To-Character (PTC) conversion instead of Hidden Markov Model (HMM) that could not include complicated and long-distance lexical information. Two ME models were built based on simple and complex templates respectively, and the complex one gave better conversion result. Furthermore, conversion trigger pair of yA → yB/cB was proposed to extract the long-distance constrain feature from the corpus; and then Average Mutual Information (AMI) was used to select conversion trigger pair features which were added to the ME model. The experiment shows that conversion error of the ME with conversion trigger pairs is reduced by 4% on a small training corpus, comparing with HMM smoothed by absolute smoothing.

  9. Image coding based on maximum entropy partitioning for identifying improbable intensities related to facial expressions

    Indian Academy of Sciences (India)

    SEBA SUSAN; NANDINI AGGARWAL; SHEFALI CHAND; AYUSH GUPTA

    2016-12-01

    In this paper we investigate information-theoretic image coding techniques that assign longer codes to improbable, imprecise and non-distinct intensities in the image. The variable length coding techniques when applied to cropped facial images of subjects with different facial expressions, highlight the set of low probability intensities that characterize the facial expression such as the creases in the forehead, the widening of the eyes and the opening and closing of the mouth. A new coding scheme based on maximum entropy partitioning is proposed in our work, particularly to identify the improbable intensities related to different emotions. The improbable intensities when used as a mask decode the facial expression correctly, providing an effectiveplatform for future emotion categorization experiments

  10. Tissue radiation response with maximum Tsallis entropy.

    Science.gov (United States)

    Sotolongo-Grau, O; Rodríguez-Pérez, D; Antoranz, J C; Sotolongo-Costa, Oscar

    2010-10-08

    The expression of survival factors for radiation damaged cells is currently based on probabilistic assumptions and experimentally fitted for each tumor, radiation, and conditions. Here, we show how the simplest of these radiobiological models can be derived from the maximum entropy principle of the classical Boltzmann-Gibbs expression. We extend this derivation using the Tsallis entropy and a cutoff hypothesis, motivated by clinical observations. The obtained expression shows a remarkable agreement with the experimental data found in the literature.

  11. Economics and Maximum Entropy Production

    Science.gov (United States)

    Lorenz, R. D.

    2003-04-01

    Price differentials, sales volume and profit can be seen as analogues of temperature difference, heat flow and work or entropy production in the climate system. One aspect in which economic systems exhibit more clarity than the climate is that the empirical and/or statistical mechanical tendency for systems to seek a maximum in production is very evident in economics, in that the profit motive is very clear. Noting the common link between 1/f noise, power laws and Self-Organized Criticality with Maximum Entropy Production, the power law fluctuations in security and commodity prices is not inconsistent with the analogy. There is an additional thermodynamic analogy, in that scarcity is valued. A commodity concentrated among a few traders is valued highly by the many who do not have it. The market therefore encourages via prices the spreading of those goods among a wider group, just as heat tends to diffuse, increasing entropy. I explore some empirical price-volume relationships of metals and meteorites in this context.

  12. Information Entropy Production of Spatio-Temporal Maximum Entropy Distributions

    CERN Document Server

    Cofre, Rodrigo

    2015-01-01

    Spiking activity from populations of neurons display causal interactions and memory effects. Therefore, they are expected to show some degree of irreversibility in time. Motivated by the spike train statistics, in this paper we build a framework to quantify the degree of irreversibility of any maximum entropy distribution. Our approach is based on the transfer matrix technique, which enables us to find an homogeneous irreducible Markov chain that shares the same maximum entropy measure. We provide relevant examples in the context of spike train statistics

  13. Predicting the Outcome of NBA Playoffs Based on the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Ge Cheng

    2016-12-01

    Full Text Available Predicting the outcome of National Basketball Association (NBA matches poses a challenging problem of interest to the research community as well as the general public. In this article, we formalize the problem of predicting NBA game results as a classification problem and apply the principle of Maximum Entropy to construct an NBA Maximum Entropy (NBAME model that fits to discrete statistics for NBA games, and then predict the outcomes of NBA playoffs using the model. Our results reveal that the model is able to predict the winning team with 74.4% accuracy, outperforming other classical machine learning algorithms that could only afford a maximum prediction accuracy of 70.6% in the experiments that we performed.

  14. Receiver function estimated by maximum entropy deconvolution

    Institute of Scientific and Technical Information of China (English)

    吴庆举; 田小波; 张乃铃; 李卫平; 曾融生

    2003-01-01

    Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient is always less than 1, which keeps maximum entropy deconvolution stable. The maximum entropy of the data outside window increases the resolution of receiver function. Both synthetic and real seismograms show that maximum entropy deconvolution is an effective method to measure receiver function in time-domain.

  15. Maximum Entropy Principle Based Estimation of Performance Distribution in Queueing Theory

    Science.gov (United States)

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated. PMID:25207992

  16. Maximum entropy principle based estimation of performance distribution in queueing theory.

    Science.gov (United States)

    He, Dayi; Li, Ran; Huang, Qi; Lei, Ping

    2014-01-01

    In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated.

  17. Maximum entropy principle based estimation of performance distribution in queueing theory.

    Directory of Open Access Journals (Sweden)

    Dayi He

    Full Text Available In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated.

  18. Duality of Maximum Entropy and Minimum Divergence

    Directory of Open Access Journals (Sweden)

    Shinto Eguchi

    2014-06-01

    Full Text Available We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.

  19. Maximum-entropy clustering algorithm and its global convergence analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Constructing a batch of differentiable entropy functions touniformly approximate an objective function by means of the maximum-entropy principle, a new clustering algorithm, called maximum-entropy clustering algorithm, is proposed based on optimization theory. This algorithm is a soft generalization of the hard C-means algorithm and possesses global convergence. Its relations with other clustering algorithms are discussed.

  20. Zipf's law, power laws and maximum entropy

    Science.gov (United States)

    Visser, Matt

    2013-04-01

    Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.

  1. Zipf's law, power laws, and maximum entropy

    CERN Document Server

    Visser, Matt

    2012-01-01

    Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines - from astronomy to demographics to economics to linguistics to zoology, and even warfare. A recent model of random group formation [RGF] attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present article I argue that the cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.

  2. A strong test of a maximum entropy model of trait-based community assembly.

    Science.gov (United States)

    Shipley, Bill; Laughlin, Daniel C; Sonnier, Grégory; Otfinowski, Rafael

    2011-02-01

    We evaluate the predictive power and generality of Shipley's maximum entropy (maxent) model of community assembly in the context of 96 quadrats over a 120-km2 area having a large (79) species pool and strong gradients. Quadrats were sampled in the herbaceous understory of ponderosa pine forests in the Coconino National Forest, Arizona, U.S.A. The maxent model accurately predicted species relative abundances when observed community-weighted mean trait values were used as model constraints. Although only 53% of the variation in observed relative abundances was associated with a combination of 12 environmental variables, the maxent model based only on the environmental variables provided highly significant predictive ability, accounting for 72% of the variation that was possible given these environmental variables. This predictive ability largely surpassed that of nonmetric multidimensional scaling (NMDS) or detrended correspondence analysis (DCA) ordinations. Using cross-validation with 1000 independent runs, the median correlation between observed and predicted relative abundances was 0.560 (the 2.5% and 97.5% quantiles were 0.045 and 0.825). The qualitative predictions of the model were also noteworthy: dominant species were correctly identified in 53% of the quadrats, 83% of rare species were correctly predicted to have a relative abundance of < 0.05, and the median predicted relative abundance of species actually absent from a quadrat was 5 x 10(-5).

  3. Semisupervised learning for a hybrid generative/discriminative classifier based on the maximum entropy principle.

    Science.gov (United States)

    Fujino, Akinori; Ueda, Naonori; Saito, Kazumi

    2008-03-01

    This paper presents a method for designing semi-supervised classifiers trained on labeled and unlabeled samples. We focus on probabilistic semi-supervised classifier design for multi-class and single-labeled classification problems, and propose a hybrid approach that takes advantage of generative and discriminative approaches. In our approach, we first consider a generative model trained by using labeled samples and introduce a bias correction model, where these models belong to the same model family, but have different parameters. Then, we construct a hybrid classifier by combining these models based on the maximum entropy principle. To enable us to apply our hybrid approach to text classification problems, we employed naive Bayes models as the generative and bias correction models. Our experimental results for four text data sets confirmed that the generalization ability of our hybrid classifier was much improved by using a large number of unlabeled samples for training when there were too few labeled samples to obtain good performance. We also confirmed that our hybrid approach significantly outperformed generative and discriminative approaches when the performance of the generative and discriminative approaches was comparable. Moreover, we examined the performance of our hybrid classifier when the labeled and unlabeled data distributions were different.

  4. Maximum entropy PDF projection: A review

    Science.gov (United States)

    Baggenstoss, Paul M.

    2017-06-01

    We review maximum entropy (MaxEnt) PDF projection, a method with wide potential applications in statistical inference. The method constructs a sampling distribution for a high-dimensional vector x based on knowing the sampling distribution p(z) of a lower-dimensional feature z = T (x). Under mild conditions, the distribution p(x) having highest possible entropy among all distributions consistent with p(z) may be readily found. Furthermore, the MaxEnt p(x) may be sampled, making the approach useful in Monte Carlo methods. We review the theorem and present a case study in model order selection and classification for handwritten character recognition.

  5. Quantile-based Bayesian maximum entropy approach for spatiotemporal modeling of ambient air quality levels.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsin

    2013-02-05

    Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.

  6. Surface Elevation Distribution of Sea Waves Based on the Maximum Entropy Principle

    Institute of Scientific and Technical Information of China (English)

    戴德君; 王伟; 钱成春; 孙孚

    2001-01-01

    A probability density function of surface elevation is obtained through improvement of the method introduced byCieslikiewicz who employed the maximum entropy principle to investigate the surface elevation distribution. The densityfunction can be easily extended to higher order according to demand and is non-negative everywhere, satisfying the basicbehavior of the probability. Moreover because the distribution is derived without any assumption about sea waves, it isfound from comparison with several accepted distributions that the new form of distribution can be applied in a widerrange of wave conditions. In addition, the density function can be used to fit some observed distributions of surface verti-cal acceleration although something remains unsolved.

  7. A Method of LSB substitution based on image blocks and maximum entropy

    Directory of Open Access Journals (Sweden)

    Mohamed Radouane

    2013-01-01

    Full Text Available In this paper we introduce an algorithm of digital watermarking based on embedding watermark into sub images with LSB technique. The watermark is embedded into specifics blocks of the host image, the selection of blocks are based on entropy value. The simulation results show that the visual quality of the watermarked image and the extracted watermark is good, this result is presented and proved by a high PSNR value.

  8. Maximum entropy production in daisyworld

    Science.gov (United States)

    Maunu, Haley A.; Knuth, Kevin H.

    2012-05-01

    Daisyworld was first introduced in 1983 by Watson and Lovelock as a model that illustrates how life can influence a planet's climate. These models typically involve modeling a planetary surface on which black and white daisies can grow thus influencing the local surface albedo and therefore also the temperature distribution. Since then, variations of daisyworld have been applied to study problems ranging from ecological systems to global climate. Much of the interest in daisyworld models is due to the fact that they enable one to study self-regulating systems. These models are nonlinear, and as such they exhibit sensitive dependence on initial conditions, and depending on the specifics of the model they can also exhibit feedback loops, oscillations, and chaotic behavior. Many daisyworld models are thermodynamic in nature in that they rely on heat flux and temperature gradients. However, what is not well-known is whether, or even why, a daisyworld model might settle into a maximum entropy production (MEP) state. With the aim to better understand these systems, this paper will discuss what is known about the role of MEP in daisyworld models.

  9. Study on Droplet Size and Velocity Distributions of a Pressure Swirl Atomizer Based on the Maximum Entropy Formalism

    Directory of Open Access Journals (Sweden)

    Kai Yan

    2015-01-01

    Full Text Available A predictive model for droplet size and velocity distributions of a pressure swirl atomizer has been proposed based on the maximum entropy formalism (MEF. The constraint conditions of the MEF model include the conservation laws of mass, momentum, and energy. The effects of liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio on the droplet size and velocity distributions of a pressure swirl atomizer are investigated. Results show that model based on maximum entropy formalism works well to predict droplet size and velocity distributions under different spray conditions. Liquid swirling strength, Weber number, gas-to-liquid axial velocity ratio and gas-to-liquid density ratio have different effects on droplet size and velocity distributions of a pressure swirl atomizer.

  10. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  11. Generalised maximum entropy and heterogeneous technologies

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.

    1999-01-01

    Generalised maximum entropy methods are used to estimate a dual model of production on panel data of Dutch cash crop farms over the period 1970-1992. The generalised maximum entropy approach allows a coherent system of input demand and output supply equations to be estimated for each farm in the sam

  12. Estimation of Wild Fire Risk Area based on Climate and Maximum Entropy in Korean Peninsular

    Science.gov (United States)

    Kim, T.; Lim, C. H.; Song, C.; Lee, W. K.

    2015-12-01

    The number of forest fires and accompanying human injuries and physical damages has been increased by frequent drought. In this study, forest fire danger zone of Korea is estimated to predict and prepare for future forest fire hazard regions. The MaxEnt (Maximum Entropy) model is used to estimate the forest fire hazard region which estimates the probability distribution of the status. The MaxEnt model is primarily for the analysis of species distribution, but its applicability for various natural disasters is getting recognition. The detailed forest fire occurrence data collected by the MODIS for past 5 years (2010-2014) is used as occurrence data for the model. Also meteorology, topography, vegetation data are used as environmental variable. In particular, various meteorological variables are used to check impact of climate such as annual average temperature, annual precipitation, precipitation of dry season, annual effective humidity, effective humidity of dry season, aridity index. Consequently, the result was valid based on the AUC(Area Under the Curve) value (= 0.805) which is used to predict accuracy in the MaxEnt model. Also predicted forest fire locations were practically corresponded with the actual forest fire distribution map. Meteorological variables such as effective humidity showed the greatest contribution, and topography variables such as TWI (Topographic Wetness Index) and slope also contributed on the forest fire. As a result, the east coast and the south part of Korea peninsula were predicted to have high risk on the forest fire. In contrast, high-altitude mountain area and the west coast appeared to be safe with the forest fire. The result of this study is similar with former studies, which indicates high risks of forest fire in accessible area and reflects climatic characteristics of east and south part in dry season. To sum up, we estimated the forest fire hazard zone with existing forest fire locations and environment variables and had

  13. Counterexamples to convergence theorem of maximum-entropy clustering algorithm

    Institute of Scientific and Technical Information of China (English)

    于剑; 石洪波; 黄厚宽; 孙喜晨; 程乾生

    2003-01-01

    In this paper, we surveyed the development of maximum-entropy clustering algorithm, pointed out that the maximum-entropy clustering algorithm is not new in essence, and constructed two examples to show that the iterative sequence given by the maximum-entropy clustering algorithm may not converge to a local minimum of its objective function, but a saddle point. Based on these results, our paper shows that the convergence theorem of maximum-entropy clustering algorithm put forward by Kenneth Rose et al. does not hold in general cases.

  14. A Maximum Entropy-Based Chaotic Time-Variant Fragile Watermarking Scheme for Image Tampering Detection

    Directory of Open Access Journals (Sweden)

    Guo-Jheng Yang

    2013-08-01

    Full Text Available The fragile watermarking technique is used to protect intellectual property rights while also providing security and rigorous protection. In order to protect the copyright of the creators, it can be implanted in some representative text or totem. Because all of the media on the Internet are digital, protection has become a critical issue, and determining how to use digital watermarks to protect digital media is thus the topic of our research. This paper uses the Logistic map with parameter u = 4 to generate chaotic dynamic behavior with the maximum entropy 1. This approach increases the security and rigor of the protection. The main research target of information hiding is determining how to hide confidential data so that the naked eye cannot see the difference. Next, we introduce one method of information hiding. Generally speaking, if the image only goes through Arnold’s cat map and the Logistic map, it seems to lack sufficient security. Therefore, our emphasis is on controlling Arnold’s cat map and the initial value of the chaos system to undergo small changes and generate different chaos sequences. Thus, the current time is used to not only make encryption more stringent but also to enhance the security of the digital media.

  15. Maximum Joint Entropy and Information-Based Collaboration of Automated Learning Machines

    CERN Document Server

    Malakar, N K; Lary, D J

    2011-01-01

    We are working to develop automated intelligent agents, which can act and react as learning machines with minimal human intervention. To accomplish this, an intelligent agent is viewed as a question-asking machine, which is designed by coupling the processes of inference and inquiry to form a model-based learning unit. In order to select maximally-informative queries, the intelligent agent needs to be able to compute the relevance of a question. This is accomplished by employing the inquiry calculus, which is dual to the probability calculus, and extends information theory by explicitly requiring context. Here, we consider the interaction between two question-asking intelligent agents, and note that there is a potential information redundancy with respect to the two questions that the agents may choose to pose. We show that the information redundancy is minimized by maximizing the joint entropy of the questions, which simultaneously maximizes the relevance of each question while minimizing the mutual informat...

  16. A dual method for maximum entropy restoration

    Science.gov (United States)

    Smith, C. B.

    1979-01-01

    A simple iterative dual algorithm for maximum entropy image restoration is presented. The dual algorithm involves fewer parameters than conventional minimization in the image space. Minicomputer test results for Fourier synthesis with inadequate phantom data are given.

  17. The maximum entropy technique. System's statistical description

    CERN Document Server

    Belashev, B Z

    2002-01-01

    The maximum entropy technique (MENT) is applied for searching the distribution functions of physical values. MENT takes into consideration the demand of maximum entropy, the characteristics of the system and the connection conditions, naturally. It is allowed to apply MENT for statistical description of closed and open systems. The examples in which MENT had been used for the description of the equilibrium and nonequilibrium states and the states far from the thermodynamical equilibrium are considered

  18. A seqlet-based maximum entropy Markov approach for protein secondary structure prediction

    Institute of Scientific and Technical Information of China (English)

    DONG; Qiwen; WANG; Xiaolong; LIN; Lei; GUAN; Yi

    2005-01-01

    A novel method for predicting the secondary structures of proteins from amino acid sequence has been presented. The protein secondary structure seqlets that are analogous to the words in natural language have been extracted. These seqlets will capture the relationship between amino acid sequence and the secondary structures of proteins and further form the protein secondary structure dictionary. To be elaborate, the dictionary is organism-specific. Protein secondary structure prediction is formulated as an integrated word segmentation and part of speech tagging problem. The word-lattice is used to represent the results of the word segmentation and the maximum entropy model is used to calculate the probability of a seqlet tagged as a certain secondary structure type. The method is markovian in the seqlets, permitting efficient exact calculation of the posterior probability distribution over all possible word segmentations and their tags by viterbi algorithm. The optimal segmentations and their tags are computed as the results of protein secondary structure prediction. The method is applied to predict the secondary structures of proteins of four organisms respectively and compared with the PHD method. The results show that the performance of this method is higher than that of PHD by about 3.9% Q3 accuracy and 4.6% SOV accuracy. Combining with the local similarity protein sequences that are obtained by BLAST can give better prediction. The method is also tested on the 50 CASP5 target proteins with Q3 accuracy 78.9% and SOV accuracy 77.1%. A web server for protein secondary structure prediction has been constructed which is available at http://www.insun. hit. edu. cn: 81/demos/biology/index.html.

  19. Droplets diameter distribution using maximum entropy formulation combined with a new energy-based sub-model

    Institute of Scientific and Technical Information of China (English)

    Seyed Mostafa Hosseinalipour; Hadiseh Karimaei; Ehsan Movahednejad

    2016-01-01

    The maximum entropy principle (MEP) is one of the first methods which have been used to predict droplet size and velocity distributions of liquid sprays. This method needs a mean droplets diameter as an input to predict the droplet size distribution. This paper presents a new sub-model based on the deterministic aspects of liquid atom-ization process independent of the experimental data to provide the mean droplets diameter for using in the maximum entropy formulation (MEF). For this purpose, a theoretical model based on the approach of energy conservation law entitled energy-based model (EBM) is presented. Based on this approach, atomization occurs due to the kinetic energy loss. Prediction of the combined model (MEF/EBM) is in good agreement with the avail-able experimental data. The energy-based model can be used as a fast and reliable enough model to obtain a good estimation of the mean droplets diameter of a spray and the combined model (MEF/EBM) can be used to wel predict the droplet size distribution at the primary breakup.

  20. Towards realizable hyperbolic moment closures for viscous heat-conducting gas flows based on a maximum-entropy distribution

    Science.gov (United States)

    McDonald, James G.; Groth, Clinton P. T.

    2013-09-01

    The ability to predict continuum and transition-regime flows by hyperbolic moment methods offers the promise of several advantages over traditional techniques. These methods offer an extended range of physical validity as compared with the Navier-Stokes equations and can be used for the prediction of many non-equilibrium flows with a lower expense than particle-based methods. Also, the hyperbolic first-order nature of the resulting partial differential equations leads to mathematical and numerical advantages. Moment equations generated through an entropy-maximization principle are particularly attractive due to their apparent robustness; however, their application to practical situations involving viscous, heat-conducting gases has been hampered by several issues. Firstly, the lack of closed-form expressions for closing fluxes leads to numerical expense as many integrals of distribution functions must be computed numerically during the course of a flow computation. Secondly, it has been shown that there exist physically realizable moment states for which the entropy-maximizing problem on which the method is based cannot be solved. Following a review of the theory surrounding maximum-entropy moment closures, this paper shows that both of these problems can be addressed in practice, at least for a simplified one-dimensional gas, and that the resulting flow predictions can be surprisingly good. The numerical results described provide significant motivations for the extension of these ideas to the fully three-dimensional case.

  1. Assessing suitable area for Acacia dealbata Mill. in the Ceira River Basin (Central Portugal based on maximum entropy modelling approach

    Directory of Open Access Journals (Sweden)

    Jorge Pereira

    2015-12-01

    Full Text Available Biological invasion by exotic organisms became a key issue, a concern associated to the deep impacts on several domains described as resultant from such processes. A better understanding of the processes, the identification of more susceptible areas, and the definition of preventive or mitigation measures are identified as critical for the purpose of reducing associated impacts. The use of species distribution modeling might help on the purpose of identifying areas that are more susceptible to invasion. This paper aims to present preliminary results on assessing the susceptibility to invasion by the exotic species Acacia dealbata Mill. in the Ceira river basin. The results are based on the maximum entropy modeling approach, considered one of the correlative modelling techniques with better predictive performance. Models which validation is based on independent data sets present better performance, an evaluation based on the AUC of ROC accuracy measure.

  2. Proposed principles of maximum local entropy production.

    Science.gov (United States)

    Ross, John; Corlan, Alexandru D; Müller, Stefan C

    2012-07-12

    Articles have appeared that rely on the application of some form of "maximum local entropy production principle" (MEPP). This is usually an optimization principle that is supposed to compensate for the lack of structural information and measurements about complex systems, even systems as complex and as little characterized as the whole biosphere or the atmosphere of the Earth or even of less known bodies in the solar system. We select a number of claims from a few well-known papers that advocate this principle and we show that they are in error with the help of simple examples of well-known chemical and physical systems. These erroneous interpretations can be attributed to ignoring well-established and verified theoretical results such as (1) entropy does not necessarily increase in nonisolated systems, such as "local" subsystems; (2) macroscopic systems, as described by classical physics, are in general intrinsically deterministic-there are no "choices" in their evolution to be selected by using supplementary principles; (3) macroscopic deterministic systems are predictable to the extent to which their state and structure is sufficiently well-known; usually they are not sufficiently known, and probabilistic methods need to be employed for their prediction; and (4) there is no causal relationship between the thermodynamic constraints and the kinetics of reaction systems. In conclusion, any predictions based on MEPP-like principles should not be considered scientifically founded.

  3. Maximum entropy model for business cycle synchronization

    Science.gov (United States)

    Xi, Ning; Muneepeerakul, Rachata; Azaele, Sandro; Wang, Yougui

    2014-11-01

    The global economy is a complex dynamical system, whose cyclical fluctuations can mainly be characterized by simultaneous recessions or expansions of major economies. Thus, the researches on the synchronization phenomenon are key to understanding and controlling the dynamics of the global economy. Based on a pairwise maximum entropy model, we analyze the business cycle synchronization of the G7 economic system. We obtain a pairwise-interaction network, which exhibits certain clustering structure and accounts for 45% of the entire structure of the interactions within the G7 system. We also find that the pairwise interactions become increasingly inadequate in capturing the synchronization as the size of economic system grows. Thus, higher-order interactions must be taken into account when investigating behaviors of large economic systems.

  4. The maximum entropy production principle: two basic questions.

    Science.gov (United States)

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  5. Nonparametric Maximum Entropy Estimation on Information Diagrams

    CERN Document Server

    Martin, Elliot A; Meinke, Alexander; Děchtěrenko, Filip; Davidsen, Jörn

    2016-01-01

    Maximum entropy estimation is of broad interest for inferring properties of systems across many different disciplines. In this work, we significantly extend a technique we previously introduced for estimating the maximum entropy of a set of random discrete variables when conditioning on bivariate mutual informations and univariate entropies. Specifically, we show how to apply the concept to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on. This allows us to establish a number of significant advantages of our approach over existing ones. Not only does our method perform favorably in the undersampled regime, where existing methods fail, but it also can be dramatically less computationally expensive as the cardinality of the variables increases. In addition, we propose a nonparametric formulation of connected informations and give an illustrative example showing how this agrees with the existing parametric formulation in cases of interest. We furthe...

  6. Maximum Entropy-Based Ecological Niche Model and Bio-Climatic Determinants of Lone Star Tick (Amblyomma americanum) Niche.

    Science.gov (United States)

    Raghavan, Ram K; Goodin, Douglas G; Hanzlicek, Gregg A; Zolnerowich, Gregory; Dryden, Michael W; Anderson, Gary A; Ganta, Roman R

    2016-03-01

    The potential distribution of Amblyomma americanum ticks in Kansas was modeled using maximum entropy (MaxEnt) approaches based on museum and field-collected species occurrence data. Various bioclimatic variables were used in the model as potentially influential factors affecting the A. americanum niche. Following reduction of dimensionality among predictor variables using principal components analysis, which revealed that the first two principal axes explain over 87% of the variance, the model indicated that suitable conditions for this medically important tick species cover a larger area in Kansas than currently believed. Soil moisture, temperature, and precipitation were highly correlated with the first two principal components and were influential factors in the A. americanum ecological niche. Assuming that the niche estimated in this study covers the occupied distribution, which needs to be further confirmed by systematic surveys, human exposure to this known disease vector may be considerably under-appreciated in the state.

  7. A parametrization of two-dimensional turbulence based on a maximum entropy production principle with a local conservation of energy

    CERN Document Server

    Chavanis, Pierre-Henri

    2014-01-01

    In the context of two-dimensional (2D) turbulence, we apply the maximum entropy production principle (MEPP) by enforcing a local conservation of energy. This leads to an equation for the vorticity distribution that conserves all the Casimirs, the energy, and that increases monotonically the mixing entropy ($H$-theorem). Furthermore, the equation for the coarse-grained vorticity dissipates monotonically all the generalized enstrophies. These equations may provide a parametrization of 2D turbulence. They do not generally relax towards the maximum entropy state. The vorticity current vanishes for any steady state of the 2D Euler equation. Interestingly, the equation for the coarse-grained vorticity obtained from the MEPP turns out to coincide, after some algebraic manipulations, with the one obtained with the anticipated vorticity method. This shows a connection between these two approaches when the conservation of energy is treated locally. Furthermore, the newly derived equation, which incorporates a diffusion...

  8. [Study on the maximum entropy principle and population genetic equilibrium].

    Science.gov (United States)

    Zhang, Hong-Li; Zhang, Hong-Yan

    2006-03-01

    A general mathematic model of population genetic equilibrium about one locus was constructed based on the maximum entropy principle by WANG Xiao-Long et al. They proved that the maximum solve of the model was just the frequency distribution that a population reached Hardy-Weinberg genetic equilibrium. It can suggest that a population reached Hardy-Weinberg genetic equilibrium when the genotype entropy of the population reached the maximal possible value, and that the frequency distribution of the maximum entropy was equivalent to the distribution of Hardy-Weinberg equilibrium law about one locus. They further assumed that the frequency distribution of the maximum entropy was equivalent to all genetic equilibrium distributions. This is incorrect, however. The frequency distribution of the maximum entropy was only equivalent to the distribution of Hardy-Weinberg equilibrium with respect to one locus or several limited loci. The case with regard to limited loci was proved in this paper. Finally we also discussed an example where the maximum entropy principle was not the equivalent of other genetic equilibria.

  9. Maximum entropy production and the fluctuation theorem

    Energy Technology Data Exchange (ETDEWEB)

    Dewar, R C [Unite EPHYSE, INRA Centre de Bordeaux-Aquitaine, BP 81, 33883 Villenave d' Ornon Cedex (France)

    2005-05-27

    Recently the author used an information theoretical formulation of non-equilibrium statistical mechanics (MaxEnt) to derive the fluctuation theorem (FT) concerning the probability of second law violating phase-space paths. A less rigorous argument leading to the variational principle of maximum entropy production (MEP) was also given. Here a more rigorous and general mathematical derivation of MEP from MaxEnt is presented, and the relationship between MEP and the FT is thereby clarified. Specifically, it is shown that the FT allows a general orthogonality property of maximum information entropy to be extended to entropy production itself, from which MEP then follows. The new derivation highlights MEP and the FT as generic properties of MaxEnt probability distributions involving anti-symmetric constraints, independently of any physical interpretation. Physically, MEP applies to the entropy production of those macroscopic fluxes that are free to vary under the imposed constraints, and corresponds to selection of the most probable macroscopic flux configuration. In special cases MaxEnt also leads to various upper bound transport principles. The relationship between MaxEnt and previous theories of irreversible processes due to Onsager, Prigogine and Ziegler is also clarified in the light of these results. (letter to the editor)

  10. Maximum-entropy description of animal movement.

    Science.gov (United States)

    Fleming, Chris H; Subaşı, Yiğit; Calabrese, Justin M

    2015-03-01

    We introduce a class of maximum-entropy states that naturally includes within it all of the major continuous-time stochastic processes that have been applied to animal movement, including Brownian motion, Ornstein-Uhlenbeck motion, integrated Ornstein-Uhlenbeck motion, a recently discovered hybrid of the previous models, and a new model that describes central-place foraging. We are also able to predict a further hierarchy of new models that will emerge as data quality improves to better resolve the underlying continuity of animal movement. Finally, we also show that Langevin equations must obey a fluctuation-dissipation theorem to generate processes that fall from this class of maximum-entropy distributions when the constraints are purely kinematic.

  11. Automatic maximum entropy spectral reconstruction in NMR.

    Science.gov (United States)

    Mobli, Mehdi; Maciejewski, Mark W; Gryk, Michael R; Hoch, Jeffrey C

    2007-10-01

    Developments in superconducting magnets, cryogenic probes, isotope labeling strategies, and sophisticated pulse sequences together have enabled the application, in principle, of high-resolution NMR spectroscopy to biomolecular systems approaching 1 megadalton. In practice, however, conventional approaches to NMR that utilize the fast Fourier transform, which require data collected at uniform time intervals, result in prohibitively lengthy data collection times in order to achieve the full resolution afforded by high field magnets. A variety of approaches that involve nonuniform sampling have been proposed, each utilizing a non-Fourier method of spectrum analysis. A very general non-Fourier method that is capable of utilizing data collected using any of the proposed nonuniform sampling strategies is maximum entropy reconstruction. A limiting factor in the adoption of maximum entropy reconstruction in NMR has been the need to specify non-intuitive parameters. Here we describe a fully automated system for maximum entropy reconstruction that requires no user-specified parameters. A web-accessible script generator provides the user interface to the system.

  12. Maximum entropy analysis of EGRET data

    DEFF Research Database (Denmark)

    Pohl, M.; Strong, A.W.

    1997-01-01

    EGRET data are usually analysed on the basis of the Maximum-Likelihood method \\cite{ma96} in a search for point sources in excess to a model for the background radiation (e.g. \\cite{hu97}). This method depends strongly on the quality of the background model, and thus may have high systematic unce...... uncertainties in region of strong and uncertain background like the Galactic Center region. Here we show images of such regions obtained by the quantified Maximum-Entropy method. We also discuss a possible further use of MEM in the analysis of problematic regions of the sky....

  13. Stable discretization of the Boltzmann equation based on spherical harmonics, box integration, and a maximum entropy dissipation principle

    Science.gov (United States)

    Jungemann, C.; Pham, A. T.; Meinerzhagen, B.; Ringhofer, C.; Bollhöfer, M.

    2006-07-01

    The Boltzmann equation for transport in semiconductors is projected onto spherical harmonics in such a way that the resultant balance equations for the coefficients of the distribution function times the generalized density of states can be discretized over energy and real spaces by box integration. This ensures exact current continuity for the discrete equations. Spurious oscillations of the distribution function are suppressed by stabilization based on a maximum entropy dissipation principle avoiding the H transformation. The derived formulation can be used on arbitrary grids as long as box integration is possible. The approach works not only with analytical bands but also with full band structures in the case of holes. Results are presented for holes in bulk silicon based on a full band structure and electrons in a Si NPN bipolar junction transistor. The convergence of the spherical harmonics expansion is shown for a device, and it is found that the quasiballistic transport in nanoscale devices requires an expansion of considerably higher order than the usual first one. The stability of the discretization is demonstrated for a range of grid spacings in the real space and bias points which produce huge gradients in the electron density and electric field. It is shown that the resultant large linear system of equations can be solved in a memory efficient way by the numerically robust package ILUPACK.

  14. METSP: a maximum-entropy classifier based text mining tool for transporter-substrate identification with semistructured text.

    Science.gov (United States)

    Zhao, Min; Chen, Yanming; Qu, Dacheng; Qu, Hong

    2015-01-01

    The substrates of a transporter are not only useful for inferring function of the transporter, but also important to discover compound-compound interaction and to reconstruct metabolic pathway. Though plenty of data has been accumulated with the developing of new technologies such as in vitro transporter assays, the search for substrates of transporters is far from complete. In this article, we introduce METSP, a maximum-entropy classifier devoted to retrieve transporter-substrate pairs (TSPs) from semistructured text. Based on the high quality annotation from UniProt, METSP achieves high precision and recall in cross-validation experiments. When METSP is applied to 182,829 human transporter annotation sentences in UniProt, it identifies 3942 sentences with transporter and compound information. Finally, 1547 confidential human TSPs are identified for further manual curation, among which 58.37% pairs with novel substrates not annotated in public transporter databases. METSP is the first efficient tool to extract TSPs from semistructured annotation text in UniProt. This tool can help to determine the precise substrates and drugs of transporters, thus facilitating drug-target prediction, metabolic network reconstruction, and literature classification.

  15. Projective Power Entropy and Maximum Tsallis Entropy Distributions

    OpenAIRE

    Shinto Eguchi; Shogo Kato; Osamu Komori

    2011-01-01

    We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index...

  16. A parametrization of two-dimensional turbulence based on a maximum entropy production principle with a local conservation of energy

    Energy Technology Data Exchange (ETDEWEB)

    Chavanis, Pierre-Henri, E-mail: chavanis@irsamc.ups-tlse.fr [Laboratoire de Physique Théorique, Université Paul Sabatier, 118 route de Narbonne, F-31062 Toulouse (France)

    2014-12-01

    In the context of two-dimensional (2D) turbulence, we apply the maximum entropy production principle (MEPP) by enforcing a local conservation of energy. This leads to an equation for the vorticity distribution that conserves all the Casimirs, the energy, and that increases monotonically the mixing entropy (H-theorem). Furthermore, the equation for the coarse-grained vorticity dissipates monotonically all the generalized enstrophies. These equations may provide a parametrization of 2D turbulence. They do not generally relax towards the maximum entropy state. The vorticity current vanishes for any steady state of the 2D Euler equation. Interestingly, the equation for the coarse-grained vorticity obtained from the MEPP turns out to coincide, after some algebraic manipulations, with the one obtained with the anticipated vorticity method. This shows a connection between these two approaches when the conservation of energy is treated locally. Furthermore, the newly derived equation, which incorporates a diffusion term and a drift term, has a nice physical interpretation in terms of a selective decay principle. This sheds new light on both the MEPP and the anticipated vorticity method. (paper)

  17. Maximum entropy signal restoration with linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Mastin, G.A.; Hanson, R.J.

    1988-05-01

    Dantzig's bounded-variable method is used to express the maximum entropy restoration problem as a linear programming problem. This is done by approximating the nonlinear objective function with piecewise linear segments, then bounding the variables as a function of the number of segments used. The use of a linear programming approach allows equality constraints found in the traditional Lagrange multiplier method to be relaxed. A robust revised simplex algorithm is used to implement the restoration. Experimental results from 128- and 512-point signal restorations are presented.

  18. Dynamical maximum entropy approach to flocking

    Science.gov (United States)

    Cavagna, Andrea; Giardina, Irene; Ginelli, Francesco; Mora, Thierry; Piovani, Duccio; Tavarone, Raffaele; Walczak, Aleksandra M.

    2014-04-01

    We derive a new method to infer from data the out-of-equilibrium alignment dynamics of collectively moving animal groups, by considering the maximum entropy model distribution consistent with temporal and spatial correlations of flight direction. When bird neighborhoods evolve rapidly, this dynamical inference correctly learns the parameters of the model, while a static one relying only on the spatial correlations fails. When neighbors change slowly and the detailed balance is satisfied, we recover the static procedure. We demonstrate the validity of the method on simulated data. The approach is applicable to other systems of active matter.

  19. A discussion on maximum entropy production and information theory

    Energy Technology Data Exchange (ETDEWEB)

    Bruers, Stijn [Instituut voor Theoretische Fysica, Celestijnenlaan 200D, Katholieke Universiteit Leuven, B-3001 Leuven (Belgium)

    2007-07-06

    We will discuss the maximum entropy production (MaxEP) principle based on Jaynes' information theoretical arguments, as was done by Dewar (2003 J. Phys. A: Math. Gen. 36 631-41, 2005 J. Phys. A: Math. Gen. 38 371-81). With the help of a simple mathematical model of a non-equilibrium system, we will show how to derive minimum and maximum entropy production. Furthermore, the model will help us to clarify some confusing points and to see differences between some MaxEP studies in the literature.

  20. On the maximum entropy principle in non-extensive thermostatistics

    OpenAIRE

    Naudts, Jan

    2004-01-01

    It is possible to derive the maximum entropy principle from thermodynamic stability requirements. Using as a starting point the equilibrium probability distribution, currently used in non-extensive thermostatistics, it turns out that the relevant entropy function is Renyi's alpha-entropy, and not Tsallis' entropy.

  1. Exploiting the Maximum Entropy Principle to Increase Retrieval Effectiveness.

    Science.gov (United States)

    Cooper, William S.

    1983-01-01

    Presents information retrieval design approach in which queries of computer-based system consist of sets of terms, either unweighted or weighted with subjective term precision estimates, and retrieval outputs ranked by probability of usefulness estimated by "maximum entropy principle." Boolean and weighted request systems are discussed.…

  2. Influence of Pareto optimality on the maximum entropy methods

    Science.gov (United States)

    Peddavarapu, Sreehari; Sunil, Gujjalapudi Venkata Sai; Raghuraman, S.

    2017-07-01

    Galerkin meshfree schemes are emerging as a viable substitute to finite element method to solve partial differential equations for the large deformations as well as crack propagation problems. However, the introduction of Shanon-Jayne's entropy principle in to the scattered data approximation has deviated from the trend of defining the approximation functions, resulting in maximum entropy approximants. Further in addition to this, an objective functional which controls the degree of locality resulted in Local maximum entropy approximants. These are based on information-theoretical Pareto optimality between entropy and degree of locality that are defining the basis functions to the scattered nodes. The degree of locality in turn relies on the choice of locality parameter and prior (weight) function. The proper choices of both plays vital role in attain the desired accuracy. Present work is focused on the choice of locality parameter which defines the degree of locality and priors: Gaussian, Cubic spline and quartic spline functions on the behavior of local maximum entropy approximants.

  3. Weak Scale From the Maximum Entropy Principle

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2015-01-01

    The theory of multiverse and wormholes suggests that the parameters of the Standard Model are fixed in such a way that the radiation of the $S^{3}$ universe at the final stage $S_{rad}$ becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the Standard Model, we can check whether $S_{rad}$ actually becomes maximum at the observed values. In this paper, we regard $S_{rad}$ at the final stage as a function of the weak scale ( the Higgs expectation value ) $v_{h}$, and show that it becomes maximum around $v_{h}={\\cal{O}}(300\\text{GeV})$ when the dimensionless couplings in the Standard Model, that is, the Higgs self coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by \\begin{equation} v_{h}\\sim\\frac{T_{BBN}^{2}}{M_{pl}y_{e}^{5}},\

  4. Weak scale from the maximum entropy principle

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2015-03-01

    The theory of the multiverse and wormholes suggests that the parameters of the Standard Model (SM) are fixed in such a way that the radiation of the S3 universe at the final stage S_rad becomes maximum, which we call the maximum entropy principle. Although it is difficult to confirm this principle generally, for a few parameters of the SM, we can check whether S_rad actually becomes maximum at the observed values. In this paper, we regard S_rad at the final stage as a function of the weak scale (the Higgs expectation value) vh, and show that it becomes maximum around vh = {{O}} (300 GeV) when the dimensionless couplings in the SM, i.e., the Higgs self-coupling, the gauge couplings, and the Yukawa couplings are fixed. Roughly speaking, we find that the weak scale is given by vh ˜ T_{BBN}2 / (M_{pl}ye5), where ye is the Yukawa coupling of electron, T_BBN is the temperature at which the Big Bang nucleosynthesis starts, and M_pl is the Planck mass.

  5. Maximum entropy reconstruction of spin densities involving non uniform prior

    Energy Technology Data Exchange (ETDEWEB)

    Schweizer, J.; Ressouche, E. [DRFMC/SPSMS/MDN CEA-Grenoble (France); Papoular, R.J. [CEA-Saclay, Gif sur Yvette (France). Lab. Leon Brillouin; Tasset, F. [Inst. Laue Langevin, Grenoble (France); Zheludev, A.I. [Brookhaven National Lab., Upton, NY (United States). Physics Dept.

    1997-09-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m({rvec r}), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for {rho}({rvec r}) = m({rvec r}). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing.

  6. Maximum entropy production and plant optimization theories.

    Science.gov (United States)

    Dewar, Roderick C

    2010-05-12

    Plant ecologists have proposed a variety of optimization theories to explain the adaptive behaviour and evolution of plants from the perspective of natural selection ('survival of the fittest'). Optimization theories identify some objective function--such as shoot or canopy photosynthesis, or growth rate--which is maximized with respect to one or more plant functional traits. However, the link between these objective functions and individual plant fitness is seldom quantified and there remains some uncertainty about the most appropriate choice of objective function to use. Here, plants are viewed from an alternative thermodynamic perspective, as members of a wider class of non-equilibrium systems for which maximum entropy production (MEP) has been proposed as a common theoretical principle. I show how MEP unifies different plant optimization theories that have been proposed previously on the basis of ad hoc measures of individual fitness--the different objective functions of these theories emerge as examples of entropy production on different spatio-temporal scales. The proposed statistical explanation of MEP, that states of MEP are by far the most probable ones, suggests a new and extended paradigm for biological evolution--'survival of the likeliest'--which applies from biomacromolecules to ecosystems, not just to individuals.

  7. Maximum entropy principle and texture formation

    CERN Document Server

    Arminjon, M; Arminjon, Mayeul; Imbault, Didier

    2006-01-01

    The macro-to-micro transition in a heterogeneous material is envisaged as the selection of a probability distribution by the Principle of Maximum Entropy (MAXENT). The material is made of constituents, e.g. given crystal orientations. Each constituent is itself made of a large number of elementary constituents. The relevant probability is the volume fraction of the elementary constituents that belong to a given constituent and undergo a given stimulus. Assuming only obvious constraints in MAXENT means describing a maximally disordered material. This is proved to have the same average stimulus in each constituent. By adding a constraint in MAXENT, a new model, potentially interesting e.g. for texture prediction, is obtained.

  8. Video segmentation using Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    QIN Li-juan; ZHUANG Yue-ting; PAN Yun-he; WU Fei

    2005-01-01

    Detecting objects of interest from a video sequence is a fundamental and critical task in automated visual surveillance.Most current approaches only focus on discriminating moving objects by background subtraction whether or not the objects of interest can be moving or stationary. In this paper, we propose layers segmentation to detect both moving and stationary target objects from surveillance video. We extend the Maximum Entropy (ME) statistical model to segment layers with features, which are collected by constructing a codebook with a set of codewords for each pixel. We also indicate how the training models are used for the discrimination of target objects in surveillance video. Our experimental results are presented in terms of the success rate and the segmenting precision.

  9. Maximum entropy analysis of cosmic ray composition

    CERN Document Server

    Nosek, Dalibor; Vícha, Jakub; Trávníček, Petr; Nosková, Jana

    2016-01-01

    We focus on the primary composition of cosmic rays with the highest energies that cause extensive air showers in the Earth's atmosphere. A way of examining the two lowest order moments of the sample distribution of the depth of shower maximum is presented. The aim is to show that useful information about the composition of the primary beam can be inferred with limited knowledge we have about processes underlying these observations. In order to describe how the moments of the depth of shower maximum depend on the type of primary particles and their energies, we utilize a superposition model. Using the principle of maximum entropy, we are able to determine what trends in the primary composition are consistent with the input data, while relying on a limited amount of information from shower physics. Some capabilities and limitations of the proposed method are discussed. In order to achieve a realistic description of the primary mass composition, we pay special attention to the choice of the parameters of the sup...

  10. A Maximum Entropy Estimator for the Aggregate Hierarchical Logit Model

    Directory of Open Access Journals (Sweden)

    Pedro Donoso

    2011-08-01

    Full Text Available A new approach for estimating the aggregate hierarchical logit model is presented. Though usually derived from random utility theory assuming correlated stochastic errors, the model can also be derived as a solution to a maximum entropy problem. Under the latter approach, the Lagrange multipliers of the optimization problem can be understood as parameter estimators of the model. Based on theoretical analysis and Monte Carlo simulations of a transportation demand model, it is demonstrated that the maximum entropy estimators have statistical properties that are superior to classical maximum likelihood estimators, particularly for small or medium-size samples. The simulations also generated reduced bias in the estimates of the subjective value of time and consumer surplus.

  11. An Interval Maximum Entropy Method for Quadratic Programming Problem

    Institute of Scientific and Technical Information of China (English)

    RUI Wen-juan; CAO De-xin; SONG Xie-wu

    2005-01-01

    With the idea of maximum entropy function and penalty function methods, we transform the quadratic programming problem into an unconstrained differentiable optimization problem, discuss the interval extension of the maximum entropy function, provide the region deletion test rules and design an interval maximum entropy algorithm for quadratic programming problem. The convergence of the method is proved and numerical results are presented. Both theoretical and numerical results show that the method is reliable and efficient.

  12. Maximum Entropy Production and Non-Gaussian Climate Variability

    CERN Document Server

    Sura, Philip

    2016-01-01

    Earth's atmosphere is in a state far from thermodynamic equilibrium. For example, the large scale equator-to-pole temperature gradient is maintained by tropical heating, polar cooling, and a midlatitude meridional eddy heat flux predominantly driven by baroclinically unstable weather systems. Based on basic thermodynamic principles, it can be shown that the meridional heat flux, in combination with the meridional temperature gradient, acts to maximize entropy production of the atmosphere. In fact, maximum entropy production (MEP) has been successfully used to explain the observed mean state of the atmosphere and other components of the climate system. However, one important feature of the large scale atmospheric circulation is its often non-Gaussian variability about the mean. This paper presents theoretical and observational evidence that some processes in the midlatitude atmosphere are significantly non-Gaussian to maximize entropy production. First, after introducing the basic theory, it is shown that the ...

  13. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    1994-01-01

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the est

  14. Statistical optimization for passive scalar transport: maximum entropy production vs. maximum Kolmogorov–Sinay entropy

    Directory of Open Access Journals (Sweden)

    M. Mihelich

    2014-11-01

    Full Text Available We derive rigorous results on the link between the principle of maximum entropy production and the principle of maximum Kolmogorov–Sinai entropy using a Markov model of the passive scalar diffusion called the Zero Range Process. We show analytically that both the entropy production and the Kolmogorov–Sinai entropy seen as functions of f admit a unique maximum denoted fmaxEP and fmaxKS. The behavior of these two maxima is explored as a function of the system disequilibrium and the system resolution N. The main result of this article is that fmaxEP and fmaxKS have the same Taylor expansion at first order in the deviation of equilibrium. We find that fmaxEP hardly depends on N whereas fmaxKS depends strongly on N. In particular, for a fixed difference of potential between the reservoirs, fmaxEP(N tends towards a non-zero value, while fmaxKS(N tends to 0 when N goes to infinity. For values of N typical of that adopted by Paltridge and climatologists (N ≈ 10 ~ 100, we show that fmaxEP and fmaxKS coincide even far from equilibrium. Finally, we show that one can find an optimal resolution N* such that fmaxEP and fmaxKS coincide, at least up to a second order parameter proportional to the non-equilibrium fluxes imposed to the boundaries. We find that the optimal resolution N* depends on the non equilibrium fluxes, so that deeper convection should be represented on finer grids. This result points to the inadequacy of using a single grid for representing convection in climate and weather models. Moreover, the application of this principle to passive scalar transport parametrization is therefore expected to provide both the value of the optimal flux, and of the optimal number of degrees of freedom (resolution to describe the system.

  15. Incorporating Linguistic Structure into Maximum Entropy Language Models

    Institute of Scientific and Technical Information of China (English)

    FANG GaoLin(方高林); GAO Wen(高文); WANG ZhaoQi(王兆其)

    2003-01-01

    In statistical language models, how to integrate diverse linguistic knowledge in a general framework for long-distance dependencies is a challenging issue. In this paper, an improved language model incorporating linguistic structure into maximum entropy framework is presented.The proposed model combines trigram with the structure knowledge of base phrase in which trigram is used to capture the local relation between words, while the structure knowledge of base phrase is considered to represent the long-distance relations between syntactical structures. The knowledge of syntax, semantics and vocabulary is integrated into the maximum entropy framework.Experimental results show that the proposed model improves by 24% for language model perplexity and increases about 3% for sign language recognition rate compared with the trigram model.

  16. Metabolic networks evolve towards states of maximum entropy production.

    Science.gov (United States)

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles.

  17. Precipitation Interpolation by Multivariate Bayesian Maximum Entropy Based on Meteorological Data in Yun- Gui-Guang region, Mainland China

    Science.gov (United States)

    Wang, Chaolin; Zhong, Shaobo; Zhang, Fushen; Huang, Quanyi

    2016-11-01

    Precipitation interpolation has been a hot area of research for many years. It had close relation to meteorological factors. In this paper, precipitation from 91 meteorological stations located in and around Yunnan, Guizhou and Guangxi Zhuang provinces (or autonomous region), Mainland China was taken into consideration for spatial interpolation. Multivariate Bayesian maximum entropy (BME) method with auxiliary variables, including mean relative humidity, water vapour pressure, mean temperature, mean wind speed and terrain elevation, was used to get more accurate regional distribution of annual precipitation. The means, standard deviations, skewness and kurtosis of meteorological factors were calculated. Variogram and cross- variogram were fitted between precipitation and auxiliary variables. The results showed that the multivariate BME method was precise with hard and soft data, probability density function. Annual mean precipitation was positively correlated with mean relative humidity, mean water vapour pressure, mean temperature and mean wind speed, negatively correlated with terrain elevation. The results are supposed to provide substantial reference for research of drought and waterlog in the region.

  18. Collective behaviours in the stock market -- A maximum entropy approach

    CERN Document Server

    Bury, Thomas

    2014-01-01

    Scale invariance, collective behaviours and structural reorganization are crucial for portfolio management (portfolio composition, hedging, alternative definition of risk, etc.). This lack of any characteristic scale and such elaborated behaviours find their origin in the theory of complex systems. There are several mechanisms which generate scale invariance but maximum entropy models are able to explain both scale invariance and collective behaviours. The study of the structure and collective modes of financial markets attracts more and more attention. It has been shown that some agent based models are able to reproduce some stylized facts. Despite their partial success, there is still the problem of rules design. In this work, we used a statistical inverse approach to model the structure and co-movements in financial markets. Inverse models restrict the number of assumptions. We found that a pairwise maximum entropy model is consistent with the data and is able to describe the complex structure of financial...

  19. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...

  20. Application of Maximum Entropy Distribution to the Statistical Properties of Wave Groups

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The new distributions of the statistics of wave groups based on the maximum entropy principle are presented. The maximum entropy distributions appear to be superior to conventional distributions when applied to a limited amount of information. Its applications to the wave group properties show the effectiveness of the maximum entropy distribution. FFT filtering method is employed to obtain the wave envelope fast and efficiently. Comparisons of both the maximum entropy distribution and the distribution of Longuet-Higgins (1984) with the laboratory wind-wave data show that the former gives a better fit.

  1. Maximum entropy models of ecosystem functioning

    Energy Technology Data Exchange (ETDEWEB)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au [Research School of Biology, The Australian National University, Canberra ACT 0200 (Australia)

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.

  2. Maximum entropy models of ecosystem functioning

    Science.gov (United States)

    Bertram, Jason

    2014-12-01

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes' broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.

  3. Microcanonical origin of the maximum entropy principle for open systems.

    Science.gov (United States)

    Lee, Julian; Pressé, Steve

    2012-10-01

    There are two distinct approaches for deriving the canonical ensemble. The canonical ensemble either follows as a special limit of the microcanonical ensemble or alternatively follows from the maximum entropy principle. We show the equivalence of these two approaches by applying the maximum entropy formulation to a closed universe consisting of an open system plus bath. We show that the target function for deriving the canonical distribution emerges as a natural consequence of partial maximization of the entropy over the bath degrees of freedom alone. By extending this mathematical formalism to dynamical paths rather than equilibrium ensembles, the result provides an alternative justification for the principle of path entropy maximization as well.

  4. Maximum Entropy Threshold Segmentation Algorithm Based on 2D-WLDH%基于2D-WLDH的最大熵阈值分割算法

    Institute of Scientific and Technical Information of China (English)

    邹小林

    2012-01-01

    The traditional 2D maximum entropy threshold segmentation algorithm has an inadequately reasonable assumption that the sum of probabilities of main-diagonal distinct is approximately one in the 2D histogram and the algorithm is time-consuming. Aiming at this problem, a new maximum entropy segmentation algorithm is proposed in this paper. Based on gray level and Weber Local Descriptors(WLD), it constructs a 2D WLD Histogram(2D-WLDH), and applies it to the maximum entropy threshold segmentation. In order to further improve the speed of the proposed algorithm, the fast recursive algorithm is deduced. Experimental results show that, compared with existing corresponding algorithms, the proposed algorithm can reduce the running time and achieve better segmentation quality.%在传统二维最大熵图像阈值分割算法中,二维直方图主对角区域的概率和近似为1的假设不够合理,且算法耗时较多.为此,提出一种新的最大熵分割算法.根据灰度级和韦伯局部描述子(WLD)建立二维WLD直方图(2D-WLDH),将其用于最大熵的阈值分割,并设计快速递推算法,以提高运行速度.实验结果表明,该算法的运行时间较少,分割效果较好.

  5. Maximum entropy production in environmental and ecological systems.

    Science.gov (United States)

    Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M

    2010-05-12

    The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.

  6. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems

    CERN Document Server

    Hanel, Rudolf; Gell-Mann, Murray

    2014-01-01

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems, by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there exists an ongoing controversy whether the notion of the maximum entropy principle can be extended in a meaningful way to non-extensive, non-ergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for non-ergodic and complex statistical systems if their relative entropy can be factored into a general...

  7. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    Science.gov (United States)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  8. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...... in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results....... Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges....

  9. A Maximum Entropy Modelling of the Rain Drop Size Distribution

    Directory of Open Access Journals (Sweden)

    Francisco J. Tapiador

    2011-01-01

    Full Text Available This paper presents a maximum entropy approach to Rain Drop Size Distribution (RDSD modelling. It is shown that this approach allows (1 to use a physically consistent rationale to select a particular probability density function (pdf (2 to provide an alternative method for parameter estimation based on expectations of the population instead of sample moments and (3 to develop a progressive method of modelling by updating the pdf as new empirical information becomes available. The method is illustrated with both synthetic and real RDSD data, the latest coming from a laser disdrometer network specifically designed to measure the spatial variability of the RDSD.

  10. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  11. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    Science.gov (United States)

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  12. A Trustworthiness Evaluation Method for Software Architectures Based on the Principle of Maximum Entropy (POME and the Grey Decision-Making Method (GDMM

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2014-09-01

    Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.

  13. A maximum entropy model for opinions in social groups

    Science.gov (United States)

    Davis, Sergio; Navarrete, Yasmín; Gutiérrez, Gonzalo

    2014-04-01

    We study how the opinions of a group of individuals determine their spatial distribution and connectivity, through an agent-based model. The interaction between agents is described by a Hamiltonian in which agents are allowed to move freely without an underlying lattice (the average network topology connecting them is determined from the parameters). This kind of model was derived using maximum entropy statistical inference under fixed expectation values of certain probabilities that (we propose) are relevant to social organization. Control parameters emerge as Lagrange multipliers of the maximum entropy problem, and they can be associated with the level of consequence between the personal beliefs and external opinions, and the tendency to socialize with peers of similar or opposing views. These parameters define a phase diagram for the social system, which we studied using Monte Carlo Metropolis simulations. Our model presents both first and second-order phase transitions, depending on the ratio between the internal consequence and the interaction with others. We have found a critical value for the level of internal consequence, below which the personal beliefs of the agents seem to be irrelevant.

  14. Maximum Entropy Estimation of n-Year Extreme Waveheights

    Institute of Scientific and Technical Information of China (English)

    徐德伦; 张军; 郑桂珍

    2004-01-01

    A new method for estimating the n (50 or 100) -year return-period waveheight, namely, the extreme waveheightexpected to occur in n years, is presented on the basis of the maximum entropy principle. The main points of the method are as follows: ( 1 ) based on the Hamiltonian principle, a maximum entropy probability density function for the extreme waveheight H, f(H)= αHγe-βΗ4 is derived from a Lagrangian function subject to some necessary and rational constraints; (2) the parametersα,β, andγin the function are expressed in terms of the mean H, variance V = ( H - H)2and bias B = ( H- H)3; and (3) with H, V and B estimated from observed data, the n-year return-period wave height Hn is computed in accordance with the formula 1/1 - F(Hn) = n, where F(Hn) is defined as F(Hn) =n Hn Of(H)dH.Examples of estimating the 50 and 100-year retum period waveheights by the present method and by some currently used method from observed data acquired from two hydrographic stations are given. A comparison of the estimated results shows that the present method is superior to the others.

  15. Maximum Entropy Method of Image Segmentation Based on Genetic Algorithm%改进的最大熵算法在图像分割中的应用

    Institute of Scientific and Technical Information of China (English)

    王文渊; 王芳梅

    2011-01-01

    The traditional entropy threshold has shortcomings of theory and computational complexity, resulting in time - consuming in image segmentation and low efficiency. In order to improve the efficiency and accuracy of image segmentation, an image segmentation method is proposed, which combines the improved genetic algorithm with maxi-mum entropy algorithm. First, the two -dimensional histogram based on the image gray value information is used to extract features, then three genetic operations of selecting, crossover and mutation are used to search for the optimal threshold for image segmentation. Simulation results show that the improved algorithm, compared with the traditional maximum entropy image segmentation algorithm, increases segmentation efficiency, and the accuracy of image seg-mentation has greatly improved, which speeds up the segmentation speed.%研究图像分割优化问题,要求图像分割速度快,清晰度高.针对传统的熵值法在理论上存在的不足,同时抗噪能力差,速度慢,图像模糊等缺陷,造成图像分割过程耗时长,分割效率低等问题.为了提高图像分割效率和精确度,提出一种改进的遗传算法和最大熵算法相结合的图像分割新方法.首先依据图像二维直方图信息来对图像进行特征提取,最后通过遗传算法的选择、交叉和变异操作搜索最优阈值,从而获得最优阈值来对图像进行分割.仿真结果表明,改进的算法与传统最大熵值的图像分割算法相比,分割效率明显提高,同时图像分割的精度也大大提高,加快了图像分割的速度,为设计提供了依据.

  16. Proscriptive Bayesian Programming and Maximum Entropy: a Preliminary Study

    Science.gov (United States)

    Koike, Carla Cavalcante

    2008-11-01

    Some problems found in robotics systems, as avoiding obstacles, can be better described using proscriptive commands, where only prohibited actions are indicated in contrast to prescriptive situations, which demands that a specific command be specified. An interesting question arises regarding the possibility to learn automatically if proscriptive commands are suitable and which parametric function could be better applied. Lately, a great variety of problems in robotics domain are object of researches using probabilistic methods, including the use of Maximum Entropy in automatic learning for robot control systems. This works presents a preliminary study on automatic learning of proscriptive robot control using maximum entropy and using Bayesian Programming. It is verified whether Maximum entropy and related methods can favour proscriptive commands in an obstacle avoidance task executed by a mobile robot.

  17. Approximate maximum-entropy moment closures for gas dynamics

    Science.gov (United States)

    McDonald, James G.

    2016-11-01

    Accurate prediction of flows that exist between the traditional continuum regime and the free-molecular regime have proven difficult to obtain. Current methods are either inaccurate in this regime or prohibitively expensive for practical problems. Moment closures have long held the promise of providing new, affordable, accurate methods in this regime. The maximum-entropy hierarchy of closures seems to offer particularly attractive physical and mathematical properties. Unfortunately, several difficulties render the practical implementation of maximum-entropy closures very difficult. This work examines the use of simple approximations to these maximum-entropy closures and shows that physical accuracy that is vastly improved over continuum methods can be obtained without a significant increase in computational cost. Initially the technique is demonstrated for a simple one-dimensional gas. It is then extended to the full three-dimensional setting. The resulting moment equations are used for the numerical solution of shock-wave profiles with promising results.

  18. A Maximum Entropy Method for a Robust Portfolio Problem

    Directory of Open Access Journals (Sweden)

    Yingying Xu

    2014-06-01

    Full Text Available We propose a continuous maximum entropy method to investigate the robustoptimal portfolio selection problem for the market with transaction costs and dividends.This robust model aims to maximize the worst-case portfolio return in the case that allof asset returns lie within some prescribed intervals. A numerical optimal solution tothe problem is obtained by using a continuous maximum entropy method. Furthermore,some numerical experiments indicate that the robust model in this paper can result in betterportfolio performance than a classical mean-variance model.

  19. Maximum-entropy distributions of correlated variables with prespecified marginals.

    Science.gov (United States)

    Larralde, Hernán

    2012-12-01

    The problem of determining the joint probability distributions for correlated random variables with prespecified marginals is considered. When the joint distribution satisfying all the required conditions is not unique, the "most unbiased" choice corresponds to the distribution of maximum entropy. The calculation of the maximum-entropy distribution requires the solution of rather complicated nonlinear coupled integral equations, exact solutions to which are obtained for the case of Gaussian marginals; otherwise, the solution can be expressed as a perturbation around the product of the marginals if the marginal moments exist.

  20. Training Concept, Evolution Time, and the Maximum Entropy Production Principle

    Directory of Open Access Journals (Sweden)

    Alexey Bezryadin

    2016-04-01

    Full Text Available The maximum entropy production principle (MEPP is a type of entropy optimization which demands that complex non-equilibrium systems should organize such that the rate of the entropy production is maximized. Our take on this principle is that to prove or disprove the validity of the MEPP and to test the scope of its applicability, it is necessary to conduct experiments in which the entropy produced per unit time is measured with a high precision. Thus we study electric-field-induced self-assembly in suspensions of carbon nanotubes and realize precise measurements of the entropy production rate (EPR. As a strong voltage is applied the suspended nanotubes merge together into a conducting cloud which produces Joule heat and, correspondingly, produces entropy. We introduce two types of EPR, which have qualitatively different significance: global EPR (g-EPR and the entropy production rate of the dissipative cloud itself (DC-EPR. The following results are obtained: (1 As the system reaches the maximum of the DC-EPR, it becomes stable because the applied voltage acts as a stabilizing thermodynamic potential; (2 We discover metastable states characterized by high, near-maximum values of the DC-EPR. Under certain conditions, such efficient entropy-producing regimes can only be achieved if the system is allowed to initially evolve under mildly non-equilibrium conditions, namely at a reduced voltage; (3 Without such a “training” period the system typically is not able to reach the allowed maximum of the DC-EPR if the bias is high; (4 We observe that the DC-EPR maximum is achieved within a time, Te, the evolution time, which scales as a power-law function of the applied voltage; (5 Finally, we present a clear example in which the g-EPR theoretical maximum can never be achieved. Yet, under a wide range of conditions, the system can self-organize and achieve a dissipative regime in which the DC-EPR equals its theoretical maximum.

  1. A Maximum-Entropy Method for Estimating the Spectrum

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the maximum-entropy (ME) principle, a new power spectral estimator for random waves is derived in the form of ~S(ω)=(a/8)-H2(2π)d+1ω-(d+2)exp[-b(2π/ω)n], by solving a variational problem subject to some quite general constraints. This robust method is comprehensive enough to describe the wave spectra even in extreme wave conditions and is superior to periodogram method that is not suitable to process comparatively short or intensively unsteady signals for its tremendous boundary effect and some inherent defects of FFT. Fortunately, the newly derived method for spectral estimation works fairly well, even though the sample data sets are very short and unsteady, and the reliability and efficiency of this spectral estimator have been preliminarily proved.

  2. Maximum entropy method applied to deblurring images on a MasPar MP-1 computer

    Science.gov (United States)

    Bonavito, N. L.; Dorband, John; Busse, Tim

    1991-01-01

    A statistical inference method based on the principle of maximum entropy is developed for the purpose of enhancing and restoring satellite images. The proposed maximum entropy image restoration method is shown to overcome the difficulties associated with image restoration and provide the smoothest and most appropriate solution consistent with the measured data. An implementation of the method on the MP-1 computer is described, and results of tests on simulated data are presented.

  3. Acoustic space dimensionality selection and combination using the maximum entropy principle

    OpenAIRE

    Abdel-Haleem, Yasser H.; Renals, Steve; Lawrence, Neil D.

    2004-01-01

    In this paper we propose a discriminative approach to acoustic space dimensionality selection based on maximum entropy modelling. We form a set of constraints by composing the acoustic space with the space of phone classes, and use a continuous feature formulation of maximum entropy modelling to select an optimal feature set. The suggested approach has two steps: (1) the selection of the best acoustic space that efficiently and economically represents the acoustic data and its variability;...

  4. Segmentation Based on Clustering and Maximum Entropy Method%基于空间模式聚类最大熵图像分割算法研究

    Institute of Scientific and Technical Information of China (English)

    陈秋红; 沈云琴

    2012-01-01

    研究图像分割优化问题,在分割图像中,提取信息受到各种因素影响,分割效果不理想.针对图像分割计算复杂,造成图像分割分辨率低,清晰度不高.同时,当图像中的信息量非常大时,图像分割非常耗时.为了有效地分割图像,提出了一种基于空间模式聚类和最大熵算法原理相结合的图像分割方法.首先对图像采用最大熵算法进行图像分割,为每个熵区域定义特征量.根据不同的特征量计算相似区域之间的欧氏距离和空间距离,从而确定像素聚类中心的距离.然后对分割后的图像区域采用基于空间模式聚类方案进行合并,并对图像进行二值化处理.仿真表明与传统图像分割相比,提高了分割效率,分割出的图像边缘效果清晰,证明了算法的可行性和有效性.%The paper studied Image segmentation optimization problem. For the computational complexity and oth er factors, many image segmentation algorithms have low resolution of image segmentation and low clarity. When ima ges contain large amount of information, the image segmentations are very time-consuming]'. In order to effectively segment images, a space model was proposed based on clustering and principle of maximum entropy algorithm. First ly , the maximum entropy algorithm was used for image segmentation, and characteristics were defined for each entro py region. Based on different characteristics, the Euclidean distance and space distance between similar regions were calculated to determine the distance between cluster center pixel. Then, segmented image areas were clustered based on joint space mode, and binarized. Simulation results show that compared with the traditional image segmentation, this image segmentation has clear edge effects, which demonstrates the feasibility and effectiveness of the algorithm.

  5. The constraint rule of the maximum entropy principle

    NARCIS (Netherlands)

    Uffink, J.

    2001-01-01

    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distribut

  6. A MAXIMUM ENTROPY METHOD FOR CONSTRAINED SEMI-INFINITEPROGRAMMING PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    ZHOU Guanglu; WANG Changyu; SHI Zhenjun; SUN Qingying

    1999-01-01

    This paper presents a new method, called the maximum entropy method,for solving semi-infinite programming problems, in which thesemi-infinite programming problem is approximated by one with a singleconstraint. The convergence properties for this method are discussed.Numerical examples are given to show the high effciency of thealgorithm.

  7. Filtering Additive Measurement Noise with Maximum Entropy in the Mean

    CERN Document Server

    Gzyl, Henryk

    2007-01-01

    The purpose of this note is to show how the method of maximum entropy in the mean (MEM) may be used to improve parametric estimation when the measurements are corrupted by large level of noise. The method is developed in the context on a concrete example: that of estimation of the parameter in an exponential distribution. We compare the performance of our method with the bayesian and maximum likelihood approaches.

  8. Enzyme kinetics and the maximum entropy production principle.

    Science.gov (United States)

    Dobovišek, Andrej; Zupanović, Paško; Brumen, Milan; Bonačić-Lošić, Zeljana; Kuić, Domagoj; Juretić, Davor

    2011-03-01

    A general proof is derived that entropy production can be maximized with respect to rate constants in any enzymatic transition. This result is used to test the assumption that biological evolution of enzyme is accompanied with an increase of entropy production in its internal transitions and that such increase can serve to quantify the progress of enzyme evolution. The state of maximum entropy production would correspond to fully evolved enzyme. As an example the internal transition ES↔EP in a generalized reversible Michaelis-Menten three state scheme is analyzed. A good agreement is found among experimentally determined values of the forward rate constant in internal transitions ES→EP for three types of β-Lactamase enzymes and their optimal values predicted by the maximum entropy production principle, which agrees with earlier observations that β-Lactamase enzymes are nearly fully evolved. The optimization of rate constants as the consequence of basic physical principle, which is the subject of this paper, is a completely different concept from a) net metabolic flux maximization or b) entropy production minimization (in the static head state), both also proposed to be tightly connected to biological evolution.

  9. MB Distribution and its application using maximum entropy approach

    Directory of Open Access Journals (Sweden)

    Bhadra Suman

    2016-01-01

    Full Text Available Maxwell Boltzmann distribution with maximum entropy approach has been used to study the variation of political temperature and heat in a locality. We have observed that the political temperature rises without generating any political heat when political parties increase their attractiveness by intense publicity, but voters do not shift their loyalties. It has also been shown that political heat is generated and political entropy increases with political temperature remaining constant when parties do not change their attractiveness, but voters shift their loyalties (to more attractive parties.

  10. Regionalization of Chinese Material Medical Quality Based on Maximum Entropy Model: A case study of Atractylodes lancea

    Science.gov (United States)

    Shoudong, Zhu; Huasheng, Peng; Lanping, Guo; Tongren, Xu; Yan, Zhang; Meilan, Chen; Qingxiu, Hao; Liping, Kang; Luqi, Huang

    2017-01-01

    Atractylodes is an East-Asiatic endemic genera that distributed in China, Japan and Russian Far Eastern. As an important resource of medicinal plant, atractylodes has long been used as herbal medicine. To example the significant features in its trueborn quality and geographical distribution, we explored the relationships between medicine quality and habitat suitability in two classifications–lower atractylodin content than the standard of Chinese Pharmacopoeia (2010) and the other has higher content. We found that the atractylodin content is negatively related to the habitat suitability for atractylodes with lower atractylodin, while the atractylodin content is positively related to the habitat suitability for those with higher atractylodin. By analyzing the distribution of atractylodeswith lower atractylodin content than the standard of Pharmacopeia, we discovered that the main ecological factors that could inhibit the accumulation of atractylodin were soil type (39.7%), soil clay content (26.7%), mean temperature in December (22.3%), Cation-exchange capacity (6%), etc. And these ecological factors promoted the accumulation of atractylodin for the atractylodes with higher atractylodin. By integrating the two classifications, we finally predicted the distribution of atractylodin content in China.Our results realized the query of atractylodes quality in arbitrary coordinates, and satisfied the actually cultivation demands of “Planting area based on atractylodin quality”. PMID:28205539

  11. Regionalization of Chinese Material Medical Quality Based on Maximum Entropy Model: A case study of Atractylodes lancea

    Science.gov (United States)

    Shoudong, Zhu; Huasheng, Peng; Lanping, Guo; Tongren, Xu; Yan, Zhang; Meilan, Chen; Qingxiu, Hao; Liping, Kang; Luqi, Huang

    2017-02-01

    Atractylodes is an East-Asiatic endemic genera that distributed in China, Japan and Russian Far Eastern. As an important resource of medicinal plant, atractylodes has long been used as herbal medicine. To example the significant features in its trueborn quality and geographical distribution, we explored the relationships between medicine quality and habitat suitability in two classifications–lower atractylodin content than the standard of Chinese Pharmacopoeia (2010) and the other has higher content. We found that the atractylodin content is negatively related to the habitat suitability for atractylodes with lower atractylodin, while the atractylodin content is positively related to the habitat suitability for those with higher atractylodin. By analyzing the distribution of atractylodeswith lower atractylodin content than the standard of Pharmacopeia, we discovered that the main ecological factors that could inhibit the accumulation of atractylodin were soil type (39.7%), soil clay content (26.7%), mean temperature in December (22.3%), Cation-exchange capacity (6%), etc. And these ecological factors promoted the accumulation of atractylodin for the atractylodes with higher atractylodin. By integrating the two classifications, we finally predicted the distribution of atractylodin content in China.Our results realized the query of atractylodes quality in arbitrary coordinates, and satisfied the actually cultivation demands of “Planting area based on atractylodin quality”.

  12. Propane spectral resolution enhancement by the maximum entropy method

    Science.gov (United States)

    Bonavito, N. L.; Stewart, K. P.; Hurley, E. J.; Yeh, K. C.; Inguva, R.

    1990-01-01

    The Burg algorithm for maximum entropy power spectral density estimation is applied to a time series of data obtained from a Michelson interferometer and compared with a standard FFT estimate for resolution capability. The propane transmittance spectrum was estimated by use of the FFT with a 2 to the 18th data sample interferogram, giving a maximum unapodized resolution of 0.06/cm. This estimate was then interpolated by zero filling an additional 2 to the 18th points, and the final resolution was taken to be 0.06/cm. Comparison of the maximum entropy method (MEM) estimate with the FFT was made over a 45/cm region of the spectrum for several increasing record lengths of interferogram data beginning at 2 to the 10th. It is found that over this region the MEM estimate with 2 to the 16th data samples is in close agreement with the FFT estimate using 2 to the 18th samples.

  13. Maximum-Entropy Inference with a Programmable Annealer

    CERN Document Server

    Chancellor, Nicholas; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A

    2015-01-01

    Optimisation problems in science and engineering typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this approach maximises the likelihood that the solution found is correct. An alternative approach is to make use of prior statistical information about the noise in conjunction with Bayes's theorem. The maximum entropy solution to the problem then takes the form of a Boltzmann distribution over the ground and excited states of the cost function. Here we use a programmable Josephson junction array for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that maximum entropy decoding at finite temperature can in certain cases give competitive and even slightly better bit-error-rates than the maximum likelihood approach at zero temperature, confirming that useful information can be extracted from the excited states of the annealing...

  14. Delocalized Epidemics on Graphs: A Maximum Entropy Approach

    CERN Document Server

    Sahneh, Faryad Darabi; Scoglio, Caterina

    2016-01-01

    The susceptible--infected--susceptible (SIS) epidemic process on complex networks can show metastability, resembling an endemic equilibrium. In a general setting, the metastable state may involve a large portion of the network, or it can be localized on small subgraphs of the contact network. Localized infections are not interesting because a true outbreak concerns network--wide invasion of the contact graph rather than localized infection of certain sites within the contact network. Existing approaches to localization phenomenon suffer from a major drawback: they fully rely on the steady--state solution of mean--field approximate models in the neighborhood of their phase transition point, where their approximation accuracy is worst; as statistical physics tells us. We propose a dispersion entropy measure that quantifies the localization of infections in a generic contact graph. Formulating a maximum entropy problem, we find an upper bound for the dispersion entropy of the possible metastable state in the exa...

  15. Triadic conceptual structure of the maximum entropy approach to evolution.

    Science.gov (United States)

    Herrmann-Pillath, Carsten; Salthe, Stanley N

    2011-03-01

    Many problems in evolutionary theory are cast in dyadic terms, such as the polar oppositions of organism and environment. We argue that a triadic conceptual structure offers an alternative perspective under which the information generating role of evolution as a physical process can be analyzed, and propose a new diagrammatic approach. Peirce's natural philosophy was deeply influenced by his reception of both Darwin's theory and thermodynamics. Thus, we elaborate on a new synthesis which puts together his theory of signs and modern Maximum Entropy approaches to evolution in a process discourse. Following recent contributions to the naturalization of Peircean semiosis, pointing towards 'physiosemiosis' or 'pansemiosis', we show that triadic structures involve the conjunction of three different kinds of causality, efficient, formal and final. In this, we accommodate the state-centered thermodynamic framework to a process approach. We apply this on Ulanowicz's analysis of autocatalytic cycles as primordial patterns of life. This paves the way for a semiotic view of thermodynamics which is built on the idea that Peircean interpretants are systems of physical inference devices evolving under natural selection. In this view, the principles of Maximum Entropy, Maximum Power, and Maximum Entropy Production work together to drive the emergence of information carrying structures, which at the same time maximize information capacity as well as the gradients of energy flows, such that ultimately, contrary to Schrödinger's seminal contribution, the evolutionary process is seen to be a physical expression of the Second Law.

  16. Maximum-Entropy Inference with a Programmable Annealer.

    Science.gov (United States)

    Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A

    2016-03-03

    Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition.

  17. Stationary properties of maximum-entropy random walks.

    Science.gov (United States)

    Dixit, Purushottam D

    2015-10-01

    Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.

  18. Maximum information entropy: a foundation for ecological theory.

    Science.gov (United States)

    Harte, John; Newman, Erica A

    2014-07-01

    The maximum information entropy (MaxEnt) principle is a successful method of statistical inference that has recently been applied to ecology. Here, we show how MaxEnt can accurately predict patterns such as species-area relationships (SARs) and abundance distributions in macroecology and be a foundation for ecological theory. We discuss the conceptual foundation of the principle, why it often produces accurate predictions of probability distributions in science despite not incorporating explicit mechanisms, and how mismatches between predictions and data can shed light on driving mechanisms in ecology. We also review possible future extensions of the maximum entropy theory of ecology (METE), a potentially important foundation for future developments in ecological theory.

  19. Maximum-entropy closure of hydrodynamic moment hierarchies including correlations.

    Science.gov (United States)

    Hughes, Keith H; Burghardt, Irene

    2012-06-07

    Generalized hydrodynamic moment hierarchies are derived which explicitly include nonequilibrium two-particle and higher-order correlations. The approach is adapted to strongly correlated media and nonequilibrium processes on short time scales which necessitate an explicit treatment of time-evolving correlations. Closure conditions for the extended moment hierarchies are formulated by a maximum-entropy approach, generalizing related closure procedures for kinetic equations. A self-consistent set of nonperturbative dynamical equations are thus obtained for a chosen set of single-particle and two-particle (and possibly higher-order) moments. Analytical results are derived for generalized Gaussian closures including the dynamic pair distribution function and a two-particle correction to the current density. The maximum-entropy closure conditions are found to involve the Kirkwood superposition approximation.

  20. Optical and terahertz spectra analysis by the maximum entropy method.

    Science.gov (United States)

    Vartiainen, Erik M; Peiponen, Kai-Erik

    2013-06-01

    Phase retrieval is one of the classical problems in various fields of physics including x-ray crystallography, astronomy and spectroscopy. It arises when only an amplitude measurement on electric field can be made while both amplitude and phase of the field are needed for obtaining the desired material properties. In optical and terahertz spectroscopies, in particular, phase retrieval is a one-dimensional problem, which is considered as unsolvable in general. Nevertheless, an approach utilizing the maximum entropy principle has proven to be a feasible tool in various applications of optical, both linear and nonlinear, as well as in terahertz spectroscopies, where the one-dimensional phase retrieval problem arises. In this review, we focus on phase retrieval using the maximum entropy method in various spectroscopic applications. We review the theory behind the method and illustrate through examples why and how the method works, as well as discuss its limitations.

  1. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  2. Combining experiments and simulations using the maximum entropy principle.

    Directory of Open Access Journals (Sweden)

    Wouter Boomsma

    2014-02-01

    Full Text Available A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges.

  3. Combining experiments and simulations using the maximum entropy principle.

    Science.gov (United States)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-02-01

    A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges.

  4. PNNL: A Supervised Maximum Entropy Approach to Word Sense Disambiguation

    Energy Technology Data Exchange (ETDEWEB)

    Tratz, Stephen C.; Sanfilippo, Antonio P.; Gregory, Michelle L.; Chappell, Alan R.; Posse, Christian; Whitney, Paul D.

    2007-06-23

    In this paper, we described the PNNL Word Sense Disambiguation system as applied to the English All-Word task in Se-mEval 2007. We use a supervised learning approach, employing a large number of features and using Information Gain for dimension reduction. Our Maximum Entropy approach combined with a rich set of features produced results that are significantly better than baseline and are the highest F-score for the fined-grained English All-Words subtask.

  5. Maximum-entropy principle as Galerkin modelling paradigm

    Science.gov (United States)

    Noack, Bernd R.; Niven, Robert K.; Rowley, Clarence W.

    2012-11-01

    We show how the empirical Galerkin method, leading e.g. to POD models, can be derived from maximum-entropy principles building on Noack & Niven 2012 JFM. In particular, principles are proposed (1) for the Galerkin expansion, (2) for the Galerkin system identification, and (3) for the probability distribution of the attractor. Examples will illustrate the advantages of the entropic modelling paradigm. Partially supported by the ANR Chair of Excellence TUCOROM and an ADFA/UNSW Visiting Fellowship.

  6. ON A GENERALIZATION OF THE MAXIMUM ENTROPY THEOREM OF BURG

    Directory of Open Access Journals (Sweden)

    JOSÉ MARCANO

    2017-01-01

    Full Text Available In this article we introduce some matrix manipulations that allow us to obtain a version of the original Christoffel-Darboux formula, which is of interest in many applications of linear algebra. Using these developments matrix and Jensen’s inequality, we obtain the main result of this proposal, which is the generalization of the maximum entropy theorem of Burg for multivariate processes.

  7. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  8. Present and Last Glacial Maximum climates as states of maximum entropy production

    CERN Document Server

    Herbert, Corentin; Kageyama, Masa; Dubrulle, Berengere

    2011-01-01

    The Earth, like other planets with a relatively thick atmosphere, is not locally in radiative equilibrium and the transport of energy by the geophysical fluids (atmosphere and ocean) plays a fundamental role in determining its climate. Using simple energy-balance models, it was suggested a few decades ago that the meridional energy fluxes might follow a thermodynamic Maximum Entropy Production (MEP) principle. In the present study, we assess the MEP hypothesis in the framework of a minimal climate model based solely on a robust radiative scheme and the MEP principle, with no extra assumptions. Specifically, we show that by choosing an adequate radiative exchange formulation, the Net Exchange Formulation, a rigorous derivation of all the physical parameters can be performed. The MEP principle is also extended to surface energy fluxes, in addition to meridional energy fluxes. The climate model presented here is extremely fast, needs very little empirical data and does not rely on ad hoc parameterizations. We in...

  9. Maximum Tsallis entropy with generalized Gini and Gini mean difference indices constraints

    Science.gov (United States)

    Khosravi Tanak, A.; Mohtashami Borzadaran, G. R.; Ahmadi, J.

    2017-04-01

    Using the maximum entropy principle with Tsallis entropy, some distribution families for modeling income distribution are obtained. By considering income inequality measures, maximum Tsallis entropy distributions under the constraint on generalized Gini and Gini mean difference indices are derived. It is shown that the Tsallis entropy maximizers with the considered constraints belong to generalized Pareto family.

  10. Fast Forward Maximum entropy reconstruction of sparsely sampled data.

    Science.gov (United States)

    Balsgart, Nicholas M; Vosegaard, Thomas

    2012-10-01

    We present an analytical algorithm using fast Fourier transformations (FTs) for deriving the gradient needed as part of the iterative reconstruction of sparsely sampled datasets using the forward maximum entropy reconstruction (FM) procedure by Hyberts and Wagner [J. Am. Chem. Soc. 129 (2007) 5108]. The major drawback of the original algorithm is that it required one FT and one evaluation of the entropy per missing datapoint to establish the gradient. In the present study, we demonstrate that the entire gradient may be obtained using only two FT's and one evaluation of the entropy derivative, thus achieving impressive time savings compared to the original procedure. An example: A 2D dataset with sparse sampling of the indirect dimension, with sampling of only 75 out of 512 complex points (15% sampling) would lack (512-75)×2=874 points per ν(2) slice. The original FM algorithm would require 874 FT's and entropy function evaluations to setup the gradient, while the present algorithm is ∼450 times faster in this case, since it requires only two FT's. This allows reduction of the computational time from several hours to less than a minute. Even more impressive time savings may be achieved with 2D reconstructions of 3D datasets, where the original algorithm required days of CPU time on high-performance computing clusters only require few minutes of calculation on regular laptop computers with the new algorithm.

  11. Improved Maximum Entropy Method with an Extended Search Space

    CERN Document Server

    Rothkopf, Alexander

    2012-01-01

    We report on an improvement to the implementation of the Maximum Entropy Method (MEM). It amounts to departing from the search space obtained through a singular value decomposition (SVD) of the Kernel. Based on the shape of the SVD basis functions we argue that the MEM spectrum for given $N_\\tau$ data-points $D(\\tau)$ and prior information $m(\\omega)$ does not in general lie in this $N_\\tau$ dimensional singular subspace. Systematically extending the search basis will eventually recover the full search space and the correct extremum. We illustrate this idea through a mock data analysis inspired by actual lattice spectra, to show where our improvement becomes essential for the success of the MEM. To remedy the shortcomings of Bryan's SVD prescription we propose to use the real Fourier basis, which consists of trigonometric functions. Not only does our approach lead to more stable numerical behavior, as the SVD is not required for the determination of the basis functions, but also the resolution of the MEM beco...

  12. Maximum Entropy for the International Division of Labor.

    Science.gov (United States)

    Lei, Hongmei; Chen, Ying; Li, Ruiqi; He, Deli; Zhang, Jiang

    2015-01-01

    As a result of the international division of labor, the trade value distribution on different products substantiated by international trade flows can be regarded as one country's strategy for competition. According to the empirical data of trade flows, countries may spend a large fraction of export values on ubiquitous and competitive products. Meanwhile, countries may also diversify their exports share on different types of products to reduce the risk. In this paper, we report that the export share distribution curves can be derived by maximizing the entropy of shares on different products under the product's complexity constraint once the international market structure (the country-product bipartite network) is given. Therefore, a maximum entropy model provides a good fit to empirical data. The empirical data is consistent with maximum entropy subject to a constraint on the expected value of the product complexity for each country. One country's strategy is mainly determined by the types of products this country can export. In addition, our model is able to fit the empirical export share distribution curves of nearly every country very well by tuning only one parameter.

  13. Maximum-Entropy Method for Evaluating the Slope Stability of Earth Dams

    Directory of Open Access Journals (Sweden)

    Shuai Wang

    2012-10-01

    Full Text Available The slope stability is a very important problem in geotechnical engineering. This paper presents an approach for slope reliability analysis based on the maximum-entropy method. The key idea is to implement the maximum entropy principle in estimating the probability density function. The performance function is formulated by the Simplified Bishop’s method to estimate the slope failure probability. The maximum-entropy method is used to estimate the probability density function (PDF of the performance function subject to the moment constraints. A numerical example is calculated and compared to the Monte Carlo simulation (MCS and the Advanced First Order Second Moment Method (AFOSM. The results show the accuracy and efficiency of the proposed method. The proposed method should be valuable for performing probabilistic analyses.

  14. Maximum-entropy for the laser fusion problem

    Energy Technology Data Exchange (ETDEWEB)

    Madkour, M.A. [Nansoura Univ. (Egypt). Dept. of Phys.

    1996-09-01

    The problem of heat flux at the critical surfaces and the surfaces of a pellet of deuterium and tritium (conduction zone) heated by laser have been considered. Ion-electron collisions are only allowed for: i.e. the linear transport equation is used to describe the problem with boundary conditions. The maximum-entropy approach is used to calculate the electron density and temperature across the conduction zone as well as the heat flux. Numerical results are given and compared with those of Rouse and Williams and El-Wakil et al. (orig.).

  15. Use of Maximum Entropy Modeling in Wildlife Research

    Directory of Open Access Journals (Sweden)

    Roger A. Baldwin

    2009-11-01

    Full Text Available Maximum entropy (Maxent modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, and performs better than other presence-only modeling approaches. Further advances are needed to better define model thresholds, to test model significance, and to address model selection. Additionally, development of modeling approaches is needed when using repeated sampling of known individuals to assess habitat selection. These advancements would strengthen the utility of Maxent for wildlife research and management.

  16. Combining Experiments and Simulations Using the Maximum Entropy Principle

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten

    2014-01-01

    applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained...... in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results...

  17. On some problems of the maximum entropy ansatz

    Indian Academy of Sciences (India)

    K Bandyopadhyay; K Bhattacharyya; A K Bhattacharyya

    2000-03-01

    Some problems associated with the use of the maximum entropy principle, namely, (i) possible divergence of the series that is exponentiated, (ii) input-dependent asymptotic behaviour of the density function resulting from the truncation of the said series, and (iii) non-vanishing of the density function at the boundaries of a finite domain are pointed out. Prescriptions for remedying the aforesaid problems are put forward. Pilot calculations involving the ground quantum eigenenergy states of the quartic oscillator, the particle-in-a-box model, and the classical Maxwellian speed and energy distributions lend credence to our approach.

  18. Improving predictability of time series using maximum entropy methods

    Science.gov (United States)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  19. Implementation of the Maximum Entropy Method for Analytic Continuation

    CERN Document Server

    Levy, Ryan; Gull, Emanuel

    2016-01-01

    We present $\\texttt{Maxent}$, a tool for performing analytic continuation of spectral functions using the maximum entropy method. The code operates on discrete imaginary axis datasets (values with uncertainties) and transforms this input to the real axis. The code works for imaginary time and Matsubara frequency data and implements the 'Legendre' representation of finite temperature Green's functions. It implements a variety of kernels, default models, and grids for continuing bosonic, fermionic, anomalous, and other data. Our implementation is licensed under GPLv2 and extensively documented. This paper shows the use of the programs in detail.

  20. Implementation of the maximum entropy method for analytic continuation

    Science.gov (United States)

    Levy, Ryan; LeBlanc, J. P. F.; Gull, Emanuel

    2017-06-01

    We present Maxent, a tool for performing analytic continuation of spectral functions using the maximum entropy method. The code operates on discrete imaginary axis datasets (values with uncertainties) and transforms this input to the real axis. The code works for imaginary time and Matsubara frequency data and implements the 'Legendre' representation of finite temperature Green's functions. It implements a variety of kernels, default models, and grids for continuing bosonic, fermionic, anomalous, and other data. Our implementation is licensed under GPLv3 and extensively documented. This paper shows the use of the programs in detail.

  1. Time-Reversal Acoustics and Maximum-Entropy Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Berryman, J G

    2001-08-22

    Target location is a common problem in acoustical imaging using either passive or active data inversion. Time-reversal methods in acoustics have the important characteristic that they provide a means of determining the eigenfunctions and eigenvalues of the scattering operator for either of these problems. Each eigenfunction may often be approximately associated with an individual scatterer. The resulting decoupling of the scattered field from a collection of targets is a very useful aid to localizing the targets, and suggests a number of imaging and localization algorithms. Two of these are linear subspace methods and maximum-entropy imaging.

  2. Robust optimum design with maximum entropy method; Saidai entropy ho mochiita robust sei saitekika sekkeiho

    Energy Technology Data Exchange (ETDEWEB)

    Kawaguchi, K.; Egashira, Y.; Watanabe, G. [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    Vehicle and unit performance change according to not only external causes represented by the environment such as temperature or weather, but also internal causes which are dispersion of component characteristics and manufacturing processes or aged deteriorations. We developed the design method to estimate thus performance distributions with maximum entropy method and to calculate specifications with high performance robustness using Fuzzy theory. This paper describes the details of these methods and examples applied to power window system. 3 refs., 7 figs., 4 tabs.

  3. Hierarchical maximum entropy principle for generalized superstatistical systems and Bose-Einstein condensation of light.

    Science.gov (United States)

    Sob'yanin, Denis Nikolaevich

    2012-06-01

    A principle of hierarchical entropy maximization is proposed for generalized superstatistical systems, which are characterized by the existence of three levels of dynamics. If a generalized superstatistical system comprises a set of superstatistical subsystems, each made up of a set of cells, then the Boltzmann-Gibbs-Shannon entropy should be maximized first for each cell, second for each subsystem, and finally for the whole system. Hierarchical entropy maximization naturally reflects the sufficient time-scale separation between different dynamical levels and allows one to find the distribution of both the intensive parameter and the control parameter for the corresponding superstatistics. The hierarchical maximum entropy principle is applied to fluctuations of the photon Bose-Einstein condensate in a dye microcavity. This principle provides an alternative to the master equation approach recently applied to this problem. The possibility of constructing generalized superstatistics based on a statistics different from the Boltzmann-Gibbs statistics is pointed out.

  4. Stimulus-dependent maximum entropy models of neural population codes.

    Science.gov (United States)

    Granot-Atedgi, Einat; Tkačik, Gašper; Segev, Ronen; Schneidman, Elad

    2013-01-01

    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME) model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.

  5. Stimulus-dependent maximum entropy models of neural population codes.

    Directory of Open Access Journals (Sweden)

    Einat Granot-Atedgi

    Full Text Available Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. For large populations, direct sampling of these distributions is impossible, and so we must rely on constructing appropriate models. We show here that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. We introduce the stimulus-dependent maximum entropy (SDME model-a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. We find that the SDME model gives a more accurate account of single cell responses and in particular significantly outperforms uncoupled models in reproducing the distributions of population codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like average surprise and information transmission in a neural population.

  6. Maximum Entropy, Word-Frequency, Chinese Characters, and Multiple Meanings

    CERN Document Server

    Yan, Xiao-Yong

    2014-01-01

    The word-frequency distribution of a text written by an author is well accounted for by a maximum entropy distribution, the RGF (random group formation)-prediction. The RGF-distribution is completely determined by the a priori values of the total number of words in the text (M), the number of distinct words (N) and the number of repetitions of the most common word (k_max). It is here shown that this maximum entropy prediction also describes a text written in Chinese characters. In particular it is shown that although the same Chinese text written in words and Chinese characters have quite differently shaped distributions, they are nevertheless both well predicted by their respective three a priori characteristic values. It is pointed out that this is analogous to the change in the shape of the distribution when translating a given text to another language. Another consequence of the RGF-prediction is that taking a part of a long text will change the input parameters (M, N, k_max) and consequently also the sha...

  7. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  8. Advisor-advisee relationship identification based on maximum entropy model*%基于最大熵模型的导师-学生关系推测*

    Institute of Scientific and Technical Information of China (English)

    李勇军; 刘尊; 于会

    2013-01-01

      导师-学生关系是科研合作网络中重要的关系类型之一,准确识别此类关系对促进科研交流与合作、评审回避等有重要意义.以论文合作网络为基础,依据学生发表论文时通常与导师共同署名的现象,抽象出能够反映导师-学生合作关系的特征,提出了基于最大熵模型的导师-学生关系识别算法.利用DBLP中1990-2011年的论文数据进行实例验证,结果显示:1)关系类型识别结果的准确率超过95%;2)导师-学生关系终止时间的平均误差为1.39年.该方法在识别关系时避免了特征之间相互独立的约束,准确率优于其他同类识别算法,且建模方法对识别社交网络中的其他关系类型也具有借鉴意义.%Research collaboration network has become an essential part in our academic activities. We can keep or develop collaboration relationships with other researchers or share research results with them within the research collaboration network. It is well generally accepted that different relationships have essentially different influences on the collaboration of researchers. Such a scenario also hap-pens in our daily life. The advisor-advisee relationship plays an important role in the research collaboration network, so identification of advisor-advisee relationship can benefit the collaboration of researchers. In this paper, we aim to conduct a systematic investiga-tion of the problem of indentifying the social relationship types from publication networks, and try to propose an easily computed and effective solution to this problem. Based on the common knowledge that graduate student always co-authors his papers with his advisor and not vice versa, our study starts with an analysis on publication network, and retrieves these features that can represent the advisor-advisee relationship. According to these features, an advisor-advisee relationship identification algorithm based on maximum entropy model with

  9. Application of Maximum Entropy Deconvolution to ${\\gamma}$-ray Skymaps

    CERN Document Server

    Raab, Susanne

    2015-01-01

    Skymaps measured with imaging atmospheric Cherenkov telescopes (IACTs) represent the real source distribution convolved with the point spread function of the observing instrument. Current IACTs have an angular resolution in the order of 0.1$^\\circ$ which is rather large for the study of morphological structures and for comparing the morphology in $\\gamma$-rays to measurements in other wavelengths where the instruments have better angular resolutions. Serendipitously it is possible to approximate the underlying true source distribution by applying a deconvolution algorithm to the observed skymap, thus effectively improving the instruments angular resolution. From the multitude of existing deconvolution algorithms several are already used in astronomy, but in the special case of $\\gamma$-ray astronomy most of these algorithms are challenged due to the high noise level within the measured data. One promising algorithm for the application to $\\gamma$-ray data is the Maximum Entropy Algorithm. The advantages of th...

  10. Venus atmosphere profile from a maximum entropy principle

    Directory of Open Access Journals (Sweden)

    L. N. Epele

    2007-10-01

    Full Text Available The variational method with constraints recently developed by Verkley and Gerkema to describe maximum-entropy atmospheric profiles is generalized to ideal gases but with temperature-dependent specific heats. In so doing, an extended and non standard potential temperature is introduced that is well suited for tackling the problem under consideration. This new formalism is successfully applied to the atmosphere of Venus. Three well defined regions emerge in this atmosphere up to a height of 100 km from the surface: the lowest one up to about 35 km is adiabatic, a transition layer located at the height of the cloud deck and finally a third region which is practically isothermal.

  11. Conjugate variables in continuous maximum-entropy inference.

    Science.gov (United States)

    Davis, Sergio; Gutiérrez, Gonzalo

    2012-11-01

    For a continuous maximum-entropy distribution (obtained from an arbitrary number of simultaneous constraints), we derive a general relation connecting the Lagrange multipliers and the expectation values of certain particularly constructed functions of the states of the system. From this relation, an estimator for a given Lagrange multiplier can be constructed from derivatives of the corresponding constraining function. These estimators sometimes lead to the determination of the Lagrange multipliers by way of solving a linear system, and, in general, they provide another tool to widen the applicability of Jaynes's formalism. This general relation, especially well suited for computer simulation techniques, also provides some insight into the interpretation of the hypervirial relations known in statistical mechanics and the recently derived microcanonical dynamical temperature. We illustrate the usefulness of these new relations with several applications in statistics.

  12. Test images for the maximum entropy image restoration method

    Science.gov (United States)

    Mackey, James E.

    1990-01-01

    One of the major activities of any experimentalist is data analysis and reduction. In solar physics, remote observations are made of the sun in a variety of wavelengths and circumstances. In no case is the data collected free from the influence of the design and operation of the data gathering instrument as well as the ever present problem of noise. The presence of significant noise invalidates the simple inversion procedure regardless of the range of known correlation functions. The Maximum Entropy Method (MEM) attempts to perform this inversion by making minimal assumptions about the data. To provide a means of testing the MEM and characterizing its sensitivity to noise, choice of point spread function, type of data, etc., one would like to have test images of known characteristics that can represent the type of data being analyzed. A means of reconstructing these images is presented.

  13. A strong test of the maximum entropy theory of ecology.

    Science.gov (United States)

    Xiao, Xiao; McGlinn, Daniel J; White, Ethan P

    2015-03-01

    The maximum entropy theory of ecology (METE) is a unified theory of biodiversity that predicts a large number of macroecological patterns using information on only species richness, total abundance, and total metabolic rate of the community. We evaluated four major predictions of METE simultaneously at an unprecedented scale using data from 60 globally distributed forest communities including more than 300,000 individuals and nearly 2,000 species.METE successfully captured 96% and 89% of the variation in the rank distribution of species abundance and individual size but performed poorly when characterizing the size-density relationship and intraspecific distribution of individual size. Specifically, METE predicted a negative correlation between size and species abundance, which is weak in natural communities. By evaluating multiple predictions with large quantities of data, our study not only identifies a mismatch between abundance and body size in METE but also demonstrates the importance of conducting strong tests of ecological theories.

  14. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  15. Triadic Conceptual Structure of the Maximum Entropy Approach to Evolution

    CERN Document Server

    Herrmann-Pillath, Carsten

    2010-01-01

    Many problems in evolutionary theory are cast in dyadic terms, such as the polar oppositions of organism and environment. We argue that a triadic conceptual structure offers an alternative perspective under which the information generating role of evolution as a physical process can be analyzed, and propose a new diagrammatic approach. Peirce's natural philosophy was deeply influenced by his reception of both Darwin's theory and thermodynamics. Thus, we elaborate on a new synthesis which puts together his theory of signs and modern Maximum Entropy approaches to evolution. Following recent contributions to the naturalization of Peircean semiosis, we show that triadic structures involve the conjunction of three different kinds of causality, efficient, formal and final. We apply this on Ulanowicz's analysis of autocatalytic cycles as primordial patterns of life. This paves the way for a semiotic view of thermodynamics which is built on the idea that Peircean interpretants are systems of physical inference device...

  16. Maximum entropy, word-frequency, Chinese characters, and multiple meanings.

    Science.gov (United States)

    Yan, Xiaoyong; Minnhagen, Petter

    2015-01-01

    The word-frequency distribution of a text written by an author is well accounted for by a maximum entropy distribution, the RGF (random group formation)-prediction. The RGF-distribution is completely determined by the a priori values of the total number of words in the text (M), the number of distinct words (N) and the number of repetitions of the most common word (k(max)). It is here shown that this maximum entropy prediction also describes a text written in Chinese characters. In particular it is shown that although the same Chinese text written in words and Chinese characters have quite differently shaped distributions, they are nevertheless both well predicted by their respective three a priori characteristic values. It is pointed out that this is analogous to the change in the shape of the distribution when translating a given text to another language. Another consequence of the RGF-prediction is that taking a part of a long text will change the input parameters (M, N, k(max)) and consequently also the shape of the frequency distribution. This is explicitly confirmed for texts written in Chinese characters. Since the RGF-prediction has no system-specific information beyond the three a priori values (M, N, k(max)), any specific language characteristic has to be sought in systematic deviations from the RGF-prediction and the measured frequencies. One such systematic deviation is identified and, through a statistical information theoretical argument and an extended RGF-model, it is proposed that this deviation is caused by multiple meanings of Chinese characters. The effect is stronger for Chinese characters than for Chinese words. The relation between Zipf's law, the Simon-model for texts and the present results are discussed.

  17. Understanding Peripheral Bat Populations Using Maximum-Entropy Suitability Modeling

    Science.gov (United States)

    Barnhart, Paul R.; Gillam, Erin H.

    2016-01-01

    Individuals along the periphery of a species distribution regularly encounter more challenging environmental and climatic conditions than conspecifics near the center of the distribution. Due to these potential constraints, individuals in peripheral margins are expected to change their habitat and behavioral characteristics. Managers typically rely on species distribution maps when developing adequate management practices. However, these range maps are often too simplistic and do not provide adequate information as to what fine-scale biotic and abiotic factors are driving a species occurrence. In the last decade, habitat suitability modelling has become widely used as a substitute for simplistic distribution mapping which allows regional managers the ability to fine-tune management resources. The objectives of this study were to use maximum-entropy modeling to produce habitat suitability models for seven species that have a peripheral margin intersecting the state of North Dakota, according to current IUCN distributions, and determine the vegetative and climatic characteristics driving these models. Mistnetting resulted in the documentation of five species outside the IUCN distribution in North Dakota, indicating that current range maps for North Dakota, and potentially the northern Great Plains, are in need of update. Maximum-entropy modeling showed that temperature and not precipitation were the variables most important for model production. This fine-scale result highlights the importance of habitat suitability modelling as this information cannot be extracted from distribution maps. Our results provide baseline information needed for future research about how and why individuals residing in the peripheral margins of a species’ distribution may show marked differences in habitat use as a result of urban expansion, habitat loss, and climate change compared to more centralized populations. PMID:27935936

  18. Predicting the potential distribution of main malaria vectors Anopheles stephensi, An. culicifacies s.l. and An. fluviatilis s.l. in Iran based on maximum entropy model.

    Science.gov (United States)

    Pakdad, Kamran; Hanafi-Bojd, Ahmad Ali; Vatandoost, Hassan; Sedaghat, Mohammad Mehdi; Raeisi, Ahmad; Moghaddam, Abdolreza Salahi; Foroushani, Abbas Rahimi

    2017-05-01

    Malaria is considered as a major public health problem in southern areas of Iran. The goal of this study was to predict best ecological niches of three main malaria vectors of Iran: Anopheles stephensi, Anopheles culicifacies s.l. and Anopheles fluviatilis s.l. A databank was created which included all published data about Anopheles species of Iran from 1961 to 2015. The suitable environmental niches for the three above mentioned Anopheles species were predicted using maximum entropy model (MaxEnt). AUC (area under Roc curve) values were 0.943, 0.974 and 0.956 for An. stephensi, An. culicifacies s.l. and An. fluviatilis s.l respectively, which are considered as high potential power of model in the prediction of species niches. The biggest bioclimatic contributor for An. stephensi and An. fluviatilis s.l. was bio 15 (precipitation seasonality), 25.5% and 36.1% respectively, followed by bio 1 (annual mean temperature), 20.8% for An. stephensi and bio 4 (temperature seasonality) with 49.4% contribution for An. culicifacies s.l. This is the first step in the mapping of the country's malaria vectors. Hence, future weather situation can change the dispersal maps of Anopheles. Iran is under elimination phase of malaria, so that such spatio-temporal studies are essential and could provide guideline for decision makers for IVM strategies in problematic areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Comparison Between Bayesian and Maximum Entropy Analyses of Flow Networks†

    Directory of Open Access Journals (Sweden)

    Steven H. Waldrip

    2017-02-01

    Full Text Available We compare the application of Bayesian inference and the maximum entropy (MaxEnt method for the analysis of flow networks, such as water, electrical and transport networks. The two methods have the advantage of allowing a probabilistic prediction of flow rates and other variables, when there is insufficient information to obtain a deterministic solution, and also allow the effects of uncertainty to be included. Both methods of inference update a prior to a posterior probability density function (pdf by the inclusion of new information, in the form of data or constraints. The MaxEnt method maximises an entropy function subject to constraints, using the method of Lagrange multipliers,to give the posterior, while the Bayesian method finds its posterior by multiplying the prior with likelihood functions incorporating the measured data. In this study, we examine MaxEnt using soft constraints, either included in the prior or as probabilistic constraints, in addition to standard moment constraints. We show that when the prior is Gaussian,both Bayesian inference and the MaxEnt method with soft prior constraints give the same posterior means, but their covariances are different. In the Bayesian method, the interactions between variables are applied through the likelihood function, using second or higher-order cross-terms within the posterior pdf. In contrast, the MaxEnt method incorporates interactions between variables using Lagrange multipliers, avoiding second-order correlation terms in the posterior covariance. The MaxEnt method with soft prior constraints, therefore, has a numerical advantage over Bayesian inference, in that the covariance terms are avoided in its integrations. The second MaxEnt method with soft probabilistic constraints is shown to give posterior means of similar, but not identical, structure to the other two methods, due to its different formulation.

  20. A Microeconomic Interpretation of the Maximum Entropy Estimator of Multinomial Logit Models and Its Equivalence to the Maximum Likelihood Estimator

    Directory of Open Access Journals (Sweden)

    Louis de Grange

    2010-09-01

    Full Text Available Maximum entropy models are often used to describe supply and demand behavior in urban transportation and land use systems. However, they have been criticized for not representing behavioral rules of system agents and because their parameters seems to adjust only to modeler-imposed constraints. In response, it is demonstrated that the solution to the entropy maximization problem with linear constraints is a multinomial logit model whose parameters solve the likelihood maximization problem of this probabilistic model. But this result neither provides a microeconomic interpretation of the entropy maximization problem nor explains the equivalence of these two optimization problems. This work demonstrates that an analysis of the dual of the entropy maximization problem yields two useful alternative explanations of its solution. The first shows that the maximum entropy estimators of the multinomial logit model parameters reproduce rational user behavior, while the second shows that the likelihood maximization problem for multinomial logit models is the dual of the entropy maximization problem.

  1. Maximum entropy method for solving operator equations of the first kind

    Institute of Scientific and Technical Information of China (English)

    金其年; 侯宗义

    1997-01-01

    The maximum entropy method for linear ill-posed problems with modeling error and noisy data is considered and the stability and convergence results are obtained. When the maximum entropy solution satisfies the "source condition", suitable rates of convergence can be derived. Considering the practical applications, an a posteriori choice for the regularization parameter is presented. As a byproduct, a characterization of the maximum entropy regularized solution is given.

  2. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    Science.gov (United States)

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  3. A maximum entropy theorem with applications to the measurement of biodiversity

    CERN Document Server

    Leinster, Tom

    2009-01-01

    This is a preliminary article stating and proving a new maximum entropy theorem. The entropies that we consider can be used as measures of biodiversity. In that context, the question is: for a given collection of species, which frequency distribution(s) maximize the diversity? The theorem provides the answer. The chief surprise is that although we are dealing not just with a single entropy, but a one-parameter family of entropies, there is a single distribution maximizing all of them simultaneously.

  4. Local image statistics: maximum-entropy constructions and perceptual salience.

    Science.gov (United States)

    Victor, Jonathan D; Conte, Mary M

    2012-07-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics--including luminance distributions, pair-wise correlations, and higher-order correlations--are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions.

  5. Inferential permutation tests for maximum entropy models in ecology.

    Science.gov (United States)

    Shipley, Bill

    2010-09-01

    Maximum entropy (maxent) models assign probabilities to states that (1) agree with measured macroscopic constraints on attributes of the states and (2) are otherwise maximally uninformative and are thus as close as possible to a specified prior distribution. Such models have recently become popular in ecology, but classical inferential statistical tests require assumptions of independence during the allocation of entities to states that are rarely fulfilled in ecology. This paper describes a new permutation test for such maxent models that is appropriate for very general prior distributions and for cases in which many states have zero abundance and that can be used to test for conditional relevance of subsets of constraints. Simulations show that the test gives correct probability estimates under the null hypothesis. Power under the alternative hypothesis depends primarily on the number and strength of the constraints and on the number of states in the model; the number of empty states has only a small effect on power. The test is illustrated using two empirical data sets to test the community assembly model of B. Shipley, D. Vile, and E. Garnier and the species abundance distribution models of S. Pueyo, F. He, and T. Zillio.

  6. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  7. Application of the maximum entropy method to profile analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, N.; Kalceff, W. [University of Technology, Department of Applied Physics, Sydney, NSW (Australia); Cline, J.P. [National Institute of Standards and Technology, Gaithersburg, (United States)

    1999-12-01

    Full text: A maximum entropy (MaxEnt) method for analysing crystallite size- and strain-induced x-ray profile broadening is presented. This method treats the problems of determining the specimen profile, crystallite size distribution, and strain distribution in a general way by considering them as inverse problems. A common difficulty faced by many experimenters is their inability to determine a well-conditioned solution of the integral equation, which preserves the positivity of the profile or distribution. We show that the MaxEnt method overcomes this problem, while also enabling a priori information, in the form of a model, to be introduced into it. Additionally, we demonstrate that the method is fully quantitative, in that uncertainties in the solution profile or solution distribution can be determined and used in subsequent calculations, including mean particle sizes and rms strain. An outline of the MaxEnt method is presented for the specific problems of determining the specimen profile and crystallite or strain distributions for the correspondingly broadened profiles. This approach offers an alternative to standard methods such as those of Williamson-Hall and Warren-Averbach. An application of the MaxEnt method is demonstrated in the analysis of alumina size-broadened diffraction data (from NIST, Gaithersburg). It is used to determine the specimen profile and column-length distribution of the scattering domains. Finally, these results are compared with the corresponding Williamson-Hall and Warren-Averbach analyses. Copyright (1999) Australian X-ray Analytical Association Inc.

  8. Constructing Maximum Entropy Language Models for Movie Review Subjectivity Analysis

    Institute of Scientific and Technical Information of China (English)

    Bo Chen; Hui He; Jun Guo

    2008-01-01

    Document subjectivity analysis has become an important aspect of web text content mining. This problem is similar to traditional text categorization, thus many related classification techniques can be adapted here. However, there is one significant difference that more language or semantic information is required for better estimating the subjectivity of a document. Therefore, in this paper, our focuses are mainly on two aspects. One is how to extract useful and meaningful language features, and the other is how to construct appropriate language models efficiently for this special task. For the first issue, we conduct a Global-Filtering and Local-Weighting strategy to select and evaluate language features in a series of n-grams with different orders and within various distance-windows. For the second issue, we adopt Maximum Entropy (MaxEnt) modeling methods to construct our language model framework. Besides the classical MaxEnt models, we have also constructed two kinds of improved models with Gaussian and exponential priors respectively. Detailed experiments given in this paper show that with well selected and weighted language features, MaxEnt models with exponential priors are significantly more suitable for the text subjectivity analysis task.

  9. Application of the maximum relative entropy method to the physics of ferromagnetic materials

    Science.gov (United States)

    Giffin, Adom; Cafaro, Carlo; Ali, Sean Alan

    2016-08-01

    It is known that the Maximum relative Entropy (MrE) method can be used to both update and approximate probability distributions functions in statistical inference problems. In this manuscript, we apply the MrE method to infer magnetic properties of ferromagnetic materials. In addition to comparing our approach to more traditional methodologies based upon the Ising model and Mean Field Theory, we also test the effectiveness of the MrE method on conventionally unexplored ferromagnetic materials with defects.

  10. Scatter factor confidence interval estimate of least square maximum entropy quantile function for small samples

    Institute of Scientific and Technical Information of China (English)

    Wu Fuxian; Wen Weidong

    2016-01-01

    Classic maximum entropy quantile function method (CMEQFM) based on the probabil-ity weighted moments (PWMs) can accurately estimate the quantile function of random variable on small samples, but inaccurately on the very small samples. To overcome this weakness, least square maximum entropy quantile function method (LSMEQFM) and that with constraint condition (LSMEQFMCC) are proposed. To improve the confidence level of quantile function estimation, scatter factor method is combined with maximum entropy method to estimate the confidence inter-val of quantile function. From the comparisons of these methods about two common probability distributions and one engineering application, it is showed that CMEQFM can estimate the quan-tile function accurately on the small samples but inaccurately on the very small samples (10 sam-ples); LSMEQFM and LSMEQFMCC can be successfully applied to the very small samples;with consideration of the constraint condition on quantile function, LSMEQFMCC is more stable and computationally accurate than LSMEQFM; scatter factor confidence interval estimation method based on LSMEQFM or LSMEQFMCC has good estimation accuracy on the confidence interval of quantile function, and that based on LSMEQFMCC is the most stable and accurate method on the very small samples (10 samples).

  11. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Directory of Open Access Journals (Sweden)

    Richard R Stein

    2015-07-01

    Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  12. Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.

    Science.gov (United States)

    Stein, Richard R; Marks, Debora S; Sander, Chris

    2015-07-01

    Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.

  13. Maximum-Entropy Meshfree Method for Compressible and Near-Incompressible Elasticity

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, A; Puso, M A; Sukumar, N

    2009-09-04

    Numerical integration errors and volumetric locking in the near-incompressible limit are two outstanding issues in Galerkin-based meshfree computations. In this paper, we present a modified Gaussian integration scheme on background cells for meshfree methods that alleviates errors in numerical integration and ensures patch test satisfaction to machine precision. Secondly, a locking-free small-strain elasticity formulation for meshfree methods is proposed, which draws on developments in assumed strain methods and nodal integration techniques. In this study, maximum-entropy basis functions are used; however, the generality of our approach permits the use of any meshfree approximation. Various benchmark problems in two-dimensional compressible and near-incompressible small strain elasticity are presented to demonstrate the accuracy and optimal convergence in the energy norm of the maximum-entropy meshfree formulation.

  14. Maximum Relative Entropy Updating and the Value of Learning

    Directory of Open Access Journals (Sweden)

    Patryk Dziurosz-Serafinowicz

    2015-03-01

    Full Text Available We examine the possibility of justifying the principle of maximum relative entropy (MRE considered as an updating rule by looking at the value of learning theorem established in classical decision theory. This theorem captures an intuitive requirement for learning: learning should lead to new degrees of belief that are expected to be helpful and never harmful in making decisions. We call this requirement the value of learning. We consider the extent to which learning rules by MRE could satisfy this requirement and so could be a rational means for pursuing practical goals. First, by representing MRE updating as a conditioning model, we show that MRE satisfies the value of learning in cases where learning prompts a complete redistribution of one’s degrees of belief over a partition of propositions. Second, we show that the value of learning may not be generally satisfied by MRE updates in cases of updating on a change in one’s conditional degrees of belief. We explain that this is so because, contrary to what the value of learning requires, one’s prior degrees of belief might not be equal to the expectation of one’s posterior degrees of belief. This, in turn, points towards a more general moral: that the justification of MRE updating in terms of the value of learning may be sensitive to the context of a given learning experience. Moreover, this lends support to the idea that MRE is not a universal nor mechanical updating rule, but rather a rule whose application and justification may be context-sensitive.

  15. The mechanics of granitoid systems and maximum entropy production rates.

    Science.gov (United States)

    Hobbs, Bruce E; Ord, Alison

    2010-01-13

    A model for the formation of granitoid systems is developed involving melt production spatially below a rising isotherm that defines melt initiation. Production of the melt volumes necessary to form granitoid complexes within 10(4)-10(7) years demands control of the isotherm velocity by melt advection. This velocity is one control on the melt flux generated spatially just above the melt isotherm, which is the control valve for the behaviour of the complete granitoid system. Melt transport occurs in conduits initiated as sheets or tubes comprising melt inclusions arising from Gurson-Tvergaard constitutive behaviour. Such conduits appear as leucosomes parallel to lineations and foliations, and ductile and brittle dykes. The melt flux generated at the melt isotherm controls the position of the melt solidus isotherm and hence the physical height of the Transport/Emplacement Zone. A conduit width-selection process, driven by changes in melt viscosity and constitutive behaviour, operates within the Transport Zone to progressively increase the width of apertures upwards. Melt can also be driven horizontally by gradients in topography; these horizontal fluxes can be similar in magnitude to vertical fluxes. Fluxes induced by deformation can compete with both buoyancy and topographic-driven flow over all length scales and results locally in transient 'ponds' of melt. Pluton emplacement is controlled by the transition in constitutive behaviour of the melt/magma from elastic-viscous at high temperatures to elastic-plastic-viscous approaching the melt solidus enabling finite thickness plutons to develop. The system involves coupled feedback processes that grow at the expense of heat supplied to the system and compete with melt advection. The result is that limits are placed on the size and time scale of the system. Optimal characteristics of the system coincide with a state of maximum entropy production rate.

  16. Unification of Field Theory and Maximum Entropy Methods for Learning Probability Densities

    CERN Document Server

    Kinney, Justin B

    2014-01-01

    Bayesian field theory and maximum entropy are two methods for learning smooth probability distributions (a.k.a. probability densities) from finite sampled data. Both methods were inspired by statistical physics, but the relationship between them has remained unclear. Here I show that Bayesian field theory subsumes maximum entropy density estimation. In particular, the most common maximum entropy methods are shown to be limiting cases of Bayesian inference using field theory priors that impose no boundary conditions on candidate densities. This unification provides a natural way to test the validity of the maximum entropy assumption on one's data. It also provides a better-fitting nonparametric density estimate when the maximum entropy assumption is rejected.

  17. Maximum entropy principle for stationary states underpinned by stochastic thermodynamics.

    Science.gov (United States)

    Ford, Ian J

    2015-11-01

    The selection of an equilibrium state by maximizing the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximizing the change, averaged over all realizations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realizations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.

  18. Maximum entropy principle for stationary states underpinned by stochastic thermodynamics

    Science.gov (United States)

    Ford, Ian J.

    2015-11-01

    The selection of an equilibrium state by maximizing the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximizing the change, averaged over all realizations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realizations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.

  19. Nonuniform sampling and maximum entropy reconstruction in multidimensional NMR.

    Science.gov (United States)

    Hoch, Jeffrey C; Maciejewski, Mark W; Mobli, Mehdi; Schuyler, Adam D; Stern, Alan S

    2014-02-18

    NMR spectroscopy is one of the most powerful and versatile analytic tools available to chemists. The discrete Fourier transform (DFT) played a seminal role in the development of modern NMR, including the multidimensional methods that are essential for characterizing complex biomolecules. However, it suffers from well-known limitations: chiefly the difficulty in obtaining high-resolution spectral estimates from short data records. Because the time required to perform an experiment is proportional to the number of data samples, this problem imposes a sampling burden for multidimensional NMR experiments. At high magnetic field, where spectral dispersion is greatest, the problem becomes particularly acute. Consequently multidimensional NMR experiments that rely on the DFT must either sacrifice resolution in order to be completed in reasonable time or use inordinate amounts of time to achieve the potential resolution afforded by high-field magnets. Maximum entropy (MaxEnt) reconstruction is a non-Fourier method of spectrum analysis that can provide high-resolution spectral estimates from short data records. It can also be used with nonuniformly sampled data sets. Since resolution is substantially determined by the largest evolution time sampled, nonuniform sampling enables high resolution while avoiding the need to uniformly sample at large numbers of evolution times. The Nyquist sampling theorem does not apply to nonuniformly sampled data, and artifacts that occur with the use of nonuniform sampling can be viewed as frequency-aliased signals. Strategies for suppressing nonuniform sampling artifacts include the careful design of the sampling scheme and special methods for computing the spectrum. Researchers now routinely report that they can complete an N-dimensional NMR experiment 3(N-1) times faster (a 3D experiment in one ninth of the time). As a result, high-resolution three- and four-dimensional experiments that were prohibitively time consuming are now practical

  20. A MAXIMUM ENTROPY CHUNKING MODEL WITH N-FOLD TEMPLATE CORRECTION

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model. First two types of machine learning models are described. Based on the analysis of the two models, then the chunking model which combines the profits of conditional probability model and rule based model is proposed. The selection of features and rule templates in the chunking model is discussed. Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score: 92.93%. Compared with the ME model and ME Markov model, the new chunking model achieves better performance.

  1. Autonomous entropy-based intelligent experimental design

    Science.gov (United States)

    Malakar, Nabin Kumar

    2011-07-01

    The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same

  2. A maximum entropy distribution for wave heights of non-linear sea waves

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the maximum entropy principle, a probability density function (PDF) for the zero-crossing wave height (H)of random waves is derived as the simple form fn (H) = αHγe-βHn ( n is a selectable positive integer) through solving a variational problem subject to some quite general constraints. This PDF maximizes the information entropy of H, and its parameters α, γ and β are expressed ear sea waves with large uncertainty, and its parameters can be simply determined from available data. Comparisons between the PDF with n = 3 and n = 4 and the observed distributions of H from wave records measured in the East China Sea and in a wind-wave tunnel show fairly satisfying agreements.

  3. A Maximum Margin Learning Machine Based on Entropy Concept and Kernel Density Estimation%基于熵理论和核密度估计的最大间隔学习机

    Institute of Scientific and Technical Information of China (English)

    刘忠宝; 王士同

    2011-01-01

    In order to circumvent the deficiencies of Support Vector Machine (SVM) and its improved algorithms, this paper presents Maximum-margin Learning Machine based on Entropy concept and Kernel density estimation (MLMEK). In MLMEK, data distributions in samples are represented by kernel density estimation and classification uncertainties are represented by entropy. MLMEK takes boundary data between classes and inner data in each class seriously, so it performs better than traditional SVM. MLMEK can work for two-class and one-class pattern classification. Experimental results obtained from UCI data sets verify that the algorithms proposed in the paper is effective and competitive.%该文针对支持向量机(SVM)及其变种的不足,提出一种基于熵理论和核密度估计的最大间隔学习机MLMEK.MLMEK引入了核密度估计和熵的概念,用核密度估计表征样本数据的分布特征,用熵表征分类的不确定性.MLMEK真实反映样本数据的分布特征;同时解决两类分类问题和单类分类问题;比传统SVM具有更好的分类性能.UCI数据集上的实验验证了MLMEK的有效性.

  4. Maximum entropy production: Can it be used to constrain conceptual hydrological models?

    Science.gov (United States)

    M.C. Westhoff; E. Zehe

    2013-01-01

    In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP) is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in...

  5. 基于最大熵原理与鉴别信息的机床主轴系统退化分析%Degradation analysis of machine tool spindle based on maximum entropy and discrimination information

    Institute of Scientific and Technical Information of China (English)

    董新峰; 李郝林; 余慧杰

    2013-01-01

    The methods of maximum entropy and discrimination information were applied to the analysis of X2 direction degradation of a M1432 grinding machine. The maximum entropy principle was used to obtain the accurate maximum entropy probability density distribution of the vibration. Then, the discrimination information was made in use to analyze the variations of maximum entropy probability density distribution that can judge the state of machine tool spindle system. The results show that in the X2 direction, the workpiece spindle in the example has tiny degradation.%采用最大熵原理与鉴别信息方法对M1432B型磨床工件主轴X2方向退化进行分析:用最大熵原理获得工件主轴在4 ~10月份振动信号最大熵概率密度分布,再用鉴别信息对该概率密度分布变化进行计算,通过鉴别信息变化判断主轴系统状态的变化,结果表明,M1432B型磨床工件主轴x2方向发生微小退化.

  6. 基于最大熵谱估计的回波信号检测技术研究%Research on Echo Signal Detection Technology Based on Maximum Entropy Spectral Estimation

    Institute of Scientific and Technical Information of China (English)

    李红; 雷志勇

    2011-01-01

    提出了最大熵谱估计和LMS自适应算法提取激光测距系统的反射回波信号。研究了最大熵谱估计的信号检测原理,采用Burg算法求取AR模型相关参数,设计LMS自适应滤波器提取回波信号,并分析了Burg最大熵谱估计在激光测距系统回波信号检测中的应用。仿真分析表明,最大熵谱估计和LMS自适应算法相结合可以有效地从背景噪声中提取有用的激光反射回波信号。%Maximum entropy spectral estimation and LMS adaptive algorithm were proposed to extract the reflection echo signal of laser ranging system.The theory of the maximum entropy spectra estimation was researched,the Burg algorithm was used to obtain correlative parameters of model,and LMS filter was designed to extract useful signal from weak echo signal.The applications of Burg maximum entropy spectral estimation were analyzed in echo signal detection of laser ranging system.Simluation results show that the maximum entropy spectral estimation and LMS adaptive algorithm can extract echo signal of laser ranging system effectively from background noises.

  7. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines

    Directory of Open Access Journals (Sweden)

    Y. Haseli

    2016-05-01

    Full Text Available The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov’s engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  8. Application of maximum entropy method for droplet size distribution prediction using instability analysis of liquid sheet

    Science.gov (United States)

    Movahednejad, E.; Ommi, F.; Hosseinalipour, S. M.; Chen, C. P.; Mahdavi, S. A.

    2011-12-01

    This paper describes the implementation of the instability analysis of wave growth on liquid jet surface, and maximum entropy principle (MEP) for prediction of droplet diameter distribution in primary breakup region. The early stage of the primary breakup, which contains the growth of wave on liquid-gas interface, is deterministic; whereas the droplet formation stage at the end of primary breakup is random and stochastic. The stage of droplet formation after the liquid bulk breakup can be modeled by statistical means based on the maximum entropy principle. The MEP provides a formulation that predicts the atomization process while satisfying constraint equations based on conservations of mass, momentum and energy. The deterministic aspect considers the instability of wave motion on jet surface before the liquid bulk breakup using the linear instability analysis, which provides information of the maximum growth rate and corresponding wavelength of instabilities in breakup zone. The two sub-models are coupled together using momentum source term and mean diameter of droplets. This model is also capable of considering drag force on droplets through gas-liquid interaction. The predicted results compared favorably with the experimentally measured droplet size distributions for hollow-cone sprays.

  9. Nuclear Enhanced X-ray Maximum Entropy Method Used to Analyze Local Distortions in Simple Structures

    DEFF Research Database (Denmark)

    Christensen, Sebastian; Bindzus, Niels; Christensen, Mogens

    the ideal, undistorted rock-salt structure. NEXMEM employs a simple procedure to normalize extracted structure factors to the atomic form factors. The NDD is reconstructed by performing maximum entropy calculations on the normalized structure factors. NEXMEM has been validated by testing against simulated....... In addition, we have applied NEXMEM to multi-temperature synchrotron powder X-ray diffraction collected on PbX. Based on powder diffraction data, our study demonstrates that NEXMEM successfully improves the atomic resolution over standard MEM. This new tool aids our understanding of the local distortions...

  10. Use of maximum entropy principle with Lagrange multipliers extends the feasibility of elementary mode analysis.

    Science.gov (United States)

    Zhao, Quanyu; Kurata, Hiroyuki

    2010-08-01

    Elementary mode (EM) analysis is potentially effective in integrating transcriptome or proteome data into metabolic network analyses and in exploring the mechanism of how phenotypic or metabolic flux distribution is changed with respect to environmental and genetic perturbations. The EM coefficients (EMCs) indicate the quantitative contribution of their associated EMs and can be estimated by maximizing Shannon's entropy as a general objective function in our previous study, but the use of EMCs is still restricted to a relatively small-scale networks. We propose a fast and universal method that optimizes hundreds of thousands of EMCs under the constraint of the Maximum entropy principle (MEP). Lagrange multipliers (LMs) are applied to maximize the Shannon's entropy-based objective function, analytically solving each EMC as the function of LMs. Consequently, the number of such search variables, the EMC number, is dramatically reduced to the reaction number. To demonstrate the feasibility of the MEP with Lagrange multipliers (MEPLM), it is coupled with enzyme control flux (ECF) to predict the flux distributions of Escherichia coli and Saccharomycescerevisiae for different conditions (gene deletion, adaptive evolution, temperature, and dilution rate) and to provide a quantitative understanding of how metabolic or physiological states are changed in response to these genetic or environmental perturbations at the elementary mode level. It is shown that the ECF-based method is a feasible framework for the prediction of metabolic flux distribution by integrating enzyme activity data into EMs to genetic and environmental perturbations.

  11. Entropy-based benchmarking methods

    OpenAIRE

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...

  12. Application of maximum-entropy spectral estimation to deconvolution of XPS data. [X-ray Photoelectron Spectroscopy

    Science.gov (United States)

    Vasquez, R. P.; Klein, J. D.; Barton, J. J.; Grunthaner, F. J.

    1981-01-01

    A comparison is made between maximum-entropy spectral estimation and traditional methods of deconvolution used in electron spectroscopy. The maximum-entropy method is found to have higher resolution-enhancement capabilities and, if the broadening function is known, can be used with no adjustable parameters with a high degree of reliability. The method and its use in practice are briefly described, and a criterion is given for choosing the optimal order for the prediction filter based on the prediction-error power sequence. The method is demonstrated on a test case and applied to X-ray photoelectron spectra.

  13. Maximum entropy analysis of flow and reaction networks

    Science.gov (United States)

    Niven, Robert K.; Abel, Markus; Schlegel, Michael; Waldrip, Steven H.

    2015-01-01

    We present a generalised MaxEnt method to infer the stationary state of a flow network, subject to "observable" constraints on expectations of various parameters, as well as "physical" constraints arising from frictional properties (resistance functions) and conservation laws (Kirchhoff laws). The method invokes an entropy defined over all uncertainties in the system, in this case the internal and external flow rates and potential differences. The proposed MaxEnt framework is readily extendable to the analysis of networks with uncertainty in the network structure itself.

  14. Determining Dynamical Path Distributions usingMaximum Relative Entropy

    Science.gov (United States)

    2015-05-31

    θ). The selected joint posterior Pnew(x, θ) is that which maximizes the entropy1 , S[P, Pold ] = − ∫ P (x, θ) log P (x, θ) Pold (x, θ) dxdθ , (15) 1...to the appropriate constraints (parameters can be discrete as well). Pold (x, θ) contains our prior information which we call the joint prior. To be...explicit, Pold (x, θ) = Pold (x) Pold (θ|x) , (16) where Pold (x) is the traditional Bayesian prior and Pold (θ|x) is the likelihood. It is important to

  15. Disproportionate Allocation of Indirect Costs at Individual-Farm Level Using Maximum Entropy

    Directory of Open Access Journals (Sweden)

    Markus Lips

    2017-08-01

    Full Text Available This paper addresses the allocation of indirect or joint costs among farm enterprises, and elaborates two maximum entropy models, the basic CoreModel and the InequalityModel, which additionally includes inequality restrictions in order to incorporate knowledge from production technology. Representing the indirect costing approach, both models address the individual-farm level and use standard costs from farm-management literature as allocation bases. They provide a disproportionate allocation, with the distinctive feature that enterprises with large allocation bases face stronger adjustments than enterprises with small ones, approximating indirect costing with reality. Based on crop-farm observations from the Swiss Farm Accountancy Data Network (FADN, including up to 36 observations per enterprise, both models are compared with a proportional allocation as reference base. The mean differences of the enterprise’s allocated labour inputs and machinery costs are in a range of up to ±35% and ±20% for the CoreModel and InequalityModel, respectively. We conclude that the choice of allocation methods has a strong influence on the resulting indirect costs. Furthermore, the application of inequality restrictions is a precondition to make the merits of the maximum entropy principle accessible for the allocation of indirect costs.

  16. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  17. Maximum Entropy Production vs. Kolmogorov-Sinai Entropy in a Constrained ASEP Model

    Directory of Open Access Journals (Sweden)

    Martin Mihelich

    2014-02-01

    Full Text Available The asymmetric simple exclusion process (ASEP has become a paradigmatic toy-model of a non-equilibrium system, and much effort has been made in the past decades to compute exactly its statistics for given dynamical rules. Here, a different approach is developed; analogously to the equilibrium situation, we consider that the dynamical rules are not exactly known. Allowing for the transition rate to vary, we show that the dynamical rules that maximize the entropy production and those that maximise the rate of variation of the dynamical entropy, known as the Kolmogorov-Sinai entropy coincide with good accuracy. We study the dependence of this agreement on the size of the system and the couplings with the reservoirs, for the original ASEP and a variant with Langmuir kinetics.

  18. On the estimation of the curvatures and bending rigidity of membrane networks via a local maximum-entropy approach

    CERN Document Server

    Fraternali, Fernando; Marcelli, Gianluca

    2011-01-01

    We present a meshfree method for the curvature estimation of membrane networks based on the Local Maximum Entropy approach recently presented in (Arroyo and Ortiz, 2006). A continuum regularization of the network is carried out by balancing the maximization of the information entropy corresponding to the nodal data, with the minimization of the total width of the shape functions. The accuracy and convergence properties of the given curvature prediction procedure are assessed through numerical applications to benchmark problems, which include coarse grained molecular dynamics simulations of the fluctuations of red blood cell membranes (Marcelli et al., 2005; Hale et al., 2009). We also provide an energetic discrete-to-continuum approach to the prediction of the zero-temperature bending rigidity of membrane networks, which is based on the integration of the local curvature estimates. The Local Maximum Entropy approach is easily applicable to the continuum regularization of fluctuating membranes, and the predict...

  19. A maximum entropy framework for non-exponential distributions

    CERN Document Server

    Peterson, Jack; Dill, Ken A

    2015-01-01

    Probability distributions having power-law tails are observed in a broad range of social, economic, and biological systems. We describe here a potentially useful common framework. We derive distribution functions $\\{p_k\\}$ for situations in which a `joiner particle' $k$ pays some form of price to enter a `community' of size $k-1$, where costs are subject to economies-of-scale (EOS). Maximizing the Boltzmann-Gibbs-Shannon entropy subject to this energy-like constraint predicts a distribution having a power-law tail; it reduces to the Boltzmann distribution in the absence of EOS. We show that the predicted function gives excellent fits to 13 different distribution functions, ranging from friendship links in social networks, to protein-protein interactions, to the severity of terrorist attacks. This approach may give useful insights into when to expect power-law distributions in the natural and social sciences.

  20. General proof of (maximum) entropy principle in Lovelock gravity

    CERN Document Server

    Cao, Li-Ming

    2014-01-01

    We consider a static self-gravitating perfect fluid system in Lovelock gravity theory. For a spacial region on the hypersurface orthogonal to static Killing vector, by the Tolman's law of temperature, the assumption of a fixed total particle number inside the spacial region, and all of the variations (of relevant fields) in which the induced metric and its first derivatives are fixed on the boundary of the spacial region, then with the help of the gravitational equations of the theory, we can prove a theorem says that the total entropy of the fluid in this region takes an extremum value. A converse theorem can also be obtained following the reverse process of our proof.

  1. General proof of (maximum) entropy principle in Lovelock gravity

    Science.gov (United States)

    Cao, Li-Ming; Xu, Jianfei

    2015-02-01

    We consider a static self-gravitating perfect fluid system in Lovelock gravity theory. For a spacial region on the hypersurface orthogonal to static Killing vector, by the Tolman's law of temperature, the assumption of a fixed total particle number inside the spacial region, and all of the variations (of relevant fields) in which the induced metric and its first derivatives are fixed on the boundary of the spacial region, then with the help of the gravitational and fluid equations of the theory, we can prove a theorem says that the total entropy of the fluid in this region takes an extremum value. A converse theorem can also be obtained following the reverse process of our proof. We also propose the definition of isolation quasilocally for the system and explain the physical meaning of the boundary conditions in the proof of the theorems.

  2. The Second Law Today: Using Maximum-Minimum Entropy Generation

    Directory of Open Access Journals (Sweden)

    Umberto Lucia

    2015-11-01

    Full Text Available There are a great number of thermodynamic schools, independent of each other, and without a powerful general approach, but with a split on non-equilibrium thermodynamics. In 1912, in relation to the stationary non-equilibrium states, Ehrenfest introduced the fundamental question on the existence of a functional that achieves its extreme value for stable states, as entropy does for the stationary states in equilibrium thermodynamics. Today, the new branch frontiers of science and engineering, from power engineering to environmental sciences, from chaos to complex systems, from life sciences to nanosciences, etc. require a unified approach in order to optimize results and obtain a powerful approach to non-equilibrium thermodynamics and open systems. In this paper, a generalization of the Gouy–Stodola approach is suggested as a possible answer to the Ehrenfest question.

  3. Improvement of the detector resolution in X-ray spectrometry by using the maximum entropy method

    Science.gov (United States)

    Fernández, Jorge E.; Scot, Viviana; Giulio, Eugenio Di; Sabbatucci, Lorenzo

    2015-11-01

    In every X-ray spectroscopy measurement the influence of the detection system causes loss of information. Different mechanisms contribute to form the so-called detector response function (DRF): the detector efficiency, the escape of photons as a consequence of photoelectric or scattering interactions, the spectrum smearing due to the energy resolution, and, in solid states detectors (SSD), the charge collection artifacts. To recover the original spectrum, it is necessary to remove the detector influence by solving the so-called inverse problem. The maximum entropy unfolding technique solves this problem by imposing a set of constraints, taking advantage of the known a priori information and preserving the positive-defined character of the X-ray spectrum. This method has been included in the tool UMESTRAT (Unfolding Maximum Entropy STRATegy), which adopts a semi-automatic strategy to solve the unfolding problem based on a suitable combination of the codes MAXED and GRAVEL, developed at PTB. In the past UMESTRAT proved the capability to resolve characteristic peaks which were revealed as overlapped by a Si SSD, giving good qualitative results. In order to obtain quantitative results, UMESTRAT has been modified to include the additional constraint of the total number of photons of the spectrum, which can be easily determined by inverting the diagonal efficiency matrix. The features of the improved code are illustrated with some examples of unfolding from three commonly used SSD like Si, Ge, and CdTe. The quantitative unfolding can be considered as a software improvement of the detector resolution.

  4. A New Maximum Entropy Probability Function for the Surface Elevation of Nonlinear Sea Waves

    Institute of Scientific and Technical Information of China (English)

    ZHANG Li-zhen; XU De-lun

    2005-01-01

    Based on the maximum entropy principle a new probability density function (PDF) f(x) for the surface elevation of nonlinear sea waves, X, is derived through performing a coordinate transform of X and solving a variation problem subject to three constraint conditions of f(x). Compared with the maximum entropy PDFs presented previously, the new PDF has the following merits: (1) it has four parameters to be determined and hence can give more refined fit to observed data and has wider suitability for nonlinear waves in different conditions; (2) these parameters are expressed in terms of distribution moments of X in a relatively simple form and hence are easy to be determined from observed data; (3) the PDF is free of the restriction of weak nonlinearity and possible to be used for sea waves in complicated conditions, such as those in shallow waters with complicated topography; and (4) the PDF is simple in form and hence convenient for theoretical and practical uses. Laboratory wind-wave experiments have been conducted to test the competence of the new PDF for the surface elevation of nonlinear waves. The experimental results manifest that the new PDF gives somewhat better fit to the laboratory wind-wave data than the well-known Gram-Charlier PDF and beta PDF.

  5. A maximum entropy approach to separating noise from signal in bimodal affiliation networks

    CERN Document Server

    Dianati, Navid

    2016-01-01

    In practice, many empirical networks, including co-authorship and collocation networks are unimodal projections of a bipartite data structure where one layer represents entities, the second layer consists of a number of sets representing affiliations, attributes, groups, etc., and an inter-layer link indicates membership of an entity in a set. The edge weight in the unimodal projection, which we refer to as a co-occurrence network, counts the number of sets to which both end-nodes are linked. Interpreting such dense networks requires statistical analysis that takes into account the bipartite structure of the underlying data. Here we develop a statistical significance metric for such networks based on a maximum entropy null model which preserves both the frequency sequence of the individuals/entities and the size sequence of the sets. Solving the maximum entropy problem is reduced to solving a system of nonlinear equations for which fast algorithms exist, thus eliminating the need for expensive Monte-Carlo sam...

  6. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  7. Night vision image fusion for target detection with improved 2D maximum entropy segmentation

    Science.gov (United States)

    Bai, Lian-fa; Liu, Ying-bin; Yue, Jiang; Zhang, Yi

    2013-08-01

    Infrared and LLL image are used for night vision target detection. In allusion to the characteristics of night vision imaging and lack of traditional detection algorithm for segmentation and extraction of targets, we propose a method of infrared and LLL image fusion for target detection with improved 2D maximum entropy segmentation. Firstly, two-dimensional histogram was improved by gray level and maximum gray level in weighted area, weights were selected to calculate the maximum entropy for infrared and LLL image segmentation by using the histogram. Compared with the traditional maximum entropy segmentation, the algorithm had significant effect in target detection, and the functions of background suppression and target extraction. And then, the validity of multi-dimensional characteristics AND operation on the infrared and LLL image feature level fusion for target detection is verified. Experimental results show that detection algorithm has a relatively good effect and application in target detection and multiple targets detection in complex background.

  8. An Efficient Algorithm for Maximum-Entropy Extension of Block-Circulant Covariance Matrices

    CERN Document Server

    Carli, Francesca P; Pavon, Michele; Picci, Giorgio

    2011-01-01

    This paper deals with maximum entropy completion of partially specified block-circulant matrices. Since positive definite symmetric circulants happen to be covariance matrices of stationary periodic processes, in particular of stationary reciprocal processes, this problem has applications in signal processing, in particular to image modeling. Maximum entropy completion is strictly related to maximum likelihood estimation subject to certain conditional independence constraints. The maximum entropy completion problem for block-circulant matrices is a nonlinear problem which has recently been solved by the authors, although leaving open the problem of an efficient computation of the solution. The main contribution of this paper is to provide an efficient algorithm for computing the solution. Simulation shows that our iterative scheme outperforms various existing approaches, especially for large dimensional problems. A necessary and sufficient condition for the existence of a positive definite circulant completio...

  9. Analysis of the Velocity Distribution in Partially-Filled Circular Pipe Employing the Principle of Maximum Entropy.

    Science.gov (United States)

    Jiang, Yulin; Li, Bin; Chen, Jie

    2016-01-01

    The flow velocity distribution in partially-filled circular pipe was investigated in this paper. The velocity profile is different from full-filled pipe flow, since the flow is driven by gravity, not by pressure. The research findings show that the position of maximum flow is below the water surface, and varies with the water depth. In the region of near tube wall, the fluid velocity is mainly influenced by the friction of the wall and the pipe bottom slope, and the variation of velocity is similar to full-filled pipe. But near the free water surface, the velocity distribution is mainly affected by the contractive tube wall and the secondary flow, and the variation of the velocity is relatively small. Literature retrieval results show relatively less research has been shown on the practical expression to describe the velocity distribution of partially-filled circular pipe. An expression of two-dimensional (2D) velocity distribution in partially-filled circular pipe flow was derived based on the principle of maximum entropy (POME). Different entropies were compared according to fluid knowledge, and non-extensive entropy was chosen. A new cumulative distribution function (CDF) of partially-filled circular pipe velocity in terms of flow depth was hypothesized. Combined with the CDF hypothesis, the 2D velocity distribution was derived, and the position of maximum velocity distribution was analyzed. The experimental results show that the estimated velocity values based on the principle of maximum Tsallis wavelet entropy are in good agreement with measured values.

  10. Adaptive Statistical Language Modeling; A Maximum Entropy Approach

    Science.gov (United States)

    1994-04-19

    recognition systems were built that could recognize vowels or digits, but they could not be successfully extended to handle more realistic language...maximum likelihood of gener- ating the training data. The identity of the ML and ME solutions, apart from being aesthetically pleasing, is extremely

  11. Most likely maximum entropy for population analysis: A case study in decompression sickness prevention

    Science.gov (United States)

    Bennani, Youssef; Pronzato, Luc; Rendas, Maria João

    2015-01-01

    We estimate the density of a set of biophysical parameters from region censored observations. We propose a new Maximum Entropy (maxent) estimator formulated as finding the most likely constrained maxent density. By using the Ŕnyi entropy of order two instead of the Shannon entropy, we are lead to a quadratic optimization problem with linear inequality constraints that has an efficient numerical solution. We compare the proposed estimator to the NPMLE and to the best fitting maxent solutions in real data from hyperbaric diving, showing that the resulting distribution has better generalization performance than NPMLE or maxent alone.

  12. Modeling the Mass Action Dynamics of Metabolism with Fluctuation Theorems and Maximum Entropy

    Science.gov (United States)

    Cannon, William; Thomas, Dennis; Baxter, Douglas; Zucker, Jeremy; Goh, Garrett

    The laws of thermodynamics dictate the behavior of biotic and abiotic systems. Simulation methods based on statistical thermodynamics can provide a fundamental understanding of how biological systems function and are coupled to their environment. While mass action kinetic simulations are based on solving ordinary differential equations using rate parameters, analogous thermodynamic simulations of mass action dynamics are based on modeling states using chemical potentials. The latter have the advantage that standard free energies of formation/reaction and metabolite levels are much easier to determine than rate parameters, allowing one to model across a large range of scales. Bridging theory and experiment, statistical thermodynamics simulations allow us to both predict activities of metabolites and enzymes and use experimental measurements of metabolites and proteins as input data. Even if metabolite levels are not available experimentally, it is shown that a maximum entropy assumption is quite reasonable and in many cases results in both the most energetically efficient process and the highest material flux.

  13. 基于最大熵原理的区域农业干旱度概率分布模型%Probability distribution model of regional agricultural drought degree based on the maximum entropy principle

    Institute of Scientific and Technical Information of China (English)

    陈海涛; 黄鑫; 邱林; 王文川

    2013-01-01

      提出了构建综合考虑自然因素与农作物生长周期之间量化关系的干旱度评价指标,并基于最大熵原理建立了项目区干旱度分布密度函数,避免了以往构建概率分布的随意性,实现了对区域农业干旱度进行量化评价的目的。首先根据作物在非充分灌溉条件下的减产率,建立了干旱程度的量化评价指标,然后通过蒙特卡罗法生成了长系列降雨资料,并计算历年干旱度指标,最后利用最大熵原理,构建了农业干旱度分布的概率分布密度函数。以河南省濮阳市渠村灌区为对象进行了实例计算。结果表明,该模型概念清晰,计算简便实用,结果符合实际,是一种较好的评估方法。%The evaluation index of drought degree,which comprehensively considering the quantitative rela⁃tionship between the crop growing period and natural factors, is presented in this paper. The distribution density function of drought degree has been established based on the maximum-entropy principle. It can avoid the randomness of probability distribution previous constructed and has realized purpose of quantita⁃tive evaluation of agricultural drought degree. Firstly, the quantitative evaluation index of drought degree was established according to the yield reduction rate of deficit irrigation conditions. Secondly,a long series rainfall data were generated by Monte-Carlo method and the past years index of drought degree were calcu⁃lated. Finally, the density function of probability distribution of agricultural drought degree distribution was constructed by using maximum entropy principle. As an example, the calculation results of the distribution of drought degree of agriculture in Qucun irrigation area were presented. The results show that the model provides a better evaluation method with clear concept,simple and practical approach,and reasonable out⁃comes.

  14. Entropy-based portfolio models: Practical issues

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2015-10-01

    Entropy is a nonparametric alternative of variance and has been used as a measure of risk in portfolio analysis. In this paper, the computation of entropy risk for a given set of data is discussed with illustration. A comparison between entropy-based portfolio models is made. We propose a natural extension of the mean entropy portfolio to make it more general and diversified. In terms of performance, this new model is similar to the mean-entropy portfolio when applied to real and simulated data, and offers higher return if no constraint is set for the desired return; also it is found to be the most diversified portfolio model.

  15. On the statistical equivalence of restrained-ensemble simulations with the maximum entropy method.

    Science.gov (United States)

    Roux, Benoît; Weare, Jonathan

    2013-02-28

    An issue of general interest in computer simulations is to incorporate information from experiments into a structural model. An important caveat in pursuing this goal is to avoid corrupting the resulting model with spurious and arbitrary biases. While the problem of biasing thermodynamic ensembles can be formulated rigorously using the maximum entropy method introduced by Jaynes, the approach can be cumbersome in practical applications with the need to determine multiple unknown coefficients iteratively. A popular alternative strategy to incorporate the information from experiments is to rely on restrained-ensemble molecular dynamics simulations. However, the fundamental validity of this computational strategy remains in question. Here, it is demonstrated that the statistical distribution produced by restrained-ensemble simulations is formally consistent with the maximum entropy method of Jaynes. This clarifies the underlying conditions under which restrained-ensemble simulations will yield results that are consistent with the maximum entropy method.

  16. 基于极大熵投影寻踪组合法的水资源预警评价模型%Warning Evaluation Model of Water Resources Based on Set of Legal Projection Pursuit of the Maximum Entropy

    Institute of Scientific and Technical Information of China (English)

    任永泰; 李丽

    2011-01-01

    利用基于极大熵准则赋权和基于实数加速遗传算法的投影寻踪方法相结合的组合附权法确定了各预警指标的权重;采用层次分析法计算水资源可持续利用复合系统中各子系统所占权重;利用综合评价模型计算出哈尔滨市水资源可持续发展指数;最终得到哈尔滨市水资源可持续利用预警结果.%The weights of each warning index are determined by combination enables law which is based on the maximum entropy criterion empowerment and projection pursuit method of real accelerating genetic algorithm; Using analytic hierarchy process to calculate the weights of each subsystem in composite system of water resources sustainable utilization; Sustainable development index of Harbin water resources is calculated by using comprehensive evaluation model; Warning results of Harbin water resources sustainable utilization are got eventually.

  17. Maximum Entropy Threshold Segmentation for Target Matching Using Speeded-Up Robust Features

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2014-01-01

    Full Text Available This paper proposes a 2-dimensional (2D maximum entropy threshold segmentation (2DMETS based speeded-up robust features (SURF approach for image target matching. First of all, based on the gray level of each pixel and the average gray level of its neighboring pixels, we construct a 2D gray histogram. Second, by the target and background segmentation, we localize the feature points at the interest points which have the local extremum of box filter responses. Third, from the 2D Haar wavelet responses, we generate the 64-dimensional (64D feature point descriptor vectors. Finally, we perform the target matching according to the comparisons of the 64D feature point descriptor vectors. Experimental results show that our proposed approach can effectively enhance the target matching performance, as well as preserving the real-time capacity.

  18. Test the Principle of Maximum Entropy in Constant Sum 2x2 Game:Evidence in Experimental Economics

    CERN Document Server

    Xu, Bin; Wang, Zhijian; Zhang, Jianbo

    2011-01-01

    Entropy serves as a central observable which indicates uncertainty in many chemical, thermodynamical, biological and ecological systems, and the principle of the maximum entropy (MaxEnt) is widely supported in natural science. Recently, entropy is employed to describe the social system in which human subjects are interacted with each other, but the principle of the maximum entropy has never been reported from this field empirically. By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two person constant sum $2 \\times 2$ game. Empirical evidence shows that, in this competing game environment, the outcome of human's decision-making obeys the principle of maximum entropy.

  19. Research of Text Categorization Based on Improved Maximum Entropy Algorithm%改进的最大熵权值算法在文本分类中的应用

    Institute of Scientific and Technical Information of China (English)

    李学相

    2012-01-01

    This paper discussed the problems in text categorization accuracy. In traditional text classification algorithm, different feature words have the same affecte on classification result,and classification accuracy is lower,causing the increase algorithm time complexity. Because the maximum entropy model can integrated various relevant or irrelevant probability knowledge observed, the processing of many issues can achieve better results. In order to solve the above problems, this paper proposed an improved maximum entropy text classification, which fully combines c-mean and maximum entropy algorithm advantages. The algorithm firstly takes shannon entropy as maximum entropy model of the objective function, simplifies classifier expression form, and then uses c-mean algorithm to classify the optimal feature. The simulation results show that the proposed method can quickly get the optimal classification feature subsets, greatly improve text classification accuracy, compared with the traditional text classification.%由于传统算法存在着特征词不明确、分类结果有重叠、工作效率低的缺陷,为了解决上述问题,提出了一种改进的最大熵文本分类方法.最大熵模型可以综合观察到的各种相关或不相关的概率知识,对许多问题的处理都可以达到较好的结果.提出的方法充分结合了均值聚类和最大熵值算法的优点,算法首先以香农熵作为最大熵模型中的目标函数,简化分类器的表达形式,然后采用均值聚类算法对最优特征进行分类.经过实验论证,所提出的新算法能够在较短的时间内获得分类后得到的特征集,大大缩短了工作的时间,同时提高了工作的效率.

  20. Ecosystem biogeochemistry considered as a distributed metabolic network ordered by maximum entropy production.

    Science.gov (United States)

    Vallino, Joseph J

    2010-05-12

    We examine the application of the maximum entropy production principle for describing ecosystem biogeochemistry. Since ecosystems can be functionally stable despite changes in species composition, we use a distributed metabolic network for describing biogeochemistry, which synthesizes generic biological structures that catalyse reaction pathways, but is otherwise organism independent. Allocation of biological structure and regulation of biogeochemical reactions is determined via solution of an optimal control problem in which entropy production is maximized. However, because synthesis of biological structures cannot occur if entropy production is maximized instantaneously, we propose that information stored within the metagenome allows biological systems to maximize entropy production when averaged over time. This differs from abiotic systems that maximize entropy production at a point in space-time, which we refer to as the steepest descent pathway. It is the spatio-temporal averaging that allows biological systems to outperform abiotic processes in entropy production, at least in many situations. A simulation of a methanotrophic system is used to demonstrate the approach. We conclude with a brief discussion on the implications of viewing ecosystems as self-organizing molecular machines that function to maximize entropy production at the ecosystem level of organization.

  1. Exact computation of the maximum-entropy potential of spiking neural-network models.

    Science.gov (United States)

    Cofré, R; Cessac, B

    2014-05-01

    Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.

  2. Analysis of simulated fluorescence intensities decays by a new maximum entropy method algorithm.

    Science.gov (United States)

    Esposito, Rosario; Altucci, Carlo; Velotta, Raffaele

    2013-01-01

    A new algorithm for the Maximum Entropy Method (MEM) is proposed for recovering the lifetime distribution in time-resolved fluorescence decays. The procedure is based on seeking the distribution that maximizes the Skilling entropy function subjected to the chi-squared constraint χ(2) ~ 1 through iterative linear approximations, LU decomposition of the Hessian matrix of the lagrangian problem and the Golden Section Search for backtracking. The accuracy of this algorithm has been investigated through comparisons with simulated fluorescence decays both of narrow and broad lifetime distributions. The proposed approach is capable to analyse datasets of up to 4,096 points with a discretization ranging from 100 to 1,000 lifetimes. A good agreement with non linear fitting estimates has been observed when the method has been applied to multi-exponential decays. Remarkable results have been also obtained for the broad lifetime distributions where the position is recovered with high accuracy and the distribution width is estimated within 3%. These results indicate that the procedure proposed generates MEM lifetime distributions that can be used to quantify the real heterogeneity of lifetimes in a sample.

  3. Restraint of mid-spatial frequency error in magnetorheological finishing (MRF) process by maximum entropy method

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In order to restrain the mid-spatial frequency error in magnetorheological finishing (MRF) process, a novel part-random path is designed based on the theory of maximum entropy method (MEM). Using KDMRF-1000F polishing machine, one flat work piece (98 mm in diameter) is polished. The mid-spatial frequency error in the region using part-random path is much lower than that by using common raster path. After one MRF iteration (7.46 min), peak-to-valley (PV) is 0.062 wave (1 wave =632.8 nm), root-mean-square (RMS) is 0.010 wave and no obvious mid-spatial frequency error is found. The result shows that the part-random path is a novel path, which results in a high form accuracy and low mid-spatial frequency error in MRF process.

  4. Background adjustment of cDNA microarray images by Maximum Entropy distributions.

    Science.gov (United States)

    Argyropoulos, Christos; Daskalakis, Antonis; Nikiforidis, George C; Sakellaropoulos, George C

    2010-08-01

    Many empirical studies have demonstrated the exquisite sensitivity of both traditional and novel statistical and machine intelligence algorithms to the method of background adjustment used to analyze microarray datasets. In this paper we develop a statistical framework that approaches background adjustment as a classic stochastic inverse problem, whose noise characteristics are given in terms of Maximum Entropy distributions. We derive analytic closed form approximations to the combined problem of estimating the magnitude of the background in microarray images and adjusting for its presence. The proposed method reduces standardized measures of log expression variability across replicates in situations of known differential and non-differential gene expression without increasing the bias. Additionally, it results in computationally efficient procedures for estimation and learning based on sufficient statistics and can filter out spot measures with intensities that are numerically close to the background level resulting in a noise reduction of about 7%.

  5. Restoration of Scanning Tunneling Microscope Images by means of Two-Dimensional Maximum Entropy Method

    Science.gov (United States)

    Matsumoto, Hisanori; Tokiwano, Kazuo; Hosoi, Hirotaka; Sueoka, Kazuhisa; Mukasa, Koichi

    2002-05-01

    We present a new technique for the restoration of scanning tunneling microscopy (STM) images, which is a two-dimensional extension of a recently developed statistical approach based on the one-dimensional least-squares method (LSM). An STM image is regarded as a realization of a stochastic process and assumed to be a composition of an underlying image and noise. We express the underlying image in terms of a two-dimensional generalized trigonometric polynomial suitable for representing the atomic protrusions in STM images. The optimization of the polynomial is performed by the two-dimensional LSM combined with the power spectral density function estimated by means of the maximum entropy method (MEM) iterative algorithm for two-dimensional signals. The restored images are obtained as the optimum least-squares fitting polynomial which is a continuous surface. We apply this technique to modeled and actual STM data. Results show that the present method yields a reasonable restoration of STM images.

  6. 岩爆预测的最大熵最优相对隶属度方法%Prediction of Rock Burst using the Method of Optimal Relative Membership Degree Based on the Maximum Entropy Principle

    Institute of Scientific and Technical Information of China (English)

    曾杰; 张永兴; 靳晓光

    2011-01-01

    通过分析国内外岩爆预测的判据,选择岩爆发生所需的力学条件、完整性条件、储能条件和脆性条件作为岩爆预测指标.引入岩爆预测的相对隶属度概念,计算了岩爆的相对隶属度模糊矩阵和预测指标的权重,以信息熵来描述并比较岩爆评价中的不确定性,定义了加权广义权距离来表征岩爆的差异.根据最大熵原理建立了岩爆预测的模糊最优化模型,对一些岩石地下工程实例进行了分析,预测结果与其他方法的分析结果以及实际情况基本一致.并将模型运用于葡萄山隧道岩爆预测,预测结果与实际岩爆情况符合较好.%In the analysis of rock burst criterion prediction at home and abroad, the prediction standards of rock burst are selected including the conditions of mechanics integrity, energy and brittle. The concept of relative membership degree on the rock burst prediction was introduced. The weight of standards and fuzzy matrix of relative membership degree are calculated. Uncertainty in rock burst prediction is described and compared according to the information entropy. Generalized weighted distance is also defined to characterize the differences in rock burst based on the maximum entropy principle, the establishment of a rock burst prediction fuzzy optimization model. The results from the application to practical example and comparisons with other methods are fairly good. Finally, the prediction model is applied in Putaoshan tunnel and the predictions are consistent with the actual rock burst.

  7. Feature-Opinion Pairs Classification Based on Dependency Relations and Maximum Entropy Model%基于依存关系和最大熵的特征-情感对分类

    Institute of Scientific and Technical Information of China (English)

    张磊; 李珊; 彭舰; 陈黎; 黎红友

    2014-01-01

    In recent years, feature-opinion pairs classification of Chinese product review is one of the most important research field in Web data mining technology. In this paper, five types of Chinese dependency relationships for product review have been concluded based on the traditional English dependency grammar. The maximum entropy model is used to predict the opinion-relevant product feature relations. To train the model, a set of feature symbol combinations have been designed by means of Chinese dependency. The experiment result shows that the recall and F-score of our approach could reach 78.68%and 75.36%respectively, which is clearly superior to Hu’s adjacent based method and Popesecu’s pattern based method.%中文产品评论特征词与关联的情感词的分类是观点挖掘的重要研究内容之一。该文改进了英文依存关系语法,总结出5种常用的中文产品评论依存关系;利用最大熵模型进行训练,设计了基于依存关系的复合特征模板。实验证明,应用该复合模板进行特征-情感对的提取,系统的查全率和F-score相比于传统方法,分别提高到78.68%和75.36%。

  8. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  9. 遗传算法粒在二维最大熵值图像分割中的应用%2-D Maximum Entropy Method of Image Segmentation Based on Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    欧萍; 贺电

    2011-01-01

    研究图像分割,针对从图像中提取用户要求的特征目标,最优阈值的选取是图像准确分割的关键技术.传统二维最大熵值算法的最优阈值采用穷举方式进行寻优,耗时长,分割效率较低,易产生误分割.为了提高图像分割效率和准确性,提出一种遗传算法的二维最大熵值图像分割方法.先对原始图像进行灰度转换,绘制出图像的二维直方图.根据二维直方图信息选取适当灰度值进行初始化,采用遗传算法的初始种群,通过遗传算法选择、交叉和变异操作搜索最优阈值,获得的最优阈值对图像进行分割.实验结果表明,与传统二维最大熵值的图像分割算法相比,方法不仅运算速度加快,提高了分割效率,而且图像分割精度也大大提高.%In the 2-d image segmentation algorithm of maximum entropy value, the optimum threshold selection of image segmentation is the key technique. Traditional 2-d maximum entropy image segmentation algorithms use exhaustive way to find the optimal threshold, which is time-consuming, low efficient, and easy to generate the false division. In order to improve the accuracy and efficiency of image segmentation, this paper puts forward a genetic algorithm of 2-d maximum entropy value for image segmentation. This method firstly carries out gray level transform of the original image and draws the 2-d histogram. Then, according to the 2-d histogram information, appropriate gray value is selected to be initialized, The initial population of genetic algorithm is desinod, and each individual is represented with a mo-dimensional vector. Through the operators of selection, crossover and mutation, the optimal thresholds are searched, wich finally is taken as the optimal threshold of image segmentation. Experimental results show that compared with the maximum entropy with traditional 2-d image segmentation algorithm, this method can improve the computation speed, efficiency, and image

  10. A generalized maximum entropy stochastic frontier measuring productivity accounting for spatial dependency

    NARCIS (Netherlands)

    Tonini, A.; Pede, V.

    2011-01-01

    In this paper, a stochastic frontier model accounting for spatial dependency is developed using generalized maximum entropy estimation. An application is made for measuring total factor productivity in European agriculture. The empirical results show that agricultural productivity growth in Europe i

  11. Monitoring of Time-Dependent System Profiles by Multiplex Gas Chromatography with Maximum Entropy Demodulation

    Science.gov (United States)

    Becker, Joseph F.; Valentin, Jose

    1996-01-01

    The maximum entropy technique was successfully applied to the deconvolution of overlapped chromatographic peaks. An algorithm was written in which the chromatogram was represented as a vector of sample concentrations multiplied by a peak shape matrix. Simulation results demonstrated that there is a trade off between the detector noise and peak resolution in the sense that an increase of the noise level reduced the peak separation that could be recovered by the maximum entropy method. Real data originated from a sample storage column was also deconvoluted using maximum entropy. Deconvolution is useful in this type of system because the conservation of time dependent profiles depends on the band spreading processes in the chromatographic column, which might smooth out the finer details in the concentration profile. The method was also applied to the deconvolution of previously interpretted Pioneer Venus chromatograms. It was found in this case that the correct choice of peak shape function was critical to the sensitivity of maximum entropy in the reconstruction of these chromatograms.

  12. Maximum-entropy expectation-maximization algorithm for image reconstruction and sensor field estimation.

    Science.gov (United States)

    Hong, Hunsop; Schonfeld, Dan

    2008-06-01

    In this paper, we propose a maximum-entropy expectation-maximization (MEEM) algorithm. We use the proposed algorithm for density estimation. The maximum-entropy constraint is imposed for smoothness of the estimated density function. The derivation of the MEEM algorithm requires determination of the covariance matrix in the framework of the maximum-entropy likelihood function, which is difficult to solve analytically. We, therefore, derive the MEEM algorithm by optimizing a lower-bound of the maximum-entropy likelihood function. We note that the classical expectation-maximization (EM) algorithm has been employed previously for 2-D density estimation. We propose to extend the use of the classical EM algorithm for image recovery from randomly sampled data and sensor field estimation from randomly scattered sensor networks. We further propose to use our approach in density estimation, image recovery and sensor field estimation. Computer simulation experiments are used to demonstrate the superior performance of the proposed MEEM algorithm in comparison to existing methods.

  13. Maximum-entropy parameter estimation for the k-NN modified value-difference kernel

    NARCIS (Netherlands)

    Hendrickx, I.H.E.; van den Bosch, A.; Verbruggen, R.; Taatgen, N.; Schomaker, L.

    2004-01-01

    We introduce an extension of the modified value-difference kernel of $k$-nn by replacing the kernel's default class distribution matrix with the matrix produced by the maximum-entropy learning algorithm. This hybrid algorithm is tested on fifteen machine learning benchmark tasks, comparing the hybri

  14. Can the maximum entropy principle be explained as a consistency requirement?

    NARCIS (Netherlands)

    Uffink, J.

    2001-01-01

    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathema

  15. Maximum-Entropy Parameter Estimation for the k-nn Modified Value-Difference Kernel

    NARCIS (Netherlands)

    Hendrickx, Iris; Bosch, Antal van den

    2005-01-01

    We introduce an extension of the modified value-difference kernel of k-nn by replacing the kernel's default class distribution matrix with the matrix produced by the maximum-entropy learning algorithm. This hybrid algorithm is tested on fifteen machine learning benchmark tasks, comparing the hybrid

  16. Bayesian Maximum Entropy prediction of soil categories using a traditional soil map as soft information.

    NARCIS (Netherlands)

    Brus, D.J.; Bogaert, P.; Heuvelink, G.B.M.

    2008-01-01

    Bayesian Maximum Entropy was used to estimate the probabilities of occurrence of soil categories in the Netherlands, and to simulate realizations from the associated multi-point pdf. Besides the hard observations (H) of the categories at 8369 locations, the soil map of the Netherlands 1:50 000 was u

  17. Maximum entropy production: can it be used to constrain conceptual hydrological models?

    Directory of Open Access Journals (Sweden)

    M. C. Westhoff

    2013-08-01

    Full Text Available In recent years, optimality principles have been proposed to constrain hydrological models. The principle of maximum entropy production (MEP is one of the proposed principles and is subject of this study. It states that a steady state system is organized in such a way that entropy production is maximized. Although successful applications have been reported in literature, generally little guidance has been given on how to apply the principle. The aim of this paper is to use the maximum power principle – which is closely related to MEP – to constrain parameters of a simple conceptual (bucket model. Although, we had to conclude that conceptual bucket models could not be constrained with respect to maximum power, this study sheds more light on how to use and how not to use the principle. Several of these issues have been correctly applied in other studies, but have not been explained or discussed as such. While other studies were based on resistance formulations, where the quantity to be optimized is a linear function of the resistance to be identified, our study shows that the approach also works for formulations that are only linear in the log-transformed space. Moreover, we showed that parameters describing process thresholds or influencing boundary conditions cannot be constrained. We furthermore conclude that, in order to apply the principle correctly, the model should be (1 physically based; i.e. fluxes should be defined as a gradient divided by a resistance, (2 the optimized flux should have a feedback on the gradient; i.e. the influence of boundary conditions on gradients should be minimal, (3 the temporal scale of the model should be chosen in such a way that the parameter that is optimized is constant over the modelling period, (4 only when the correct feedbacks are implemented the fluxes can be correctly optimized and (5 there should be a trade-off between two or more fluxes. Although our application of the maximum power principle did

  18. 基于特征比较和最大熵模型的统计机器翻译错误检测%Error Detection for Statistical Machine Translation Based on Feature Comparison and Maximum Entropy Model Classifier

    Institute of Scientific and Technical Information of China (English)

    杜金华; 王莎

    2013-01-01

    首先介绍3种典型的用于翻译错误检测和分类的单词后验概率特征,即基于固定位置的词后验概率、基于滑动窗的词后验概率和基于词对齐的词后验概率,分析其对错误检测性能的影响;然后,将其分别与语言学特征如词性、词及由LG句法分析器抽取的句法特征等进行组合,利用最大熵分类器预测翻译错误,并在汉英NIST数据集上进行实验验证和比较.实验结果表明,不同的单词后验概率对分类错误率的影响是显著的,并且在词后验概率基础上加入语言学特征的组合特征可以显著降低分类错误率,提高译文错误预测性能.%The authors firstly introduce three typical word posterior probabilities (WPP) for error detection and classification, which are fixed position WPP, sliding window WPP, and alignment-based WPP, and analyzes their impact on the detection performance. Then each WPP feature is combined with three linguistic features (Word, POS and LG Parsing knowledge) over the maximum entropy classifier to predict the translation errors. Experimental results on Chinese-to-English NIST datasets show that the influences of different WPP features on the classification error rate (CER) are significant, and the combination of WPP with linguistic features can significantly reduce the CER and improve the prediction capability of the classifier.

  19. Maximum entropy, fractal dimension and lacunarity in quantification of cellular rejection in myocardial biopsy of patients submitted to heart transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Neves, L A [Universidade Estadual Paulista, IGCE, DEMAC, Rio Claro, SP (Brazil); Oliveira, F R; Peres, F A [Faculdade de Tecnologia de Sao Jose do Rio Preto, Sao Jose do Rio Preto, SP (Brazil); Moreira, R D; Moriel, A R; De Godoy, M F [Faculdade de Medicina de Sao Jose do Rio Preto, FAMERP, Sao Jose do Rio Preto, SP (Brazil); Murta Junior, L O, E-mail: laneves@rc.unesp.br [Universidade de Sao Paulo, FFCLRP, Depto Computacao e Matematica, Ribeirao Preto (Brazil)

    2011-03-01

    This paper presents a method for the quantification of cellular rejection in endomyocardial biopsies of patients submitted to heart transplant. The model is based on automatic multilevel thresholding, which employs histogram quantification techniques, histogram slope percentage analysis and the calculation of maximum entropy. The structures were quantified with the aid of the multi-scale fractal dimension and lacunarity for the identification of behavior patterns in myocardial cellular rejection in order to determine the most adequate treatment for each case.

  20. Maximum entropy, fractal dimension and lacunarity in quantification of cellular rejection in myocardial biopsy of patients submitted to heart transplantation

    Science.gov (United States)

    Neves, L. A.; Oliveira, F. R.; Peres, F. A.; Moreira, R. D.; Moriel, A. R.; de Godoy, M. F.; Murta Junior, L. O.

    2011-03-01

    This paper presents a method for the quantification of cellular rejection in endomyocardial biopsies of patients submitted to heart transplant. The model is based on automatic multilevel thresholding, which employs histogram quantification techniques, histogram slope percentage analysis and the calculation of maximum entropy. The structures were quantified with the aid of the multi-scale fractal dimension and lacunarity for the identification of behavior patterns in myocardial cellular rejection in order to determine the most adequate treatment for each case.

  1. Maximum entropy as a consequence of Bayes' theorem in differentiable manifolds

    CERN Document Server

    Davis, Sergio

    2015-01-01

    Bayesian inference and the principle of maximum entropy (PME) are usually presented as separate but complementary branches of inference, the latter playing a central role in the foundations of Statistical Mechanics. In this work it is shown that the PME can be derived from Bayes' theorem and the divergence theorem for systems whose states can be mapped to points in a differentiable manifold. In this view, entropy must be interpreted as the invariant measure (non-informative prior) on the space of probability densities.

  2. Self-assembled wiggling nano-structures and the principle of maximum entropy production.

    Science.gov (United States)

    Belkin, A; Hubler, A; Bezryadin, A

    2015-02-09

    While behavior of equilibrium systems is well understood, evolution of nonequilibrium ones is much less clear. Yet, many researches have suggested that the principle of the maximum entropy production is of key importance in complex systems away from equilibrium. Here, we present a quantitative study of large ensembles of carbon nanotubes suspended in a non-conducting non-polar fluid subject to a strong electric field. Being driven out of equilibrium, the suspension spontaneously organizes into an electrically conducting state under a wide range of parameters. Such self-assembly allows the Joule heating and, therefore, the entropy production in the fluid, to be maximized. Curiously, we find that emerging self-assembled structures can start to wiggle. The wiggling takes place only until the entropy production in the suspension reaches its maximum, at which time the wiggling stops and the structure becomes quasi-stable. Thus, we provide strong evidence that maximum entropy production principle plays an essential role in the evolution of self-organizing systems far from equilibrium.

  3. Nonequilibrium thermodynamics and maximum entropy production in the Earth system: applications and implications.

    Science.gov (United States)

    Kleidon, Axel

    2009-06-01

    The Earth system is maintained in a unique state far from thermodynamic equilibrium, as, for instance, reflected in the high concentration of reactive oxygen in the atmosphere. The myriad of processes that transform energy, that result in the motion of mass in the atmosphere, in oceans, and on land, processes that drive the global water, carbon, and other biogeochemical cycles, all have in common that they are irreversible in their nature. Entropy production is a general consequence of these processes and measures their degree of irreversibility. The proposed principle of maximum entropy production (MEP) states that systems are driven to steady states in which they produce entropy at the maximum possible rate given the prevailing constraints. In this review, the basics of nonequilibrium thermodynamics are described, as well as how these apply to Earth system processes. Applications of the MEP principle are discussed, ranging from the strength of the atmospheric circulation, the hydrological cycle, and biogeochemical cycles to the role that life plays in these processes. Nonequilibrium thermodynamics and the MEP principle have potentially wide-ranging implications for our understanding of Earth system functioning, how it has evolved in the past, and why it is habitable. Entropy production allows us to quantify an objective direction of Earth system change (closer to vs further away from thermodynamic equilibrium, or, equivalently, towards a state of MEP). When a maximum in entropy production is reached, MEP implies that the Earth system reacts to perturbations primarily with negative feedbacks. In conclusion, this nonequilibrium thermodynamic view of the Earth system shows great promise to establish a holistic description of the Earth as one system. This perspective is likely to allow us to better understand and predict its function as one entity, how it has evolved in the past, and how it is modified by human activities in the future.

  4. 基于最大熵原理的径流预报误差分布规律研究%A Study of the Distribution Law of Runoff Forecasting Error Based on Maximum Entropy Principle

    Institute of Scientific and Technical Information of China (English)

    何洋; 纪昌明; 田开华; 张验科; 李传刚

    2016-01-01

    为了更好的研究径流预报误差的分布规律,应用最大熵原理,建立径流预报误差分布的最大熵模型;以官地水库径流预报系列为例,计算其不同预见期的径流预报误差概率密度函数及分布曲线,将该分布曲线与理论正态分布曲线和样本直方图进行对比,结果表明最大熵法求得的误差分布能更好地描述径流预报误差的分布特性。考虑流域径流年内的丰枯变化,以枯水期、汛期和过渡期对径流系列进行划分,分别分析各个时期的误差分布规律,并给出预报误差在不同置信区间下的置信度,从而更好地掌握径流预报误差的分布规律,为提高径流预报精度提供一条新途径。%To deeply study the distribution law of runoff forecast error, the maximum entropy principle is applied and the maximum entropy model for the distribution of runoff prediction error is established in this paper. The authors use the runoff forecast series in Guandi Reservoir as an example and calculate the probability density function and distribution curve of the runoff forecast error for different forecasting periods. The distribution curves are compared with the theoretical normal distribution curves and the histogram of the samples. The results show that the distribution characteristics of the error distribution calculated by the maximum entropy method can describe the runoff forecasting error better. Considering the change of runoff years, the runoff series are divided into droughts, flood and transition seasons. The error distribution rule of each period is analyzed, and the confidence of forecasting error at different confidence interval offered, thus mastering the distribution rule of runoff forecasting error better and providing a new way to improve the accuracy of runoff forecasting.

  5. 基于最大熵模型的词位标注汉语分词%Chinese Word Segmentation via Word-position Tagging Based on Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    于江德; 王希杰; 樊孝忠

    2011-01-01

    The performance of Chinese word segmentation has been greatly improved by word-position-based approaches in recent years.This approach treated Chinese word segmentation as a word-position tagging.With the help of powerful sequence tagging model, word-position-based method quickly rose as a mainstream technique in this field.Feature template selection and tag sets selection was crucial in this method.The technique was studied via using different word-positions tag sets and maximum entropy model.Closed evaluations were performed on corpus from the second international Chinese word segmentation Bakeoff-2005, and comparative experiments were performed on different tag sets and feature templates.Experimental results showed that the feature template set TMPT-6 and six word-position tag sets was much better than the other.%近年来基于字的词位标注汉语分词方法极大地提高了分词的性能,该方法将汉语分词转化为字的词位标注问题,借助于优秀的序列标注模型,词位标注汉语分词逐渐成为汉语分词的主要技术路线.该方法中特征模板集设定和词位标注集的选择至关重要,采用不同的词位标注集,使用最大熵模型进一步研究了词位标注汉语分词技术.在国际汉语分词评测Bakeoff2005的语料上进行了封闭测试,并对比了不同词位标注集对分词性能的影响.实验表明所采用的六词位标注集配合相应的特征模板集TMPT-6较其他词位标注集分词性能要好.

  6. Estimation of fine particulate matter in Taipei using landuse regression and bayesian maximum entropy methods.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-06-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.

  7. Maximum entropy deconvolution of the optical jet of 3C 273

    Science.gov (United States)

    Evans, I. N.; Ford, H. C.; Hui, X.

    1989-01-01

    The technique of maximum entropy image restoration is applied to the problem of deconvolving the point spread function from a deep, high-quality V band image of the optical jet of 3C 273. The resulting maximum entropy image has an approximate spatial resolution of 0.6 arcsec and has been used to study the morphology of the optical jet. Four regularly-spaced optical knots are clearly evident in the data, together with an optical 'extension' at each end of the optical jet. The jet oscillates around its center of gravity, and the spatial scale of the oscillations is very similar to the spacing between the optical knots. The jet is marginally resolved in the transverse direction and has an asymmetric profile perpendicular to the jet axis. The distribution of V band flux along the length of the jet, and accurate astrometry of the optical knot positions are presented.

  8. In-medium dispersion relations of charmonia studied by the maximum entropy method

    Science.gov (United States)

    Ikeda, Atsuro; Asakawa, Masayuki; Kitazawa, Masakiyo

    2017-01-01

    We study in-medium spectral properties of charmonia in the vector and pseudoscalar channels at nonzero momenta on quenched lattices, especially focusing on their dispersion relation and the weight of the peak. We measure the lattice Euclidean correlation functions with nonzero momenta on the anisotropic quenched lattices and study the spectral functions with the maximum entropy method. The dispersion relations of charmonia and the momentum dependence of the weight of the peak are analyzed with the maximum entropy method together with the errors estimated probabilistically in this method. We find a significant increase of the masses of charmonia in medium. We also find that the functional form of the charmonium dispersion relations is not changed from that in the vacuum within the error even at T ≃1.6 Tc for all the channels we analyze.

  9. On the maximum-entropy method for kinetic equation of radiation, particle and gas

    Energy Technology Data Exchange (ETDEWEB)

    El-Wakil, S.A. [Mansoura Univ. (Egypt). Phys. Dept.; Madkour, M.A. [Mansoura Univ. (Egypt). Phys. Dept.; Degheidy, A.R. [Mansoura Univ. (Egypt). Phys. Dept.; Machali, H.M. [Mansoura Univ. (Egypt). Phys. Dept.

    1995-02-01

    The maximum-entropy approach is used to calculate some problems in radiative transfer and reactor physics such as the escape probability, the emergent and transmitted intensities for a finite slab as well as the emergent intensity for a semi-infinite medium. Also, it is employed to solve problems involving spherical geometry, such as luminosity (the total energy emitted by a sphere), neutron capture probability and the albedo problem. The technique is also employed in the kinetic theory of gases to calculate the Poiseuille flow and thermal creep of a rarefied gas between two plates. Numerical calculations are achieved and compared with the published data. The comparisons demonstrate that the maximum-entropy results are good in agreement with the exact ones. (orig.).

  10. In-medium dispersion relations of charmonia studied by maximum entropy method

    CERN Document Server

    Ikeda, Atsuro; Kitazawa, Masakiyo

    2016-01-01

    We study in-medium spectral properties of charmonia in the vector and pseudoscalar channels at nonzero momenta on quenched lattices, especially focusing on their dispersion relation and weight of the peak. We measure the lattice Euclidean correlation functions with nonzero momenta on the anisotropic quenched lattices and study the spectral functions with the maximum entropy method. The dispersion relations of charmonia and the momentum dependence of the weight of the peak are analyzed with the maximum entropy method together with the errors estimated probabilistically in this method. We find significant increase of the masses of charmonia in medium. It is also found that the functional form of the charmonium dispersion relations is not changed from that in the vacuum within the error even at $T\\simeq1.6T_c$ for all the channels we analyzed.

  11. Maximum Theoretical Efficiency Limit of Photovoltaic Devices: Effect of Band Structure on Excited State Entropy.

    Science.gov (United States)

    Osterloh, Frank E

    2014-10-02

    The Shockley-Queisser analysis provides a theoretical limit for the maximum energy conversion efficiency of single junction photovoltaic cells. But besides the semiconductor bandgap no other semiconductor properties are considered in the analysis. Here, we show that the maximum conversion efficiency is limited further by the excited state entropy of the semiconductors. The entropy loss can be estimated with the modified Sackur-Tetrode equation as a function of the curvature of the bands, the degeneracy of states near the band edges, the illumination intensity, the temperature, and the band gap. The application of the second law of thermodynamics to semiconductors provides a simple explanation for the observed high performance of group IV, III-V, and II-VI materials with strong covalent bonding and for the lower efficiency of transition metal oxides containing weakly interacting metal d orbitals. The model also predicts efficient energy conversion with quantum confined and molecular structures in the presence of a light harvesting mechanism.

  12. Derivation of some new distributions in statistical mechanics using maximum entropy approach

    Directory of Open Access Journals (Sweden)

    Ray Amritansu

    2014-01-01

    Full Text Available The maximum entropy principle has been earlier used to derive the Bose Einstein(B.E., Fermi Dirac(F.D. & Intermediate Statistics(I.S. distribution of statistical mechanics. The central idea of these distributions is to predict the distribution of the microstates, which are the particle of the system, on the basis of the knowledge of some macroscopic data. The latter information is specified in the form of some simple moment constraints. One distribution differs from the other in the way in which the constraints are specified. In the present paper, we have derived some new distributions similar to B.E., F.D. distributions of statistical mechanics by using maximum entropy principle. Some proofs of B.E. & F.D. distributions are shown, and at the end some new results are discussed.

  13. Exact computation of the Maximum Entropy Potential of spiking neural networks models

    CERN Document Server

    Cofre, Rodrigo

    2014-01-01

    Understanding how stimuli and synaptic connectivity in uence the statistics of spike patterns in neural networks is a central question in computational neuroscience. Maximum Entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. But, in spite of good performance in terms of prediction, the ?tting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuro-mimetic models) provide a probabilistic mapping between stimulus, network architecture and spike patterns in terms of conditional proba- bilities. In this paper we build an exact analytical mapping between neuro-mimetic and Maximum Entropy models.

  14. Hydrodynamic equations for electrons in graphene obtained from the maximum entropy principle

    Energy Technology Data Exchange (ETDEWEB)

    Barletti, Luigi, E-mail: luigi.barletti@unifi.it [Dipartimento di Matematica e Informatica “Ulisse Dini”, Università degli Studi di Firenze, Viale Morgagni 67/A, 50134 Firenze (Italy)

    2014-08-15

    The maximum entropy principle is applied to the formal derivation of isothermal, Euler-like equations for semiclassical fermions (electrons and holes) in graphene. After proving general mathematical properties of the equations so obtained, their asymptotic form corresponding to significant physical regimes is investigated. In particular, the diffusive regime, the Maxwell-Boltzmann regime (high temperature), the collimation regime and the degenerate gas limit (vanishing temperature) are considered.

  15. The Maximum Entropy Production Principle: Its Theoretical Foundations and Applications to the Earth System

    OpenAIRE

    J. G. Dyke; Kleidon, A.

    2010-01-01

    The Maximum Entropy Production (MEP) principle has been remarkably successful in producing accurate predictions for non-equilibrium states. We argue that this is because the MEP principle is an effective inference procedure that produces the best predictions from the available information. Since all Earth system processes are subject to the conservation of energy, mass and momentum, we argue that in practical terms the MEP principle should be applied to Earth system processes in terms of the ...

  16. Structural modelling and control design under incomplete parameter information: The maximum-entropy approach

    Science.gov (United States)

    Hyland, D. C.

    1983-01-01

    A stochastic structural control model is described. In contrast to the customary deterministic model, the stochastic minimum data/maximum entropy model directly incorporates the least possible a priori parameter information. The approach is to adopt this model as the basic design model, thus incorporating the effects of parameter uncertainty at a fundamental level, and design mean-square optimal controls (that is, choose the control law to minimize the average of a quadratic performance index over the parameter ensemble).

  17. Lattice Field Theory with the Sign Problem and the Maximum Entropy Method

    Directory of Open Access Journals (Sweden)

    Masahiro Imachi

    2007-02-01

    Full Text Available Although numerical simulation in lattice field theory is one of the most effective tools to study non-perturbative properties of field theories, it faces serious obstacles coming from the sign problem in some theories such as finite density QCD and lattice field theory with the θ term. We reconsider this problem from the point of view of the maximum entropy method.

  18. Application of the Maximum Entropy/optimal Projection Control Design Approach for Large Space Structures

    Science.gov (United States)

    Hyland, D. C.

    1985-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modelling and reduced order control design method for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed and the application of the methodology to several large space structure (LSS) problems of representative complexity is illustrated.

  19. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    Science.gov (United States)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  20. Predictive Modeling and Mapping of Malayan Sun Bear (Helarctos malayanus) Distribution Using Maximum Entropy

    OpenAIRE

    Mona Nazeri; Kamaruzaman Jusoff; Nima Madani; Ahmad Rodzi Mahmud; Abdul Rani Bahman; Lalit Kumar

    2012-01-01

    One of the available tools for mapping the geographical distribution and potential suitable habitats is species distribution models. These techniques are very helpful for finding poorly known distributions of species in poorly sampled areas, such as the tropics. Maximum Entropy (MaxEnt) is a recently developed modeling method that can be successfully calibrated using a relatively small number of records. In this research, the MaxEnt model was applied to describe the distribution and identify ...

  1. 一种基于最大熵模型的加权归纳迁移学习方法%A Weighted Algorithm of Inductive Transfer Learning Based on Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    梅灿华; 张玉红; 胡学钢; 李培培

    2011-01-01

    Traditional machine learning and data mining algorithms mainly assume that the training and test data must be in the same feature space and follow the same distribution. However, in real applications, the data distributions change frequently, so those two hypotheses are hence difficult to hold. In such cases, most traditional algorithms are no longer applicable, because they usually require re-collecting and re-labeling large amounts of data, which is very expensive and time consuming. As a new framework of learning, transfer learning could effectively solve this problem by transferring the knowledge learned from one or more source domains to a target domain. This paper focuses on one of the important branches in this field, namely inductive transfer learning. Therefore, a weighted algorithm of inductive transfer learning based on maximum entropy model is proposed. It transfers the parameters of model learned from the source domain to the target domain, and meanwhile adjusts the weights of instances in the target domain to obtain the model with higher accuracy. And thus it could speed up learning process and achieve domain adaptation. The experimental results show the effectiveness of this algorithm.%传统机器学习和数据挖掘算法主要基于两个假设:训练数据集和测试数据集具有相同的特征空间和数据分布.然而在实际应用中,这两个假设却难以成立,从而导致传统的算法不再适用.迁移学习作为一种新的学习框架能有效地解决该问题.着眼于迁移学习的一个重要分支——归纳迁移学习,提出了一种基于最大熵模型的加权归纳迁移学习算法WTLME.该算法通过将已训练好的原始领域模型参数迁移到目标领域,并对目标领域实例权重进行调整,从而获得了精度较高的目标领域模型.实验结果表明了该算法的有效性.

  2. A pairwise maximum entropy model accurately describes resting-state human brain networks.

    Science.gov (United States)

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks.

  3. Maximum entropy analytic continuation for frequency-dependent transport coefficients with nonpositive spectral weight

    Science.gov (United States)

    Reymbaut, A.; Gagnon, A.-M.; Bergeron, D.; Tremblay, A.-M. S.

    2017-03-01

    The computation of transport coefficients, even in linear response, is a major challenge for theoretical methods that rely on analytic continuation of correlation functions obtained numerically in Matsubara space. While maximum entropy methods can be used for certain correlation functions, this is not possible in general, important examples being the Seebeck, Hall, Nernst, and Reggi-Leduc coefficients. Indeed, positivity of the spectral weight on the positive real-frequency axis is not guaranteed in these cases. The spectral weight can even be complex in the presence of broken time-reversal symmetry. Various workarounds, such as the neglect of vertex corrections or the study of the infinite frequency or Kelvin limits, have been proposed. Here, we show that one can define auxiliary response functions that allow one to extract the desired real-frequency susceptibilities from maximum entropy methods in the most general multiorbital cases with no particular symmetry. As a benchmark case, we study the longitudinal thermoelectric response and corresponding Onsager coefficient in the single-band two-dimensional Hubbard model treated with dynamical mean-field theory and continuous-time quantum Monte Carlo. We thereby extend the maximum entropy analytic continuation with auxiliary functions (MaxEntAux method), developed for the study of the superconducting pairing dynamics of correlated materials, to transport coefficients.

  4. Maximum-Entropy Models of Sequenced Immune Repertoires Predict Antigen-Antibody Affinity.

    Science.gov (United States)

    Asti, Lorenzo; Uguzzoni, Guido; Marcatili, Paolo; Pagnani, Andrea

    2016-04-01

    The immune system has developed a number of distinct complex mechanisms to shape and control the antibody repertoire. One of these mechanisms, the affinity maturation process, works in an evolutionary-like fashion: after binding to a foreign molecule, the antibody-producing B-cells exhibit a high-frequency mutation rate in the genome region that codes for the antibody active site. Eventually, cells that produce antibodies with higher affinity for their cognate antigen are selected and clonally expanded. Here, we propose a new statistical approach based on maximum entropy modeling in which a scoring function related to the binding affinity of antibodies against a specific antigen is inferred from a sample of sequences of the immune repertoire of an individual. We use our inference strategy to infer a statistical model on a data set obtained by sequencing a fairly large portion of the immune repertoire of an HIV-1 infected patient. The Pearson correlation coefficient between our scoring function and the IC50 neutralization titer measured on 30 different antibodies of known sequence is as high as 0.77 (p-value 10-6), outperforming other sequence- and structure-based models.

  5. Revisiting the global surface energy budgets with maximum-entropy-production model of surface heat fluxes

    Science.gov (United States)

    Huang, Shih-Yu; Deng, Yi; Wang, Jingfeng

    2016-10-01

    The maximum-entropy-production (MEP) model of surface heat fluxes, based on contemporary non-equilibrium thermodynamics, information theory, and atmospheric turbulence theory, is used to re-estimate the global surface heat fluxes. The MEP model predicted surface fluxes automatically balance the surface energy budgets at all time and space scales without the explicit use of near-surface temperature and moisture gradient, wind speed and surface roughness data. The new MEP-based global annual mean fluxes over the land surface, using input data of surface radiation, temperature data from National Aeronautics and Space Administration-Clouds and the Earth's Radiant Energy System (NASA CERES) supplemented by surface specific humidity data from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), agree closely with previous estimates. The new estimate of ocean evaporation, not using the MERRA reanalysis data as model inputs, is lower than previous estimates, while the new estimate of ocean sensible heat flux is higher than previously reported. The MEP model also produces the first global map of ocean surface heat flux that is not available from existing global reanalysis products.

  6. Maximum entropy estimation of a Benzene contaminated plume using ecotoxicological assays.

    Science.gov (United States)

    Wahyudi, Agung; Bartzke, Mariana; Küster, Eberhard; Bogaert, Patrick

    2013-01-01

    Ecotoxicological bioassays, e.g. based on Danio rerio teratogenicity (DarT) or the acute luminescence inhibition with Vibrio fischeri, could potentially lead to significant benefits for detecting on site contaminations on qualitative or semi-quantitative bases. The aim was to use the observed effects of two ecotoxicological assays for estimating the extent of a Benzene groundwater contamination plume. We used a Maximum Entropy (MaxEnt) method to rebuild a bivariate probability table that links the observed toxicity from the bioassays with Benzene concentrations. Compared with direct mapping of the contamination plume as obtained from groundwater samples, the MaxEnt concentration map exhibits on average slightly higher concentrations though the global pattern is close to it. This suggest MaxEnt is a valuable method to build a relationship between quantitative data, e.g. contaminant concentrations, and more qualitative or indirect measurements, in a spatial mapping framework, which is especially useful when clear quantitative relation is not at hand.

  7. Revisiting the global surface energy budgets with maximum-entropy-production model of surface heat fluxes

    Science.gov (United States)

    Huang, Shih-Yu; Deng, Yi; Wang, Jingfeng

    2017-09-01

    The maximum-entropy-production (MEP) model of surface heat fluxes, based on contemporary non-equilibrium thermodynamics, information theory, and atmospheric turbulence theory, is used to re-estimate the global surface heat fluxes. The MEP model predicted surface fluxes automatically balance the surface energy budgets at all time and space scales without the explicit use of near-surface temperature and moisture gradient, wind speed and surface roughness data. The new MEP-based global annual mean fluxes over the land surface, using input data of surface radiation, temperature data from National Aeronautics and Space Administration-Clouds and the Earth's Radiant Energy System (NASA CERES) supplemented by surface specific humidity data from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), agree closely with previous estimates. The new estimate of ocean evaporation, not using the MERRA reanalysis data as model inputs, is lower than previous estimates, while the new estimate of ocean sensible heat flux is higher than previously reported. The MEP model also produces the first global map of ocean surface heat flux that is not available from existing global reanalysis products.

  8. Maximum-Entropy Models of Sequenced Immune Repertoires Predict Antigen-Antibody Affinity.

    Directory of Open Access Journals (Sweden)

    Lorenzo Asti

    2016-04-01

    Full Text Available The immune system has developed a number of distinct complex mechanisms to shape and control the antibody repertoire. One of these mechanisms, the affinity maturation process, works in an evolutionary-like fashion: after binding to a foreign molecule, the antibody-producing B-cells exhibit a high-frequency mutation rate in the genome region that codes for the antibody active site. Eventually, cells that produce antibodies with higher affinity for their cognate antigen are selected and clonally expanded. Here, we propose a new statistical approach based on maximum entropy modeling in which a scoring function related to the binding affinity of antibodies against a specific antigen is inferred from a sample of sequences of the immune repertoire of an individual. We use our inference strategy to infer a statistical model on a data set obtained by sequencing a fairly large portion of the immune repertoire of an HIV-1 infected patient. The Pearson correlation coefficient between our scoring function and the IC50 neutralization titer measured on 30 different antibodies of known sequence is as high as 0.77 (p-value 10-6, outperforming other sequence- and structure-based models.

  9. Maximum Entropy Relief Feature Weighting%极大熵Relief特征加权

    Institute of Scientific and Technical Information of China (English)

    张翔; 邓赵红; 王士同; 蔡及时

    2011-01-01

    Relief特征加权的最新研究进展表明其可近似地表述为一个间距最大化优化问题.尽管该类算法广为应用,但仍然存在一些缺陷.为了提高Relief特征加权的适应性和鲁棒性,融合间距最大化和极大熵理论,并由此探讨了新的鲁棒的具有更好适应性的Relief特征加新方法.首先,构造了一个结合极大熵原理的间距最大化目标函数.对于该目标函数,运用优化理论得到一些重要的理论结果.在此基础上,对于两类数据、多类数据和在线数据,提出了一组鲁棒的Relief特征加权算法.利用UCI基准数据集和基因数据集进行了实验验证,结果表明提出的新Relief特征加权算法对噪音和例外点显示出了更好的适应性和鲁棒性.%A latest advance in Relief feature weighting techniques is that it can be approximately expressed as a margin maximization problem and therefore its distinctive properties can be investigated with the help of the optimization theory. Although Relief feature has been widely used, it lacks a mechanism to deal with outlier data and how to enhance the robustness and the adjustability of the algorithm in noisy environments is still not very obvious. In order to enhance Relief's adjustability and robustness, by integrating maximum entropy technique into Relief feature weighting techniques, the more robust and adaptive Relief feature weighting new algorithms are investigated. First, a new margin-based objective function integrating maximum entropy is proposed within the optimization framework, where two maximum entropy terms are adopted to control the feature weights and sample force coefficients respectively. Then by applying optimization theory, some of useful theoretical results are derived from the proposed objective function and then a set of robust Relief feature weighting algorithms are developed for two-class data, multi-class data and online data. As demonstrated by extensive experiments in UCI

  10. Entropy-based implied volatility and its information content

    NARCIS (Netherlands)

    X. Xiao (Xiao); C. Zhou (Chen)

    2016-01-01

    markdownabstractThis paper investigates the maximum entropy approach on estimating implied volatility. The entropy approach also allows to measure option implied skewness and kurtosis nonparametrically, and to construct confidence intervals. Simulations show that the en- tropy approach outperforms t

  11. New multifactor spatial prediction method based on Bayesian maximum entropy%基于贝叶斯最大熵的多因子空间属性预测新方法

    Institute of Scientific and Technical Information of China (English)

    杨勇; 张楚天; 贺立源

    2013-01-01

    Summary The spatial distributions of soil properties (e.g.,organic matter and heavy metal content) are vital to soil quality evaluation and regional environment assessment.Currently,the spatial distribution of soil properties is usually predicted with classical geostatistics or environmental correlation.These two methods are different in theory.Geostatistics is based on spatial correlation of sampling points.However,it contains some deficiencies, such as the lack of effective utilization of environmental information,the smoothing effect of predicted results, difficult to meet the assumption of single point to multipoint Gaussian distribution etc .On the other hand,the theoretical basis of environmental correlation is based on the relationship between soil and environment,but it ignores the spatial correlation among sampling points.These two methods complement each other.Thus,it is very important to study how to integrate these two methods,so that the spatial correlation among sampling points and the relationship between soil and environmental factors can both be used to improve the prediction accuracy. We propose a new spatial prediction method based on the theory of Bayesian maximum entropy (BME), which is one of the most well-known modern spatiotemporal geostatistical techniques.The main objective is to incorporate the results of classical geostatistics and quantitative soil-landscape model in the BME framework. The result of ordinary Kriging was taken as the priori probability density function (pdf),as well as the sampling data as hard data,and the results of environmental correlation as soft data.Posterior pdf is calculated with priori pdf,hard data and soft data.According to the posterior pdf,the predicted values of non-sampling points could be obtained,which not only contained the spatial correlation between sample points,but also took into account the relationship between soil properties and the environment.Meanwhile,the soil organic matter contents in

  12. Nonextensive random-matrix theory based on Kaniadakis entropy

    Energy Technology Data Exchange (ETDEWEB)

    Abul-Magd, A.Y. [Department of Mathematics, Faculty of Science, Zagazig University, Zagazig (Egypt)]. E-mail: a_y_abul_magd@hotmail.com

    2007-02-12

    The joint eigenvalue distributions of random-matrix ensembles are derived by applying the principle maximum entropy to the Renyi, Abe and Kaniadakis entropies. While the Renyi entropy produces essentially the same matrix-element distributions as the previously obtained expression by using the Tsallis entropy, and the Abe entropy does not lead to a closed form expression, the Kaniadakis entropy leads to a new generalized form of the Wigner surmise that describes a transition of the spacing distribution from chaos to order. This expression is compared with the corresponding expression obtained by assuming Tsallis' entropy as well as the results of a previous numerical experiment.

  13. Linearized semiclassical initial value time correlation functions with maximum entropy analytic continuation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jian; Miller, William H.

    2008-08-01

    The maximum entropy analytic continuation (MEAC) method is used to extend the range of accuracy of the linearized semiclassical initial value representation (LSC-IVR)/classical Wigner approximation for real time correlation functions. The LSC-IVR provides a very effective 'prior' for the MEAC procedure since it is very good for short times, exact for all time and temperature for harmonic potentials (even for correlation functions of nonlinear operators), and becomes exact in the classical high temperature limit. This combined MEAC+LSC/IVR approach is applied here to two highly nonlinear dynamical systems, a pure quartic potential in one dimensional and liquid para-hydrogen at two thermal state points (25K and 14K under nearly zero external pressure). The former example shows the MEAC procedure to be a very significant enhancement of the LSC-IVR, for correlation functions of both linear and nonlinear operators, and especially at low temperature where semiclassical approximations are least accurate. For liquid para-hydrogen, the LSC-IVR is seen already to be excellent at T = 25K, but the MEAC procedure produces a significant correction at the lower temperature (T = 14K). Comparisons are also made to how the MEAC procedure is able to provide corrections for other trajectory-based dynamical approximations when used as priors.

  14. Initial system-bath state via the maximum-entropy principle

    Science.gov (United States)

    Dai, Jibo; Len, Yink Loong; Ng, Hui Khoon

    2016-11-01

    The initial state of a system-bath composite is needed as the input for prediction from any quantum evolution equation to describe subsequent system-only reduced dynamics or the noise on the system from joint evolution of the system and the bath. The conventional wisdom is to write down an uncorrelated state as if the system and the bath were prepared in the absence of each other; yet, such a factorized state cannot be the exact description in the presence of system-bath interactions. Here, we show how to go beyond the simplistic factorized-state prescription using ideas from quantum tomography: We employ the maximum-entropy principle to deduce an initial system-bath state consistent with the available information. For the generic case of weak interactions, we obtain an explicit formula for the correction to the factorized state. Such a state turns out to have little correlation between the system and the bath, which we can quantify using our formula. This has implications, in particular, on the subject of subsequent non-completely positive dynamics of the system. Deviation from predictions based on such an almost uncorrelated state is indicative of accidental control of hidden degrees of freedom in the bath.

  15. Estimation of Groundwater Radon in North Carolina Using Land Use Regression and Bayesian Maximum Entropy.

    Science.gov (United States)

    Messier, Kyle P; Campbell, Ted; Bradley, Philip J; Serre, Marc L

    2015-08-18

    Radon ((222)Rn) is a naturally occurring chemically inert, colorless, and odorless radioactive gas produced from the decay of uranium ((238)U), which is ubiquitous in rocks and soils worldwide. Exposure to (222)Rn is likely the second leading cause of lung cancer after cigarette smoking via inhalation; however, exposure through untreated groundwater is also a contributing factor to both inhalation and ingestion routes. A land use regression (LUR) model for groundwater (222)Rn with anisotropic geological and (238)U based explanatory variables is developed, which helps elucidate the factors contributing to elevated (222)Rn across North Carolina. The LUR is also integrated into the Bayesian Maximum Entropy (BME) geostatistical framework to increase accuracy and produce a point-level LUR-BME model of groundwater (222)Rn across North Carolina including prediction uncertainty. The LUR-BME model of groundwater (222)Rn results in a leave-one out cross-validation r(2) of 0.46 (Pearson correlation coefficient = 0.68), effectively predicting within the spatial covariance range. Modeled results of (222)Rn concentrations show variability among intrusive felsic geological formations likely due to average bedrock (238)U defined on the basis of overlying stream-sediment (238)U concentrations that is a widely distributed consistently analyzed point-source data.

  16. Maximum entropy production allows a simple representation of heterogeneity in semiarid ecosystems.

    Science.gov (United States)

    Schymanski, Stanislaus J; Kleidon, Axel; Stieglitz, Marc; Narula, Jatin

    2010-05-12

    Feedbacks between water use, biomass and infiltration capacity in semiarid ecosystems have been shown to lead to the spontaneous formation of vegetation patterns in a simple model. The formation of patterns permits the maintenance of larger overall biomass at low rainfall rates compared with homogeneous vegetation. This results in a bias of models run at larger scales neglecting subgrid-scale variability. In the present study, we investigate the question whether subgrid-scale heterogeneity can be parameterized as the outcome of optimal partitioning between bare soil and vegetated area. We find that a two-box model reproduces the time-averaged biomass of the patterns emerging in a 100 x 100 grid model if the vegetated fraction is optimized for maximum entropy production (MEP). This suggests that the proposed optimality-based representation of subgrid-scale heterogeneity may be generally applicable to different systems and at different scales. The implications for our understanding of self-organized behaviour and its modelling are discussed.

  17. Two dimensional IR-FID-CPMG acquisition and adaptation of a maximum entropy reconstruction

    Science.gov (United States)

    Rondeau-Mouro, C.; Kovrlija, R.; Van Steenberge, E.; Moussaoui, S.

    2016-04-01

    By acquiring the FID signal in two-dimensional TD-NMR spectroscopy, it is possible to characterize mixtures or complex samples composed of solid and liquid phases. We have developed a new sequence for this purpose, called IR-FID-CPMG, making it possible to correlate spin-lattice T1 and spin-spin T2 relaxation times, including both liquid and solid phases in samples. We demonstrate here the potential of a new algorithm for the 2D inverse Laplace transformation of IR-FID-CPMG data based on an adapted reconstruction of the maximum entropy method, combining the standard decreasing exponential decay function with an additional term drawn from Abragam's FID function. The results show that the proposed IR-FID-CPMG sequence and its related inversion model allow accurate characterization and quantification of both solid and liquid phases in multiphasic and compartmentalized systems. Moreover, it permits to distinguish between solid phases having different T1 relaxation times or to highlight cross-relaxation phenomena.

  18. Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data

    Directory of Open Access Journals (Sweden)

    Jayajit Das '

    2015-07-01

    Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.

  19. A Maximum Entropy Approach to Assess Debonding in Honeycomb aluminum Plates

    Directory of Open Access Journals (Sweden)

    Viviana Meruane

    2014-05-01

    Full Text Available Honeycomb sandwich structures are used in a wide variety of applications. Nevertheless, due to manufacturing defects or impact loads, these structures can be subject to imperfect bonding or debonding between the skin and the honeycomb core. The presence of debonding reduces the bending stiffness of the composite panel, which causes detectable changes in its vibration characteristics. This article presents a new supervised learning algorithm to identify debonded regions in aluminum honeycomb panels. The algorithm uses a linear approximation method handled by a statistical inference model based on the maximum-entropy principle. The merits of this new approach are twofold: training is avoided and data is processed in a period of time that is comparable to the one of neural networks. The honeycomb panels are modeled with finite elements using a simplified three-layer shell model. The adhesive layer between the skin and core is modeled using linear springs, the rigidities of which are reduced in debonded sectors. The algorithm is validated using experimental data of an aluminum honeycomb panel under different damage scenarios.

  20. Causal nexus between energy consumption and carbon dioxide emission for Malaysia using maximum entropy bootstrap approach.

    Science.gov (United States)

    Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid

    2015-12-01

    This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.

  1. Using maximum entropy model to predict protein secondary structure with single sequence.

    Science.gov (United States)

    Ding, Yong-Sheng; Zhang, Tong-Liang; Gu, Quan; Zhao, Pei-Ying; Chou, Kuo-Chen

    2009-01-01

    Prediction of protein secondary structure is somewhat reminiscent of the efforts by many previous investigators but yet still worthy of revisiting it owing to its importance in protein science. Several studies indicate that the knowledge of protein structural classes can provide useful information towards the determination of protein secondary structure. Particularly, the performance of prediction algorithms developed recently have been improved rapidly by incorporating homologous multiple sequences alignment information. Unfortunately, this kind of information is not available for a significant amount of proteins. In view of this, it is necessary to develop the method based on the query protein sequence alone, the so-called single-sequence method. Here, we propose a novel single-sequence approach which is featured by that various kinds of contextual information are taken into account, and that a maximum entropy model classifier is used as the prediction engine. As a demonstration, cross-validation tests have been performed by the new method on datasets containing proteins from different structural classes, and the results thus obtained are quite promising, indicating that the new method may become an useful tool in protein science or at least play a complementary role to the existing protein secondary structure prediction methods.

  2. STOCHASTIC ANALYSIS OF RANDOM AD HOC NETWORKS WITH MAXIMUM ENTROPY DEPLOYMENTS

    Directory of Open Access Journals (Sweden)

    Thomas Bourgeois

    2014-10-01

    Full Text Available In this paper, we present the first stochastic analysis of the link performance of an ad hoc network modelled by a single homogeneous Poisson point process (HPPP. According to the maximum entropy principle, the single HPPP model is mathematically the best model for random deployments with a given node density. However, previous works in the literature only consider a modified model which shows a discrepancy in the interference distribution with the more suitable single HPPP model. The main contributions of this paper are as follows. 1 It presents a new mathematical framework leading to closed form expressions of the probability of success of both one-way transmissions and handshakes for a deployment modelled by a single HPPP. Our approach, based on stochastic geometry, can be extended to complex protocols. 2 From the obtained results, all confirmed by comparison to simulated data, optimal PHY and MAC layer parameters are determined and the relations between them is described in details. 3 The influence of the routing protocol on handshake performance is taken into account in a realistic manner, leading to the confirmation of the intuitive result that the effect of imperfect feedback on the probability of success of a handshake is only negligible for transmissions to the first neighbour node.

  3. Computational design of hepatitis C vaccines using maximum entropy models and population dynamics

    Science.gov (United States)

    Hart, Gregory; Ferguson, Andrew

    Hepatitis C virus (HCV) afflicts 170 million people and kills 350,000 annually. Vaccination offers the most realistic and cost effective hope of controlling this epidemic. Despite 20 years of research, no vaccine is available. A major obstacle is the virus' extreme genetic variability and rapid mutational escape from immune pressure. Improvements in the vaccine design process are urgently needed. Coupling data mining with spin glass models and maximum entropy inference, we have developed a computational approach to translate sequence databases into empirical fitness landscapes. These landscapes explicitly connect viral genotype to phenotypic fitness and reveal vulnerable targets that can be exploited to rationally design immunogens. Viewing these landscapes as the mutational ''playing field'' over which the virus is constrained to evolve, we have integrated them with agent-based models of the viral mutational and host immune response dynamics, establishing a data-driven immune simulator of HCV infection. We have employed this simulator to perform in silico screening of HCV immunogens. By systematically identifying a small number of promising vaccine candidates, these models can accelerate the search for a vaccine by massively reducing the experimental search space.

  4. Applications of the principle of maximum entropy: from physics to ecology.

    Science.gov (United States)

    Banavar, Jayanth R; Maritan, Amos; Volkov, Igor

    2010-02-17

    There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.

  5. Polyatomic gases with dynamic pressure: Maximum entropy principle and shock structure

    CERN Document Server

    Pavić-Čolić, Milana; Simić, Srboljub

    2016-01-01

    This paper is concerned with the analysis of polyatomic gases within the framework of kinetic theory. Internal degrees of freedom are modeled using a single continuous variable corresponding to the molecular internal energy. Non-equilibrium velocity distribution function, compatible with macroscopic field variables, is constructed using the maximum entropy principle. A proper collision cross section is constructed which obeys the micro-reversibility requirement. The source term and entropy production rate are determined in the form which generalizes the results obtained within the framework of extended thermodynamics. They can be adapted to appropriate physical situations due to the presence of parameters. They are also compared with the results obtained using BGK approximation. For the proposed model the shock structure problem is thoroughly analyzed.

  6. Source extension based on ε-entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; YU Sheng-sheng; ZHOU Jing-li; ZHENG Xin-wei

    2005-01-01

    It is known by entropy theory that image is a source correlated with a certain characteristic of probability. The entropy rate of the source and ? entropy (rate-distortion function theory) are the information content to identify the characteristics of video images, and hence are essentially related with video image compression. They are fundamental theories of great significance to image compression, though impossible to be directly turned into a compression method. Based on the entropy theory and the image compression theory, by the application of the rate-distortion feature mathematical model and Lagrange multipliers to some theoretical problems in the H.264 standard, this paper presents a new the algorithm model of coding rate-distortion. This model is introduced into complete test on the capability of the test model of JM61e (JUT Test Model). The result shows that the speed of coding increases without significant reduction of the rate-distortion performance of the coder.

  7. Entropy Evaluation Based on Value Validity

    Directory of Open Access Journals (Sweden)

    Tarald O. Kvålseth

    2014-09-01

    Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.

  8. Steepest entropy ascent model for far-nonequilibrium thermodynamics: unified implementation of the maximum entropy production principle.

    Science.gov (United States)

    Beretta, Gian Paolo

    2014-10-01

    By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium

  9. The Maximum Entropy Approach to Record Abbreviation for Optimal Record Control.

    Science.gov (United States)

    Goyal, P.

    1983-01-01

    Tests performed on 6,260 titles from 3 machine-readable British National Bibliography files using an entropy based technique for abbreviation of text strings for use as a control code found that more than 94 percent of the titles generated a unique seven character code. Six references and an illustrative example are appended. (EJS)

  10. Entropy-based Probabilistic Fatigue Damage Prognosis and Algorithmic Performance Comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  11. Entropy-based probabilistic fatigue damage prognosis and algorithmic performance comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  12. Entropy-based Statistical Analysis of PolSAR Data

    CERN Document Server

    Frery, Alejandro C; Nascimento, Abraão D C

    2012-01-01

    Images obtained from coherent illumination processes are contaminated with speckle noise, with polarimetric synthetic aperture radar (PolSAR) imagery as a prominent example. With an adequacy widely attested in the literature, the scaled complex Wishart distribution is an acceptable model for PolSAR data. In this perspective, we derive analytic expressions for the Shannon, R\\'enyi, and restricted Tsallis entropies under this model. Relationships between the derived measures and the parameters of the scaled Wishart law (i.e., the equivalent number of looks and the covariance matrix) are discussed. In addition, we obtain the asymptotic variances of the Shannon and R\\'enyi entropies when replacing distribution parameters by maximum likelihood estimators. As a consequence, confidence intervals based on these two entropies are also derived and proposed as new ways of capturing contrast. New hypothesis tests are additionally proposed using these results, and their performance is assessed using simulated and real dat...

  13. Maximum Entropy Production Modeling of Evapotranspiration Partitioning on Heterogeneous Terrain and Canopy Cover: advantages and limitations.

    Science.gov (United States)

    Gutierrez-Jurado, H. A.; Guan, H.; Wang, J.; Wang, H.; Bras, R. L.; Simmons, C. T.

    2015-12-01

    Quantification of evapotranspiration (ET) and its partition over regions of heterogeneous topography and canopy poses a challenge using traditional approaches. In this study, we report the results of a novel field experiment design guided by the Maximum Entropy Production model of ET (MEP-ET), formulated for estimating evaporation and transpiration from homogeneous soil and canopy. A catchment with complex terrain and patchy vegetation in South Australia was instrumented to measure temperature, humidity and net radiation at soil and canopy surfaces. Performance of the MEP-ET model to quantify transpiration and soil evaporation was evaluated during wet and dry conditions with independently and directly measured transpiration from sapflow and soil evaporation using the Bowen Ratio Energy Balance (BREB). MEP-ET transpiration shows remarkable agreement with that obtained through sapflow measurements during wet conditions, but consistently overestimates the flux during dry periods. However, an additional term introduced to the original MEP-ET model accounting for higher stomatal regulation during dry spells, based on differences between leaf and air vapor pressure deficits and temperatures, significantly improves the model performance. On the other hand, MEP-ET soil evaporation is in good agreement with that from BREB regardless of moisture conditions. The experimental design allows a plot and tree scale quantification of evaporation and transpiration respectively. This study confirms for the first time that the MEP-ET originally developed for homogeneous open bare soil and closed canopy can be used for modeling ET over heterogeneous land surfaces. Furthermore, we show that with the addition of an empirical function simulating the plants ability to regulate transpiration, and based on the same measurements of temperature and humidity, the method can produce reliable estimates of ET during both wet and dry conditions without compromising its parsimony.

  14. A subjective supply-demand model: the maximum Boltzmann/Shannon entropy solution

    Science.gov (United States)

    Piotrowski, Edward W.; Sładkowski, Jan

    2009-03-01

    The present authors have put forward a projective geometry model of rational trading. The expected (mean) value of the time that is necessary to strike a deal and the profit strongly depend on the strategies adopted. A frequent trader often prefers maximal profit intensity to the maximization of profit resulting from a separate transaction because the gross profit/income is the adopted/recommended benchmark. To investigate activities that have different periods of duration we define, following the queuing theory, the profit intensity as a measure of this economic category. The profit intensity in repeated trading has a unique property of attaining its maximum at a fixed point regardless of the shape of demand curves for a wide class of probability distributions of random reverse transactions (i.e. closing of the position). These conclusions remain valid for an analogous model based on supply analysis. This type of market game is often considered in research aiming at finding an algorithm that maximizes profit of a trader who negotiates prices with the Rest of the World (a collective opponent), possessing a definite and objective supply profile. Such idealization neglects the sometimes important influence of an individual trader on the demand/supply profile of the Rest of the World and in extreme cases questions the very idea of demand/supply profile. Therefore we put forward a trading model in which the demand/supply profile of the Rest of the World induces the (rational) trader to (subjectively) presume that he/she lacks (almost) all knowledge concerning the market but his/her average frequency of trade. This point of view introduces maximum entropy principles into the model and broadens the range of economic phenomena that can be perceived as a sort of thermodynamical system. As a consequence, the profit intensity has a fixed point with an astonishing connection with Fibonacci classical works and looking for the quickest algorithm for obtaining the extremum of a

  15. Direct comparison of phase-sensitive vibrational sum frequency generation with maximum entropy method: case study of water.

    Science.gov (United States)

    de Beer, Alex G F; Samson, Jean-Sebastièn; Hua, Wei; Huang, Zishuai; Chen, Xiangke; Allen, Heather C; Roke, Sylvie

    2011-12-14

    We present a direct comparison of phase sensitive sum-frequency generation experiments with phase reconstruction obtained by the maximum entropy method. We show that both methods lead to the same complex spectrum. Furthermore, we discuss the strengths and weaknesses of each of these methods, analyzing possible sources of experimental and analytical errors. A simulation program for maximum entropy phase reconstruction is available at: http://lbp.epfl.ch/.

  16. The SIS and SIR stochastic epidemic models: a maximum entropy approach.

    Science.gov (United States)

    Artalejo, J R; Lopez-Herrero, M J

    2011-12-01

    We analyze the dynamics of infectious disease spread by formulating the maximum entropy (ME) solutions of the susceptible-infected-susceptible (SIS) and the susceptible-infected-removed (SIR) stochastic models. Several scenarios providing helpful insight into the use of the ME formalism for epidemic modeling are identified. The ME results are illustrated with respect to several descriptors, including the number of recovered individuals and the time to extinction. An application to infectious data from outbreaks of extended spectrum beta lactamase (ESBL) in a hospital is also considered.

  17. A homotopy algorithm for synthesizing robust controllers for flexible structures via the maximum entropy design equations

    Science.gov (United States)

    Collins, Emmanuel G., Jr.; Richter, Stephen

    1990-01-01

    One well known deficiency of LQG compensators is that they do not guarantee any measure of robustness. This deficiency is especially highlighted when considering control design for complex systems such as flexible structures. There has thus been a need to generalize LQG theory to incorporate robustness constraints. Here we describe the maximum entropy approach to robust control design for flexible structures, a generalization of LQG theory, pioneered by Hyland, which has proved useful in practice. The design equations consist of a set of coupled Riccati and Lyapunov equations. A homotopy algorithm that is used to solve these design equations is presented.

  18. The Maximum Entropy Spectrum and the Burg Technique. Technical Report No. 1: Advanced Signal Processing

    Science.gov (United States)

    1975-06-25

    conjugates of the roots of AH V. Thus the forward prediction error filter is a minimum phase filter . Since its output does not precede any of its input points...circle. The inverse of the forward Li-18 prediction error filter is also a causal minimum phase filter . The inverse filter can be used to construct the...filter is a maximum phase filter (a minimum phase filter if the direction of time is reversed). When the maxi- mum entropy assumption is valid, it

  19. Spectral density analysis of time correlation functions in lattice QCD using the maximum entropy method

    CERN Document Server

    Fiebig, H R

    2002-01-01

    We study various aspects of extracting spectral information from time correlation functions of lattice QCD by means of Bayesian inference with an entropic prior, the maximum entropy method (MEM). Correlator functions of a heavy-light meson-meson system serve as a repository for lattice data with diverse statistical quality. Attention is given to spectral mass density functions, inferred from the data, and their dependence on the parameters of the MEM. We propose to employ simulated annealing, or cooling, to solve the Bayesian inference problem, and discuss practical issues of the approach.

  20. High resolution VLBI polarization imaging of AGN with the maximum entropy method

    Science.gov (United States)

    Coughlan, Colm P.; Gabuzda, Denise C.

    2016-12-01

    Radio polarization images of the jets of Active Galactic Nuclei (AGN) can provide a deep insight into the launching and collimation mechanisms of relativistic jets. However, even at VLBI scales, resolution is often a limiting factor in the conclusions that can be drawn from observations. The maximum entropy method (MEM) is a deconvolution algorithm that can outperform the more common CLEAN algorithm in many cases, particularly when investigating structures present on scales comparable to or smaller than the nominal beam size with `super-resolution'. A new implementation of the MEM suitable for single- or multiple-wavelength VLBI polarization observations has been developed and is described here. Monte Carlo simulations comparing the performances of CLEAN and MEM at reconstructing the properties of model images are presented; these demonstrate the enhanced reliability of MEM over CLEAN when images of the fractional polarization and polarization angle are constructed using convolving beams that are appreciably smaller than the full CLEAN beam. The results of using this new MEM software to image VLBA observations of the AGN 0716+714 at six different wavelengths are presented, and compared to corresponding maps obtained with CLEAN. MEM and CLEAN maps of Stokes I, the polarized flux, the fractional polarization and the polarization angle are compared for convolving beams ranging from the full CLEAN beam down to a beam one-third of this size. MEM's ability to provide more trustworthy polarization imaging than a standard CLEAN-based deconvolution when convolving beams appreciably smaller than the full CLEAN beam are used is discussed.

  1. Potential distribution of Xylella fastidiosa in Italy: a maximum entropy model

    Directory of Open Access Journals (Sweden)

    Luciano BOSSO

    2016-05-01

    Full Text Available Species distribution models may provide realistic scenarios to explain the influence of bioclimatic variables in the context of emerging plant pathogens. Xylella fastidiosa is a xylem-limited Gram-negative bacterium causing severe diseases in many plant species. We developed a maximum entropy model for X. fastidiosa in Italy. Our objectives were to carry out a preliminary analysis of the species’ potential geographical distribution and determine which eco-geographical variables may favour its presence in other Italian regions besides Apulia. The analysis of single variable contribution showed that precipitation of the driest (40.3% and wettest (30.4% months were the main factors influencing model performance. Altitude, precipitation of warmest quarter, mean temperature of coldest quarter, and land cover provided a total contribution of 19.5%. Based on the model predictions, X. fastidiosa has a high probability (> 0.8 of colonizing areas characterized by: i low altitude (0–150 m a.s.l.; ii precipitations in the driest month < 10 mm, in the wettest month ranging between 80–110 mm and during the warmest quarter < 60 mm; iii mean temperature of coldest quarter ≥ 8°C; iv agricultural areas comprising intensive agriculture, complex cultivation patterns, olive groves, annual crops associated with permanent crops, orchards and vineyards; forest (essentially oak woodland; and Mediterranean shrubland. Species distribution models showed a high probability of X. fastidiosa occurrence in the regions of Apulia, Calabria, Basilicata, Sicily, Sardinia and coastal areas of Campania, Lazio and south Tuscany. Maxent models achieved excellent levels of predictive performance according to area under curve (AUC, true skill statistic (TSS and minimum difference between training and testing AUC data (AUCdiff. Our study indicated that X. fastidiosa has the potential to overcome the current boundaries of distribution and affect areas of Italy outside Apulia.

  2. Quantifying extrinsic noise in gene expression using the maximum entropy framework.

    Science.gov (United States)

    Dixit, Purushottam D

    2013-06-18

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed.

  3. Maximum entropy reconstructions of dynamic signaling networks from quantitative proteomics data.

    Science.gov (United States)

    Locasale, Jason W; Wolf-Yadlin, Alejandro

    2009-08-26

    Advances in mass spectrometry among other technologies have allowed for quantitative, reproducible, proteome-wide measurements of levels of phosphorylation as signals propagate through complex networks in response to external stimuli under different conditions. However, computational approaches to infer elements of the signaling network strictly from the quantitative aspects of proteomics data are not well established. We considered a method using the principle of maximum entropy to infer a network of interacting phosphotyrosine sites from pairwise correlations in a mass spectrometry data set and derive a phosphorylation-dependent interaction network solely from quantitative proteomics data. We first investigated the applicability of this approach by using a simulation of a model biochemical signaling network whose dynamics are governed by a large set of coupled differential equations. We found that in a simulated signaling system, the method detects interactions with significant accuracy. We then analyzed a growth factor mediated signaling network in a human mammary epithelial cell line that we inferred from mass spectrometry data and observe a biologically interpretable, small-world structure of signaling nodes, as well as a catalog of predictions regarding the interactions among previously uncharacterized phosphotyrosine sites. For example, the calculation places a recently identified tumor suppressor pathway through ARHGEF7 and Scribble, in the context of growth factor signaling. Our findings suggest that maximum entropy derived network models are an important tool for interpreting quantitative proteomics data.

  4. Possible dynamical explanations for Paltridge's principle of maximum entropy production

    Energy Technology Data Exchange (ETDEWEB)

    Virgo, Nathaniel, E-mail: nathanielvirgo@gmail.com; Ikegami, Takashi, E-mail: nathanielvirgo@gmail.com [Ikegami Laboratory, University of Tokyo (Japan)

    2014-12-05

    Throughout the history of non-equilibrium thermodynamics a number of theories have been proposed in which complex, far from equilibrium flow systems are hypothesised to reach a steady state that maximises some quantity. Perhaps the most celebrated is Paltridge's principle of maximum entropy production for the horizontal heat flux in Earth's atmosphere, for which there is some empirical support. There have been a number of attempts to derive such a principle from maximum entropy considerations. However, we currently lack a more mechanistic explanation of how any particular system might self-organise into a state that maximises some quantity. This is in contrast to equilibrium thermodynamics, in which models such as the Ising model have been a great help in understanding the relationship between the predictions of MaxEnt and the dynamics of physical systems. In this paper we show that, unlike in the equilibrium case, Paltridge-type maximisation in non-equilibrium systems cannot be achieved by a simple dynamical feedback mechanism. Nevertheless, we propose several possible mechanisms by which maximisation could occur. Showing that these occur in any real system is a task for future work. The possibilities presented here may not be the only ones. We hope that by presenting them we can provoke further discussion about the possible dynamical mechanisms behind extremum principles for non-equilibrium systems, and their relationship to predictions obtained through MaxEnt.

  5. Modelling and Simulation of Seasonal Rainfall Using the Principle of Maximum Entropy

    Directory of Open Access Journals (Sweden)

    Jonathan Borwein

    2014-02-01

    Full Text Available We use the principle of maximum entropy to propose a parsimonious model for the generation of simulated rainfall during the wettest three-month season at a typical location on the east coast of Australia. The model uses a checkerboard copula of maximum entropy to model the joint probability distribution for total seasonal rainfall and a set of two-parameter gamma distributions to model each of the marginal monthly rainfall totals. The model allows us to match the grade correlation coefficients for the checkerboard copula to the observed Spearman rank correlation coefficients for the monthly rainfalls and, hence, provides a model that correctly describes the mean and variance for each of the monthly totals and also for the overall seasonal total. Thus, we avoid the need for a posteriori adjustment of simulated monthly totals in order to correctly simulate the observed seasonal statistics. Detailed results are presented for the modelling and simulation of seasonal rainfall in the town of Kempsey on the mid-north coast of New South Wales. Empirical evidence from extensive simulations is used to validate this application of the model. A similar analysis for Sydney is also described.

  6. Maximum entropy production, carbon assimilation, and the spatial organization of vegetation in river basins.

    Science.gov (United States)

    del Jesus, Manuel; Foti, Romano; Rinaldo, Andrea; Rodriguez-Iturbe, Ignacio

    2012-12-18

    The spatial organization of functional vegetation types in river basins is a major determinant of their runoff production, biodiversity, and ecosystem services. The optimization of different objective functions has been suggested to control the adaptive behavior of plants and ecosystems, often without a compelling justification. Maximum entropy production (MEP), rooted in thermodynamics principles, provides a tool to justify the choice of the objective function controlling vegetation organization. The application of MEP at the ecosystem scale results in maximum productivity (i.e., maximum canopy photosynthesis) as the thermodynamic limit toward which the organization of vegetation appears to evolve. Maximum productivity, which incorporates complex hydrologic feedbacks, allows us to reproduce the spatial macroscopic organization of functional types of vegetation in a thoroughly monitored river basin, without the need for a reductionist description of the underlying microscopic dynamics. The methodology incorporates the stochastic characteristics of precipitation and the associated soil moisture on a spatially disaggregated framework. Our results suggest that the spatial organization of functional vegetation types in river basins naturally evolves toward configurations corresponding to dynamically accessible local maxima of the maximum productivity of the ecosystem.

  7. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Directory of Open Access Journals (Sweden)

    Gian Paolo Beretta

    2008-08-01

    Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  8. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Science.gov (United States)

    Beretta, Gian P.

    2008-09-01

    A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  9. Vertical and horizontal processes in the global atmosphere and the maximum entropy production conjecture

    Directory of Open Access Journals (Sweden)

    S. Pascale

    2012-01-01

    Full Text Available The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport.

  10. Image Filtering Based on Improved Information Entropy

    Institute of Scientific and Technical Information of China (English)

    JINGXiaojun; LIUYulin; XIONGYuqing

    2004-01-01

    An image filtering based on improved information entropy is proposed in this paper, which can overcome the shortcomings of hybrid linear and non-linear filtering algorithm. Due to the shortcomings of information entropy in the field of data fusion, we introduce the consistency constraint factor of sub-source report and subsource performance difference parameter, propose the concept of fusion entropy, utilize its amendment and regularity function on sub-source decision-making matrix, bring into play the competency, redundency and complementarity of information fusion, suppress and delete fault and invalid information, strengthen and preserve correct and useful information, overcome the risk of error reporting on single source critical point and the shortcomings of reliability and error tolerating, add the decision-making criteria of multiple sub-source fusion, finally improve filtering quality. Subsequent experiments show its validity and improved filtering performance, thus providing a new way of image filtering technique.

  11. Maximum Entropy Methods as the Bridge Between Microscopic and Macroscopic Theory

    Science.gov (United States)

    Taylor, Jamie M.

    2016-09-01

    This paper is concerned with an investigation into a function of macroscopic variables known as the singular potential, building on previous work by Ball and Majumdar. The singular potential is a function of the admissible statistical averages of probability distributions on a state space, defined so that it corresponds to the maximum possible entropy given known observed statistical averages, although non-classical entropy-like objective functions will also be considered. First the set of admissible moments must be established, and under the conditions presented in this work the set is open, bounded and convex allowing a description in terms of supporting hyperplanes, which provides estimates on the development of singularities for related probability distributions. Under appropriate conditions it is shown that the singular potential is strictly convex, as differentiable as the microscopic entropy, and blows up uniformly as the macroscopic variable tends to the boundary of the set of admissible moments. Applications of the singular potential are then discussed, and particular consideration will be given to certain free-energy functionals typical in mean-field theory, demonstrating an equivalence between certain microscopic and macroscopic free-energy functionals. This allows statements about L^1-local minimisers of Onsager's free energy to be obtained which cannot be given by two-sided variations, and overcomes the need to ensure local minimisers are bounded away from zero and +∞ before taking L^∞ variations. The analysis also permits the definition of a dual order parameter for which Onsager's free energy allows an explicit representation. Also, the difficulties in approximating the singular potential by everywhere defined functions, in particular by polynomial functions, are addressed, with examples demonstrating the failure of the Taylor approximation to preserve relevant shape properties of the singular potential.

  12. Wave scattering through classically chaotic cavities in the presence of absorption: A maximum-entropy model

    Indian Academy of Sciences (India)

    Pier A Mello; Eugene Kogan

    2002-02-01

    We present a maximum-entropy model for the transport of waves through a classically chaotic cavity in the presence of absorption. The entropy of the -matrix statistical distribution is maximized, with the constraint $\\langle {\\rm Tr}SS^{\\dagger}\\rangle = n: n$ is the dimensionality of , and 0 ≤ ≤ 1. For = 1 the -matrix distribution concentrates on the unitarity sphere and we have no absorption; for = 0 the distribution becomes a delta function at the origin and we have complete absorption. For strong absorption our result agrees with a number of analytical calculations already given in the literature. In that limit, the distribution of the individual (angular) transmission and reflection coefficients becomes exponential – Rayleigh statistics – even for = 1. For ≫ 1 Rayleigh statistics is attained even with no absorption; here we extend the study to < 1. The model is compared with random-matrix-theory numerical simulations: it describes the problem very well for strong absorption, but fails for moderate and weak absorptions. The success of the model for strong absorption is understood in the light of a central-limit theorem. For weak absorption, some important physical constraint is missing in the construction of the model.

  13. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    Science.gov (United States)

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations.

  14. Application of the maximum entropy principle to determine ensembles of intrinsically disordered proteins from residual dipolar couplings.

    Science.gov (United States)

    Sanchez-Martinez, M; Crehuet, R

    2014-12-21

    We present a method based on the maximum entropy principle that can re-weight an ensemble of protein structures based on data from residual dipolar couplings (RDCs). The RDCs of intrinsically disordered proteins (IDPs) provide information on the secondary structure elements present in an ensemble; however even two sets of RDCs are not enough to fully determine the distribution of conformations, and the force field used to generate the structures has a pervasive influence on the refined ensemble. Two physics-based coarse-grained force fields, Profasi and Campari, are able to predict the secondary structure elements present in an IDP, but even after including the RDC data, the re-weighted ensembles differ between both force fields. Thus the spread of IDP ensembles highlights the need for better force fields. We distribute our algorithm in an open-source Python code.

  15. Source Function Determined from HBT Correlations by the Maximum Entropy Principle

    CERN Document Server

    Yuan Fang Wei; Yuanfang, Wu; Heinz, Ulrich

    1996-01-01

    We study the reconstruction of the source function in space-time directly from the measured HBT correlation function using the Maximum Entropy Principle. We find that the problem is ill-defined without at least one additional theoretical constraint as input. Using the requirement of a finite source lifetime for the latter we find a new Gaussian parametrization of the source function directly in terms of the measured HBT radius parameters and its lifetime, where the latter is a free parameter which is not directly measurable by HBT. We discuss the implications of our results for the remaining freedom in building source models consistent with a given set of measured HBT radius parameters.

  16. Determination of zero-coupon and spot rates from treasury data by maximum entropy methods

    Science.gov (United States)

    Gzyl, Henryk; Mayoral, Silvia

    2016-08-01

    An interesting and important inverse problem in finance consists of the determination of spot rates or prices of the zero coupon bonds, when the only information available consists of the prices of a few coupon bonds. A variety of methods have been proposed to deal with this problem. Here we present variants of a non-parametric method to treat with such problems, which neither imposes an analytic form on the rates or bond prices, nor imposes a model for the (random) evolution of the yields. The procedure consists of transforming the problem of the determination of the prices of the zero coupon bonds into a linear inverse problem with convex constraints, and then applying the method of maximum entropy in the mean. This method is flexible enough to provide a possible solution to a mispricing problem.

  17. A maximum-entropy approach to the adiabatic freezing of a supercooled liquid.

    Science.gov (United States)

    Prestipino, Santi

    2013-04-28

    I employ the van der Waals theory of Baus and co-workers to analyze the fast, adiabatic decay of a supercooled liquid in a closed vessel with which the solidification process usually starts. By imposing a further constraint on either the system volume or pressure, I use the maximum-entropy method to quantify the fraction of liquid that is transformed into solid as a function of undercooling and of the amount of a foreign gas that could possibly be also present in the test tube. Upon looking at the implications of thermal and mechanical insulation for the energy cost of forming a solid droplet within the liquid, I identify one situation where the onset of solidification inevitably occurs near the wall in contact with the bath.

  18. Optimal resolution in maximum entropy image reconstruction from projections with multigrid acceleration

    Science.gov (United States)

    Limber, Mark A.; Manteuffel, Thomas A.; Mccormick, Stephen F.; Sholl, David S.

    1993-01-01

    We consider the problem of image reconstruction from a finite number of projections over the space L(sup 1)(Omega), where Omega is a compact subset of the set of Real numbers (exp 2). We prove that, given a discretization of the projection space, the function that generates the correct projection data and maximizes the Boltzmann-Shannon entropy is piecewise constant on a certain discretization of Omega, which we call the 'optimal grid'. It is on this grid that one obtains the maximum resolution given the problem setup. The size of this grid grows very quickly as the number of projections and number of cells per projection grow, indicating fast computational methods are essential to make its use feasible. We use a Fenchel duality formulation of the problem to keep the number of variables small while still using the optimal discretization, and propose a multilevel scheme to improve convergence of a simple cyclic maximization scheme applied to the dual problem.

  19. Use of maximum entropy method with parallel processing machine. [for x-ray object image reconstruction

    Science.gov (United States)

    Yin, Lo I.; Bielefeld, Michael J.

    1987-01-01

    The maximum entropy method (MEM) and balanced correlation method were used to reconstruct the images of low-intensity X-ray objects obtained experimentally by means of a uniformly redundant array coded aperture system. The reconstructed images from MEM are clearly superior. However, the MEM algorithm is computationally more time-consuming because of its iterative nature. On the other hand, both the inherently two-dimensional character of images and the iterative computations of MEM suggest the use of parallel processing machines. Accordingly, computations were carried out on the massively parallel processor at Goddard Space Flight Center as well as on the serial processing machine VAX 8600, and the results are compared.

  20. Source Function Determined from Hanbury-Brown/Twiss Correlations by the Maximum Entropy Principle

    Institute of Scientific and Technical Information of China (English)

    吴元芳; 刘连寿

    2002-01-01

    We study the reconstruction of the source function in space-time directly from the measured Hanbury-Brown/Twiss (HBT) correlation function using the maximum entropy principle. We find that the problem is ill-defined without at least one additional theoretical constraint as input. Using the requirement of a finite source lifetime for the problem we find a new Gaussian parametrization of the source function directly in terms of the measured HBT radius parameters and its lifetime, where the latter is a free parameter which is not directly measurable by HBT.We discuss the implications of our results for the remaining freedom in building source models consistent with a given set of measured HBT radius parameters.

  1. Parallelization of Maximum Entropy POS Tagging for Bahasa Indonesia with MapReduce

    Directory of Open Access Journals (Sweden)

    Arif Nurwidyantoro

    2012-07-01

    Full Text Available In this paper, MapReduce programming model is used to parallelize training and tagging process in maximum entropy part of speech tagging for Bahasa Indonesia. In training process, MapReduce model is implemented dictionary, tagtoken, and feature creation. In tagging process, MapReduce is implemented to tag lines of document in parallel. The training experiments showed that total training time using MapReduce is faster, but its result reading time inside the process slow down the total training time. The tagging experiments using different number of map and reduce process showed that MapReduce implementation could speedup the tagging process. The fastest tagging result is showed by tagging process using 1,000,000 word corpus and 30 map process.

  2. Evaluation of Variation Coefficient of Slewing Bearing Starting Torque Using Bootstrap Maximum-Entropy Method

    Directory of Open Access Journals (Sweden)

    Xintao Xia

    2013-07-01

    Full Text Available This study proposed the bootstrap maximum-entropy method to evaluate the uncertainty of the starting torque of a slewing bearing. Addressing the variation coefficient of the slewing bearing starting torque under load, the probability density function, estimated true value and variation domain are obtained through experimental investigation of the slewing bearing starting torque under various loads. The probability density function is found to be characterized by variational figure, scale and location. In addition, the estimated true value and the variation domain vary from large to small along with increasing load, indicating better evolution of the stability and reliability of the starting friction torque. Finally, a sensitive spot exists where the estimated true value and the variation domain rise abnormally, showing a fluctuation in the immunity and a degenerative disorder in the stability and reliability of the starting friction torque.

  3. Continuity of the maximum-entropy inference: Convex geometry and numerical ranges approach

    Energy Technology Data Exchange (ETDEWEB)

    Rodman, Leiba [Department of Mathematics, College of William and Mary, P.O. Box 8795, Williamsburg, Virginia 23187-8795 (United States); Spitkovsky, Ilya M., E-mail: ims2@nyu.edu, E-mail: ilya@math.wm.edu [Department of Mathematics, College of William and Mary, P.O. Box 8795, Williamsburg, Virginia 23187-8795 (United States); Division of Science and Mathematics, New York University Abu Dhabi, Saadiyat Island, P.O. Box 129188, Abu Dhabi (United Arab Emirates); Szkoła, Arleta, E-mail: szkola@mis.mpg.de; Weis, Stephan, E-mail: maths@stephan-weis.info [Max Planck Institute for Mathematics in the Sciences, Inselstrasse 22, D-04103 Leipzig (Germany)

    2016-01-15

    We study the continuity of an abstract generalization of the maximum-entropy inference—a maximizer. It is defined as a right-inverse of a linear map restricted to a convex body which uniquely maximizes on each fiber of the linear map a continuous function on the convex body. Using convex geometry we prove, amongst others, the existence of discontinuities of the maximizer at limits of extremal points not being extremal points themselves and apply the result to quantum correlations. Further, we use numerical range methods in the case of quantum inference which refers to two observables. One result is a complete characterization of points of discontinuity for 3 × 3 matrices.

  4. Maximum-entropy weak lens reconstruction improved methods and application to data

    CERN Document Server

    Marshall, P J; Gull, S F; Bridle, S L

    2002-01-01

    We develop the maximum-entropy weak shear mass reconstruction method presented in earlier papers by taking each background galaxy image shape as an independent estimator of the reduced shear field and incorporating an intrinsic smoothness into the reconstruction. The characteristic length scale of this smoothing is determined by Bayesian methods. Within this algorithm the uncertainties due to the intrinsic distribution of galaxy shapes are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures can be calculated with corresponding uncertainties. We apply this method to two clusters taken from N-body simulations using mock observations corresponding to Keck LRIS and mosaiced HST WFPC2 fields. We demonstrate that the Bayesian choice of smoothing length is sensible and that masses within apertures (including one on a filamentary structure) are reliable. We apply the method to data taken on the cluster MS1054-03 using the Keck LRIS (Clowe et al. 2000) and HST (Hoekstra e...

  5. Modelling streambank erosion potential using maximum entropy in a central Appalachian watershed

    Science.gov (United States)

    Pitchford, J.; Strager, M.; Riley, A.; Lin, L.; Anderson, J.

    2015-03-01

    We used maximum entropy to model streambank erosion potential (SEP) in a central Appalachian watershed to help prioritize sites for management. Model development included measuring erosion rates, application of a quantitative approach to locate Target Eroding Areas (TEAs), and creation of maps of boundary conditions. We successfully constructed a probability distribution of TEAs using the program Maxent. All model evaluation procedures indicated that the model was an excellent predictor, and that the major environmental variables controlling these processes were streambank slope, soil characteristics, bank position, and underlying geology. A classification scheme with low, moderate, and high levels of SEP derived from logistic model output was able to differentiate sites with low erosion potential from sites with moderate and high erosion potential. A major application of this type of modelling framework is to address uncertainty in stream restoration planning, ultimately helping to bridge the gap between restoration science and practice.

  6. High resolution VLBI polarisation imaging of AGN with the Maximum Entropy Method

    CERN Document Server

    Coughlan, Colm P

    2016-01-01

    Radio polarisation images of the jets of Active Galactic Nuclei (AGN) can provide a deep insight into the launching and collimation mechanisms of relativistic jets. However, even at VLBI scales, resolution is often a limiting factor in the conclusions that can be drawn from observations. The Maximum Entropy Method (MEM) is a deconvolution algorithm that can outperform the more common CLEAN algorithm in many cases, particularly when investigating structures present on scales comparable to or smaller than the nominal beam size with "super-resolution". A new implementation of the MEM suitable for single- or multiple-wavelength VLBI polarisation observations has been developed and is described here. Monte Carlo simulations comparing the performances of CLEAN and MEM at reconstructing the properties of model images are presented; these demonstrate the enhanced reliability of MEM over CLEAN when images of the fractional polarisation and polarisation angle are constructed using convolving beams that are appreciably ...

  7. Imaging VLBI polarimetry data from Active Galactic Nuclei using the Maximum Entropy Method

    Directory of Open Access Journals (Sweden)

    Coughlan Colm P.

    2013-12-01

    Full Text Available Mapping the relativistic jets emanating from AGN requires the use of a deconvolution algorithm to account for the effects of missing baseline spacings. The CLEAN algorithm is the most commonly used algorithm in VLBI imaging today and is suitable for imaging polarisation data. The Maximum Entropy Method (MEM is presented as an alternative with some advantages over the CLEAN algorithm, including better spatial resolution and a more rigorous and unbiased approach to deconvolution. We have developed a MEM code suitable for deconvolving VLBI polarisation data. Monte Carlo simulations investigating the performance of CLEAN and the MEM code on a variety of source types are being carried out. Real polarisation (VLBA data taken at multiple wavelengths have also been deconvolved using MEM, and several of the resulting polarisation and Faraday rotation maps are presented and discussed.

  8. On the stability of the moments of the maximum entropy wind wave spectrum

    Energy Technology Data Exchange (ETDEWEB)

    Pena, H.G.

    1983-03-01

    The stability of some current wind wave parameters as a function of high-frequency cut-off and degrees of freedom of the spectrum has been numerically investigated when computed in terms of the moments of the wave energy spectrum. From the Pierson-Moskovitz wave spectrum type, a sea surface profile is simulated and its wave energy spectrum is estimated by the Maximum Entropy Method (MEM). As the degrees of freedom of the MEM spectral estimation are varied, the results show a much better stability of the wave parameters as compared to the classical periodogram and correlogram spectral approaches. The stability of wave parameters as a function of high-frequency cut-off has the same result as obtained by the classical techniques.

  9. APPLICATION OF MAXIMUM ENTROPY PRINCIPLE METHOD TO THE STUDY OF WAVE CLIMATE STATISTICAL CHARACTERISTICS

    Institute of Scientific and Technical Information of China (English)

    XU Fu-min; XUE Hong-chao

    2004-01-01

    The Maximum Entropy Principle (MEP) method is elaborated, and the corresponding probability density evaluation method for the random fluctuation system is introduced, the goal of the article is to find the best fitting method for the wave climate statistical distribution. For the first time, a kind of new maximum entropy probability distribution (MEP distribution) expression is deduced in accordance with the second order moment of a random process. Different from all the fitting methods in the past, the MEP distribution can describe the probability distribution of any random fluctuation system conveniently and reasonably. If the moments of the random signal is limited to the second order, that is, the ratio of the root-mean-square value to the mean value of the random variable is obtained from the random sample, the corresponding MEP distribution can be computed according to the deduced expression in this essay. The concept of the wave climate is introduced here, and the MEP distribution is applied to fit the probability density distributions of the significant wave height and spectral peak period. Take the Mexico Gulf as an example, three stations at different locations, depths and wind wave strengths are chosen in the half-closed gulf, the significant wave height and spectral peak period distributions at each station are fitted with the MEP distribution, the Weibull distribution and the Log-normal distribution respectively, the fitted results are compared with the field observations, the results show that the MEP distribution is the best fitting method, and the Weibull distribution is the worst one when applied to the significant wave height and spectral peak period distributions at different locations, water depths and wind wave strengths in the Gulf. The conclusion shows the feasibility and reasonability of fitting wave climate statistical distributions with the deduced MEP distributions in this essay, and furthermore proves the great potential of MEP method to

  10. n-Order and maximum fuzzy similarity entropy for discrimination of signals of different complexity: Application to fetal heart rate signals.

    Science.gov (United States)

    Zaylaa, Amira; Oudjemia, Souad; Charara, Jamal; Girault, Jean-Marc

    2015-09-01

    This paper presents two new concepts for discrimination of signals of different complexity. The first focused initially on solving the problem of setting entropy descriptors by varying the pattern size instead of the tolerance. This led to the search for the optimal pattern size that maximized the similarity entropy. The second paradigm was based on the n-order similarity entropy that encompasses the 1-order similarity entropy. To improve the statistical stability, n-order fuzzy similarity entropy was proposed. Fractional Brownian motion was simulated to validate the different methods proposed, and fetal heart rate signals were used to discriminate normal from abnormal fetuses. In all cases, it was found that it was possible to discriminate time series of different complexity such as fractional Brownian motion and fetal heart rate signals. The best levels of performance in terms of sensitivity (90%) and specificity (90%) were obtained with the n-order fuzzy similarity entropy. However, it was shown that the optimal pattern size and the maximum similarity measurement were related to intrinsic features of the time series.

  11. Fuzzy tracking algorithm with feedback based on maximum entropy principle%带反馈多传感器模糊最大熵单目标跟踪算法

    Institute of Scientific and Technical Information of China (English)

    刘智; 陈丰; 黄继平

    2012-01-01

    针对矩阵加权融合算法计算量大、传感器数量不易扩充的特点,提出了一种带反馈的模糊最大熵融合算法.该算法采用模糊C-均值算法和最大熵原理计算状态向量中每一分量的权值,不但从整体考虑各分量对融合估计的影响,而且减少了复杂的矩阵运算过程,实时性较好.与矩阵加权算法相比,该融合算法还具有容易扩充的特点,能够直接应用于传感器数量大于2时的融合计算.实验仿真结果表明,融合估计的准确性与矩阵加权融合算法基本一致,算法的有效性得到了验证.%Aiming at the disadvantages of high computation overhead and bad extensibility in matrix weighted fusion methods, this paper proposed a multisensor fusion algorithm with feedback based on fuzzy C-means( FCM) clustering and tnaximun en-tropy principle( MEP). This algorithm combined FCM and MEP to calculate fusion matrix weight of state vector considering ev-ery components integratedly. What' s more, this algoritm had a good real-time performance due to less matrix computation and good extensibility which showed it could directly be applied into tracking system comprising more than two sensors. Experi-ments and results reveal that the tracking accuracy of fusion estimate is consistent with that of matrix weighted fusion methods.

  12. Entropy Based Modelling for Estimating Demographic Trends.

    Directory of Open Access Journals (Sweden)

    Guoqi Li

    Full Text Available In this paper, an entropy-based method is proposed to forecast the demographical changes of countries. We formulate the estimation of future demographical profiles as a constrained optimization problem, anchored on the empirically validated assumption that the entropy of age distribution is increasing in time. The procedure of the proposed method involves three stages, namely: 1 Prediction of the age distribution of a country's population based on an "age-structured population model"; 2 Estimation the age distribution of each individual household size with an entropy-based formulation based on an "individual household size model"; and 3 Estimation the number of each household size based on a "total household size model". The last stage is achieved by projecting the age distribution of the country's population (obtained in stage 1 onto the age distributions of individual household sizes (obtained in stage 2. The effectiveness of the proposed method is demonstrated by feeding real world data, and it is general and versatile enough to be extended to other time dependent demographic variables.

  13. Dynamics of non-stationary processes that follow the maximum of the Rényi entropy principle.

    Science.gov (United States)

    Shalymov, Dmitry S; Fradkov, Alexander L

    2016-01-01

    We propose dynamics equations which describe the behaviour of non-stationary processes that follow the maximum Rényi entropy principle. The equations are derived on the basis of the speed-gradient principle originated in the control theory. The maximum of the Rényi entropy principle is analysed for discrete and continuous cases, and both a discrete random variable and probability density function (PDF) are used. We consider mass conservation and energy conservation constraints and demonstrate the uniqueness of the limit distribution and asymptotic convergence of the PDF for both cases. The coincidence of the limit distribution of the proposed equations with the Rényi distribution is examined.

  14. Bayesian Maximum Entropy Integration of Ozone Observations and Model Predictions: A National Application.

    Science.gov (United States)

    Xu, Yadong; Serre, Marc L; Reyes, Jeanette; Vizuete, William

    2016-04-19

    To improve ozone exposure estimates for ambient concentrations at a national scale, we introduce our novel Regionalized Air Quality Model Performance (RAMP) approach to integrate chemical transport model (CTM) predictions with the available ozone observations using the Bayesian Maximum Entropy (BME) framework. The framework models the nonlinear and nonhomoscedastic relation between air pollution observations and CTM predictions and for the first time accounts for variability in CTM model performance. A validation analysis using only noncollocated data outside of a validation radius rv was performed and the R(2) between observations and re-estimated values for two daily metrics, the daily maximum 8-h average (DM8A) and the daily 24-h average (D24A) ozone concentrations, were obtained with the OBS scenario using ozone observations only in contrast with the RAMP and a Constant Air Quality Model Performance (CAMP) scenarios. We show that, by accounting for the spatial and temporal variability in model performance, our novel RAMP approach is able to extract more information in terms of R(2) increase percentage, with over 12 times for the DM8A and over 3.5 times for the D24A ozone concentrations, from CTM predictions than the CAMP approach assuming that model performance does not change across space and time.

  15. Minimum Mutual Information and Non-Gaussianity through the Maximum Entropy Method: Estimation from Finite Samples

    Directory of Open Access Journals (Sweden)

    Carlos A. L. Pires

    2013-02-01

    Full Text Available The Minimum Mutual Information (MinMI Principle provides the least committed, maximum-joint-entropy (ME inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values generated by constraining sets Tcr comprehended by mcr linear and/or nonlinear joint expectations, computed from samples of N iid outcomes. Marginals (and their entropy are imposed by single morphisms of the original random variables. N-asymptotic formulas are given both for the distribution of cross expectation’s estimation errors, the MinMI estimation bias, its variance and distribution. A growing Tcr leads to an increasing MinMI, converging eventually to the total MI. Under N-sized samples, the MinMI increment relative to two encapsulated sets Tcr1 ⊂ Tcr2 (with numbers of constraints mcr1

  16. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth pre

  17. Forest Tree Species Distribution Mapping Using Landsat Satellite Imagery and Topographic Variables with the Maximum Entropy Method in Mongolia

    Science.gov (United States)

    Hao Chiang, Shou; Valdez, Miguel; Chen, Chi-Farn

    2016-06-01

    Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM) were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface reflectance coupled

  18. FOREST TREE SPECIES DISTRIBUTION MAPPING USING LANDSAT SATELLITE IMAGERY AND TOPOGRAPHIC VARIABLES WITH THE MAXIMUM ENTROPY METHOD IN MONGOLIA

    Directory of Open Access Journals (Sweden)

    S. H. Chiang

    2016-06-01

    Full Text Available Forest is a very important ecosystem and natural resource for living things. Based on forest inventories, government is able to make decisions to converse, improve and manage forests in a sustainable way. Field work for forestry investigation is difficult and time consuming, because it needs intensive physical labor and the costs are high, especially surveying in remote mountainous regions. A reliable forest inventory can give us a more accurate and timely information to develop new and efficient approaches of forest management. The remote sensing technology has been recently used for forest investigation at a large scale. To produce an informative forest inventory, forest attributes, including tree species are unavoidably required to be considered. In this study the aim is to classify forest tree species in Erdenebulgan County, Huwsgul province in Mongolia, using Maximum Entropy method. The study area is covered by a dense forest which is almost 70% of total territorial extension of Erdenebulgan County and is located in a high mountain region in northern Mongolia. For this study, Landsat satellite imagery and a Digital Elevation Model (DEM were acquired to perform tree species mapping. The forest tree species inventory map was collected from the Forest Division of the Mongolian Ministry of Nature and Environment as training data and also used as ground truth to perform the accuracy assessment of the tree species classification. Landsat images and DEM were processed for maximum entropy modeling, and this study applied the model with two experiments. The first one is to use Landsat surface reflectance for tree species classification; and the second experiment incorporates terrain variables in addition to the Landsat surface reflectance to perform the tree species classification. All experimental results were compared with the tree species inventory to assess the classification accuracy. Results show that the second one which uses Landsat surface

  19. Decisions in uncertainty based on entropy

    Directory of Open Access Journals (Sweden)

    Vladimir CHERNOV

    2017-07-01

    Full Text Available At present, the choice of the best solutions out of many possible under conditions of uncertainty is the actual economic task, arising and to be solved in many economic situations. Famous classical approaches to its solution are based on various assessments of decision-making practical situations. However, they often give insufficiently accurate or incorrect results, and do not satisfy sustainability requirements, when the only invariant calculation result relative to calculation methodology is a reliable one and a corresponding to the reality result. This article describes an alternative approach to the justification of decisions under conditions of uncertainty without the construction and use of assumptions about the decision-making situation and in conformity with the approaches of the stability theory. The problem of multi-criteria decision-making in conditions of complete uncertainty, wherein structuring of alternatives is performed using the fuzzy entropy, has been formulated and conceptually investigated. The idea of the described method assumes that the criterial conformity is estimated by fuzzy numbers and (or linguistic allegations, i.e. formalizes by tools of fuzzy set theory. In opposition to classical approaches, this approach does not require the construction of hypotheses about the possible circumstances of decision-making and meets the requirements of stability theory. As a confirmation, it has been shown that the calculation of the fuzzy entropy by various methods does not lead to contradictory results. In this work appropriateness and practicality of using fuzzy entropy criterion for ordering sets of alternatives in fuzzy conditions of decision making has been is substantiated. The method for calculating the fuzzy entropy when evaluating criteria in linguistic form has been grounded. The paper presents numerical examples for which the fuzzy entropy calculation allows generating grounded clear recommendations and choose the best

  20. SAR image target segmentation based on entropy maximization and morphology

    Institute of Scientific and Technical Information of China (English)

    柏正尧; 刘洲峰; 何佩琨

    2004-01-01

    Entropy maximization thresholding is a simple, effective image segmentation method. The relation between the histogram entropy and the gray level of an image is analyzed. An approach, which speeds the computation of optimal threshold based on entropy maximization, is proposed. The suggested method has been applied to the synthetic aperture radar (SAR) image targets segmentation. Mathematical morphology works well in reducing the residual noise.

  1. Test the principle of maximum entropy in constant sum 2×2 game: Evidence in experimental economics

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Bin, E-mail: xubin211@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Public Administration College, Zhejiang Gongshang University, Hangzhou, 310018 (China); Zhang, Hongen, E-mail: hongen777@163.com [Department of Physics, Zhejiang University, Hangzhou, 310027 (China); Wang, Zhijian, E-mail: wangzj@zju.edu.cn [Experimental Social Science Laboratory, Zhejiang University, Hangzhou, 310058 (China); Zhang, Jianbo, E-mail: jbzhang08@zju.edu.cn [Department of Physics, Zhejiang University, Hangzhou, 310027 (China)

    2012-03-19

    By using laboratory experimental data, we test the uncertainty of strategy type in various competing environments with two-person constant sum 2×2 game in the social system. It firstly shows that, in these competing game environments, the outcome of human's decision-making obeys the principle of the maximum entropy. -- Highlights: ► Test the uncertainty in two-person constant sum games with experimental data. ► On game level, the constant sum game fits the principle of maximum entropy. ► On group level, all empirical entropy values are close to theoretical maxima. ► The results can be different for the games that are not constant sum game.

  2. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1987-01-01

    A stochastic control model of the NASA/MSFC Ground Facility for Large Space Structures (LSS) control verification through Maximum Entropy (ME) principle adopted in Hyland's method was presented. Using ORACLS, a computer program was implemented for this purpose. Four models were then tested and the results presented.

  3. Charge density study with the Maximum Entropy Method on model data of silicon. A search for non-nuclear attractors

    NARCIS (Netherlands)

    Vries, de R.Y.; Briels, W.J.; Feil, D.; Velde, te G.; Baerends, E.J.

    1996-01-01

    1990 Sakata and Sato applied the maximum entropy method (MEM) to a set of structure factors measured earlier by Saka and Kato with the Pendellösung method. They found the presence of non-nuclear attractors, i.e., maxima in the density between two bonded atoms. We applied the MEM to a limited set of

  4. Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location

    Directory of Open Access Journals (Sweden)

    Qiaoning Yang

    2015-10-01

    Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.

  5. Computational Amide I Spectroscopy for Refinement of Disordered Peptide Ensembles: Maximum Entropy and Related Approaches

    Science.gov (United States)

    Reppert, Michael; Tokmakoff, Andrei

    The structural characterization of intrinsically disordered peptides (IDPs) presents a challenging biophysical problem. Extreme heterogeneity and rapid conformational interconversion make traditional methods difficult to interpret. Due to its ultrafast (ps) shutter speed, Amide I vibrational spectroscopy has received considerable interest as a novel technique to probe IDP structure and dynamics. Historically, Amide I spectroscopy has been limited to delivering global secondary structural information. More recently, however, the method has been adapted to study structure at the local level through incorporation of isotope labels into the protein backbone at specific amide bonds. Thanks to the acute sensitivity of Amide I frequencies to local electrostatic interactions-particularly hydrogen bonds-spectroscopic data on isotope labeled residues directly reports on local peptide conformation. Quantitative information can be extracted using electrostatic frequency maps which translate molecular dynamics trajectories into Amide I spectra for comparison with experiment. Here we present our recent efforts in the development of a rigorous approach to incorporating Amide I spectroscopic restraints into refined molecular dynamics structural ensembles using maximum entropy and related approaches. By combining force field predictions with experimental spectroscopic data, we construct refined structural ensembles for a family of short, strongly disordered, elastin-like peptides in aqueous solution.

  6. Predictive modeling and mapping of Malayan Sun Bear (Helarctos malayanus) distribution using maximum entropy.

    Science.gov (United States)

    Nazeri, Mona; Jusoff, Kamaruzaman; Madani, Nima; Mahmud, Ahmad Rodzi; Bahman, Abdul Rani; Kumar, Lalit

    2012-01-01

    One of the available tools for mapping the geographical distribution and potential suitable habitats is species distribution models. These techniques are very helpful for finding poorly known distributions of species in poorly sampled areas, such as the tropics. Maximum Entropy (MaxEnt) is a recently developed modeling method that can be successfully calibrated using a relatively small number of records. In this research, the MaxEnt model was applied to describe the distribution and identify the key factors shaping the potential distribution of the vulnerable Malayan Sun Bear (Helarctos malayanus) in one of the main remaining habitats in Peninsular Malaysia. MaxEnt results showed that even though Malaysian sun bear habitat is tied with tropical evergreen forests, it lives in a marginal threshold of bio-climatic variables. On the other hand, current protected area networks within Peninsular Malaysia do not cover most of the sun bears potential suitable habitats. Assuming that the predicted suitability map covers sun bears actual distribution, future climate change, forest degradation and illegal hunting could potentially severely affect the sun bear's population.

  7. Potential distribution of Mexican primates: modeling the ecological niche with the maximum entropy algorithm.

    Science.gov (United States)

    Vidal-García, Francisca; Serio-Silva, Juan Carlos

    2011-07-01

    We developed a potential distribution model for the tropical rain forest species of primates of southern Mexico: the black howler monkey (Alouatta pigra), the mantled howler monkey (Alouatta palliata), and the spider monkey (Ateles geoffroyi). To do so, we applied the maximum entropy algorithm from the ecological niche modeling program MaxEnt. For each species, we used occurrence records from scientific collections, and published and unpublished sources, and we also used the 19 environmental coverage variables related to precipitation and temperature from WorldClim to develop the models. The predicted distribution of A. pigra was strongly associated with the mean temperature of the warmest quarter (23.6%), whereas the potential distributions of A. palliata and A. geoffroyi were strongly associated with precipitation during the coldest quarter (52.2 and 34.3% respectively). The potential distribution of A. geoffroyi is broader than that of the Alouatta spp. The areas with the greatest probability of presence of A. pigra and A. palliata are strongly associated with riparian vegetation, whereas the presence of A. geoffroyi is more strongly associated with the presence of rain forest. Our most significant contribution is the identification of areas with a high probability of the presence of these primate species, which is information that can be applied to planning future studies and then establishing criteria for the creation of areas to primate conservation in Mexico.

  8. Maximum entropy estimation of glutamate and glutamine in MR spectroscopic imaging.

    Science.gov (United States)

    Rathi, Yogesh; Ning, Lipeng; Michailovich, Oleg; Liao, HuiJun; Gagoski, Borjan; Grant, P Ellen; Shenton, Martha E; Stern, Robert; Westin, Carl-Fredrik; Lin, Alexander

    2014-01-01

    Magnetic resonance spectroscopic imaging (MRSI) is often used to estimate the concentration of several brain metabolites. Abnormalities in these concentrations can indicate specific pathology, which can be quite useful in understanding the disease mechanism underlying those changes. Due to higher concentration, metabolites such as N-acetylaspartate (NAA), Creatine (Cr) and Choline (Cho) can be readily estimated using standard Fourier transform techniques. However, metabolites such as Glutamate (Glu) and Glutamine (Gln) occur in significantly lower concentrations and their resonance peaks are very close to each other making it difficult to accurately estimate their concentrations (separately). In this work, we propose to use the theory of 'Spectral Zooming' or high-resolution spectral analysis to separate the Glutamate and Glutamine peaks and accurately estimate their concentrations. The method works by estimating a unique power spectral density, which corresponds to the maximum entropy solution of a zero-mean stationary Gaussian process. We demonstrate our estimation technique on several physical phantom data sets as well as on in-vivo brain spectroscopic imaging data. The proposed technique is quite general and can be used to estimate the concentration of any other metabolite of interest.

  9. Bayesian Maximum Entropy space/time estimation of surface water chloride in Maryland using river distances.

    Science.gov (United States)

    Jat, Prahlad; Serre, Marc L

    2016-12-01

    Widespread contamination of surface water chloride is an emerging environmental concern. Consequently accurate and cost-effective methods are needed to estimate chloride along all river miles of potentially contaminated watersheds. Here we introduce a Bayesian Maximum Entropy (BME) space/time geostatistical estimation framework that uses river distances, and we compare it with Euclidean BME to estimate surface water chloride from 2005 to 2014 in the Gunpowder-Patapsco, Severn, and Patuxent subbasins in Maryland. River BME improves the cross-validation R(2) by 23.67% over Euclidean BME, and river BME maps are significantly different than Euclidean BME maps, indicating that it is important to use river BME maps to assess water quality impairment. The river BME maps of chloride concentration show wide contamination throughout Baltimore and Columbia-Ellicott cities, the disappearance of a clean buffer separating these two large urban areas, and the emergence of multiple localized pockets of contamination in surrounding areas. The number of impaired river miles increased by 0.55% per year in 2005-2009 and by 1.23% per year in 2011-2014, corresponding to a marked acceleration of the rate of impairment. Our results support the need for control measures and increased monitoring of unassessed river miles.

  10. Characterizing species abundance distributions across taxa and ecosystems using a simple maximum entropy model.

    Science.gov (United States)

    White, Ethan P; Thibault, Katherine M; Xiao, Xiao

    2012-08-01

    The species abundance distribution (SAD) is one of themost studied patterns in ecology due to its potential insights into commonness and rarity, community assembly, and patterns of biodiversity. It is well established that communities are composed of a few common and many rare species, and numerous theoretical models have been proposed to explain this pattern. However, no attempt has been made to determine how well these theoretical characterizations capture observed taxonomic and global-scale spatial variation in the general form of the distribution. Here, using data of a scope unprecedented in community ecology, we show that a simple maximum entropy model produces a truncated log-series distribution that can predict between 83% and 93% of the observed variation in the rank abundance of species across 15 848 globally distributed communities including birds, mammals, plants, and butterflies. This model requires knowledge of only the species richness and total abundance of the community to predict the full abundance distribution, which suggests that these factors are sufficient to understand the distribution for most purposes. Since geographic patterns in richness and abundance can often be successfully modeled, this approach should allow the distribution of commonness and rarity to be characterized, even in locations where empirical data are unavailable.

  11. Using Maximum Entropy Modeling for Optimal Selection of Sampling Sites for Monitoring Networks

    Directory of Open Access Journals (Sweden)

    Paul H. Evangelista

    2011-05-01

    Full Text Available Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2 of the National Ecological Observatory Network (NEON. We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint, within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  12. a Maximum Entropy Model of the Bearded Capuchin Monkey Habitat Incorporating Topography and Spectral Unmixing Analysis

    Science.gov (United States)

    Howard, A. M.; Bernardes, S.; Nibbelink, N.; Biondi, L.; Presotto, A.; Fragaszy, D. M.; Madden, M.

    2012-07-01

    Movement patterns of bearded capuchin monkeys (Cebus (Sapajus) libidinosus) in northeastern Brazil are likely impacted by environmental features such as elevation, vegetation density, or vegetation type. Habitat preferences of these monkeys provide insights regarding the impact of environmental features on species ecology and the degree to which they incorporate these features in movement decisions. In order to evaluate environmental features influencing movement patterns and predict areas suitable for movement, we employed a maximum entropy modelling approach, using observation points along capuchin monkey daily routes as species presence points. We combined these presence points with spatial data on important environmental features from remotely sensed data on land cover and topography. A spectral mixing analysis procedure was used to generate fraction images that represent green vegetation, shade and soil of the study area. A Landsat Thematic Mapper scene of the area of study was geometrically and atmospherically corrected and used as input in a Minimum Noise Fraction (MNF) procedure and a linear spectral unmixing approach was used to generate the fraction images. These fraction images and elevation were the environmental layer inputs for our logistic MaxEnt model of capuchin movement. Our models' predictive power (test AUC) was 0.775. Areas of high elevation (>450 m) showed low probabilities of presence, and percent green vegetation was the greatest overall contributor to model AUC. This work has implications for predicting daily movement patterns of capuchins in our field site, as suitability values from our model may relate to habitat preference and facility of movement.

  13. On the relevance of the maximum entropy principle in non-equilibrium statistical mechanics

    Science.gov (United States)

    Auletta, Gennaro; Rondoni, Lamberto; Vulpiani, Angelo

    2017-07-01

    At first glance, the maximum entropy principle (MEP) apparently allows us to derive, or justify in a simple way, fundamental results of equilibrium statistical mechanics. Because of this, a school of thought considers the MEP as a powerful and elegant way to make predictions in physics and other disciplines, rather than a useful technical tool like others in statistical physics. From this point of view the MEP appears as an alternative and more general predictive method than the traditional ones of statistical physics. Actually, careful inspection shows that such a success is due to a series of fortunate facts that characterize the physics of equilibrium systems, but which are absent in situations not described by Hamiltonian dynamics, or generically in nonequilibrium phenomena. Here we discuss several important examples in non equilibrium statistical mechanics, in which the MEP leads to incorrect predictions, proving that it does not have a predictive nature. We conclude that, in these paradigmatic examples, an approach that uses a detailed analysis of the relevant aspects of the dynamics cannot be avoided.

  14. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  15. Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.

    Science.gov (United States)

    O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao

    2017-07-01

    Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.

  16. Predictive modeling and mapping of Malayan Sun Bear (Helarctos malayanus distribution using maximum entropy.

    Directory of Open Access Journals (Sweden)

    Mona Nazeri

    Full Text Available One of the available tools for mapping the geographical distribution and potential suitable habitats is species distribution models. These techniques are very helpful for finding poorly known distributions of species in poorly sampled areas, such as the tropics. Maximum Entropy (MaxEnt is a recently developed modeling method that can be successfully calibrated using a relatively small number of records. In this research, the MaxEnt model was applied to describe the distribution and identify the key factors shaping the potential distribution of the vulnerable Malayan Sun Bear (Helarctos malayanus in one of the main remaining habitats in Peninsular Malaysia. MaxEnt results showed that even though Malaysian sun bear habitat is tied with tropical evergreen forests, it lives in a marginal threshold of bio-climatic variables. On the other hand, current protected area networks within Peninsular Malaysia do not cover most of the sun bears potential suitable habitats. Assuming that the predicted suitability map covers sun bears actual distribution, future climate change, forest degradation and illegal hunting could potentially severely affect the sun bear's population.

  17. Maximum entropy inference of seabed attenuation parameters using ship radiated broadband noise.

    Science.gov (United States)

    Knobles, D P

    2015-12-01

    The received acoustic field generated by a single passage of a research vessel on the New Jersey continental shelf is employed to infer probability distributions for the parameter values representing the frequency dependence of the seabed attenuation and the source levels of the ship. The statistical inference approach employed in the analysis is a maximum entropy methodology. The average value of the error function, needed to uniquely specify a conditional posterior probability distribution, is estimated with data samples from time periods in which the ship-receiver geometry is dominated by either the stern or bow aspect. The existence of ambiguities between the source levels and the environmental parameter values motivates an attempt to partially decouple these parameter values. The main result is the demonstration that parameter values for the attenuation (α and the frequency exponent), the sediment sound speed, and the source levels can be resolved through a model space reduction technique. The results of this multi-step statistical inference developed for ship radiated noise is then tested by processing towed source data over the same bandwidth and source track to estimate continuous wave source levels that were measured independently with a reference hydrophone on the tow body.

  18. Mind the edge! The role of adjacency matrix degeneration in maximum entropy weighted network models

    CERN Document Server

    Sagarra, Oleguer; Díaz-Guilera, Albert

    2015-01-01

    Complex network null models based on entropy maximization are becoming a powerful tool to characterize and analyze data from real systems. However, it is not easy to extract good and unbiased information from these models: A proper understanding of the nature of the underlying events represented in them is crucial. In this paper we emphasize this fact stressing how an accurate counting of configurations compatible with given constraints is fundamental to build good null models for the case of networks with integer valued adjacency matrices constructed from aggregation of one or multiple layers. We show how different assumptions about the elements from which the networks are built give rise to distinctively different statistics, even when considering the same observables to match those of real data. We illustrate our findings by applying the formalism to three datasets using an open-source software package accompanying the present work and demonstrate how such differences are clearly seen when measuring networ...

  19. A Hardware Architecture of a Counter-Based Entropy Coder

    Directory of Open Access Journals (Sweden)

    Armein Z R Langi

    2012-04-01

    Full Text Available This paper describes a hardware architectural design of a real-time counter based entropy coder at a register transfer level (RTL computing model. The architecture is based on a lossless compression algorithm called Rice coding, which is optimal for an entropy range of bits per sample. The architecture incorporates a word-splitting scheme to extend the entropy coverage into a range of bits per sample. We have designed a data structure in a form of independent code blocks, allowing more robust compressed bitstream. The design focuses on an RTL computing model and architecture, utilizing 8-bit buffers, adders, registers, loader-shifters, select-logics, down-counters, up-counters, and multiplexers. We have validated the architecture (both the encoder and the decoder in a coprocessor for 8 bits/sample data on an FPGA Xilinx XC4005, utilizing 61% of F&G-CLBs, 34% H-CLBs, 32% FF-CLBs, and 68% IO resources. On this FPGA implementation, the encoder and decoder can achieve 1.74 Mbits/s and 2.91 Mbits/s throughputs, respectively. The architecture allows pipelining, resulting in potentially maximum encoding throughput of 200 Mbit/s on typical real-time TTL implementations. In addition, it uses a minimum number of register elements. As a result, this architecture can result in low cost, low energy consumption and reduced silicon area realizations.

  20. Identifying English BaseNPs Through a Combination of Maximum Entropy Approach and Brill Approach%最大熵和Brill方法结合识别英语BaseNPs

    Institute of Scientific and Technical Information of China (English)

    吕琳; 刘玉树

    2006-01-01

    为了进一步提高基本名词短语(BaseNPs)的识别精度,针对最大熵方法和Brill方法各自的特点,提出基于两者相结合的英语基本名词短语识别算法.该算法是在高准确率词性标注的基础上实现的.在训练和测试两个阶段中,均先采用最大熵方法识别基本名词短语,然后将已具有很高精度的识别结果作为初始标注结果运用于Brill方法中.实验结果表明,此联合算法达到了94%的准确率和召回率,充分融合了最大熵方法和Brill方法的优点,可与基于相同训练和测试语料的目前最理想的英语基本名词短语识别结果相比.

  1. Entropy-based Tuning of Musical Instruments

    CERN Document Server

    Hinrichsen, Haye

    2012-01-01

    The human sense of hearing perceives a combination of sounds 'in tune' if the corresponding harmonic spectra are correlated, meaning that the neuronal excitation pattern in the inner ear exhibits some kind of order. Based on this observation it is suggested that musical instruments such as pianos can be tuned by minimizing the Shannon entropy of suitably preprocessed Fourier spectra. This method reproduces not only the correct stretch curve but also similar pitch fluctuations as in the case of high-quality aural tuning.

  2. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    Science.gov (United States)

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  3. Entropy coders for image compression based on binary forward classification

    Science.gov (United States)

    Yoo, Hoon; Jeong, Jechang

    2000-12-01

    Entropy coders as a noiseless compression method are widely used as final step compression for images, and there have been many contributions to increase of entropy coder performance and to reduction of entropy coder complexity. In this paper, we propose some entropy coders based on the binary forward classification (BFC). The BFC requires overhead of classification but there is no change between the amount of input information and the total amount of classified output information, which we prove this property in this paper. And using the proved property, we propose entropy coders that are the BFC followed by Golomb-Rice coders (BFC+GR) and the BFC followed by arithmetic coders (BFC+A). The proposed entropy coders introduce negligible additional complexity due to the BFC. Simulation results also show better performance than other entropy coders that have similar complexity to the proposed coders.

  4. Application of a maximum entropy method to estimate the probability density function of nonlinear or chaotic behavior in structural health monitoring data

    Science.gov (United States)

    Livingston, Richard A.; Jin, Shuang

    2005-05-01

    Bridges and other civil structures can exhibit nonlinear and/or chaotic behavior under ambient traffic or wind loadings. The probability density function (pdf) of the observed structural responses thus plays an important role for long-term structural health monitoring, LRFR and fatigue life analysis. However, the actual pdf of such structural response data often has a very complicated shape due to its fractal nature. Various conventional methods to approximate it can often lead to biased estimates. This paper presents recent research progress at the Turner-Fairbank Highway Research Center of the FHWA in applying a novel probabilistic scaling scheme for enhanced maximum entropy evaluation to find the most unbiased pdf. The maximum entropy method is applied with a fractal interpolation formulation based on contraction mappings through an iterated function system (IFS). Based on a fractal dimension determined from the entire response data set by an algorithm involving the information dimension, a characteristic uncertainty parameter, called the probabilistic scaling factor, can be introduced. This allows significantly enhanced maximum entropy evaluation through the added inferences about the fine scale fluctuations in the response data. Case studies using the dynamic response data sets collected from a real world bridge (Commodore Barry Bridge, PA) and from the simulation of a classical nonlinear chaotic system (the Lorenz system) are presented in this paper. The results illustrate the advantages of the probabilistic scaling method over conventional approaches for finding the unbiased pdf especially in the critical tail region that contains the larger structural responses.

  5. Entropy-based financial asset pricing.

    Directory of Open Access Journals (Sweden)

    Mihály Ormos

    Full Text Available We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  6. Entropy-based financial asset pricing.

    Science.gov (United States)

    Ormos, Mihály; Zibriczky, Dávid

    2014-01-01

    We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  7. Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties

    Directory of Open Access Journals (Sweden)

    Rui A. P. Perdigão

    2012-06-01

    Full Text Available The application of the Maximum Entropy (ME principle leads to a minimum of the Mutual Information (MI, I(X,Y, between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig, depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing, coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios.

  8. Simultaneous State and Parameter Estimation Using Maximum Relative Entropy with Nonhomogenous Differential Equation Constraints

    Directory of Open Access Journals (Sweden)

    Adom Giffin

    2014-09-01

    Full Text Available In this paper, we continue our efforts to show how maximum relative entropy (MrE can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a non-equilibrium state, i.e., perturbed by varying external forces. Traditional parameter estimation can be performed by using filters, such as the extended Kalman filter (EKF. However, as shown with a toy example of a system with first order non-homogeneous ordinary differential equations, assumptions made by the EKF algorithm (such as the Markov assumption may not be valid. The problem can be solved with exponential smoothing, e.g., exponentially weighted moving average (EWMA. Although this has been shown to produce acceptable filtering results in real exponential systems, it still cannot simultaneously estimate both the state and its parameters and has its own assumptions that are not always valid, for example when jump discontinuities exist. We show that by applying MrE as a filter, we can not only develop the closed form solutions, but we can also infer the parameters of the differential equation simultaneously with the means. This is useful in real, physical systems, where we want to not only filter the noise from our measurements, but we also want to simultaneously infer the parameters of the dynamics of a nonlinear and non-equilibrium system. Although there were many assumptions made throughout the paper to illustrate that EKF and exponential smoothing are special cases ofMrE, we are not “constrained”, by these assumptions. In other words, MrE is completely general and can be used in broader ways.

  9. Entropy-based consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław Jerzy

    2016-09-01

    A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.

  10. An Entropy-Based Damage Characterization

    Directory of Open Access Journals (Sweden)

    Mehdi Amiri

    2014-12-01

    Full Text Available This paper presents a scientific basis for the description of the causes of damage within an irreversible thermodynamic framework and the effects of damage as observable variables that signify degradation of structural integrity. The approach relies on the fundamentals of irreversible thermodynamics and specifically the notion of entropy generation as a measure of degradation and damage. We first review the state-of-the-art advances in entropic treatment of damage followed by a discussion on generalization of the entropic concept to damage characterization that may offers a better definition of damage metric commonly used for structural integrity assessment. In general, this approach provides the opportunity to described reliability and risk of structures in terms of fundamental science concepts. Over the years, many studies have focused on materials damage assessment by determining physics-based cause and affect relationships, the goal of this paper is to put this work in perspective and encourage future work of materials damage based on the entropy concept.

  11. A temp erature and emissivity separation algorithm based on maximum entropy estimation of alpha sp ectrum’s scaling and translation%基于最大熵估计Alpha谱缩放与平移量的温度与发射率分离算法∗

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    In the thermal infrared (TIR) waveband, solving the target emissivity spectrum and temperature leads to an ill-posed problem in which the number of unknown parameters is larger than that of available measurements. Generally, the approaches developed for solving this kind of problems are called, by a joint name, the TES (temperature and emissivity separation) algorithm. As is shown in the name, the TES algorithm is dedicated to separating the target temperature and emissivity in the calculating procedure. In this paper, a novel method called the new MaxEnt (maximum entropy) TES algorithm is proposed, which is considered as a promotion of the MaxEnt TES algorithm proposed by Barducci. The maximum entropy estimation is utilized as the basic framework in the two preceding algorithms, so that the two algorithms both could make temperature and emissivity separation, independent of experiential information derived by some special data bases. As a result, the two algorithms could be applied to solve the temperature and emissivity spectrum of the targets which are absolutely unknown to us. However, what makes the two algorithms different is that the alpha spectrum derived by the ADE (alpha derived emissivity) method is considered as priori information to be added in the new MaxEnt TES algorithm. Based on the Wien approximation, the ADE method is dedicated to the calculation of the alpha spectrum which has a similar distribution to the true emissivity spectrum. Based on the preceding promotion, the new MaxEnt TES algorithm keeps a simpler mathematical formalism. Without any doubt, the new MaxEnt TES algorithm provides a faster computation for large volumes of data (i.e. hyperspectral images of the Earth). Some numerical simulations have been performed; the data and results show that, the maximum RMSE of emissivity estimation is 0.017, the maximum absolute error of temperature estimation is 0.62 K. Added with Gaussian white noise in which the signal to noise ratio is measured

  12. Review on Image Segmentation Based on Entropy%图像分割的熵方法综述

    Institute of Scientific and Technical Information of China (English)

    曹建农

    2012-01-01

    The image segmentation based on entropy is analyzed and reviewed including one-dimensional maximum entropy, minimum cross entropy, maximum cross entropy and so on. The relations of Shannon entropy, Tsallis entropy and Renyi entropy are analyzed and commented, and the performance of two dimensional (high dimension) entropy and spatial entropy is also appraised. In conclusion, it points out the future research direction, such as the computational efficiency of the high-dimensional entropy model and one-dimensional entropy and other theories integrated.%对图像分割的熵方法进行较全面地分析和综述,其中包括一维最大熵、最小交叉熵、最大交叉熵图像分割方法等.对Shannon熵、Tsallis熵及Renyi熵之间的关系等进行分析与评述.对二维(高维)熵及空间熵等进行分析与评述.最后指出一维熵与其它理论的有机结合、高维熵模型的计算效率等未来研究方向.

  13. Neutrino Induced 4He Break-up Reaction -- Application of the Maximum Entropy Method in Calculating Nuclear Strength Function

    CERN Document Server

    Murata, T; Sato, T; Nakamura, S X

    2016-01-01

    The maximum entropy method is examined as a new tool for solving the ill-posed inversion problem involved in the Lorentz integral transformation (LIT) method. As an example, we apply the method to the spin-dipole strength function of 4He. We show that the method can be successfully used for inversion of LIT, provided the LIT function is available with a sufficient accuracy.

  14. MAXED, a computer code for the deconvolution of multisphere neutron spectrometer data using the maximum entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Reginatto, M.; Goldhagen, P.

    1998-06-01

    The problem of analyzing data from a multisphere neutron spectrometer to infer the energy spectrum of the incident neutrons is discussed. The main features of the code MAXED, a computer program developed to apply the maximum entropy principle to the deconvolution (unfolding) of multisphere neutron spectrometer data, are described, and the use of the code is illustrated with an example. A user`s guide for the code MAXED is included in an appendix. The code is available from the authors upon request.

  15. Deep-sea benthic megafaunal habitat suitability modelling: A global-scale maximum entropy model for xenophyophores

    OpenAIRE

    Ashford, Oliver S; Davies, Andrew J.; Jones, Daniel O. B.

    2014-01-01

    Xenophyophores are a group of exclusively deep-sea agglutinating rhizarian protozoans, at least some of which are foraminifera. They are an important constituent of the deep-sea megafauna that are sometimes found in sufficient abundance to act as a significant source of habitat structure for meiofaunal and macrofaunal organisms. This study utilised maximum entropy modelling (Maxent) and a high-resolution environmental database to explore the environmental factors controlling the presence of X...

  16. Link prediction based on path entropy

    CERN Document Server

    Xu, Zhongqi; Yang, Jian

    2015-01-01

    Information theory has been taken as a prospective tool for quantifying the complexity of complex networks. In this paper, we first study the information entropy or uncertainty of a path using the information theory. Then we apply the path entropy to the link prediction problem in real-world networks. Specifically, we propose a new similarity index, namely Path Entropy (PE) index, which considers the information entropies of shortest paths between node pairs with penalization to long paths. Empirical experiments demonstrate that PE index outperforms the mainstream link predictors.

  17. Shannon Entropy-Based Evaluation of Meteorological Droughts over China

    Science.gov (United States)

    Sang, Yan-Fang; Singh, Vijay P.

    2017-04-01

    Evaluation of drought is essential to develop mitigation measures. Here we employed the entropy index to investigate spatiotemporal variability of meteorological droughts over China. Entropy values, with reliable hydrogeographical basis, can be classified as 1.5, reflecting a very high, high, mid and low occurrence probability of droughts. The occurrence frequency and average magnitude of droughts, based on the standard precipitation index, consistently decrease with entropy increase. Therefore, southwest China with smaller entropy values has higher occurrence probability of droughts than northwest China, with a break at 38°N latitude. The aggravating drought in China is represented by the increase in occurrence frequency but not magnitude. The entropy, determined by skew variation of precipitation and easily calculated, can be an effective index for evaluating drought. We therefore identified dominant thresholds for entropy values and statistical characteristics of precipitation, which would help evaluate the occurrence probability of meteorological droughts worldwide.

  18. Entropy-based link prediction in weighted networks

    CERN Document Server

    Xu, Zhongqi; Sharafat, Rajput Ramiz; Li, Lunbo; Yang, Jian

    2016-01-01

    Information entropy has been proved to be an effective tool to quantify the structural importance of complex networks. In the previous work (Xu et al, 2016 \\cite{xu2016}), we measure the contribution of a path in link prediction with information entropy. In this paper, we further quantify the contribution of a path with both path entropy and path weight, and propose a weighted prediction index based on the contributions of paths, namely Weighted Path Entropy (WPE), to improve the prediction accuracy in weighted networks. Empirical experiments on six weighted real-world networks show that WPE achieves higher prediction accuracy than three typical weighted indices.

  19. Predicting Changes in Macrophyte Community Structure from Functional Traits in a Freshwater Lake: A Test of Maximum Entropy Model.

    Science.gov (United States)

    Fu, Hui; Zhong, Jiayou; Yuan, Guixiang; Guo, Chunjing; Lou, Qian; Zhang, Wei; Xu, Jun; Ni, Leyi; Xie, Ping; Cao, Te

    2015-01-01

    Trait-based approaches have been widely applied to investigate how community dynamics respond to environmental gradients. In this study, we applied a series of maximum entropy (maxent) models incorporating functional traits to unravel the processes governing macrophyte community structure along water depth gradient in a freshwater lake. We sampled 42 plots and 1513 individual plants, and measured 16 functional traits and abundance of 17 macrophyte species. Study results showed that maxent model can be highly robust (99.8%) in predicting the species relative abundance of macrophytes with observed community-weighted mean (CWM) traits as the constraints, while relative low (about 30%) with CWM traits fitted from water depth gradient as the constraints. The measured traits showed notably distinct importance in predicting species abundances, with lowest for perennial growth form and highest for leaf dry mass content. For tuber and leaf nitrogen content, there were significant shifts in their effects on species relative abundance from positive in shallow water to negative in deep water. This result suggests that macrophyte species with tuber organ and greater leaf nitrogen content would become more abundant in shallow water, but would become less abundant in deep water. Our study highlights how functional traits distributed across gradients provide a robust path towards predictive community ecology.

  20. Dynamics of the Anderson model for dilute magnetic alloys: A quantum Monte Carlo and maximum entropy study

    Energy Technology Data Exchange (ETDEWEB)

    Silver, R.N.; Gubernatis, J.E.; Sivia, D.S. (Los Alamos National Lab., NM (USA)); Jarrell, M. (Ohio State Univ., Columbus, OH (USA). Dept. of Physics)

    1990-01-01

    In this article we describe the results of a new method for calculating the dynamical properties of the Anderson model. QMC generates data about the Matsubara Green's functions in imaginary time. To obtain dynamical properties, one must analytically continue these data to real time. This is an extremely ill-posed inverse problem similar to the inversion of a Laplace transform from incomplete and noisy data. Our method is a general one, applicable to the calculation of dynamical properties from a wide variety of quantum simulations. We use Bayesian methods of statistical inference to determine the dynamical properties based on both the QMC data and any prior information we may have such as sum rules, symmetry, high frequency limits, etc. This provides a natural means of combining perturbation theory and numerical simulations in order to understand dynamical many-body problems. Specifically we use the well-established maximum entropy (ME) method for image reconstruction. We obtain the spectral density and transport coefficients over the entire range of model parameters accessible by QMC, with data having much larger statistical error than required by other proposed analytic continuation methods.

  1. Application of Maximum Entropy Principle to Studying the Distribution of Wave Heights in A Random Wave Field

    Institute of Scientific and Technical Information of China (English)

    周良明; 郭佩芳; 王强; 杜伊

    2004-01-01

    Based on the maximum entropy principle, a probability density function (PDF) is derived for the distribution of wave heights in a random wave field, without any more hypothesis. The present PDF, being a non-Rayleigh form, involves two parameters: the average wave height H and the state parameter γ. The role of γ in the distribution of wave heights is examined. It is found that γ may be a certain measure of sea state. A least square method for determining γ from measured data is proposed. In virtue of the method, the values of γ are determined for three sea states from the data measured in the East China Sea. The present PDF is compared with the well known Rayleigh PDF of wave height and it is shown that it much better fits the data than the Rayleigh PDF. It is expected that the present PDF would fit some other wave variables, since its derivation is not restricted only to the wave height.

  2. Upper bound for the average entropy production based on stochastic entropy extrema

    Science.gov (United States)

    Limkumnerd, Surachate

    2017-03-01

    The second law of thermodynamics, which asserts the non-negativity of the average total entropy production of a combined system and its environment, is a direct consequence of applying Jensen's inequality to a fluctuation relation. It is also possible, through this inequality, to determine an upper bound of the average total entropy production based on the entropies along the most extreme stochastic trajectories. In this work, we construct an upper bound inequality of the average of a convex function over a domain whose average is known. When applied to the various fluctuation relations, the upper bounds of the average total entropy production are established. Finally, by employing the result of Neri, Roldán, and Jülicher [Phys. Rev. X 7, 011019 (2017)], 10.1103/PhysRevX.7.011019, we are able to show that the average total entropy production is bounded only by the total entropy production supremum, and vice versa, for a general nonequilibrium stationary system.

  3. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    Science.gov (United States)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  4. A basic introduction to the thermodynamics of the Earth system far from equilibrium and maximum entropy production.

    Science.gov (United States)

    Kleidon, A

    2010-05-12

    The Earth system is remarkably different from its planetary neighbours in that it shows pronounced, strong global cycling of matter. These global cycles result in the maintenance of a unique thermodynamic state of the Earth's atmosphere which is far from thermodynamic equilibrium (TE). Here, I provide a simple introduction of the thermodynamic basis to understand why Earth system processes operate so far away from TE. I use a simple toy model to illustrate the application of non-equilibrium thermodynamics and to classify applications of the proposed principle of maximum entropy production (MEP) to such processes into three different cases of contrasting flexibility in the boundary conditions. I then provide a brief overview of the different processes within the Earth system that produce entropy, review actual examples of MEP in environmental and ecological systems, and discuss the role of interactions among dissipative processes in making boundary conditions more flexible. I close with a brief summary and conclusion.

  5. Oseen vortex as a maximum entropy state of a two dimensional fluid

    Science.gov (United States)

    Montgomery, D. C.; Matthaeus, W. H.

    2011-07-01

    During the last four decades, a considerable number of investigations has been carried out into the evolution of turbulence in two dimensional Navier-Stokes flows. Much of the information has come from numerical solution of the (otherwise insoluble) dynamical equations and thus has necessarily required some kind of boundary conditions: spatially periodic, no-slip, stress-free, or free-slip. The theoretical framework that has proved to be of the most predictive value has been one employing an entropy functional (sometimes called the Boltzmann entropy) whose maximization has been correlated well in several cases with the late-time configurations into which the computed turbulence has relaxed. More recently, flow in the unbounded domain has been addressed by Gallay and Wayne who have shown a late-time relaxation to the classical Oseen vortex (also sometimes called the Lamb-Oseen vortex) for situations involving a finite net circulation or non-zero total integrated vorticity. Their proof involves powerful but difficult mathematics that might be thought to be beyond the preparation of many practicing fluid dynamicists. The purpose of this present paper is to remark that relaxation to the Oseen vortex can also be predicted in the more intuitive framework that has previously proved useful in predicting computational results with boundary conditions: that of an appropriate entropy maximization. The results make no assumption about the size of the Reynolds numbers, as long as they are finite, and the viscosity is treated as finite throughout.

  6. The Grading Entropy-based Criteria for Structural Stability of Granular Materials and Filters

    Directory of Open Access Journals (Sweden)

    Janos Lőrincz

    2015-05-01

    Full Text Available This paper deals with three grading entropy-based rules that describe different soil structure stability phenomena: an internal stability rule, a filtering rule and a segregation rule. These rules are elaborated on the basis of a large amount of laboratory testing and from existing knowledge in the field. Use is made of the theory of grading entropy to derive parameters which incorporate all of the information of the grading curve into a pair of entropy-based parameters that allow soils with common behaviours to be grouped into domains on an entropy diagram. Applications of the derived entropy-based rules are presented by examining the reason of a dam failure, by testing against the existing filter rules from the literature, and by giving some examples for the design of non-segregating grading curves (discrete particle size distributions by dry weight. A physical basis for the internal stability rule is established, wherein the higher values of base entropy required for granular stability are shown to reflect the closeness between the mean and maximum grain diameters, which explains how there are sufficient coarser grains to achieve a stable grain skeleton.

  7. Design and Implementation of Hardware Based Entropy Analysis

    Directory of Open Access Journals (Sweden)

    S. Saravanan

    2012-07-01

    Full Text Available The aim of this study is hardware implementation of the Entropy analysis. Designing and verifying entropy analysis is the major finding of this study aper. Entropy tells how much amount of data can be compressed. Entropy analysis plays a major role in scan based SoC testing. Size and complexity have been the major issues for current scenario of System-on-a-Chip (SoC testing. Test data compression is a must for such cases. Entropy analysis is taken for both specified and unspecified bits (don’t care bits. Unspecified bits are specified using Zero and One fill algorithms. The X-filling technique is applied for fixed to fixed codes. The proposed method is successfully tested on ISCAS89 benchmark circuits.

  8. Entropy-Based Credit Evaluation for Mobile Telephone Customers

    Directory of Open Access Journals (Sweden)

    Yang Zong-Chang

    2013-10-01

    Full Text Available The arrears problem puzzled most mobile communication corporations in China. In information theory, the Shannon entropy is a measure of the uncertainty in a signal or random event. Motivated by entropy theory especially the Shannon Entropy, in this study, one called customer information entropy is defined and proposed to credit evaluation for arrearage customers of cellular telephone. The proposed customer information entropy is based on customer’s behavior attributes. Arrearage customers often include malevolent ones and non-malevolent ones. 52364 arrearage customers among a total number of 400000 ones in a mobile communication corporation are chosen for experiment. The proposed measure yields good results in its application of credit evaluation for 52364 arrearage customers in August and September. Its correct evaluation rates for malevolent and non-malevolent of the 52364 arrearage ones both are over 90.0%: among the 52364 arrearage customers, 90.75% of the non-malevolent ones, whose entropy changes is less than zero, while for the entropy changes of the malevolent ones, 95.36% is equal to zero and 1.57% is more than zero. The experimental results indicate that the entropy changes of the non-malevolent ones could be considered as negative and nonnegative for the malevolent ones. The proposed show its potential practicality.

  9. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  10. The two-box model of climate: limitations and applications to planetary habitability and maximum entropy production studies.

    Science.gov (United States)

    Lorenz, Ralph D

    2010-05-12

    The 'two-box model' of planetary climate is discussed. This model has been used to demonstrate consistency of the equator-pole temperature gradient on Earth, Mars and Titan with what would be predicted from a principle of maximum entropy production (MEP). While useful for exposition and for generating first-order estimates of planetary heat transports, it has too low a resolution to investigate climate systems with strong feedbacks. A two-box MEP model agrees well with the observed day : night temperature contrast observed on the extrasolar planet HD 189733b.

  11. Molecular dynamics simulations with replica-averaged structural restraints generate structural ensembles according to the maximum entropy principle.

    Science.gov (United States)

    Cavalli, Andrea; Camilloni, Carlo; Vendruscolo, Michele

    2013-03-07

    In order to characterise the dynamics of proteins, a well-established method is to incorporate experimental parameters as replica-averaged structural restraints into molecular dynamics simulations. Here, we justify this approach in the case of interproton distance information provided by nuclear Overhauser effects by showing that it generates ensembles of conformations according to the maximum entropy principle. These results indicate that the use of replica-averaged structural restraints in molecular dynamics simulations, given a force field and a set of experimental data, can provide an accurate approximation of the unknown Boltzmann distribution of a system.

  12. Maximum entropy algorithm and its implementation for the neutral beam profile measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Wook; Cho, Gyu Seong [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Cho, Yong Sub [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    A tomography algorithm to maximize the entropy of image using Lagrangian multiplier technique and conjugate gradient method has been designed for the measurement of 2D spatial distribution of intense neutral beams of KSTAR NBI (Korea Superconducting Tokamak Advanced Research Neutral Beam Injector), which is now being designed. A possible detection system was assumed and a numerical simulation has been implemented to test the reconstruction quality of given beam profiles. This algorithm has the good applicability for sparse projection data and thus, can be used for the neutral beam tomography. 8 refs., 3 figs. (Author)

  13. Improving Estimations of Spatial Distribution of Soil Respiration Using the Bayesian Maximum Entropy Algorithm and Soil Temperature as Auxiliary Data.

    Science.gov (United States)

    Hu, Junguo; Zhou, Jian; Zhou, Guomo; Luo, Yiqi; Xu, Xiaojun; Li, Pingheng; Liang, Junyi

    2016-01-01

    Soil respiration inherently shows strong spatial variability. It is difficult to obtain an accurate characterization of soil respiration with an insufficient number of monitoring points. However, it is expensive and cumbersome to deploy many sensors. To solve this problem, we proposed employing the Bayesian Maximum Entropy (BME) algorithm, using soil temperature as auxiliary information, to study the spatial distribution of soil respiration. The BME algorithm used the soft data (auxiliary information) effectively to improve the estimation accuracy of the spatiotemporal distribution of soil respiration. Based on the functional relationship between soil temperature and soil respiration, the BME algorithm satisfactorily integrated soil temperature data into said spatial distribution. As a means of comparison, we also applied the Ordinary Kriging (OK) and Co-Kriging (Co-OK) methods. The results indicated that the root mean squared errors (RMSEs) and absolute values of bias for both Day 1 and Day 2 were the lowest for the BME method, thus demonstrating its higher estimation accuracy. Further, we compared the performance of the BME algorithm coupled with auxiliary information, namely soil temperature data, and the OK method without auxiliary information in the same study area for 9, 21, and 37 sampled points. The results showed that the RMSEs for the BME algorithm (0.972 and 1.193) were less than those for the OK method (1.146 and 1.539) when the number of sampled points was 9 and 37, respectively. This indicates that the former method using auxiliary information could reduce the required number of sampling points for studying spatial distribution of soil respiration. Thus, the BME algorithm, coupled with soil temperature data, can not only improve the accuracy of soil respiration spatial interpolation but can also reduce the number of sampling points.

  14. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    Science.gov (United States)

    Almog, Assaf; Garlaschelli, Diego

    2014-09-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.

  15. Quantitative LC-MS of polymers: determining accurate molecular weight distributions by combined size exclusion chromatography and electrospray mass spectrometry with maximum entropy data processing.

    Science.gov (United States)

    Gruendling, Till; Guilhaus, Michael; Barner-Kowollik, Christopher

    2008-09-15

    We report on the successful application of size exclusion chromatography (SEC) combined with electrospray ionization mass spectrometry (ESI-MS) and refractive index (RI) detection for the determination of accurate molecular weight distributions of synthetic polymers, corrected for chromatographic band broadening. The presented method makes use of the ability of ESI-MS to accurately depict the peak profiles and retention volumes of individual oligomers eluting from the SEC column, whereas quantitative information on the absolute concentration of oligomers is obtained from the RI-detector only. A sophisticated computational algorithm based on the maximum entropy principle is used to process the data gained by both detectors, yielding an accurate molecular weight distribution, corrected for chromatographic band broadening. Poly(methyl methacrylate) standards with molecular weights up to 10 kDa serve as model compounds. Molecular weight distributions (MWDs) obtained by the maximum entropy procedure are compared to MWDs, which were calculated by a conventional calibration of the SEC-retention time axis with peak retention data obtained from the mass spectrometer. Comparison showed that for the employed chromatographic system, distributions below 7 kDa were only weakly influenced by chromatographic band broadening. However, the maximum entropy algorithm could successfully correct the MWD of a 10 kDa standard for band broadening effects. Molecular weight averages were between 5 and 14% lower than the manufacturer stated data obtained by classical means of calibration. The presented method demonstrates a consistent approach for analyzing data obtained by coupling mass spectrometric detectors and concentration sensitive detectors to polymer liquid chromatography.

  16. Some Comments on the Entropy-Based Criteria for Piping

    Directory of Open Access Journals (Sweden)

    Emöke Imre

    2015-04-01

    Full Text Available This paper is an extension of previous work which characterises soil behaviours using the grading entropy diagram. The present work looks at the piping process in granular soils, by considering some new data from flood-protection dikes. The piping process is divided into three parts here: particle movement at the micro scale to segregate free water; sand boil development (which is the initiation of the pipe, and pipe growth. In the first part of the process, which occurs during the rising flood, the increase in shear stress along the dike base may cause segregation of water into micro pipes if the subsoil in the dike base is relatively loose. This occurs at the maximum dike base shear stress level (ratio of shear stress and strength zone which is close to the toe. In the second part of the process, the shear strain increment causes a sudden, asymmetric slide and cracking of the dike leading to the localized excess pore pressure, liquefaction and the formation of a sand boil. In the third part of the process, the soil erosion initiated through the sand boil continues, and the pipe grows. The piping in the Hungarian dikes often occurs in a two-layer system; where the base layer is coarser with higher permeability and the cover layer is finer with lower permeability. The new data presented here show that the soils ejected from the sand boils are generally silty sands and sands, which are prone to both erosion (on the basis of the entropy criterion and liquefaction. They originate from the cover layer which is basically identical to the soil used in the Dutch backward erosion experiments.

  17. Maximum entropy approach for batch-arrival queue under N policy with an un-reliable server and single vacation

    Science.gov (United States)

    Ke, Jau-Chuan; Lin, Chuen-Horng

    2008-11-01

    We consider the M[x]/G/1 queueing system, in which the server operates N policy and a single vacation. As soon as the system becomes empty the server leaves for a vacation of random length V. When he returns from the vacation and the system size is greater than or equal to a threshold value N, he starts to serve the waiting customers. If he finds fewer customers than N. he waits in the system until the system size reaches or exceeds N. The server is subject to breakdowns according to a Poisson process and his repair time obeys an arbitrary distribution. We use maximum entropy principle to derive the approximate formulas for the steady-state probability distributions of the queue length. We perform a comparative analysis between the approximate results with established exact results for various batch size, vacation time, service time and repair time distributions. We demonstrate that the maximum entropy approach is efficient enough for practical purpose and is a feasible method for approximating the solution of complex queueing systems.

  18. Enhanced Information Recovery in 2D On-and Off-Resonance Nutation NQR using the Maximum Entropy Method

    Science.gov (United States)

    Maćkowiak, Mariusz; Kątowski, Piotr

    1996-06-01

    Two-dimensional zero-field nutation NQR spectroscopy has been used to determine the full quadrupolar tensor of spin - 3/2 nuclei in serveral molecular crystals containing the 3 5 Cl and 7 5 As nuclei. The problems of reconstructing 2D-nutation NQR spectra using conventional methods and the advantages of using implementation of the maximum entropy method (MEM) are analyzed. It is shown that the replacement of conventional Fourier transform by an alternative data processing by MEM in 2D NQR spectroscopy leads to sensitivity improvement, reduction of instrumental artefacts and truncation errors, shortened data acquisition times and suppression of noise, while at the same time increasing the resolution. The effects of off-resonance irradiation in nutation experiments are demonstrated both experimentally and theoretically. It is shown that off-resonance nutation spectroscopy is a useful extension of the conventional on-resonance experiments, thus facilitating the determination of asymmetry parameters in multiple spectrum. The theoretical description of the off-resonance effects in 2D nutation NQR spectroscopy is given, and general exact formulas for the asymmetry parameter are obtained. In off-resonance conditions, the resolution of the nutation NQR spectrum decreases with the spectrometer offset. However, an enhanced resolution can be achieved by using the maximum entropy method in 2D-data reconstruction.

  19. Collaborative Personalized Web Recommender System using Entropy based Similarity Measure

    CERN Document Server

    Mehta, Harita; Bedi, Punam; Dixit, V S

    2012-01-01

    On the internet, web surfers, in the search of information, always strive for recommendations. The solutions for generating recommendations become more difficult because of exponential increase in information domain day by day. In this paper, we have calculated entropy based similarity between users to achieve solution for scalability problem. Using this concept, we have implemented an online user based collaborative web recommender system. In this model based collaborative system, the user session is divided into two levels. Entropy is calculated at both the levels. It is shown that from the set of valuable recommenders obtained at level I; only those recommenders having lower entropy at level II than entropy at level I, served as trustworthy recommenders. Finally, top N recommendations are generated from such trustworthy recommenders for an online user.

  20. Power-law distribution functions derived from maximum entropy and a symmetry relationship

    CERN Document Server

    Peterson, G J

    2011-01-01

    Power-law distributions are common, particularly in social physics. Here, we explore whether power-laws might arise as a consequence of a general variational principle for stochastic processes. We describe communities of 'social particles', where the cost of adding a particle to the community is shared equally between the particle joining the cluster and the particles that are already members of the cluster. Power-law probability distributions of community sizes arise as a natural consequence of the maximization of entropy, subject to this 'equal cost sharing' rule. We also explore a generalization in which there is unequal sharing of the costs of joining a community. Distributions change smoothly from exponential to power-law as a function of a sharing-inequality quantity. This work gives an interpretation of power-law distributions in terms of shared costs.

  1. Entropy-Based Search Algorithm for Experimental Design

    CERN Document Server

    Malakar, N K

    2010-01-01

    The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. ...

  2. Shannon Entropy based Randomness Measurement and Test for Image Encryption

    CERN Document Server

    Wu, Yue; Agaian, Sos

    2011-01-01

    The quality of image encryption is commonly measured by the Shannon entropy over the ciphertext image. However, this measurement does not consider to the randomness of local image blocks and is inappropriate for scrambling based image encryption methods. In this paper, a new information entropy-based randomness measurement for image encryption is introduced which, for the first time, answers the question of whether a given ciphertext image is sufficiently random-like. It measures the randomness over the ciphertext in a fairer way by calculating the averaged entropy of a series of small image blocks within the entire test image. In order to fulfill both quantitative and qualitative measurement, the expectation and the variance of this averaged block entropy for a true-random image are strictly derived and corresponding numerical reference tables are also provided. Moreover, a hypothesis test at significance?-level is given to help accept or reject the hypothesis that the test image is ideally encrypted/random-...

  3. Reliability-based design optimization with Cross-Entropy method

    OpenAIRE

    Ghidey, Hiruy

    2015-01-01

    Implementation of the Cross-entropy (CE) method to solve reliability-based design optimization (RBDO) problems was investigated. The emphasis of this implementation method was to solve independently both the reliability and optimization sub-problems within the RBDO problem; therefore, the main aim of this study was to evaluate the performance of the Cross-entropy method in terms of efficiency and accuracy to solve RBDO problems. A numerical approach was followed in which the implementatio...

  4. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  5. Deceiving entropy-based DoS detection

    Science.gov (United States)

    Özçelik, Ä.°lker; Brooks, Richard R.

    2014-06-01

    Denial of Service (DoS) attacks disable network services for legitimate users. A McAfee report shows that eight out of ten Critical Infrastructure Providers (CIPs) surveyed had a significant Distributed DoS (DDoS) attack in 2010.1 Researchers proposed many approaches for detecting these attacks in the past decade. Anomaly based DoS detection is the most common. In this approach, the detector uses statistical features; such as the entropy of incoming packet header fields like source IP addresses or protocol type. It calculates the observed statistical feature and triggers an alarm if an extreme deviation occurs. However, intrusion detection systems (IDS) using entropy based detection can be fooled by spoofing. An attacker can sniff the network to collect header field data of network packets coming from distributed nodes on the Internet and fuses them to calculate the entropy of normal background traffic. Then s/he can spoof attack packets to keep the entropy value in the expected range during the attack. In this study, we present a proof of concept entropy spoofing attack that deceives entropy based detection approaches. Our preliminary results show that spoofing attacks cause significant detection performance degradation.

  6. A new entropy based method for computing software structural complexity

    CERN Document Server

    Roca, J L

    2002-01-01

    In this paper a new methodology for the evaluation of software structural complexity is described. It is based on the entropy evaluation of the random uniform response function associated with the so called software characteristic function SCF. The behavior of the SCF with the different software structures and their relationship with the number of inherent errors is investigated. It is also investigated how the entropy concept can be used to evaluate the complexity of a software structure considering the SCF as a canonical representation of the graph associated with the control flow diagram. The functions, parameters and algorithms that allow to carry out this evaluation are also introduced. After this analytic phase follows the experimental phase, verifying the consistency of the proposed metric and their boundary conditions. The conclusion is that the degree of software structural complexity can be measured as the entropy of the random uniform response function of the SCF. That entropy is in direct relation...

  7. Deriving the electron-phonon spectral density of MgB2 from optical data, using maximum entropy techniques.

    Science.gov (United States)

    Hwang, J; Carbotte, J P

    2014-04-23

    We use maximum entropy techniques to extract an electron-phonon density from optical data for the normal state at T = 45 K of MgB2. Limiting the analysis to a range of phonon energies below 110 meV, which is sufficient for capturing all phonon structures, we find a spectral function that is in good agreement with that calculated for the quasi-two-dimensional σ-band. Extending the analysis to higher energies, up to 160 meV, we find no evidence for any additional contributions to the fluctuation spectrum, but find that the data can only be understood if the density of states is taken to decrease with increasing energy.

  8. Superresolution of FT-NMR Spectra by the Maximum Entropy Method and AR Model Fitting with Singular Value Decomposition

    Science.gov (United States)

    Uchiyama, Takanori; Minamitani, Haruyuki; Sakata, Makoto

    1990-01-01

    The complex maximum entropy method and complex autoregressive model fitting with the singular value decomposition method (SVD) were applied to the free induction decay signal data obtained with a Fourier transform nuclear magnetic resonance spectrometer to estimate superresolved NMR spectra. The practical estimation of superresolved NMR spectra are shown on the data of phosphorus-31 nuclear magnetic resonance spectra. These methods provide sharp peaks and high signal-to-noise ratio compared with conventional fast Fourier transform. The SVD method was more suitable for estimating superresolved NMR spectra than the MEM because the SVD method allowed high-order estimation without spurious peaks, and it was easy to determine the order and the rank.

  9. A Maximum-Entropy Compound Distribution Model for Extreme Wave Heights of Typhoon-Affected Sea Areas

    Institute of Scientific and Technical Information of China (English)

    WANG Li-ping; SUN Xiao-guang; LU Ke-bo; XU De-lun

    2012-01-01

    A new compound distribution model for extreme wave heights of typhoon-affected sea areas is proposed on the basis of the maximum-entropy principle.The new model is formed by nesting a discrete distribution in a continuous one,having eight parameters which can be determined in terms of observed data of typhoon occurrence-frequency and extreme wave heights by numerically solving two sets of equations derived in this paper.The model is examined by using it to predict the N-year return-periodwave height at two hydrology stations in the Yellow Sea,and the predicted results are compared with those predicted by use of some other compound distribution models.Examinations and comparisons show that the model has some advantages for predicting the N-year return-period wave height in typhoon-affected sea areas.

  10. On the 'fake' inferred entanglement associated with the maximum entropy inference of quantum states

    Energy Technology Data Exchange (ETDEWEB)

    Batle, J.; Casas, M. [Departament de Fisica, Universitat de les Illes Balears, Palma de Mallorca (Spain); Plastino, A.R. [Departament de Fisica, Universitat de les Illes Balears, Palma de Mallorca (Spain); Faculty of Astronomy and Geophysics, National University La Plata, La Plata (Argentina); National Research Council, CONICET (AR)); Plastino, A. [National Research Council (CONICET) (Argentina); Department of Physics, National University La Plata, La Plata (Argentina)

    2001-08-24

    The inference of entangled quantum states by recourse to the maximum entropy (MaxEnt) principle is considered in connection with the recently pointed out problem of fake inferred entanglement (Horodecki R et al 1999 Phys. Rev. A 59 1799). We show that there are operators A-circumflex, both diagonal and non-diagonal in the Bell basis, such that, when the expectation value is taken as prior information, the problem of fake entanglement is not solved by adding a new constraint associated with the mean value of A-circumflex{sup 2} (unlike what happens when the partial information is given by the expectation value of a Bell operator). The fake entanglement generated by the MaxEnt principle is also studied quantitatively by comparing the entanglement of formation of the inferred state with that of the original one. (author)

  11. Predicting the potential environmental suitability for Theileria orientalis transmission in New Zealand cattle using maximum entropy niche modelling.

    Science.gov (United States)

    Lawrence, K E; Summers, S R; Heath, A C G; McFadden, A M J; Pulford, D J; Pomroy, W E

    2016-07-15

    The tick-borne haemoparasite Theileria orientalis is the most important infectious cause of anaemia in New Zealand cattle. Since 2012 a previously unrecorded type, T. orientalis type 2 (Ikeda), has been associated with disease outbreaks of anaemia, lethargy, jaundice and deaths on over 1000 New Zealand cattle farms, with most of the affected farms found in the upper North Island. The aim of this study was to model the relative environmental suitability for T. orientalis transmission throughout New Zealand, to predict the proportion of cattle farms potentially suitable for active T. orientalis infection by region, island and the whole of New Zealand and to estimate the average relative environmental suitability per farm by region, island and the whole of New Zealand. The relative environmental suitability for T. orientalis transmission was estimated using the Maxent (maximum entropy) modelling program. The Maxent model predicted that 99% of North Island cattle farms (n=36,257), 64% South Island cattle farms (n=15,542) and 89% of New Zealand cattle farms overall (n=51,799) could potentially be suitable for T. orientalis transmission. The average relative environmental suitability of T. orientalis transmission at the farm level was 0.34 in the North Island, 0.02 in the South Island and 0.24 overall. The study showed that the potential spatial distribution of T. orientalis environmental suitability was much greater than presumed in the early part of the Theileria associated bovine anaemia (TABA) epidemic. Maximum entropy offers a computer efficient method of modelling the probability of habitat suitability for an arthropod vectored disease. This model could help estimate the boundaries of the endemically stable and endemically unstable areas for T. orientalis transmission within New Zealand and be of considerable value in informing practitioner and farmer biosecurity decisions in these respective areas.

  12. A New Maximum Entropy Estimation of Distribution Algorithm to Solve Uncertain Information Job-shop Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Lu Lin

    2009-10-01

    Full Text Available Estimation of Distribution Algorithm (EDA is a new kinds of colony evolution algorithm, through counting excellent information of individuals of the present colony EDA construct probability distribution model, then sample the model produces newt generation. To solve the NP-Hard question as EDA searching optimum network structure a new Maximum Entropy Distribution Algorithm (MEEDA is provided. The algorithm takes Jaynes principle as the basis, makes use of the maximum entropy of random variables to estimate the minimum bias probability distribution of random variables, and then regard it as the evolution model of the algorithm, which produces the optimal/near optimal solution. Then this paper presents a rough programming model for job shop scheduling under uncertain information problem. The method overcomes the defects of traditional methods which need pre-set authorized characteristics or amount described attributes, designs multi-objective optimization mechanism and expands the application space of a rough set in the issue of job shop scheduling under uncertain information environment. Due to the complexity of the proposed model, traditional algorithms have low capability in producing a feasible solution. We use MEEDA in order to enable a definition of a solution within a reasonable amount of time. We assume that machine flexibility in processing operations to decrease the complexity of the proposed model. Muth and Thompson’s benchmark problems tests are used to verify and validate the proposed rough programming model and its algorithm. The computational results obtained by MEEDA are compared with GA. The compared results prove the effectiveness of MEEDA in the job shop scheduling problem under uncertain information environment.

  13. Improved maximum entropy method for the analysis of fluorescence spectroscopy data: evaluating zero-time shift and assessing its effect on the determination of fluorescence lifetimes.

    Science.gov (United States)

    Esposito, Rosario; Mensitieri, Giuseppe; de Nicola, Sergio

    2015-12-21

    A new algorithm based on the Maximum Entropy Method (MEM) is proposed for recovering both the lifetime distribution and the zero-time shift from time-resolved fluorescence decay intensities. The developed algorithm allows the analysis of complex time decays through an iterative scheme based on entropy maximization and the Brent method to determine the minimum of the reduced chi-squared value as a function of the zero-time shift. The accuracy of this algorithm has been assessed through comparisons with simulated fluorescence decays both of multi-exponential and broad lifetime distributions for different values of the zero-time shift. The method is capable of recovering the zero-time shift with an accuracy greater than 0.2% over a time range of 2000 ps. The center and the width of the lifetime distributions are retrieved with relative discrepancies that are lower than 0.1% and 1% for the multi-exponential and continuous lifetime distributions, respectively. The MEM algorithm is experimentally validated by applying the method to fluorescence measurements of the time decays of the flavin adenine dinucleotide (FAD).

  14. Fuzzy entropy image segmentation based on particle Swarm optimization

    Institute of Scientific and Technical Information of China (English)

    Linyi Li; Deren Li

    2008-01-01

    Partide swaFnl optimization is a stochastic global optimization algorithm that is based on swarm intelligence.Because of its excellent performance,particle swarm optimization is introduced into fuzzy entropy image segmentation to select the optimal fuzzy parameter combination and fuzzy threshold adaptively.In this study,the particles in the swarm are constructed and the swarm search strategy is proposed to meet the needs of the segmentation application.Then fuzzy entropy image segmentation based on particle swarm opti-mization is implemented and the proposed method obtains satisfactory results in the segmentation experiments.Compared with the exhaustive search method,particle swarm optimization can give the salne optimal fuzzy parameter combination and fuzzy threshold while needing less search time in the segmentation experiments and also has good search stability in the repeated experiments.Therefore,fuzzy entropy image segmentation based on particle swarm optimization is an efficient and promising segmentation method.

  15. 边坡工程可靠性分析的最大熵方法%THE MAXIMUM ENTROPY METHOD FOR RELIABILITY ANALYSIS OF SLOPE ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    王宇; 张慧; 贾志刚

    2012-01-01

    边坡工程可靠性分析的最大熵方法,利用已有样本的部分信息来使熵最大化,充分利用了随机变量的高阶矩信息,由样本矩来推断边坡可靠性功能函数的概率密度函数,求解边坡的破坏概率.该方法对基本随机变量的分布没有特别要求,避免了常规方法计算过程中在迭代点处对非正态随机变量进行近似当量正态化处理的缺陷.通常,功能函数的真实概率密度函数很难、甚至无法求得,将Pearson曲线族引入岩土参数随机变量高阶矩的求解当中,可以很容易地得到功能函数的高阶中心矩,然后,基于最大熵原理拟合得到功能函数的最大熵密度函数,采用区间截断法和高斯-克朗罗德数值积分法分别确定最大熵密度函数的拉格郎日系数和边坡的破坏概率.算例分析结果表明:该方法计算效率高,结果可靠,克服了传统方法求解过程复杂、精度低的缺点,将其应用于工程边坡的可靠性分析当中,发展潜力大,具有一定的应用前景和实用价值.%The maximum entropy method is used to conduct the reliability analysis for slope engineering. The entropy is enlarged by the partial information of the existed samples. The high order moment information of the random variables fully uses the sample moment to infer the slope reliability probability density function. Then the slope failure probability is calculated. This method is for the distribution of basic random variables without special requirement. It avoids the conventional method in the process of computation in the iteration points for non-normal random variables to approximate the yield of the normal processes defects. Usually, the function of real probability density function is difficult to obtain,even can't be calculated. So the Pearson curve clan is introduced to solve the high-order moment for geotechnical parameter random variable. It can easily get the function of high order center. It is based on

  16. 最大熵算法在气象雨量预测中的应用分析%Application of Maximum Entropy Algorithm Meteorological Rainfall Prediction

    Institute of Scientific and Technical Information of China (English)

    王海燕

    2014-01-01

    将最大熵原理的计算方法应用到气象雨量预测中,通过有效的仿真实验能够证明最大熵方法在气象雨量预测中的可行性。%The calculation of the maximum entropy principle is applied to the weather forecast rainfall through effective simulation experiment to prove the feasibility of the maximum entropy method of meteorological rainfall prediction.

  17. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    Science.gov (United States)

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  18. Electromagnetic Nanoscale Metrology Based on Entropy Production and Fluctuations

    Directory of Open Access Journals (Sweden)

    James Baker-Jarvis

    2008-10-01

    Full Text Available The goal in this paper is to show how many high-frequency electromagnetic metrology areas can be understood and formulated in terms of entropy evolution, production, and fluctuations. This may be important in nanotechnology where an understanding of fluctuations of thermal and electromagnetic energy and the effects of nonequilibrium are particularly important. The approach used here is based on a new derivation of an entropy evolution equation using an exact Liouville-based statistical-mechanical theory rooted in the Robertson-Zwanzig-Mori formulations. The analysis begins by developing an exact equation for entropy rate in terms of time correlations of the microscopic entropy rate. This equation is an exact fluctuation-dissipation relationship. We then define the entropy and its production for electromagnetic driving, both in the time and frequency domains, and apply this to study dielectric and magnetic material measurements, magnetic relaxation, cavity resonance, noise, measuring Boltzmann’s constant, and power measurements.

  19. Feature selection with neighborhood entropy-based cooperative game theory.

    Science.gov (United States)

    Zeng, Kai; She, Kun; Niu, Xinzheng

    2014-01-01

    Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighborhood entropy-based feature contribution is proposed under the framework of cooperative game. The evaluative criteria of features can be formalized as the product of contribution and other classical feature measures. Finally, the proposed method is tested on several UCI datasets. The results show that neighborhood entropy-based cooperative game theory model (NECGT) yield better performance than classical ones.

  20. A non-uniformly sampled 4D HCC(CO)NH-TOCSY experiment processed using maximum entropy for rapid protein sidechain assignment

    Science.gov (United States)

    Mobli, Mehdi; Stern, Alan S.; Bermel, Wolfgang; King, Glenn F.; Hoch, Jeffrey C.

    2010-05-01

    One of the stiffest challenges in structural studies of proteins using NMR is the assignment of sidechain resonances. Typically, a panel of lengthy 3D experiments are acquired in order to establish connectivities and resolve ambiguities due to overlap. We demonstrate that these experiments can be replaced by a single 4D experiment that is time-efficient, yields excellent resolution, and captures unique carbon-proton connectivity information. The approach is made practical by the use of non-uniform sampling in the three indirect time dimensions and maximum entropy reconstruction of the corresponding 3D frequency spectrum. This 4D method will facilitate automated resonance assignment procedures and it should be particularly beneficial for increasing throughput in NMR-based structural genomics initiatives.

  1. Maximum entropy modeling risk of anthrax in the Republic of Kazakhstan.

    Science.gov (United States)

    Abdrakhmanov, S K; Mukhanbetkaliyev, Y Y; Korennoy, F I; Sultanov, A A; Kadyrov, A S; Kushubaev, D B; Bakishev, T G

    2017-09-01

    The objective of this study was to zone the territory of the Republic of Kazakhstan (RK) into risk categories according to the probability of anthrax emergence in farm animals as stipulated by the re-activation of preserved natural foci. We used historical data on anthrax morbidity in farm animals during the period 1933 - 2014, collected by the veterinary service of the RK. The database covers the entire territory of the RK and contains 4058 anthrax outbreaks tied to 1798 unique locations. Considering the strongly pronounced natural focality of anthrax, we employed environmental niche modeling (Maxent) to reveal patterns in the outbreaks' linkages to specific combinations of environmental factors. The set of bioclimatic factors BIOCLIM, derived from remote sensing data, the altitude above sea level, the land cover type, the maximum green vegetation fraction (MGVF) and the soil type were examined as explanatory variables. The model demonstrated good predictive ability, while the MGVF, the bioclimatic variables reflecting precipitation level and humidity, and the soil type were found to contribute most significantly to the model. A continuous probability surface was obtained that reflects the suitability of the study area for the emergence of anthrax outbreaks. The surface was turned into a categorical risk map by averaging the probabilities within the administrative divisions at the 2nd level and putting them into four categories of risk, namely: low, medium, high and very high risk zones, where very high risk refers to more than 50% suitability to the disease re-emergence and low risk refers to less than 10% suitability. The map indicated increased risk of anthrax re-emergence in the districts along the northern, eastern and south-eastern borders of the country. It was recommended that the national veterinary service uses the risk map for the development of contra-epizootic measures aimed at the prevention of anthrax re-emergence in historically affected regions of

  2. DDoS Attack Detection Algorithms Based on Entropy Computing

    Science.gov (United States)

    Li, Liying; Zhou, Jianying; Xiao, Ning

    Distributed Denial of Service (DDoS) attack poses a severe threat to the Internet. It is difficult to find the exact signature of attacking. Moreover, it is hard to distinguish the difference of an unusual high volume of traffic which is caused by the attack or occurs when a huge number of users occasionally access the target machine at the same time. The entropy detection method is an effective method to detect the DDoS attack. It is mainly used to calculate the distribution randomness of some attributes in the network packets' headers. In this paper, we focus on the detection technology of DDoS attack. We improve the previous entropy detection algorithm, and propose two enhanced detection methods based on cumulative entropy and time, respectively. Experiment results show that these methods could lead to more accurate and effective DDoS detection.

  3. Relating quantum coherence and correlations with entropy-based measures.

    Science.gov (United States)

    Wang, Xiao-Li; Yue, Qiu-Ling; Yu, Chao-Hua; Gao, Fei; Qin, Su-Juan

    2017-09-21

    Quantum coherence and quantum correlations are important quantum resources for quantum computation and quantum information. In this paper, using entropy-based measures, we investigate the relationships between quantum correlated coherence, which is the coherence between subsystems, and two main kinds of quantum correlations as defined by quantum discord as well as quantum entanglement. In particular, we show that quantum discord and quantum entanglement can be well characterized by quantum correlated coherence. Moreover, we prove that the entanglement measure formulated by quantum correlated coherence is lower and upper bounded by the relative entropy of entanglement and the entanglement of formation, respectively, and equal to the relative entropy of entanglement for all the maximally correlated states.

  4. Towards an Entropy-based Analysis of Log Variability

    DEFF Research Database (Denmark)

    Back, Christoffer Olling; Slaats, Tijs; Debois, Søren

    2017-01-01

    the development of hybrid miners: given a (sub-)log, can we determine a priori whether the log is best suited for imperative or declarative mining? We propose using the concept of entropy, commonly used in information theory. We consider different measures for entropy that could be applied and show through...... and unstructured parts and therefore do not fit strictly in one category or the other. This has led to the recent introduction of hybrid miners, which aim to combine flow- and constraint-based models to provide the best possible representation of a log. In this paper we focus on a core question underlying...

  5. A new parallel algorithm for image matching based on entropy

    Institute of Scientific and Technical Information of China (English)

    董开坤; 胡铭曾

    2001-01-01

    Presents a new parallel image matching algorithm based on the concept of entropy feature vector and suitable to SIMD computer, which, in comparison with other algorithms, has the following advantages: ( 1 ) The spatial information of an image is appropriately introduced into the definition of image entropy. (2) A large number of multiplication operations are eliminated, thus the algorithm is sped up. (3) The shortcoming of having to do global calculation in the first instance is overcome, and concludes the algorithm has very good locality and is suitable for parallel processing.

  6. Entropy-growth-based model of emotionally charged online dialogues

    CERN Document Server

    Sienkiewicz, Julian; Paltoglou, Georgios; Holyst, Janusz A

    2012-01-01

    We analyze emotionally annotated massive data from IRC (Internet Relay Chat) and model the dialogues between its participants by assuming that the driving force for the discussion is the entropy growth of emotional probability distribution. This process is claimed to be correlated to the emergence of the power-law distribution of the discussion lengths observed in the dialogues. We perform numerical simulations based on the noticed phenomenon obtaining a good agreement with the real data. Finally, we propose a method to artificially prolong the duration of the discussion that relies on the entropy of emotional probability distribution.

  7. Maximum-Entropy Models of Sequenced Immune Repertoires Predict Antigen-Antibody Affinity

    DEFF Research Database (Denmark)

    Asti, Lorenzo; Uguzzoni, Guido; Marcatili, Paolo

    2016-01-01

    The immune system has developed a number of distinct complex mechanisms to shape and control the antibody repertoire. One of these mechanisms, the affinity maturation process, works in an evolutionary-like fashion: after binding to a foreign molecule, the antibody-producing B-cells exhibit a high...... of an HIV-1 infected patient. The Pearson correlation coefficient between our scoring function and the IC50 neutralization titer measured on 30 different antibodies of known sequence is as high as 0.77 (p-value 10-6), outperforming other sequence- and structure-based models....

  8. A Maximum Entropy Fixed-Point Route Choice Model for Route Correlation

    Directory of Open Access Journals (Sweden)

    Louis de Grange

    2014-06-01

    Full Text Available In this paper we present a stochastic route choice model for transit networks that explicitly addresses route correlation due to overlapping alternatives. The model is based on a multi-objective mathematical programming problem, the optimality conditions of which generate an extension to the Multinomial Logit models. The proposed model considers a fixed point problem for treating correlations between routes, which can be solved iteratively. We estimated the new model on the Santiago (Chile Metro network and compared the results with other route choice models that can be found in the literature. The new model has better explanatory and predictive power that many other alternative models, correctly capturing the correlation factor. Our methodology can be extended to private transport networks.

  9. Extracting the near surface stoichiometry of BiFe0.5Mn0.5O3 thin films; a finite element maximum entropy approach

    NARCIS (Netherlands)

    Song, F.; Monsen, A.; Li, Z. S.; Choi, E. -M.; MacManus-Driscoll, J. L.; Xiong, J.; Jia, Q. X.; Wahlstrom, E.; Wells, J. W.

    2012-01-01

    The surface and near-surface chemical composition of BiFe0.5Mn0.5O3 has been studied using a combination of low photon energy synchrotron photoemission spectroscopy, and a newly developed maximum entropy finite element model from which it is possible to extract the depth dependent chemical compositi

  10. A Unified Theory of Turbulence: Maximum Entropy Increase Due To Turbulent Dissipation In Fluid Systems From Laboratory-scale Turbulence To Global-scale Circulations

    Science.gov (United States)

    Ozawa, Hisashi; Shimokawa, Shinya; Sakuma, Hirofumi

    Turbulence is ubiquitous in nature, yet remains an enigma in many respects. Here we investigate dissipative properties of turbulence so as to find out a statistical "law" of turbulence. Two general expressions are derived for a rate of entropy increase due to thermal and viscous dissipation (turbulent dissipation) in a fluid system. It is found with these equations that phenomenological properties of turbulence such as Malkus's suggestion on maximum heat transport in thermal convection as well as Busse's sug- gestion on maximum momentum transport in shear turbulence can rigorously be ex- plained by a unique state in which the rate of entropy increase due to the turbulent dissipation is at a maximum (dS/dt = Max.). It is also shown that the same state cor- responds to the maximum entropy climate suggested by Paltridge. The tendency to increase the rate of entropy increase has also been confirmed by our recent GCM ex- periments. These results suggest the existence of a universal law that manifests itself in the long-term statistics of turbulent fluid systems from laboratory-scale turbulence to planetary-scale circulations. Ref.) Ozawa, H., Shimokawa, S., and Sakuma, H., Phys. Rev. E 64, 026303, 2001.

  11. Human Brain Networks: Spiking Neuron Models, Multistability, Synchronization, Thermodynamics, Maximum Entropy Production, and Anesthetic Cascade Mechanisms

    Directory of Open Access Journals (Sweden)

    Wassim M. Haddad

    2014-07-01

    Full Text Available Advances in neuroscience have been closely linked to mathematical modeling beginning with the integrate-and-fire model of Lapicque and proceeding through the modeling of the action potential by Hodgkin and Huxley to the current era. The fundamental building block of the central nervous system, the neuron, may be thought of as a dynamic element that is “excitable”, and can generate a pulse or spike whenever the electrochemical potential across the cell membrane of the neuron exceeds a threshold. A key application of nonlinear dynamical systems theory to the neurosciences is to study phenomena of the central nervous system that exhibit nearly discontinuous transitions between macroscopic states. A very challenging and clinically important problem exhibiting this phenomenon is the induction of general anesthesia. In any specific patient, the transition from consciousness to unconsciousness as the concentration of anesthetic drugs increases is very sharp, resembling a thermodynamic phase transition. This paper focuses on multistability theory for continuous and discontinuous dynamical systems having a set of multiple isolated equilibria and/or a continuum of equilibria. Multistability is the property whereby the solutions of a dynamical system can alternate between two or more mutually exclusive Lyapunov stable and convergent equilibrium states under asymptotically slowly changing inputs or system parameters. In this paper, we extend the theory of multistability to continuous, discontinuous, and stochastic nonlinear dynamical systems. In particular, Lyapunov-based tests for multistability and synchronization of dynamical systems with continuously differentiable and absolutely continuous flows are established. The results are then applied to excitatory and inhibitory biological neuronal networks to explain the underlying mechanism of action for anesthesia and consciousness from a multistable dynamical system perspective, thereby providing a

  12. Bayesian maximum entropy integration of ozone observations and model predictions: an application for attainment demonstration in North Carolina.

    Science.gov (United States)

    de Nazelle, Audrey; Arunachalam, Saravanan; Serre, Marc L

    2010-08-01

    States in the USA are required to demonstrate future compliance of criteria air pollutant standards by using both air quality monitors and model outputs. In the case of ozone, the demonstration tests aim at relying heavily on measured values, due to their perceived objectivity and enforceable quality. Weight given to numerical models is diminished by integrating them in the calculations only in a relative sense. For unmonitored locations, the EPA has suggested the use of a spatial interpolation technique to assign current values. We demonstrate that this approach may lead to erroneous assignments of nonattainment and may make it difficult for States to establish future compliance. We propose a method that combines different sources of information to map air pollution, using the Bayesian Maximum Entropy (BME) Framework. The approach gives precedence to measured values and integrates modeled data as a function of model performance. We demonstrate this approach in North Carolina, using the State's ozone monitoring network in combination with outputs from the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. We show that the BME data integration approach, compared to a spatial interpolation of measured data, improves the accuracy and the precision of ozone estimations across the state.

  13. Electron density distribution in Si and Ge using multipole, maximum entropy method and pair distribution function analysis

    Indian Academy of Sciences (India)

    R Saravanan; K S Syed Ali; S Israel

    2008-04-01

    The local, average and electronic structure of the semiconducting materials Si and Ge has been studied using multipole, maximum entropy method (MEM) and pair distribution function (PDF) analyses, using X-ray powder data. The covalent nature of bonding and the interaction between the atoms are clearly revealed by the two-dimensional MEM maps plotted on (1 0 0) and (1 1 0) planes and one-dimensional density along [1 0 0], [1 1 0] and [1 1 1] directions. The mid-bond electron densities between the atoms are 0.554 e/Å3 and 0.187 e/Å3 for Si and Ge respectively. In this work, the local structural information has also been obtained by analyzing the atomic pair distribution function. An attempt has been made in the present work to utilize the X-ray powder data sets to refine the structure and electron density distribution using the currently available versatile methods, MEM, multipole analysis and determination of pair distribution function for these two systems.

  14. A Relevancy, Hierarchical and Contextual Maximum Entropy Framework for a Data-Driven 3D Scene Generation

    Directory of Open Access Journals (Sweden)

    Mesfin Dema

    2014-05-01

    Full Text Available We introduce a novel Maximum Entropy (MaxEnt framework that can generate 3D scenes by incorporating objects’ relevancy, hierarchical and contextual constraints in a unified model. This model is formulated by a Gibbs distribution, under the MaxEnt framework, that can be sampled to generate plausible scenes. Unlike existing approaches, which represent a given scene by a single And-Or graph, the relevancy constraint (defined as the frequency with which a given object exists in the training data require our approach to sample from multiple And-Or graphs, allowing variability in terms of objects’ existence across synthesized scenes. Once an And-Or graph is sampled from the ensemble, the hierarchical constraints are employed to sample the Or-nodes (style variations and the contextual constraints are subsequently used to enforce the corresponding relations that must be satisfied by the And-nodes. To illustrate the proposed methodology, we use desk scenes that are composed of objects whose existence, styles and arrangements (position and orientation can vary from one scene to the next. The relevancy, hierarchical and contextual constraints are extracted from a set of training scenes and utilized to generate plausible synthetic scenes that in turn satisfy these constraints. After applying the proposed framework, scenes that are plausible representations of the training examples are automatically generated.

  15. Soil Moisture and Vegetation Controls on Surface Energy Balance Using the Maximum Entropy Production Model of Evapotranspiration

    Science.gov (United States)

    Wang, J.; Parolari, A.; Huang, S. Y.

    2014-12-01

    The objective of this study is to formulate and test plant water stress parameterizations for the recently proposed maximum entropy production (MEP) model of evapotranspiration (ET) over vegetated surfaces. . The MEP model of ET is a parsimonious alternative to existing land surface parameterizations of surface energy fluxes from net radiation, temperature, humidity, and a small number of parameters. The MEP model was previously tested for vegetated surfaces under well-watered and dry, dormant conditions, when the surface energy balance is relatively insensitive to plant physiological activity. Under water stressed conditions, however, the plant water stress response strongly affects the surface energy balance. This effect occurs through plant physiological adjustments that reduce ET to maintain leaf turgor pressure as soil moisture is depleted during drought. To improve MEP model of ET predictions under water stress conditions, the model was modified to incorporate this plant-mediated feedback between soil moisture and ET. We compare MEP model predictions to observations under a range of field conditions, including bare soil, grassland, and forest. The results indicate a water stress function that combines the soil water potential in the surface soil layer with the atmospheric humidity successfully reproduces observed ET decreases during drought. In addition to its utility as a modeling tool, the calibrated water stress functions also provide a means to infer ecosystem influence on the land surface state. Challenges associated with sampling model input data (i.e., net radiation, surface temperature, and surface humidity) are also discussed.

  16. Electron density distribution and bonding in ZnSe and PbSe using maximum entropy method (MEM)

    Indian Academy of Sciences (India)

    K S Syed Ali; R Saravanan; S Israel; R K Rajaram

    2006-04-01

    The study of electronic structure of materials and bonding is an important part of material characterization. The maximum entropy method (MEM) is a powerful tool for deriving accurate electron density distribution in crystalline materials using experimental data. In this paper, the attention is focused on producing electron density distribution of ZnSe and PbSe using JCPDS X-ray powder diffraction data. The covalent/ionic nature of the bonding and the interaction between the atoms are clearly revealed by the MEM maps. The mid bond electron densities between atoms in these systems are found to be 0.544 e/Å3 and 0.261 e/Å3, respectively for ZnSe and PbSe. The bonding in these two systems has been studied using two-dimensional MEM electron density maps on the (100) and (110) planes, and the one-dimensional electron density profiles along [100], [110] and [111] directions. The thermal parameters of the individual atoms have also been reported in this work. The algorithm of the MEM procedure has been presented.

  17. Binary versus non-binary information in real time series: empirical results and maximum-entropy matrix models

    CERN Document Server

    Almog, Assaf

    2014-01-01

    The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of time series of activity of their fundamental elements (such as stocks or neurons respectively). While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relationships between binary and non-binary properties of financial time series. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to replicate the observed binary/non-binary relations very well, and to mathematically...

  18. Location of Cu(2+) in CHA zeolite investigated by X-ray diffraction using the Rietveld/maximum entropy method.

    Science.gov (United States)

    Andersen, Casper Welzel; Bremholm, Martin; Vennestrøm, Peter Nicolai Ravnborg; Blichfeld, Anders Bank; Lundegaard, Lars Fahl; Iversen, Bo Brummerstedt

    2014-11-01

    Accurate structural models of reaction centres in zeolite catalysts are a prerequisite for mechanistic studies and further improvements to the catalytic performance. The Rietveld/maximum entropy method is applied to synchrotron powder X-ray diffraction data on fully dehydrated CHA-type zeolites with and without loading of catalytically active Cu(2+) for the selective catalytic reduction of NO x with NH3. The method identifies the known Cu(2+) sites in the six-membered ring and a not previously observed site in the eight-membered ring. The sum of the refined Cu occupancies for these two sites matches the chemical analysis and thus all the Cu is accounted for. It is furthermore shown that approximately 80% of the Cu(2+) is located in the new 8-ring site for an industrially relevant CHA zeolite with Si/Al = 15.5 and Cu/Al = 0.45. Density functional theory calculations are used to corroborate the positions and identity of the two Cu sites, leading to the most complete structural description of dehydrated silicoaluminate CHA loaded with catalytically active Cu(2+) cations.

  19. Entropy-Based Privacy against Profiling of User Mobility

    Directory of Open Access Journals (Sweden)

    Alicia Rodriguez-Carrion

    2015-06-01

    Full Text Available Location-based services (LBSs flood mobile phones nowadays, but their use poses an evident privacy risk. The locations accompanying the LBS queries can be exploited by the LBS provider to build the user profile of visited locations, which might disclose sensitive data, such as work or home locations. The classic concept of entropy is widely used to evaluate privacy in these scenarios, where the information is represented as a sequence of independent samples of categorized data. However, since the LBS queries might be sent very frequently, location profiles can be improved by adding temporal dependencies, thus becoming mobility profiles, where location samples are not independent anymore and might disclose the user’s mobility patterns. Since the time dimension is factored in, the classic entropy concept falls short of evaluating the real privacy level, which depends also on the time component. Therefore, we propose to extend the entropy-based privacy metric to the use of the entropy rate to evaluate mobility profiles. Then, two perturbative mechanisms are considered to preserve locations and mobility profiles under gradual utility constraints. We further use the proposed privacy metric and compare it to classic ones to evaluate both synthetic and real mobility profiles when the perturbative methods proposed are applied. The results prove the usefulness of the proposed metric for mobility profiles and the need for tailoring the perturbative methods to the features of mobility profiles in order to improve privacy without completely loosing utility.

  20. Entropy-Based Search Algorithm for Experimental Design

    Science.gov (United States)

    Malakar, N. K.; Knuth, K. H.

    2011-03-01

    The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. This algorithm is inspired by Skilling's nested sampling algorithm used in inference and borrows the concept of a rising threshold while a set of experiment samples are maintained. We demonstrate that this algorithm not only selects highly relevant experiments, but also is more efficient than brute force search. Such entropic search techniques promise to greatly benefit autonomous experimental design.

  1. A Method of Rotating Machinery Fault Diagnosis Based on the Close Degree of Information Entropy

    Institute of Scientific and Technical Information of China (English)

    GENG Jun-bao; HUANG Shu-hong; JIN Jia-shan; CHEN Fei; LIU Wei

    2006-01-01

    This paper presents a method of rotating machinery fault diagnosis based on the close degree of information entropy. In the view of the information entropy, we introduce four information entropy features of the rotating machinery, which describe the vibration condition of the machinery. The four features are, respectively, denominated as singular spectrum entropy, power spectrum entropy, wavelet space state feature entropy and wavelet power spectrum entropy. The value scopes of the four information entropy features of the rotating machinery in some typical fault conditions are gained by experiments, which can be acted as the standard features of fault diagnosis. According to the principle of the shorter distance between the more similar models, the decision-making method based on the close degree of information entropy is put forward to deal with the recognition of fault patterns. We demonstrate the effectiveness of this approach in an instance involving the fault pattern recognition of some rotating machinery.

  2. Seamless measurement technology of transient signals based on approximate entropy

    Science.gov (United States)

    Jiang, Jun; Tian, Shulin; Guo, Lianping; Huang, Wuhuang

    2016-10-01

    The acquisition of waveforms and the analysis of transient characteristics of signals are the fundamental tasks for time-domain measurement, while the reduction of the measuring gap till seamless measurement is extremely important to the acquisition, measurement, and analysis of transient signals. This paper, aimed at the seamless time-domain measurement of non-stationary transient signals, proposes an approximate entropy-based characteristic signal extraction algorithm on the basis of information entropy theories. The algorithm quantitatively describes the complexity (amount of information) of sampled signals using the approximate entropy value, self-adaptively captures characteristic signals under the control of the approximate entropy in real time, extracts the critical or useful information, and removes redundant or useless information so as to reduce the time consumption of processing data and displaying waveforms and realize the seamless time-domain measurement of transient signals finally. Experimental results show that the study could provide a new method for the design of electronic measuring instrument with seamless measurement capability.

  3. Does the soil's effective hydraulic conductivity adapt in order to obey the Maximum Entropy Production principle? A lab experiment

    Science.gov (United States)

    Westhoff, Martijn; Zehe, Erwin; Erpicum, Sébastien; Archambeau, Pierre; Pirotton, Michel; Dewals, Benjamin

    2015-04-01

    The Maximum Entropy Production (MEP) principle is a conjecture assuming that a medium is organized in such a way that maximum power is subtracted from a gradient driving a flux (with power being a flux times its driving gradient). This maximum power is also known as the Carnot limit. It has already been shown that the atmosphere operates close to this Carnot limit when it comes to heat transport from the Equator to the poles, or vertically, from the surface to the atmospheric boundary layer. To reach this state close to the Carnot limit, the effective thermal conductivity of the atmosphere is adapted by the creation of convection cells (e.g. wind). The aim of this study is to test if the soil's effective hydraulic conductivity also adapts itself in such a way that it operates close to the Carnot limit. The big difference between atmosphere and soil is the way of adaptation of its resistance. The soil's hydraulic conductivity is either changed by weathering processes, which is a very slow process, or by creation of preferential flow paths. In this study the latter process is simulated in a lab experiment, where we focus on the preferential flow paths created by piping. Piping is the process of backwards erosion of sand particles subject to a large pressure gradient. Since this is a relatively fast process, it is suitable for being tested in the lab. In the lab setup a horizontal sand bed connects two reservoirs that both drain freely at a level high enough to keep the sand bed always saturated. By adding water to only one reservoir, a horizontal pressure gradient is maintained. If the flow resistance is small, a large gradient develops, leading to the effect of piping. When pipes are being formed, the effective flow resistance decreases; the flow through the sand bed increases and the pressure gradient decreases. At a certain point, the flow velocity is small enough to stop the pipes from growing any further. In this steady state, the effective flow resistance of

  4. Equivalent Relation between Normalized Spatial Entropy and Fractal Dimension

    CERN Document Server

    Chen, Yanguang

    2016-01-01

    Fractal dimension is defined on the base of entropy, including macro state entropy and information entropy. The generalized dimension of multifractals is based on Renyi entropy. However, the mathematical transform from entropy to fractal dimension is not yet clear in both theory and practice. This paper is devoted to revealing the equivalence relation between spatial entropy and fractal dimension using box-counting method. Based on varied regular fractals, the numerical relationship between spatial entropy and fractal dimension is examined. The results show that the ratio of actual entropy (Mq) to the maximum entropy (Mmax) equals the ratio of actual dimension (Dq) to the maximum dimension (Dmax), that is, Mq/Mmax=Dq/Dmax. For real systems, the spatial entropy and fractal dimension of complex spatial systems such as cities can be converted into one another by means of functional box-counting method. The theoretical inference is verified by observational data of urban form. A conclusion is that normalized spat...

  5. Rough K-means Outlier Factor Based on Entropy Computation

    Directory of Open Access Journals (Sweden)

    Djoko Budiyanto Setyohadi

    2014-07-01

    Full Text Available Many studies of outlier detection have been developed based on the cluster-based outlier detection approach, since it does not need any prior knowledge of the dataset. However, the previous studies only regard the outlier factor computation with respect to a single point or a small cluster, which reflects its deviates from a common cluster. Furthermore, all objects within outlier cluster are assumed to be similar. The outlier objects intuitively can be grouped into the outlier clusters and the outlier factors of each object within the outlier cluster should be different gradually. It is not natural if the outlierness of each object within outlier cluster is similar. This study proposes the new outlier detection method based on the hybrid of the Rough K-Means clustering algorithm and the entropy computation. We introduce the outlier degree measure namely the entropy outlier factor for the cluster based outlier detection. The proposed algorithm sequentially finds the outlier cluster and calculates the outlier factor degree of the objects within outlier cluster. Each object within outlier cluster is evaluated using entropy cluster-based to a whole cluster. The performance of the algorithm has been tested on four UCI benchmark data sets and show outperform especially in detection rate.

  6. Predicting the current and future potential distributions of lymphatic filariasis in Africa using maximum entropy ecological niche modelling.

    Science.gov (United States)

    Slater, Hannah; Michael, Edwin

    2012-01-01

    Modelling the spatial distributions of human parasite species is crucial to understanding the environmental determinants of infection as well as for guiding the planning of control programmes. Here, we use ecological niche modelling to map the current potential distribution of the macroparasitic disease, lymphatic filariasis (LF), in Africa, and to estimate how future changes in climate and population could affect its spread and burden across the continent. We used 508 community-specific infection presence data collated from the published literature in conjunction with five predictive environmental/climatic and demographic variables, and a maximum entropy niche modelling method to construct the first ecological niche maps describing potential distribution and burden of LF in Africa. We also ran the best-fit model against climate projections made by the HADCM3 and CCCMA models for 2050 under A2a and B2a scenarios to simulate the likely distribution of LF under future climate and population changes. We predict a broad geographic distribution of LF in Africa extending from the west to the east across the middle region of the continent, with high probabilities of occurrence in the Western Africa compared to large areas of medium probability interspersed with smaller areas of high probability in Central and Eastern Africa and in Madagascar. We uncovered complex relationships between predictor ecological niche variables and the probability of LF occurrence. We show for the first time that predicted climate change and population growth will expand both the range and risk of LF infection (and ultimately disease) in an endemic region. We estimate that populations at risk to LF may range from 543 and 804 million currently, and that this could rise to between 1.65 to 1.86 billion in the future depending on the climate scenario used and thresholds applied to signify infection presence.

  7. Deep-sea benthic megafaunal habitat suitability modelling: A global-scale maximum entropy model for xenophyophores

    Science.gov (United States)

    Ashford, Oliver S.; Davies, Andrew J.; Jones, Daniel O. B.

    2014-12-01

    Xenophyophores are a group of exclusively deep-sea agglutinating rhizarian protozoans, at least some of which are foraminifera. They are an important constituent of the deep-sea megafauna that are sometimes found in sufficient abundance to act as a significant source of habitat structure for meiofaunal and macrofaunal organisms. This study utilised maximum entropy modelling (Maxent) and a high-resolution environmental database to explore the environmental factors controlling the presence of Xenophyophorea and two frequently sampled xenophyophore species that are taxonomically stable: Syringammina fragilissima and Stannophyllum zonarium. These factors were also used to predict the global distribution of each taxon. Areas of high habitat suitability for xenophyophores were highlighted throughout the world's oceans, including in a large number of areas yet to be suitably sampled, but the Northeast and Southeast Atlantic Ocean, Gulf of Mexico and Caribbean Sea, the Red Sea and deep-water regions of the Malay Archipelago represented particular hotspots. The two species investigated showed more specific habitat requirements when compared to the model encompassing all xenophyophore records, perhaps in part due to the smaller number and relatively more clustered nature of the presence records available for modelling at present. The environmental variables depth, oxygen parameters, nitrate concentration, carbon-chemistry parameters and temperature were of greatest importance in determining xenophyophore distributions, but, somewhat surprisingly, hydrodynamic parameters were consistently shown to have low importance, possibly due to the paucity of well-resolved global hydrodynamic datasets. The results of this study (and others of a similar type) have the potential to guide further sample collection, environmental policy, and spatial planning of marine protected areas and industrial activities that impact the seafloor, particularly those that overlap with aggregations of

  8. Non-uniformly under-sampled multi-dimensional spectroscopic imaging in vivo: maximum entropy versus compressed sensing reconstruction.

    Science.gov (United States)

    Burns, Brian; Wilson, Neil E; Furuyama, Jon K; Thomas, M Albert

    2014-02-01

    The four-dimensional (4D) echo-planar correlated spectroscopic imaging (EP-COSI) sequence allows for the simultaneous acquisition of two spatial (ky, kx) and two spectral (t2, t1) dimensions in vivo in a single recording. However, its scan time is directly proportional to the number of increments in the ky and t1 dimensions, and a single scan can take 20–40 min using typical parameters, which is too long to be used for a routine clinical protocol. The present work describes efforts to accelerate EP-COSI data acquisition by application of non-uniform under-sampling (NUS) to the ky–t1 plane of simulated and in vivo EP-COSI datasets then reconstructing missing samples using maximum entropy (MaxEnt) and compressed sensing (CS). Both reconstruction problems were solved using the Cambridge algorithm, which offers many workflow improvements over other l1-norm solvers. Reconstructions of retrospectively under-sampled simulated data demonstrate that the MaxEnt and CS reconstructions successfully restore data fidelity at signal-to-noise ratios (SNRs) from 4 to 20 and 5× to 1.25× NUS. Retrospectively and prospectively 4× under-sampled 4D EP-COSI in vivo datasets show that both reconstruction methods successfully remove NUS artifacts; however, MaxEnt provides reconstructions equal to or better than CS. Our results show that NUS combined with iterative reconstruction can reduce 4D EP-COSI scan times by 75% to a clinically viable 5 min in vivo, with MaxEnt being the preferred method. 2013 John Wiley & Sons, Ltd.

  9. Estimation of Land Surface Temperature through Blending MODIS and AMSR-E Data with the Bayesian Maximum Entropy Method

    Directory of Open Access Journals (Sweden)

    Xiaokang Kou

    2016-01-01

    Full Text Available Land surface temperature (LST plays a major role in the study of surface energy balances. Remote sensing techniques provide ways to monitor LST at large scales. However, due to atmospheric influences, significant missing data exist in LST products retrieved from satellite thermal infrared (TIR remotely sensed data. Although passive microwaves (PMWs are able to overcome these atmospheric influences while estimating LST, the data are constrained by low spatial resolution. In this study, to obtain complete and high-quality LST data, the Bayesian Maximum Entropy (BME method was introduced to merge 0.01° and 0.25° LSTs inversed from MODIS and AMSR-E data, respectively. The result showed that the missing LSTs in cloudy pixels were filled completely, and the availability of merged LSTs reaches 100%. Because the depths of LST and soil temperature measurements are different, before validating the merged LST, the station measurements were calibrated with an empirical equation between MODIS LST and 0~5 cm soil temperatures. The results showed that the accuracy of merged LSTs increased with the increasing quantity of utilized data, and as the availability of utilized data increased from 25.2% to 91.4%, the RMSEs of the merged data decreased from 4.53 °C to 2.31 °C. In addition, compared with the filling gap method in which MODIS LST gaps were filled with AMSR-E LST directly, the merged LSTs from the BME method showed better spatial continuity. The different penetration depths of TIR and PMWs may influence fusion performance and still require further studies.

  10. Mapping the Global Potential Geographical Distribution of Black Locust (Robinia Pseudoacacia L. Using Herbarium Data and a Maximum Entropy Model

    Directory of Open Access Journals (Sweden)

    Guoqing Li

    2014-11-01

    Full Text Available Black locust (Robinia pseudoacacia L. is a tree species of high economic and ecological value, but is also considered to be highly invasive. Understanding the global potential distribution and ecological characteristics of this species is a prerequisite for its practical exploitation as a resource. Here, a maximum entropy modeling (MaxEnt was used to simulate the potential distribution of this species around the world, and the dominant climatic factors affecting its distribution were selected by using a jackknife test and the regularized gain change during each iteration of the training algorithm. The results show that the MaxEnt model performs better than random, with an average test AUC value of 0.9165 (±0.0088. The coldness index, annual mean temperature and warmth index were the most important climatic factors affecting the species distribution, explaining 65.79% of the variability in the geographical distribution. Species response curves showed unimodal relationships with the annual mean temperature and warmth index, whereas there was a linear relationship with the coldness index. The dominant climatic conditions in the core of the black locust distribution are a coldness index of −9.8 °C–0 °C, an annual mean temperature of 5.8 °C–14.5 °C, a warmth index of 66 °C–168 °C and an annual precipitation of 508–1867 mm. The potential distribution of black locust is located mainly in the United States, the United Kingdom, Germany, France, the Netherlands, Belgium, Italy, Switzerland, Australia, New Zealand, China, Japan, South Korea, South Africa, Chile and Argentina. The predictive map of black locust, climatic thresholds and species response curves can provide globally applicable guidelines and valuable information for policymakers and planners involved in the introduction, planting and invasion control of this species around the world.

  11. Influence of nonlinearity of the phonon dispersion relation on wave velocities in the four-moment maximum entropy phonon hydrodynamics

    Science.gov (United States)

    Larecki, Wieslaw; Banach, Zbigniew

    2014-01-01

    This paper analyzes the propagation of the waves of weak discontinuity in a phonon gas described by the four-moment maximum entropy phonon hydrodynamics involving a nonlinear isotropic phonon dispersion relation. For the considered hyperbolic equations of phonon gas hydrodynamics, the eigenvalue problem is analyzed and the condition of genuine nonlinearity is discussed. The speed of the wave front propagating into the region in thermal equilibrium is first determined in terms of the integral formula dependent on the phonon dispersion relation and subsequently explicitly calculated for the Dubey dispersion-relation model: |k|=ωc-1(1+bω2). The specification of the parameters c and b for sodium fluoride (NaF) and semimetallic bismuth (Bi) then makes it possible to compare the calculated dependence of the wave-front speed on the sample’s temperature with the empirical relations of Coleman and Newman (1988) describing for NaF and Bi the variation of the second-sound speed with temperature. It is demonstrated that the calculated temperature dependence of the wave-front speed resembles the empirical relation and that the parameters c and b obtained from fitting respectively the empirical relation and the original material parameters of Dubey (1973) are of the same order of magnitude, the difference being in the values of the numerical factors. It is also shown that the calculated temperature dependence is in good agreement with the predictions of Hardy and Jaswal’s theory (Hardy and Jaswal, 1971) on second-sound propagation. This suggests that the nonlinearity of a phonon dispersion relation should be taken into account in the theories aiming at the description of the wave-type phonon heat transport and that the Dubey nonlinear isotropic dispersion-relation model can be very useful for this purpose.

  12. Predicting the current and future potential distributions of lymphatic filariasis in Africa using maximum entropy ecological niche modelling.

    Directory of Open Access Journals (Sweden)

    Hannah Slater

    Full Text Available Modelling the spatial distributions of human parasite species is crucial to understanding the environmental determinants of infection as well as for guiding the planning of control programmes. Here, we use ecological niche modelling to map the current potential distribution of the macroparasitic disease, lymphatic filariasis (LF, in Africa, and to estimate how future changes in climate and population could affect its spread and burden across the continent. We used 508 community-specific infection presence data collated from the published literature in conjunction with five predictive environmental/climatic and demographic variables, and a maximum entropy niche modelling method to construct the first ecological niche maps describing potential distribution and burden of LF in Africa. We also ran the best-fit model against climate projections made by the HADCM3 and CCCMA models for 2050 under A2a and B2a scenarios to simulate the likely distribution of LF under future climate and population changes. We predict a broad geographic distribution of LF in Africa extending from the west to the east across the middle region of the continent, with high probabilities of occurrence in the Western Africa compared to large areas of medium probability interspersed with smaller areas of high probability in Central and Eastern Africa and in Madagascar. We uncovered complex relationships between predictor ecological niche variables and the probability of LF occurrence. We show for the first time that predicted climate change and population growth will expand both the range and risk of LF infection (and ultimately disease in an endemic region. We estimate that populations at risk to LF may range from 543 and 804 million currently, and that this could rise to between 1.65 to 1.86 billion in the future depending on the climate scenario used and thresholds applied to signify infection presence.

  13. Using maximum entropy to predict suitable habitat for the endangered dwarf wedgemussel in the Maryland Coastal Plain

    Science.gov (United States)

    Campbell, Cara; Hilderbrand, Robert H.

    2017-01-01

    Species distribution modelling can be useful for the conservation of rare and endangered species. Freshwater mussel declines have thinned species ranges producing spatially fragmented distributions across large areas. Spatial fragmentation in combination with a complex life history and heterogeneous environment makes predictive modelling difficult.A machine learning approach (maximum entropy) was used to model occurrences and suitable habitat for the federally endangered dwarf wedgemussel, Alasmidonta heterodon, in Maryland's Coastal Plain catchments. Landscape-scale predictors (e.g. land cover, land use, soil characteristics, geology, flow characteristics, and climate) were used to predict the suitability of individual stream segments for A. heterodon.The best model contained variables at three scales: minimum elevation (segment scale), percentage Tertiary deposits, low intensity development, and woody wetlands (sub-catchment), and percentage low intensity development, pasture/hay agriculture, and average depth to the water table (catchment). Despite a very small sample size owing to the rarity of A. heterodon, cross-validated prediction accuracy was 91%.Most predicted suitable segments occur in catchments not known to contain A. heterodon, which provides opportunities for new discoveries or population restoration. These model predictions can guide surveys toward the streams with the best chance of containing the species or, alternatively, away from those streams with little chance of containing A. heterodon.Developed reaches had low predicted suitability for A. heterodon in the Coastal Plain. Urban and exurban sprawl continues to modify stream ecosystems in the region, underscoring the need to preserve existing populations and to discover and protect new populations.

  14. Local Stereo Matching Based on Information Entropy of Image

    Science.gov (United States)

    Geng, Yingnan

    2016-09-01

    Adaptive support-window algorithm is one of the simplest local algorithms for stereo matching. An important problem for adaptive support-window algorithm is to determine the appropriate support-window size, which is always hard to do and limits the validity of adaptive support-window algorithm. An appropriate support-window size must be selected adaptively based on image features. In this paper, information entropy of image is defined for stereo matching in the RGB vector space. Based on adaptive support-window, a new support-window selection algorithm, which uses information entropy of image to quantify image features such as illumination color and number of object contained in an image, is proposed. Experimental results evaluated on the Middlebury stereo benchmark show that our algorithm outperforms the conventional adaptive support-window algorithms.

  15. Improvement of the relative entropy based protein folding method

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The "relative entropy" has been used as a minimization function to predict the tertiary structure of a protein backbone, and good results have been obtained. However, in our previous work, the ensemble average of the contact potential was estimated by an approximate calculation. In order to improve the theoretical integrity of the relative-entropy-based method, a new theoretical calculation method of the ensemble average of the contact potential was presented in this work, which is based on the thermodynamic perturbation theory. Tests of the improved algorithm were performed on twelve small proteins. The root mean square deviations of the predicted versus the native structures from Protein Data Bank range from 0.40 to 0.60 nm. Compared with the previous approximate values, the average prediction accuracy is improved by 0.04 nm.

  16. Improvement of the relative entropy based protein folding method

    Institute of Scientific and Technical Information of China (English)

    QI LiSheng; SU JiGuo; CHEN WeiZu; WANG CunXin

    2009-01-01

    The "relative entropy" has been used as a minimization function to predict the tertiary structure of a protein backbone, and good results have been obtained. However, in our previous work, the ensemble average of the contact potential was estimated by an approximate calculation. In order to improve the theoretical integrity of the relative-entropy-based method, a new theoretical calculation method of the ensemble average of the contact potential was presented in this work, which is based on the thermodynamic perturbation theory. Testa of the improved algorithm were performed on twelve small proteins. The root mean square deviations of the predicted versus the native structures from Protein Data Bank range from 0.40 to 0.60 nm. Compared with the previous approximate values, the average prediction accuracy is improved by 0.04 nm.

  17. Geometrical Interpretation of Shannon's Entropy Based on the Born Rule

    CERN Document Server

    Jankovic, Marko V

    2009-01-01

    In this paper we will analyze discrete probability distributions in which probabilities of particular outcomes of some experiment (microstates) can be represented by the ratio of natural numbers (in other words, probabilities are represented by digital numbers of finite representation length). We will introduce several results that are based on recently proposed JoyStick Probability Selector, which represents a geometrical interpretation of the probability based on the Born rule. The terms of generic space and generic dimension of the discrete distribution, as well as, effective dimension are going to be introduced. It will be shown how this simple geometric representation can lead to an optimal code length coding of the sequence of signals. Then, we will give a new, geometrical, interpretation of the Shannon entropy of the discrete distribution. We will suggest that the Shannon entropy represents the logarithm of the effective dimension of the distribution. Proposed geometrical interpretation of the Shannon ...

  18. Stochastic modeling and control system designs of the NASA/MSFC Ground Facility for large space structures: The maximum entropy/optimal projection approach

    Science.gov (United States)

    Hsia, Wei-Shen

    1986-01-01

    In the Control Systems Division of the Systems Dynamics Laboratory of the NASA/MSFC, a Ground Facility (GF), in which the dynamics and control system concepts being considered for Large Space Structures (LSS) applications can be verified, was designed and built. One of the important aspects of the GF is to design an analytical model which will be as close to experimental data as possible so that a feasible control law can be generated. Using Hyland's Maximum Entropy/Optimal Projection Approach, a procedure was developed in which the maximum entropy principle is used for stochastic modeling and the optimal projection technique is used for a reduced-order dynamic compensator design for a high-order plant.

  19. Software Metrics Evaluation Based on Entropy

    CERN Document Server

    Selvarani, R; Ramachandran, Muthu; Prasad, Kamakshi

    2010-01-01

    Software engineering activities in the Industry has come a long way with various improve- ments brought in various stages of the software development life cycle. The complexity of modern software, the commercial constraints and the expectation for high quality products demand the accurate fault prediction based on OO design metrics in the class level in the early stages of software development. The object oriented class metrics are used as quality predictors in the entire OO software development life cycle even when a highly iterative, incremental model or agile software process is employed. Recent research has shown some of the OO design metrics are useful for predicting fault-proneness of classes. In this paper the empirical validation of a set of metrics proposed by Chidamber and Kemerer is performed to assess their ability in predicting the software quality in terms of fault proneness and degradation. We have also proposed the design complexity of object-oriented software with Weighted Methods per Class m...

  20. Drought frequency analysis using stochastic simulation with maximum entropy model%基于最大熵分布模拟的干旱频率分析

    Institute of Scientific and Technical Information of China (English)

    张明; 金菊良; 王国庆; 周润娟

    2013-01-01

    simulated annual runoff series of 10 000 years long for this station, the model predicts a probability of 2. 6% and a return period of 203a for a severe 12a drought event. The stochastic simulation based on maximum entropy model, that does not rely on assumption and has better applicability, would be widely used in frequency analysis for hydrological and water resources systems.

  1. BIOSMILE: A semantic role labeling system for biomedical verbs using a maximum-entropy model with automatically generated template features

    Directory of Open Access Journals (Sweden)

    Tsai Richard

    2007-09-01

    Full Text Available Abstract Background Bioinformatics tools for automatic processing of biomedical literature are invaluable for both the design and interpretation of large-scale experiments. Many information extraction (IE systems that incorporate natural language processing (NLP techniques have thus been developed for use in the biomedical field. A key IE task in this field is the extraction of biomedical relations, such as protein-protein and gene-disease interactions. However, most biomedical relation extraction systems usually ignore adverbial and prepositional phrases and words identifying location, manner, timing, and condition, which are essential for describing biomedical relations. Semantic role labeling (SRL is a natural language processing technique that identifies the semantic roles of these words or phrases in sentences and expresses them as predicate-argument structures. We construct a biomedical SRL system called BIOSMILE that uses a maximum entropy (ME machine-learning model to extract biomedical relations. BIOSMILE is trained on BioProp, our semi-automatic, annotated biomedical proposition bank. Currently, we are focusing on 30 biomedical verbs that are frequently used or considered important for describing molecular events. Results To evaluate the performance of BIOSMILE, we conducted two experiments to (1 compare the performance of SRL systems trained on newswire and biomedical corpora; and (2 examine the effects of using biomedical-specific features. The experimental results show that using BioProp improves the F-score of the SRL system by 21.45% over an SRL system that uses a newswire corpus. It is noteworthy that adding automatically generated template features improves the overall F-score by a further 0.52%. Specifically, ArgM-LOC, ArgM-MNR, and Arg2 achieve statistically significant performance improvements of 3.33%, 2.27%, and 1.44%, respectively. Conclusion We demonstrate the necessity of using a biomedical proposition bank for training

  2. Predicting the spatiotemporal distributions of marine fish species utilizing earth system data in a maximum entropy modeling framework

    Science.gov (United States)

    Wang, L.; Kerr, L. A.; Bridger, E.

    2016-12-01

    Changes in species distributions have been widely associated with climate change. Understanding how ocean conditions influence marine fish distributions is critical for elucidating the role of climate in ecosystem change and forecasting how fish may be distributed in the future. Species distribution models (SDMs) can enable estimation of the likelihood of encountering species in space or time as a function of environmental conditions. Traditional SDMs are applied to scientific-survey data that include both presences and absences. Maximum entropy (MaxEnt) models are promising tools as they can be applied to presence-only data, such as those collected from fisheries or citizen science programs. We used MaxEnt to relate the occurrence records of marine fish species (e.g. Atlantic herring, Atlantic mackerel, and butterfish) from NOAA Northeast Fisheries Observer Program to environmental conditions. Environmental variables from earth system data, such as sea surface temperature (SST), sea bottom temperature (SBT), Chlorophyll-a, bathymetry, North Atlantic oscillation (NAO), and Atlantic multidecadal oscillation (AMO), were matched with species occurrence for MaxEnt modeling the fish distributions in Northeast Shelf area. We developed habitat suitability maps for these species, and assessed the relative influence of environmental factors on their distributions. Overall, SST and Chlorophyll-a had greatest influence on their monthly distributions, with bathymetry and SBT having moderate influence and climate indices (NAO and AMO) having little influence. Across months, Atlantic herring distribution was most related to SST 10th percentile, and Atlantic mackerel and butterfish distributions were most related to previous month SST. The fish distributions were most affected by previous month Chlorophyll-a in summer months, which may indirectly indicate the accumulative impact of primary productivity. Results highlighted the importance of spatial and temporal scales when using

  3. Information Entropy- and Average-Based High-Resolution Digital Storage Oscilloscope

    Directory of Open Access Journals (Sweden)

    Jun Jiang

    2014-01-01

    Full Text Available Vertical resolution is an essential indicator of digital storage oscilloscope (DSO and the key to improving resolution is to increase digitalizing bits and lower noise. Averaging is a typical method to improve signal to noise ratio (SNR and the effective number of bits (ENOB. The existing averaging algorithm is apt to be restricted by the repetitiveness of signal and be influenced by gross error in quantization, and therefore its effect on restricting noise and improving resolution is limited. An information entropy-based data fusion and average-based decimation filtering algorithm, proceeding from improving average algorithm and in combination with relevant theories of information entropy, are proposed in this paper to improve the resolution of oscilloscope. For single acquiring signal, resolution is improved through eliminating gross error in quantization by utilizing the maximum entropy of sample data with further noise filtering via average-based decimation after data fusion of efficient sample data under the premise of oversampling. No subjective assumptions and constraints are added to the signal under test in the whole process without any impact on the analog bandwidth of oscilloscope under actual sampling rate.

  4. Information entropy-based classification of triterpenoids and steroids from Ganoderma.

    Science.gov (United States)

    Castellano, Gloria; Torrens, Francisco

    2015-08-01

    A set of 71 triterpenoid and steroid compounds from Ganoderma were periodically classified using a procedure based on information entropy with artificial intelligence. Six features were used in hierarchical order to classify the triterpenoids and steroids structurally. The phytochemicals belonging to the same group in the periodic table present similar antioxidant activity, and those compounds belonging to the same period exhibit maximum resemblance. The periodic classification is related to the experimental bioactivity and antioxidant potency data that are available in the literature: a steroid with a three-ketone group conjugated with two carbon-carbon double bonds in the right side of the periodic table exhibits the greatest antioxidant activity.

  5. Application Research on GERT Network Model in Real Estate Market: Based on Parameter Configuration by Maximum Entropy%房地产市场GERT网络模型应用研究——基于极大熵参数配置

    Institute of Scientific and Technical Information of China (English)

    王传会; 方志耕; 公维凤

    2013-01-01

    采用极大熵方法配置参数,构建了考虑体外循环的GERT网络模型,研究投资决策参量的相关参数的函数关系、运算法则及确定方法,提出了考虑体外循环的GERT网络模型求解算法.将该GERT网络模型应用于房地产市场领域,研究房地产市场中各参与主体之间的投资决策流动系统.结果显示,利用该GERT网络模型能充分反映房地产市场中各部门之间、房地产市场部门与体外环境部门之间的交互影响关系,可得到科学合理的量化结论.%This paper constructs a GERT network model with extracorporeal feedback loop through using the method of maximum entropy to make the parametric configuration, and analyzes the function relation and the operating regulation of parameters about investment decision. Then it provides an analytic algorithm of this GERT network model. And it applies this GERT network model to study the investment decision flowing system of participants in real estate market. The result shows the mutual influence relationship among sections in real estate market,and the mutual influence relationship between sections in real estate market and sections in extracorporeal feedback loop could be reflected by this GERT network model.

  6. Maximum-entropy mobility spectrum of two-dimensional hole gas in strained-Si sub 1 sub - sub x Ge sub x /Si heterostructures

    CERN Document Server

    Kiatgamolchai, S

    2000-01-01

    gamma has the bowl shape with the minimum at x approx 0.25-0.3. These characteristics suggest a possible influence of alloy disorder scattering. The mobilities and activation energies of the carriers in the boron-doped cap vary between samples and this is believed to be due to boron-spike near the Si/Si-substrate interface, in some samples. The source of electron-like carrier is presently unknown. Magnetotransport properties of modulation-doped p-type Si sub 1 sub - sub x Ge sub x /Si and Si sub 1 sub - sub x Ge sub x /Si sub 1 sub - sub y Ge sub y heterostructures were studied, in the magnetic field range 0-12 T, and in the temperature range 0.35-300 K. The experimental data within the classical regime have been analysed by mobility spectrum analysis, in order to separate the influences of different parallel conduction paths. A new method of mobility spectrum analysis has been developed by the based on the concept of maximum-entropy, and this computation has been shown to overcome several drawbacks or limita...

  7. A coupled force-restore model of surface temperature and soil moisture using the maximum entropy production model of heat fluxes

    Science.gov (United States)

    Huang, S.-Y.; Wang, J.

    2016-07-01

    A coupled force-restore model of surface soil temperature and moisture (FRMEP) is formulated by incorporating the maximum entropy production model of surface heat fluxes and including the gravitational drainage term. The FRMEP model driven by surface net radiation and precipitation are independent of near-surface atmospheric variables with reduced sensitivity to the uncertainties of model input and parameters compared to the classical force-restore models (FRM). The FRMEP model was evaluated using observations from two field experiments with contrasting soil moisture conditions. The modeling errors of the FRMEP predicted surface temperature and soil moisture are lower than those of the classical FRMs forced by observed or bulk formula based surface heat fluxes (bias 1 ~ 2°C versus ~4°C, 0.02 m3 m-3 versus 0.05 m3 m-3). The diurnal variations of surface temperature, soil moisture, and surface heat fluxes are well captured by the FRMEP model measured by the high correlations between the model predictions and observations (r ≥ 0.84). Our analysis suggests that the drainage term cannot be neglected under wet soil condition. A 1 year simulation indicates that the FRMEP model captures the seasonal variation of surface temperature and soil moisture with bias less than 2°C and 0.01 m3 m-3 and correlation coefficients of 0.93 and 0.9 with observations, respectively.

  8. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach, part 2

    Science.gov (United States)

    Hsia, Wei Shen

    1989-01-01

    A validated technology data base is being developed in the areas of control/structures interaction, deployment dynamics, and system performance for Large Space Structures (LSS). A Ground Facility (GF), in which the dynamics and control systems being considered for LSS applications can be verified, was designed and built. One of the important aspects of the GF is to verify the analytical model for the control system design. The procedure is to describe the control system mathematically as well as possible, then to perform tests on the control system, and finally to factor those results into the mathematical model. The reduction of the order of a higher order control plant was addressed. The computer program was improved for the maximum entropy principle adopted in Hyland's MEOP method. The program was tested against the testing problem. It resulted in a very close match. Two methods of model reduction were examined: Wilson's model reduction method and Hyland's optimal projection (OP) method. Design of a computer program for Hyland's OP method was attempted. Due to the difficulty encountered at the stage where a special matrix factorization technique is needed in order to obtain the required projection matrix, the program was successful up to the finding of the Linear Quadratic Gaussian solution but not beyond. Numerical results along with computer programs which employed ORACLS are presented.

  9. Distribution of the Habitat Suitability of the Main Malaria Vector in French Guiana Using Maximum Entropy Modeling.

    Science.gov (United States)

    Moua, Yi; Roux, Emmanuel; Girod, Romain; Dusfour, Isabelle; de Thoisy, Benoit; Seyler, Frédérique; Briolant, Sébastien

    2016-12-22

    Malaria is an important health issue in French Guiana. Its principal mosquito vector in this region is Anopheles darlingi Root. Knowledge of the spatial distribution of this species is still very incomplete due to the extent of French Guiana and the difficulty to access most of the territory. Species distribution modeling based on the maximal entropy procedure was used to predict the spatial distribution of An. darlingi using 39 presence sites. The resulting model provided significantly high prediction performances (mean 10-fold cross-validated partial area under the curve and continuous Boyce index equal to, respectively, 1.11-with a level of omission error of 20%-and 0.42). The model also provided a habitat suitability map and environmental response curves in accordance with the known entomological situation. Several environmental characteristics that had a positive correlation with the presence of An. darlingi were highlighted: nonpermanent anthropogenic changes of the natural environment, the presence of roads and tracks, and opening of the forest. Some geomorphological landforms and high altitude landscapes appear to be unsuitable for An. darlingi The species distribution modeling was able to reliably predict the distribution of suitable habitats for An. darlingi in French Guiana. Results allowed completion of the knowledge of the spatial distribution of the principal malaria vector in this Amazonian region, and identification of the main factors that favor its presence. They should contribute to the definition of a necessary targeted vector control strategy in a malaria pre-elimination stage, and allow extrapolation of the acquired knowledge to other Amazonian or malaria-endemic contexts.

  10. 基于最大熵值法生态位模型(Maxent)的三种实蝇潜在适生性分布预测%Maximum entropy niche-based modeling(Maxent)of potential geographical distributions of fruit flies Dacus bivittatus,D.ciliatus and D.vertebrates(Diptera:Tephritidae)

    Institute of Scientific and Technical Information of China (English)

    李白尼; 魏武; 马骏; 张润杰

    2009-01-01

    In order to predict and analyse the potential geographical distributions of three important quarantine invasive pests.Dacus bivittatus.D.ciliatus and D.vertebratus,three ecological niche modeling techniques,BIOCLIM.DOMAIN and Maximum entroPY niche.based modeling(Maxent)were implemented by using distribution records of the three fruit fly species and a set of environmental predictor variables.Differences in prediction performance of the three models with thresholds were observed.An evaluation using independent records of D.bivittatus showed that Maxent offers the most accurate predictions than two other models based on three values of ROC/AUC.Kappa.and TTS.Prediction outcomes made by Maxent revealed that the three fruit fly species have broadly similar potential ranges in Central American,South American,Southeast Asia,and Coastalarea8 of Australia in general.D.ciliatus has the comparatively widest potential range among the three species,including Coastal areas of Mediterranean Sea,Saudi Arabia,Yemen,Oman and South Iran,suggesting that it may be tolerant of the widest range of climatic conditions among the three species.In China,while large area8 ofYunnan and Hainan are very habitable for all the three fruit fly species,southern part of Guangdong and Taiwan are also their habitable areas.D.ciliatus has the widest potential distribution area.with southern part ot btenuan,Guizhou and Tibet plus the coastal areas of southern China all being its suitable areas.The risk of the three fruit fly species permanently establishing in Guangdong if introduced exist but low.Jackknife analysis revealed that temperature and its variation have comparative significant influence on the distribution patterns of three fruit fly species both in global and restricted regions.%本研究首先对3种重要生态位模型BIOCLIM,DOMAIN和Maxent(基于最大熵值原理模型)的分布预测精确度进行了分析和比较,再结合分布点记录以及一系列环境数据图层对3种重

  11. Horton Ratios Link Self-Similarity with Maximum Entropy of Eco-Geomorphological Properties in Stream Networks

    Directory of Open Access Journals (Sweden)

    Bruce T. Milne

    2017-05-01

    Full Text Available Stream networks are branched structures wherein water and energy move between land and atmosphere, modulated by evapotranspiration and its interaction with the gravitational dissipation of potential energy as runoff. These actions vary among climates characterized by Budyko theory, yet have not been integrated with Horton scaling, the ubiquitous pattern of eco-hydrological variation among Strahler streams that populate river basins. From Budyko theory, we reveal optimum entropy coincident with high biodiversity. Basins on either side of optimum respond in opposite ways to precipitation, which we evaluated for the classic Hubbard Brook experiment in New Hampshire and for the Whitewater River basin in Kansas. We demonstrate that Horton ratios are equivalent to Lagrange multipliers used in the extremum function leading to Shannon information entropy being maximal, subject to constraints. Properties of stream networks vary with constraints and inter-annual variation in water balance that challenge vegetation to match expected resource supply throughout the network. The entropy-Horton framework informs questions of biodiversity, resilience to perturbations in water supply, changes in potential evapotranspiration, and land use changes that move ecosystems away from optimal entropy with concomitant loss of productivity and biodiversity.

  12. A Note on k-Limited Maximum Base

    Institute of Scientific and Technical Information of China (English)

    Yang Ruishun; Yang Xiaowei

    2006-01-01

    The problem of k-limited maximum base was specified into two special problems of k-limited maximum base; that is, let subset D of the problem of k-limited maximum base be an independent set and a circuit of the matroid, respectively. It was proved that under this circumstance the collections of k-limited base satisfy base axioms. Then a new matroid was determined, and the problem of k-limited maximum base was transformed to the problem of maximum base of this new matroid. Aiming at the problem, two algorithms, which in essence are greedy algorithms based on former matroid, were presented for the two special problems of k-limited maximum base. They were proved to be reasonable and more efficient than the algorithm presented by Ma Zhongfan in view of the complexity of algorithm.

  13. Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins.

    Science.gov (United States)

    Säwén, Elin; Massad, Tariq; Landersjö, Clas; Damberg, Peter; Widmalm, Göran

    2010-08-21

    The conformational space available to the flexible molecule α-D-Manp-(1-->2)-α-D-Manp-OMe, a model for the α-(1-->2)-linked mannose disaccharide in N- or O-linked glycoproteins, is determined using experimental data and molecular simulation combined with a maximum entropy approach that leads to a converged population distribution utilizing different input information. A database survey of the Protein Data Bank where structures having the constituent disaccharide were retrieved resulted in an ensemble with >200 structures. Subsequent filtering removed erroneous structures and gave the database (DB) ensemble having three classes of mannose-containing compounds, viz., N- and O-linked structures, and ligands to proteins. A molecular dynamics (MD) simulation of the disaccharide revealed a two-state equilibrium with a major and a minor conformational state, i.e., the MD ensemble. These two different conformation ensembles of the disaccharide were compared to measured experimental spectroscopic data for the molecule in water solution. However, neither of the two populations were compatible with experimental data from optical rotation, NMR (1)H,(1)H cross-relaxation rates as well as homo- and heteronuclear (3)J couplings. The conformational distributions were subsequently used as background information to generate priors that were used in a maximum entropy analysis. The resulting posteriors, i.e., the population distributions after the application of the maximum entropy analysis, still showed notable deviations that were not anticipated based on the prior information. Therefore, reparameterization of homo- and heteronuclear Karplus relationships for the glycosidic torsion angles Φ and Ψ were carried out in which the importance of electronegative substituents on the coupling pathway was deemed essential resulting in four derived equations, two (3)J(COCC) and two (3)J(COCH) being different for the Φ and Ψ torsions, respectively. These Karplus relationships are denoted

  14. Network Entropy Based on Topology Configuration and Its Computation to Random Networks

    Institute of Scientific and Technical Information of China (English)

    LI Ji; WANG Bing-Hong; WANG Wen-Xu; ZHOU Tao

    2008-01-01

    A definition of network entropy is presented, and as an example, the relationship between the value of network entropy of ER network model and the connect probability p as well as the total nodes N is discussed. The theoretical result and the simulation result based on the network entropy of the ER network are in agreement well with each other. The result indicated that different from the other network entropy reported before, the network entropy defined here has an obvious difference from different type of random networks or networks having different total nodes. Thus, this network entropy may portray the characters of complex networks better. It is also pointed out that, with the aid of network entropy defined, the concept of equilibrium networks and the concept of non-equilibrium networks may be introduced, and a quantitative measurement to describe the deviation to equilibrium state of a complex network is carried out.

  15. Entropy: From Thermodynamics to Hydrology

    Directory of Open Access Journals (Sweden)

    Demetris Koutsoyiannis

    2014-02-01

    Full Text Available Some known results from statistical thermophysics as well as from hydrology are revisited from a different perspective trying: (a to unify the notion of entropy in thermodynamic and statistical/stochastic approaches of complex hydrological systems and (b to show the power of entropy and the principle of maximum entropy in inference, both deductive and inductive. The capability for deductive reasoning is illustrated by deriving the law of phase change transition of water (Clausius-Clapeyron from scratch by maximizing entropy in a formal probabilistic frame. However, such deductive reasoning cannot work in more complex hydrological systems with diverse elements, yet the entropy maximization framework can help in inductive inference, necessarily based on data. Several examples of this type are provided in an attempt to link statistical thermophysics with hydrology with a unifying view of entropy.

  16. Multiscale Permutation Entropy Based Rolling Bearing Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Jinde Zheng

    2014-01-01

    Full Text Available A new rolling bearing fault diagnosis approach based on multiscale permutation entropy (MPE, Laplacian score (LS, and support vector machines (SVMs is proposed in this paper. Permutation entropy (PE was recently proposed and defined to measure the randomicity and detect dynamical changes of time series. However, for the complexity of mechanical systems, the randomicity and dynamic changes of the vibration signal will exist in different scales. Thus, the definition of MPE is introduced and employed to extract the nonlinear fault characteristics from the bearing vibration signal in different scales. Besides, the SVM is utilized to accomplish the fault feature classification to fulfill diagnostic procedure automatically. Meanwhile, in order to avoid a high dimension of features, the Laplacian score (LS is used to refine the feature vector by ranking the features according to their importance and correlations with the main fault information. Finally, the rolling bearing fault diagnosis method based on MPE, LS, and SVM is proposed and applied to the experimental data. The experimental data analysis results indicate that the proposed method could identify the fault categories effectively.

  17. Accelerated echo planar J-resolved spectroscopic imaging in prostate cancer: a pilot validation of non-linear reconstruction using total variation and maximum entropy.

    Science.gov (United States)

    Nagarajan, Rajakumar; Iqbal, Zohaib; Burns, Brian; Wilson, Neil E; Sarma, Manoj K; Margolis, Daniel A; Reiter, Robert E; Raman, Steven S; Thomas, M Albert

    2015-11-01

    The overlap of metabolites is a major limitation in one-dimensional (1D) spectral-based single-voxel MRS and multivoxel-based MRSI. By combining echo planar spectroscopic imaging (EPSI) with a two-dimensional (2D) J-resolved spectroscopic (JPRESS) sequence, 2D spectra can be recorded in multiple locations in a single slice of prostate using four-dimensional (4D) echo planar J-resolved spectroscopic imaging (EP-JRESI). The goal of the present work was to validate two different non-linear reconstruction methods independently using compressed sensing-based 4D EP-JRESI in prostate cancer (PCa): maximum entropy (MaxEnt) and total variation (TV). Twenty-two patients with PCa with a mean age of 63.8 years (range, 46-79 years) were investigated in this study. A 4D non-uniformly undersampled (NUS) EP-JRESI sequence was implemented on a Siemens 3-T MRI scanner. The NUS data were reconstructed using two non-linear reconstruction methods, namely MaxEnt and TV. Using both TV and MaxEnt reconstruction methods, the following observations were made in cancerous compared with non-cancerous locations: (i) higher mean (choline + creatine)/citrate metabolite ratios; (ii) increased levels of (choline + creatine)/spermine and (choline + creatine)/myo-inositol; and (iii) decreased levels of (choline + creatine)/(glutamine + glutamate). We have shown that it is possible to accelerate the 4D EP-JRESI sequence by four times and that the data can be reliably reconstructed using the TV and MaxEnt methods. The total acquisition duration was less than 13 min and we were able to detect and quantify several metabolites.

  18. Physical entropy, information entropy and their evolution equations

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Inspired by the evolution equation of nonequilibrium statistical physics entropy and the concise statistical formula of the entropy production rate, we develop a theory of the dynamic information entropy and build a nonlinear evolution equation of the information entropy density changing in time and state variable space. Its mathematical form and physical meaning are similar to the evolution equation of the physical entropy: The time rate of change of information entropy density originates together from drift, diffusion and production. The concise statistical formula of information entropy production rate is similar to that of physical entropy also. Furthermore, we study the similarity and difference between physical entropy and information entropy and the possible unification of the two statistical entropies, and discuss the relationship among the principle of entropy increase, the principle of equilibrium maximum entropy and the principle of maximum information entropy as well as the connection between them and the entropy evolution equation.

  19. Lossless Compression Performance of a Simple Counter-Based Entropy Coder

    OpenAIRE

    Armein Z. R. Langi

    2011-01-01

    This paper describes the performance of a simple counter based entropy coder, as compared to other entropy coders, especially Huffman coder. Lossless data compression, such as Huffman coder and arithmetic coder, are designed to perform well over a wide range of data entropy. As a result, the coders require significant computational resources that could be the bottleneck of a compression implementation performance. In contrast, counter-based coders are designed to be optimal on a limited entro...

  20. Maximum entropy method and charge flipping, a powerful combination to visualize the true nature of structural disorder from in situ X-ray powder diffraction data.

    Science.gov (United States)

    Samy, Ali; Dinnebier, Robert E; van Smaalen, Sander; Jansen, Martin

    2010-04-01

    In a systematic approach, the ability of the Maximum Entropy Method (MEM) to reconstruct the most probable electron density of highly disordered crystal structures from X-ray powder diffraction data was evaluated. As a case study, the ambient temperature crystal structures of disordered alpha-Rb(2)[C(2)O(4)] and alpha-Rb(2)[CO(3)] and ordered delta-K(2)[C(2)O(4)] were investigated in detail with the aim of revealing the ;true' nature of the apparent disorder. Different combinations of F (based on phased structure factors) and G constraints (based on structure-factor amplitudes) from different sources were applied in MEM calculations. In particular, a new combination of the MEM with the recently developed charge-flipping algorithm with histogram matching for powder diffraction data (pCF) was successfully introduced to avoid the inevitable bias of the phases of the structure-factor amplitudes by the Rietveld model. Completely ab initio electron-density distributions have been obtained with the MEM applied to a combination of structure-factor amplitudes from Le Bail fits with phases derived from pCF. All features of the crystal structures, in particular the disorder of the oxalate and carbonate anions, and the displacements of the cations, are clearly obtained. This approach bears the potential of a fast method of electron-density determination, even for highly disordered materials. All the MEM maps obtained in this work were compared with the MEM map derived from the best Rietveld refined model. In general, the phased observed structure factors obtained from Rietveld refinement (applying F and G constraints) were found to give the closest description of the experimental data and thus lead to the most accurate image of the actual disorder.