Entropy-generated power and its efficiency
DEFF Research Database (Denmark)
Golubeva, N.; Imparato, A.; Esposito, M.
2013-01-01
We propose a simple model for a motor that generates mechanical motion by exploiting an entropic force arising from the topology of the underlying phase space. We show that the generation of mechanical forces in our system is surprisingly robust to local changes in kinetic and topological paramet...... parameters. Furthermore, we find that the efficiency at maximum power may show discontinuities....
Analysis of entropy extraction efficiencies in random number generation systems
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Combining Experiments and Simulations Using the Maximum Entropy Principle
DEFF Research Database (Denmark)
Boomsma, Wouter; Ferkinghoff-Borg, Jesper; Lindorff-Larsen, Kresten
2014-01-01
A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations...... are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy...... applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained...
Combining experiments and simulations using the maximum entropy principle.
Directory of Open Access Journals (Sweden)
Wouter Boomsma
2014-02-01
Full Text Available A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges.
Informational basis of sensory adaptation: entropy and single-spike efficiency in rat barrel cortex.
Adibi, Mehdi; Clifford, Colin W G; Arabzadeh, Ehsan
2013-09-11
We showed recently that exposure to whisker vibrations enhances coding efficiency in rat barrel cortex despite increasing correlations in variability (Adibi et al., 2013). Here, to understand how adaptation achieves this improvement in sensory representation, we decomposed the stimulus information carried in neuronal population activity into its fundamental components in the framework of information theory. In the context of sensory coding, these components are the entropy of the responses across the entire stimulus set (response entropy) and the entropy of the responses conditional on the stimulus (conditional response entropy). We found that adaptation decreased response entropy and conditional response entropy at both the level of single neurons and the pooled activity of neuronal populations. However, the net effect of adaptation was to increase the mutual information because the drop in the conditional entropy outweighed the drop in the response entropy. The information transmitted by a single spike also increased under adaptation. As population size increased, the information content of individual spikes declined but the relative improvement attributable to adaptation was maintained.
RNA Thermodynamic Structural Entropy.
Garcia-Martin, Juan Antonio; Clote, Peter
2015-01-01
Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
RNA Thermodynamic Structural Entropy.
Directory of Open Access Journals (Sweden)
Juan Antonio Garcia-Martin
Full Text Available Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs. However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http
Ranking DMUs by Comparing DEA Cross-Efficiency Intervals Using Entropy Measures
Directory of Open Access Journals (Sweden)
Tim Lu
2016-12-01
Full Text Available Cross-efficiency evaluation, an extension of data envelopment analysis (DEA, can eliminate unrealistic weighing schemes and provide a ranking for decision making units (DMUs. In the literature, the determination of input and output weights uniquely receives more attentions. However, the problem of choosing the aggressive (minimal or benevolent (maximal formulation for decision-making might still remain. In this paper, we develop a procedure to perform cross-efficiency evaluation without the need to make any specific choice of DEA weights. The proposed procedure takes into account the aggressive and benevolent formulations at the same time, and the choice of DEA weights can then be avoided. Consequently, a number of cross-efficiency intervals is obtained for each DMU. The entropy, which is based on information theory, is an effective tool to measure the uncertainty. We then utilize the entropy to construct a numerical index for DMUs with cross-efficiency intervals. A mathematical program is proposed to find the optimal entropy values of DMUs for comparison. With the derived entropy value, we can rank DMUs accordingly. Two examples are illustrated to show the effectiveness of the idea proposed in this paper.
Osterloh, Frank E
2014-10-02
The Shockley-Queisser analysis provides a theoretical limit for the maximum energy conversion efficiency of single junction photovoltaic cells. But besides the semiconductor bandgap no other semiconductor properties are considered in the analysis. Here, we show that the maximum conversion efficiency is limited further by the excited state entropy of the semiconductors. The entropy loss can be estimated with the modified Sackur-Tetrode equation as a function of the curvature of the bands, the degeneracy of states near the band edges, the illumination intensity, the temperature, and the band gap. The application of the second law of thermodynamics to semiconductors provides a simple explanation for the observed high performance of group IV, III-V, and II-VI materials with strong covalent bonding and for the lower efficiency of transition metal oxides containing weakly interacting metal d orbitals. The model also predicts efficient energy conversion with quantum confined and molecular structures in the presence of a light harvesting mechanism.
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
Energy Technology Data Exchange (ETDEWEB)
Schaerer, Roman Pascal, E-mail: schaerer@mathcces.rwth-aachen.de; Bansal, Pratyuksh; Torrilhon, Manuel
2017-07-01
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) , we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.
Efficient algorithms and implementations of entropy-based moment closures for rarefied gases
Schaerer, Roman Pascal; Bansal, Pratyuksh; Torrilhon, Manuel
2017-07-01
We present efficient algorithms and implementations of the 35-moment system equipped with the maximum-entropy closure in the context of rarefied gases. While closures based on the principle of entropy maximization have been shown to yield very promising results for moderately rarefied gas flows, the computational cost of these closures is in general much higher than for closure theories with explicit closed-form expressions of the closing fluxes, such as Grad's classical closure. Following a similar approach as Garrett et al. (2015) [13], we investigate efficient implementations of the computationally expensive numerical quadrature method used for the moment evaluations of the maximum-entropy distribution by exploiting its inherent fine-grained parallelism with the parallelism offered by multi-core processors and graphics cards. We show that using a single graphics card as an accelerator allows speed-ups of two orders of magnitude when compared to a serial CPU implementation. To accelerate the time-to-solution for steady-state problems, we propose a new semi-implicit time discretization scheme. The resulting nonlinear system of equations is solved with a Newton type method in the Lagrange multipliers of the dual optimization problem in order to reduce the computational cost. Additionally, fully explicit time-stepping schemes of first and second order accuracy are presented. We investigate the accuracy and efficiency of the numerical schemes for several numerical test cases, including a steady-state shock-structure problem.
Directory of Open Access Journals (Sweden)
Xiong Luo
2016-07-01
Full Text Available With the recent emergence of wireless sensor networks (WSNs in the cloud computing environment, it is now possible to monitor and gather physical information via lots of sensor nodes to meet the requirements of cloud services. Generally, those sensor nodes collect data and send data to sink node where end-users can query all the information and achieve cloud applications. Currently, one of the main disadvantages in the sensor nodes is that they are with limited physical performance relating to less memory for storage and less source of power. Therefore, in order to avoid such limitation, it is necessary to develop an efficient data prediction method in WSN. To serve this purpose, by reducing the redundant data transmission between sensor nodes and sink node while maintaining the required acceptable errors, this article proposes an entropy-based learning scheme for data prediction through the use of kernel least mean square (KLMS algorithm. The proposed scheme called E-KLMS develops a mechanism to maintain the predicted data synchronous at both sides. Specifically, the kernel-based method is able to adjust the coefficients adaptively in accordance with every input, which will achieve a better performance with smaller prediction errors, while employing information entropy to remove these data which may cause relatively large errors. E-KLMS can effectively solve the tradeoff problem between prediction accuracy and computational efforts while greatly simplifying the training structure compared with some other data prediction approaches. What’s more, the kernel-based method and entropy technique could ensure the prediction effect by both improving the accuracy and reducing errors. Experiments with some real data sets have been carried out to validate the efficiency and effectiveness of E-KLMS learning scheme, and the experiment results show advantages of the our method in prediction accuracy and computational time.
Liu, Jie; Liu, Chun; Han, Wei
2016-10-01
Urban soil pollution is evaluated utilizing an efficient and simple algorithmic model referred to as the entropy method-based Topsis (EMBT) model. The model focuses on pollution source position to enhance the ability to analyze sources of pollution accurately. Initial application of EMBT to urban soil pollution analysis is actually implied. The pollution degree of sampling point can be efficiently calculated by the model with the pollution degree coefficient, which is efficiently attained by first utilizing the Topsis method to determine evaluation value and then by dividing the evaluation value of the sample point by background value. The Kriging interpolation method combines coordinates of sampling points with the corresponding coefficients and facilitates the formation of heavy metal distribution profile. A case study is completed with modeling results in accordance with actual heavy metal pollution, proving accuracy and practicality of the EMBT model.
Directory of Open Access Journals (Sweden)
Corrado lo Storto
2016-01-01
Full Text Available In this paper, a method is proposed to calculate a comprehensive index that calculates the ecological efficiency of a city by combining together the measurements provided by some Data Envelopment Analysis (DEA cross-efficiency models using the Shannon’s entropy index. The DEA models include non-discretionary uncontrollable inputs, desirable and undesirable outputs. The method is implemented to compute the ecological efficiency of a sample of 116 Italian provincial capital cities in 2011 as a case study. Results emerging from the case study show that the proposed index has a good discrimination power and performs better than the ranking provided by the Sole24Ore, which is generally used in Italy to conduct benchmarking studies. While the sustainability index proposed by the Sole24Ore utilizes a set of subjective weights to aggregate individual indicators, the adoption of the DEA based method limits the subjectivity to the selection of the models. The ecological efficiency measurements generated by the implementation of the method for the Italian cities indicate that they perform very differently, and generally largest cities in terms of population size achieve a higher efficiency score.
Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.
2016-08-01
In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.
Allnér, Olof; Foloppe, Nicolas; Nilsson, Lennart
2015-01-22
Molecular dynamics simulations of E. coli glutaredoxin1 in water have been performed to relate the dynamical parameters and entropy obtained in NMR relaxation experiments, with results extracted from simulated trajectory data. NMR relaxation is the most widely used experimental method to obtain data on dynamics of proteins, but it is limited to relatively short timescales and to motions of backbone amides or in some cases (13)C-H vectors. By relating the experimental data to the all-atom picture obtained in molecular dynamics simulations, valuable insights on the interpretation of the experiment can be gained. We have estimated the internal dynamics and their timescales by calculating the generalized order parameters (O) for different time windows. We then calculate the quasiharmonic entropy (S) and compare it to the entropy calculated from the NMR-derived generalized order parameter of the amide vectors. Special emphasis is put on characterizing dynamics that are not expressed through the motions of the amide group. The NMR and MD methods suffer from complementary limitations, with NMR being restricted to local vectors and dynamics on a timescale determined by the rotational diffusion of the solute, while in simulations, it may be difficult to obtain sufficient sampling to ensure convergence of the results. We also evaluate the amount of sampling obtained with molecular dynamics simulations and how it is affected by the length of individual simulations, by clustering of the sampled conformations. We find that two structural turns act as hinges, allowing the α helix between them to undergo large, long timescale motions that cannot be detected in the time window of the NMR dipolar relaxation experiments. We also show that the entropy obtained from the amide vector does not account for correlated motions of adjacent residues. Finally, we show that the sampling in a total of 100 ns molecular dynamics simulation can be increased by around 50%, by dividing the
Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments
Shockley, Keith R.
2014-01-01
Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003
Energy Technology Data Exchange (ETDEWEB)
Santos, Marcio Bueno dos [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. de Integracao e Testes; Saboya, Sergio Mourao [Instituto Tecnologico de Aeronautica, Sao Jose dos Campos, SP (Brazil). Dept. de Energia
1998-07-01
This paper studies the efficiency of a fined tube solar collector used in artificial satellites and the relation of this efficiency with the entropy generation in the fin. The mathematical modeling of heat transfer in the collector leads to a non-linear integrodifferential system of equations, which is solved numerically. The solution gives the efficiency, which is presented as function of geometrical and physical characteristics of the collector. It is also shown that a minimum entropy generation in the fins, in a collector, whose characteristics are subjected to constraints, corresponds to an optimum efficiency, that is, an efficiency value advantageous to collector performance. (author)
Importance Sampling Simulations of Markovian Reliability Systems using Cross Entropy
Ridder, Ad
2004-01-01
This paper reports simulation experiments, applying the cross entropy method suchas the importance sampling algorithm for efficient estimation of rare event probabilities in Markovian reliability systems. The method is compared to various failurebiasing schemes that have been proved to give
Xu, Jun; Dang, Chao; Kong, Fan
2017-10-01
This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.
Information Entropy of Fullerenes.
Sabirov, Denis Sh; Ōsawa, Eiji
2015-08-24
The reasons for the formation of the highly symmetric C60 molecule under nonequilibrium conditions are widely discussed as it dominates over numerous similar fullerene structures. In such conditions, evolution of structure rather than energy defines the processes. We have first studied the diversity of fullerenes in terms of information entropy. Sorting 2079 structures from An Atlas of Fullerenes [ Fowler , P. W. ; Manolopoulos , D. E. An Atlas of Fullerenes ; Oxford : Clarendon , 1995 . ], we have found that the information entropies of only 14 fullerenes (fullerenes. Interestingly, buckminsterfullerene is the only fullerene with zero information entropy, i.e., an exclusive compound among the other members of the fullerene family. Such an efficient sorting demonstrates possible relevance of information entropy to chemical processes. For this reason, we have introduced an algorithm for calculating changes in information entropy at chemical transformations. The preliminary calculations of changes in information entropy at the selected fullerene reactions show good agreement with thermochemical data.
EEG entropy measures in anesthesia
Directory of Open Access Journals (Sweden)
Zhenhu eLiang
2015-02-01
Full Text Available Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs’ effect is lacking. In this study, we compare the capability of twelve entropy indices for monitoring depth of anesthesia (DoA and detecting the burst suppression pattern (BSP, in anesthesia induced by GA-BAergic agents.Methods: Twelve indices were investigated, namely Response Entropy (RE and State entropy (SE, three wavelet entropy (WE measures (Shannon WE (SWE, Tsallis WE (TWE and Renyi WE (RWE, Hilbert-Huang spectral entropy (HHSE, approximate entropy (ApEn, sample entropy (SampEn, Fuzzy entropy, and three permutation entropy (PE measures (Shannon PE (SPE, Tsallis PE (TPE and Renyi PE (RPE. Two EEG data sets from sevoflurane-induced and isoflu-rane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, phar-macokinetic / pharmacodynamic (PK/PD modeling and prediction probability analysis were applied. The multifractal detrended fluctuation analysis (MDFA as a non-entropy measure was compared.Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline vari-ability, higher coefficient of determination and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an ad-vantage in computation efficiency compared with MDFA.Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure.Significance: Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA.
Adjoint entropy vs topological entropy
Giordano Bruno, Anna
2012-01-01
Recently the adjoint algebraic entropy of endomorphisms of abelian groups was introduced and studied. We generalize the notion of adjoint entropy to continuous endomorphisms of topological abelian groups. Indeed, the adjoint algebraic entropy is defined using the family of all finite-index subgroups, while we take only the subfamily of all open finite-index subgroups to define the topological adjoint entropy. This allows us to compare the (topological) adjoint entropy with the known topologic...
Rose, Michael T.; Crossan, Angus N.; Kennedy, Ivan R.
2008-01-01
Consideration of the property of action is proposed to provide a more meaningful definition of efficient energy use and sustainable production in ecosystems. Action has physical dimensions similar to angular momentum, its magnitude varying with mass, spatial configuration and relative motion. In this article, the relationship of action to…
EEG entropy measures in anesthesia.
Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J; Sleigh, Jamie W; Hagihira, Satoshi; Li, Xiaoli
2015-01-01
► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R (2)) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Each
EEG entropy measures in anesthesia
Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli
2015-01-01
efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277
Cheng, Xiaojun; Ma, Xujun; Yépez, Miztli; Genack, Azriel Z.; Mello, Pier A.
2017-11-01
The single-parameter scaling hypothesis relating the average and variance of the logarithm of the conductance is a pillar of the theory of electronic transport. We use a maximum-entropy ansatz to explore the logarithm of the particle, or energy density lnW (x ) at a depth x into a random one-dimensional system. Single-parameter scaling would be the special case in which x =L (the system length). We find the result, confirmed in microwave measurements and computer simulations, that the average of lnW (x ) is independent of L and equal to -x /ℓ , with ℓ the mean free path. At the beginning of the sample, var [lnW (x )] rises linearly with x and is also independent of L , with a sublinear increase and then a drop near the sample output. At x =L we find a correction to the value of var [lnT ] predicted by single-parameter scaling.
Lechner, Joseph H.
1999-10-01
This report describes two classroom activities that help students visualize the abstract concept of entropy and apply the second law of thermodynamics to real situations. (i) A sealed "rainbow tube" contains six smaller vessels, each filled with a different brightly colored solution (low entropy). When the tube is inverted, the solutions mix together and react to form an amorphous precipitate (high entropy). The change from low entropy to high entropy is irreversible as long as the tube remains sealed. (ii) When U.S. currency is withdrawn from circulation, intact bills (low entropy) are shredded into small fragments (high entropy). Shredding is quick and easy; the reverse process is clearly nonspontaneous. It is theoretically possible, but it is time-consuming and energy-intensive, to reassemble one bill from a pile that contains fragments of hundreds of bills. We calculate the probability P of drawing pieces of only one specific bill from a mixture containing one pound of bills, each shredded into n fragments. This result can be related to Boltzmann's entropy formula S?=klnW.
Using entropy measures to characterize human locomotion.
Leverick, Graham; Szturm, Tony; Wu, Christine Q
2014-12-01
Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.
The biophysical basis of Benveniste experiments: Entropy, structure, and information in water
Widom, Allan; Srivastava, Yogendra; Valenzi, Vincenzo
Benveniste had observed that highly dilute (and even in the absence of physical molecules) biological agents still triggered relevant biological systems. Some of these experiments were reproduced in three other laboratories who cosigned the article, (Davenas et al., Nature 1988, 333, 816). Further works, [(Medical Hypotheses 2000, 54, 33), (Rivista di Biologia/Biology Forum 97, 2004, 169)], showed that molecular activity in more than 50 biochemical systems and even in bacteria could be induced by electromagnetic signals transferred through water solutes. The sources of the electromagnetic signals were recordings of specific biological activities. These results suggest that electromagnetic transmission of biochemical information can be stored in the electric dipole moments of water in close analogy to the manner in which magnetic moments store information on a computer disk. The electromagnetic transmission would enable in vivo transmissions of the specific molecular information between two functional biomolecules. In the present work, the physical nature of such biological information storage and retrieval in ordered quantum electromagnetic domains of water will be discussed.
Directory of Open Access Journals (Sweden)
Luca Faes
2017-01-01
Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.
Lattice distortions in the FeCoNiCrMn high entropy alloy studied by theory and experiment
Oh, H.S.; Ma, D; Leyson, G.P.; Grabowski, B; Park, E.S.; Kormann, F.H.W.; Raabe, D.
2016-01-01
Lattice distortions constitute one of the main features characterizing high entropy alloys. Local lattice distortions have, however, only rarely been investigated in these multi-component alloys. We, therefore, employ a combined theoretical electronic structure and experimental approach to study the
Entropy-based financial asset pricing.
Directory of Open Access Journals (Sweden)
Mihály Ormos
Full Text Available We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.
Entropy-based financial asset pricing.
Ormos, Mihály; Zibriczky, Dávid
2014-01-01
We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.
Westhoff, Martijn; Zehe, Erwin; Erpicum, Sébastien; Archambeau, Pierre; Pirotton, Michel; Dewals, Benjamin
2015-04-01
The Maximum Entropy Production (MEP) principle is a conjecture assuming that a medium is organized in such a way that maximum power is subtracted from a gradient driving a flux (with power being a flux times its driving gradient). This maximum power is also known as the Carnot limit. It has already been shown that the atmosphere operates close to this Carnot limit when it comes to heat transport from the Equator to the poles, or vertically, from the surface to the atmospheric boundary layer. To reach this state close to the Carnot limit, the effective thermal conductivity of the atmosphere is adapted by the creation of convection cells (e.g. wind). The aim of this study is to test if the soil's effective hydraulic conductivity also adapts itself in such a way that it operates close to the Carnot limit. The big difference between atmosphere and soil is the way of adaptation of its resistance. The soil's hydraulic conductivity is either changed by weathering processes, which is a very slow process, or by creation of preferential flow paths. In this study the latter process is simulated in a lab experiment, where we focus on the preferential flow paths created by piping. Piping is the process of backwards erosion of sand particles subject to a large pressure gradient. Since this is a relatively fast process, it is suitable for being tested in the lab. In the lab setup a horizontal sand bed connects two reservoirs that both drain freely at a level high enough to keep the sand bed always saturated. By adding water to only one reservoir, a horizontal pressure gradient is maintained. If the flow resistance is small, a large gradient develops, leading to the effect of piping. When pipes are being formed, the effective flow resistance decreases; the flow through the sand bed increases and the pressure gradient decreases. At a certain point, the flow velocity is small enough to stop the pipes from growing any further. In this steady state, the effective flow resistance of
Entropy in DNA Double-Strand Break, Detection and Signaling
Zhang, Yang; Schindler, Christina; Heermann, Dieter
2014-03-01
In biology, the term entropy is often understood as a measure of disorder - a restrictive interpretation that can even be misleading. Recently it has become clearer and clearer that entropy, contrary to conventional wisdom, can help to order and guide biological processes in living cells. DNA double-strand breaks (DSBs) are among the most dangerous lesions and efficient damage detection and repair is essential for organism viability. However, what remains unknown is the precise mechanism of targeting the site of damage within billions of intact nucleotides and a crowded nuclear environment, a process which is often referred to as recruitment or signaling. Here we show that the change in entropy associated with inflicting a DSB facilitates the recruitment of damage sensor proteins. By means of computational modeling we found that higher mobility and local chromatin structure accelerate protein association at DSB ends. We compared the effect of different chromatin architectures on protein dynamics and concentrations in the vicinity of DSBs, and related these results to experiments on repair in heterochromatin. Our results demonstrate how entropy contributes to a more efficient damage detection. We identify entropy as the physical basis for DNA double-strand break signaling.
Entropy Generation Across Earth's Bow Shock
Parks, George K.; McCarthy, Michael; Fu, Suiyan; Lee E. s; Cao, Jinbin; Goldstein, Melvyn L.; Canu, Patrick; Dandouras, Iannis S.; Reme, Henri; Fazakerley, Andrew;
2011-01-01
Earth's bow shock is a transition layer that causes an irreversible change in the state of plasma that is stationary in time. Theories predict entropy increases across the bow shock but entropy has never been directly measured. Cluster and Double Star plasma experiments measure 3D plasma distributions upstream and downstream of the bow shock that allow calculation of Boltzmann's entropy function H and his famous H-theorem, dH/dt O. We present the first direct measurements of entropy density changes across Earth's bow shock. We will show that this entropy generation may be part of the processes that produce the non-thermal plasma distributions is consistent with a kinetic entropy flux model derived from the collisionless Boltzmann equation, giving strong support that solar wind's total entropy across the bow shock remains unchanged. As far as we know, our results are not explained by any existing shock models and should be of interests to theorists.
Nonextensive random-matrix theory based on Kaniadakis entropy
Abul-Magd, A. Y.
2007-02-01
The joint eigenvalue distributions of random-matrix ensembles are derived by applying the principle maximum entropy to the Rényi, Abe and Kaniadakis entropies. While the Rényi entropy produces essentially the same matrix-element distributions as the previously obtained expression by using the Tsallis entropy, and the Abe entropy does not lead to a closed form expression, the Kaniadakis entropy leads to a new generalized form of the Wigner surmise that describes a transition of the spacing distribution from chaos to order. This expression is compared with the corresponding expression obtained by assuming Tsallis' entropy as well as the results of a previous numerical experiment.
Nonextensive random-matrix theory based on Kaniadakis entropy
Energy Technology Data Exchange (ETDEWEB)
Abul-Magd, A.Y. [Department of Mathematics, Faculty of Science, Zagazig University, Zagazig (Egypt)]. E-mail: a_y_abul_magd@hotmail.com
2007-02-12
The joint eigenvalue distributions of random-matrix ensembles are derived by applying the principle maximum entropy to the Renyi, Abe and Kaniadakis entropies. While the Renyi entropy produces essentially the same matrix-element distributions as the previously obtained expression by using the Tsallis entropy, and the Abe entropy does not lead to a closed form expression, the Kaniadakis entropy leads to a new generalized form of the Wigner surmise that describes a transition of the spacing distribution from chaos to order. This expression is compared with the corresponding expression obtained by assuming Tsallis' entropy as well as the results of a previous numerical experiment.
Kruglikov, Boris; Rypdal, Martin
2005-01-01
The topological entropy of piecewise affine maps is studied. It is shown that singularities may contribute to the entropy only if there is angular expansion and we bound the entropy via the expansion rates of the map. As a corollary we deduce that non-expanding conformal piecewise affine maps have zero topological entropy. We estimate the entropy of piecewise affine skew-products. Examples of abnormal entropy growth are provided.
Emission and Absorption Entropy Generation in Semiconductors
Reck, Kasper; Varpula, Aapo; Prunnila, Mika; Hansen, Ole
2013-01-01
While emission and absorption entropy generation is well known in black bodies, it has not previously been studied in semiconductors, even though semiconductors are widely used for solar light absorption in modern solar cells [1]. We present an analysis of the entropy generation in semiconductor materials due to emission and absorption of electromagnetic radiation. It is shown that the emission and absorption entropy generation reduces the fundamental limit on the efficiency of any semiconduc...
Zucker, M. H.
This paper is a critical analysis and reassessment of entropic functioning as it applies to the question of whether the ultimate fate of the universe will be determined in the future to be "open" (expanding forever to expire in a big chill), "closed" (collapsing to a big crunch), or "flat" (balanced forever between the two). The second law of thermodynamics declares that entropy can only increase and that this principle extends, inevitably, to the universe as a whole. This paper takes the position that this extension is an unwarranted projection based neither on experience nonfact - an extrapolation that ignores the powerful effect of a gravitational force acting within a closed system. Since it was originally presented by Clausius, the thermodynamic concept of entropy has been redefined in terms of "order" and "disorder" - order being equated with a low degree of entropy and disorder with a high degree. This revised terminology more subjective than precise, has generated considerable confusion in cosmology in several critical instances. For example - the chaotic fireball of the big bang, interpreted by Stephen Hawking as a state of disorder (high entropy), is infinitely hot and, thermally, represents zero entropy (order). Hawking, apparently focusing on the disorderly "chaotic" aspect, equated it with a high degree of entropy - overlooking the fact that the universe is a thermodynamic system and that the key factor in evaluating the big-bang phenomenon is the infinitely high temperature at the early universe, which can only be equated with zero entropy. This analysis resolves this confusion and reestablishes entropy as a cosmological function integrally linked to temperature. The paper goes on to show that, while all subsystems contained within the universe require external sources of energization to have their temperatures raised, this requirement does not apply to the universe as a whole. The universe is the only system that, by itself can raise its own
Takeguchi, Tatsuya; Yamanaka, Toshiro; Asakura, Kiyotaka; Muhamad, Ernee Noryana; Uosaki, Kohei; Ueda, Wataru
2012-09-05
A randomly mixed monodispersed nanosized Pt-Ru catalyst, an ultimate catalyst for CO oxidation reaction, was prepared by the rapid quenching method. The mechanism of CO oxidation reaction on the Pt-Ru anode catalyst was elucidated by investigating the relation between the rate of CO oxidation reaction and the current density. The rate of CO oxidation reaction increased with an increase in unoccupied sites kinetically formed by hydrogen oxidation reaction, and the rate was independent of anode potential. Results of extended X-ray absorption fine structure spectroscopy showed the combination of N(Pt-Ru)/(N(Pt-Ru) + N(Pt-Pt)) ≑ M(Ru)/(M(Pt) + M(Ru)) and N(Ru-Pt)/(N(Ru-Pt) + N(Ru-Ru)) ≑ M(Pt)/(M(Ru) + M(Pt)), where N(Pt-Ru)(N(Ru-Pt)), N(Pt-Pt)(N(Ru-Ru)), M(Pt), and M(Ru) are the coordination numbers from Pt(Ru) to Ru(Pt) and Pt (Ru) to Pt (Ru) and the molar ratios of Pt and Ru, respectively. This indicates that Pt and Ru were mixed with a completely random distribution. A high-entropy state of dispersion of Pt and Ru could be maintained by rapid quenching from a high temperature. It is concluded that a nonelectrochemical shift reaction on a randomly mixed Pt-Ru catalyst is important to enhance the efficiency of residential fuel cell systems under operation conditions.
Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways
Directory of Open Access Journals (Sweden)
Keting Hu
2016-03-01
Full Text Available In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE, Wavelet Packet Energy Tsallis Entropy (WPETE with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6, Empirical Mode Decomposition Shannon Entropy (EMDESE, and Empirical Mode Decomposition Tsallis Entropy (EMDETE with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT. The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.
Closing the Gap GEF Experiences in Global Energy Efficiency
Yang, Ming
2013-01-01
Energy efficiency plays and will continue to play an important role in the world to save energy and mitigate greenhouse gas (GHG) emissions. However, little is known on how much additional capital should be invested to ensure using energy efficiently as it should be, and very little is known which sub-areas, technologies, and countries shall achieve maximum greenhouse gas emissions mitigation per dollar of investment in energy efficiency worldwide. Analyzing completed and slowly moving energy efficiency projects by the Global Environment Facility during 1991-2010, Closing the Gap: GEF Experiences in Global Energy Efficiency evaluates impacts of multi-billion-dollar investments in the world energy efficiency. It covers the following areas: 1. Reviewing the world energy efficiency investment and disclosing the global energy efficiency gap and market barriers that cause the gap; 2. Leveraging private funds with public funds and other resources in energy efficiency investments; using...
Sparavigna, Amelia Carolina
2015-01-01
Entropy has a relevant role in several applications of information theory and in the image processing. Here, we discuss the Kaniadakis entropy for images. An example of bi-level image thresholding obtained by means of this entropy is also given. Keywords: Kaniadakis Entropy, Data Segmentation, Image processing, Thresholding
Energy Technology Data Exchange (ETDEWEB)
Letessier, J.; Tounsi, A. [Paris-7 Univ., 75 (France); Rafelski, J. [Arizona Univ., Tucson, AZ (United States). Dept. of Physics
1994-04-01
Entropy is a quantity characterizing the arrow of time in the evolution of a physical system - in every irreversible process the entropy increases. In elementary interactions such as relativistic collision of two atomic nuclei there is considerable particle production and hence entropy production. We address here a number of questions which arise naturally in this context. When and how is entropy produced in a quantum process, such as is a nuclear collision? How is the particle production related to entropy production? How does one measure the entropy produced in the reaction? We also consider certain fundamental approaches to the problem of entropy definition in quantum physics. (author). 15 refs., 5 figs.
Entropy generation of micropolar fluid flow in an inclined porous ...
Indian Academy of Sciences (India)
D Srinivasacharya
collectors and geothermal energy systems depend on entropy generation. The concept of entropy generation rate in flow and thermal systems was introduced by Bejan [1]. It is observed that the thermal system efficiency is enhanced by minimizing the entropy generation of the system [2–4]. The flow through ducts or pipes is ...
Efficiency Improvements of Antenna Optimization Using Orthogonal Fractional Experiments
Directory of Open Access Journals (Sweden)
Yen-Sheng Chen
2015-01-01
Full Text Available This paper presents an extremely efficient method for antenna design and optimization. Traditionally, antenna optimization relies on nature-inspired heuristic algorithms, which are time-consuming due to their blind-search nature. In contrast, design of experiments (DOE uses a completely different framework from heuristic algorithms, reducing the design cycle by formulating the surrogates of a design problem. However, the number of required simulations grows exponentially if a full factorial design is used. In this paper, a much more efficient technique is presented to achieve substantial time savings. By using orthogonal fractional experiments, only a small subset of the full factorial design is required, yet the resultant response surface models are still effective. The capability of orthogonal fractional experiments is demonstrated through three examples, including two tag antennas for radio-frequency identification (RFID applications and one internal antenna for long-term-evolution (LTE handheld devices. In these examples, orthogonal fractional experiments greatly improve the efficiency of DOE, thereby facilitating the antenna design with less simulation runs.
The minimum entropy principle and task performance.
Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten; Weinberger, Kelsey
2013-07-01
According to the minimum entropy principle, efficient cognitive performance is produced with a neurocognitive strategy that involves a minimum of degrees of freedom. Although high performance is often regarded as consistent performance as well, some variability in performance still remains which allows the person to adapt to changing goal conditions or fatigue. The present study investigated the connection between performance, entropy in performance, and four task-switching strategies. Fifty-one undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The temporal patterns of performance were analyzed using orbital decomposition to extract pattern types and lengths, which were then compared with regard to Shannon entropy, topological entropy, and overall performance. Task switching strategies from a previous study were available for the same participants as well. Results indicated that both topological entropy and Shannon entropy were negatively correlated with performance. Some task-switching strategies produced lower entropy in performance than others. Stepwise regression showed that the top three predictors of performance were Shannon entropy and arithmetic and spatial abilities. Additional implications for the prediction of work performance with cognitive ability measurements and the applicability of the minimum entropy principle to multidimensional performance criteria and team work are discussed.
Ehrenfest's Lottery--Time and Entropy Maximization
Ashbaugh, Henry S.
2010-01-01
Successful teaching of the Second Law of Thermodynamics suffers from limited simple examples linking equilibrium to entropy maximization. I describe a thought experiment connecting entropy to a lottery that mixes marbles amongst a collection of urns. This mixing obeys diffusion-like dynamics. Equilibrium is achieved when the marble distribution is…
Entropy Generation Analysis of Desalination Technologies
Directory of Open Access Journals (Sweden)
John H. Lienhard V
2011-09-01
Full Text Available Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies. Entropy generation analysis, and specifically, Second Law efficiency, is an important tool for illustrating the influence of irreversibilities within a system on the required energy input. When defining Second Law efficiency, the useful exergy output of the system must be properly defined. For desalination systems, this is the minimum least work of separation required to extract a unit of water from a feed stream of a given salinity. In order to evaluate the Second Law efficiency, entropy generation mechanisms present in a wide range of desalination processes are analyzed. In particular, entropy generated in the run down to equilibrium of discharge streams must be considered. Physical models are applied to estimate the magnitude of entropy generation by component and individual processes. These formulations are applied to calculate the total entropy generation in several desalination systems including multiple effect distillation, multistage flash, membrane distillation, mechanical vapor compression, reverse osmosis, and humidification-dehumidification. Within each technology, the relative importance of each source of entropy generation is discussed in order to determine which should be the target of entropy generation minimization. As given here, the correct application of Second Law efficiency shows which systems operate closest to the reversible limit and helps to indicate which systems have the greatest potential for improvement.
Directory of Open Access Journals (Sweden)
George J. A. Jiang
2015-01-01
Full Text Available Electroencephalogram (EEG signals, as it can express the human brain’s activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA. Bispectral (BIS index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD method and analyzed using sample entropy (SampEn analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN model through using expert assessment of consciousness level (EACL which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.
Mutual Information and Nonadditive Entropies: A Method for Kaniadakis Entropy
Sparavigna, Amelia Carolina
2015-01-01
In [10.18483/ijSci.8451], we have discussed the mutual information of two random variables and how it can be obtained from entropies. We considered the Shannon entropy and the nonadditive Tsallis entropy. Here, following the same approach used in the Tsallis case, we propose a method for discussing the mutual entropy of another nonadditive entropy, the Kaniadakis entropy
Entropy analysis in foreign exchange markets and economic crisis
Ha, Jin-Gi; Yim, Kyubin; Kim, Seunghwan; Jung, Woo-Sung
2012-08-01
We investigate the relative market efficiency in 11 foreign exchange markets by using the Lempel-Ziv (LZ) complexity algorithm and several entropy values such as the Shannon entropy, the approximate entropy, and the sample entropy. With daily data in 11 foreign exchange markets from Jan. 2000 to Sep. 2011, we observe that mature markets have higher LZ complexities and entropy values than emerging markets. Furthermore, with sliding time windows, we also investigate the temporal evolutions of those entropies from Jan. 1994 to Sep. 2011, and we find that, after an economic crisis, the approximate entropy and the sample entropy of mature markets such as Japan, Europe and the United Kingdom suddenly become lower.
Ben-Naim, Arieh
2011-01-01
Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)
Entropy generation across Earth's collisionless bow shock.
Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A
2012-02-10
Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.
Training Concept, Evolution Time, and the Maximum Entropy Production Principle
Directory of Open Access Journals (Sweden)
Alexey Bezryadin
2016-04-01
Full Text Available The maximum entropy production principle (MEPP is a type of entropy optimization which demands that complex non-equilibrium systems should organize such that the rate of the entropy production is maximized. Our take on this principle is that to prove or disprove the validity of the MEPP and to test the scope of its applicability, it is necessary to conduct experiments in which the entropy produced per unit time is measured with a high precision. Thus we study electric-field-induced self-assembly in suspensions of carbon nanotubes and realize precise measurements of the entropy production rate (EPR. As a strong voltage is applied the suspended nanotubes merge together into a conducting cloud which produces Joule heat and, correspondingly, produces entropy. We introduce two types of EPR, which have qualitatively different significance: global EPR (g-EPR and the entropy production rate of the dissipative cloud itself (DC-EPR. The following results are obtained: (1 As the system reaches the maximum of the DC-EPR, it becomes stable because the applied voltage acts as a stabilizing thermodynamic potential; (2 We discover metastable states characterized by high, near-maximum values of the DC-EPR. Under certain conditions, such efficient entropy-producing regimes can only be achieved if the system is allowed to initially evolve under mildly non-equilibrium conditions, namely at a reduced voltage; (3 Without such a “training” period the system typically is not able to reach the allowed maximum of the DC-EPR if the bias is high; (4 We observe that the DC-EPR maximum is achieved within a time, Te, the evolution time, which scales as a power-law function of the applied voltage; (5 Finally, we present a clear example in which the g-EPR theoretical maximum can never be achieved. Yet, under a wide range of conditions, the system can self-organize and achieve a dissipative regime in which the DC-EPR equals its theoretical maximum.
Directory of Open Access Journals (Sweden)
Dong-xue Xia
2014-01-01
Full Text Available Infrared images are fuzzy and noisy by nature; thus the segmentation of human targets in infrared images is a challenging task. In this paper, a fast thresholding method of infrared human images based on two-dimensional fuzzy Tsallis entropy is introduced. First, to address the fuzziness of infrared image, the fuzzy Tsallis entropy of objects and that of background are defined, respectively, according to probability partition principle. Next, this newly defined entropy is extended to two dimensions to make good use of spatial information to deal with the noise in infrared images, and correspondingly a fast computation method of two-dimensional fuzzy Tsallis entropy is put forward to reduce its computation complexity from O(L2 to O(L. Finally, the optimal parameters of fuzzy membership function are searched by shuffled frog-leaping algorithm following maximum entropy principle, and then the best threshold of an infrared human image is computed from the optimal parameters. Compared with typical entropy-based thresholding methods by experiments, the method presented in this paper is proved to be more efficient and robust.
Configurational Entropy Revisited
Lambert, Frank L.
2007-09-01
Entropy change is categorized in some prominent general chemistry textbooks as being either positional (configurational) or thermal. In those texts, the accompanying emphasis on the dispersal of matter—independent of energy considerations and thus in discord with kinetic molecular theory—is most troubling. This article shows that the variants of entropy can be treated from a unified viewpoint and argues that to decrease students' confusion about the nature of entropy change these variants of entropy should be merged. Molecular energy dispersal in space is implicit but unfortunately tacit in the cell models of statistical mechanics that develop the configurational entropy change in gas expansion, fluids mixing, or the addition of a non-volatile solute to a solvent. Two factors are necessary for entropy change in chemistry. An increase in thermodynamic entropy is enabled in a process by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change). However, entropy increase is only actualized if the process results in a larger number of arrangements for the system's energy, that is, a final state that involves the most probable distribution for that energy under the new constraints. Positional entropy should be eliminated from general chemistry instruction and, especially benefiting "concrete minded" students, it should be replaced by emphasis on the motional energy of molecules as enabling entropy change.
Entropy estimates for simple random fields
DEFF Research Database (Denmark)
Forchhammer, Søren; Justesen, Jørn
1995-01-01
We consider the problem of determining the maximum entropy of a discrete random field on a lattice subject to certain local constraints on symbol configurations. The results are expected to be of interest in the analysis of digitized images and two dimensional codes. We shall present some examples...... of binary and ternary fields with simple constraints. Exact results on the entropies are known only in a few cases, but we shall present close bounds and estimates that are computationally efficient...
Energy Technology Data Exchange (ETDEWEB)
Santos, A.P., E-mail: alysonpaulo@dfte.ufrn.br [Universidade Federal do Rio Grande do Norte, Departamento de Fisica, Natal, RN 59072-970 (Brazil); Silva, R., E-mail: raimundosilva@dfte.ufrn.br [Universidade Federal do Rio Grande do Norte, Departamento de Fisica, Natal, RN 59072-970 (Brazil); Universidade do Estado Rio Grande do Norte, Departamento de Fisica, Mossoro, RN 59610-210 (Brazil); Alcaniz, J.S., E-mail: alcaniz@on.br [Observatorio Nacional, Rio de Janeiro, RJ 20921-400 (Brazil); Anselmo, D.H.A.L., E-mail: doryh@dfte.ufrn.br [Universidade Federal do Rio Grande do Norte, Departamento de Fisica, Natal, RN 59072-970 (Brazil)
2011-08-15
A deduction of generalized quantum entropies within the Tsallis and Kaniadakis frameworks is derived using a generalization of the ordinary multinomial coefficient. This generalization is based on the respective deformed multiplication and division. We show that the two above entropies are consistent with ones arbitrarily assumed at other contexts. -- Highlights: → Derivation of generalized quantum entropies. → Generalized combinatorial method. → Non-Gaussian quantum statistics.
Feedback cooling, measurement errors, and entropy production
Munakata, T.; Rosinberg, M. L.
2013-06-01
The efficiency of a feedback mechanism depends on the precision of the measurement outcomes obtained from the controlled system. Accordingly, measurement errors affect the entropy production in the system. We explore this issue in the context of active feedback cooling by modeling a typical cold damping setup as a harmonic oscillator in contact with a heat reservoir and subjected to a velocity-dependent feedback force that reduces the random motion. We consider two models that distinguish whether the sensor continuously measures the position of the resonator or directly its velocity (in practice, an electric current). Adopting the standpoint of the controlled system, we identify the ‘entropy pumping’ contribution that describes the entropy reduction due to the feedback control and that modifies the second law of thermodynamics. We also assign a relaxation dynamics to the feedback mechanism and compare the apparent entropy production in the system and the heat bath (under the influence of the controller) to the total entropy production in the super-system that includes the controller. In this context, entropy pumping reflects the existence of hidden degrees of freedom and the apparent entropy production satisfies fluctuation theorems associated with an effective Langevin dynamics.
Some Relations Among Entropy Measures
Sparavigna, Amelia Carolina
2015-01-01
Several entropies are generalizing the Shannon entropy and have it as their limit as entropic indices approach specific values. Here we discuss some relations existing among Rényi, Tsallis and Kaniadakis entropies and show how the Shannon entropy becomes the limit of Kaniadakis entropy
An instructive model of entropy
Zimmerman, Seth
2010-09-01
This article first notes the misinterpretation of a common thought experiment, and the misleading comment that 'systems tend to flow from less probable to more probable macrostates'. It analyses the experiment, generalizes it and introduces a new tool of investigation, the simplectic structure. A time-symmetric model is built upon this structure, yielding several non-intuitive results. The approach is combinatorial rather than statistical, and assumes that entropy is equivalent to 'missing information'. The intention of this article is not only to present interesting results, but also, by deliberately starting with a simple example and developing it through proof and computer simulation, to clarify the often confusing subject of entropy. The article should be particularly stimulating to students and instructors of discrete mathematics or undergraduate physics.
Indian Academy of Sciences (India)
lay in uncovering the microscopic meaning of entropy, in answering the ... mann's life and work. More on the subject can be found in the various 'entropy articles' in this special issue dedicated to him, as well as in others to which I refer at the end. Ludwig Eduard .... to suicide partly because of a fear that his ideas were not ...
Gravitational Entropy and Inflation
Directory of Open Access Journals (Sweden)
Øystein Elgarøy
2013-09-01
Full Text Available The main topic of this paper is a description of the generation of entropy at the end of the inflationary era. As a generalization of the present standard model of the Universe dominated by pressureless dust and a Lorentz invariant vacuum energy (LIVE, we first present a flat Friedmann universe model, where the dust is replaced with an ideal gas. It is shown that the pressure of the gas is inversely proportional to the fifth power of the scale factor and that the entropy in a comoving volume does not change during the expansion. We then review different measures of gravitational entropy related to the Weyl curvature conjecture and calculate the time evolution of two proposed measures of gravitational entropy in a LIVE-dominated Bianchi type I universe, and a Lemaitre-Bondi-Tolman universe with LIVE. Finally, we elaborate upon a model of energy transition from vacuum energy to radiation energy, that of Bonanno and Reuter, and calculate the time evolution of the entropies of vacuum energy and radiation energy. We also calculate the evolution of the maximal entropy according to some recipes and demonstrate how a gap between the maximal entropy and the actual entropy opens up at the end of the inflationary era.
Santos, A. P.; Silva, R.; Alcaniz, J. S.; Anselmo, D. H. A. L.
2011-08-01
A deduction of generalized quantum entropies within the Tsallis and Kaniadakis frameworks is derived using a generalization of the ordinary multinomial coefficient. This generalization is based on the respective deformed multiplication and division. We show that the two above entropies are consistent with ones arbitrarily assumed at other contexts.
Energy Efficiency Strategies for Ecological Greenhouses: Experiences from Murcia (Spain
Directory of Open Access Journals (Sweden)
Hilario Becerril
2016-10-01
Full Text Available There has been a continuous growth in ecological agriculture (EA in recent years. It is recognized as a production system with rational energy use and low demand for fossil fuels. There are many studies relating to this subject, in contrast to the few studies regarding the use of energy and its impact on the environment in ecological greenhouses. This article analyzes the strategies adopted by a Transformational Agricultural Society (Sociedad Agraria de Transformación in order to improve energy efficiency in ecological greenhouses, with regards to the use of fossil fuels. The methodology is based on the Working With People (WWP Model, which involves social learning processes over 30 years in one of the largest regions of ecological crops in Spain. The results show that the measures taken to manage the greenhouses have achieved a decrease of over 80% in terms of fossil fuel consumption. The experience demonstrates that EA, as opposed to conventional agriculture (CA, is a system with great potential when it comes to reducing energy consumption and environmental improvements through various strategies.
Quench action and Rényi entropies in integrable systems
Alba, Vincenzo; Calabrese, Pasquale
2017-09-01
Entropy is a fundamental concept in equilibrium statistical mechanics, yet its origin in the nonequilibrium dynamics of isolated quantum systems is not fully understood. A strong consensus is emerging around the idea that the stationary thermodynamic entropy is the von Neumann entanglement entropy of a large subsystem embedded in an infinite system. Also motivated by cold-atom experiments, here we consider the generalization to Rényi entropies. We develop a new technique to calculate the diagonal Rényi entropy in the quench action formalism. In the spirit of the replica treatment for the entanglement entropy, the diagonal Rényi entropies are generalized free energies evaluated over a thermodynamic macrostate which depends on the Rényi index and, in particular, is not the same state describing von Neumann entropy. The technical reason for this perhaps surprising result is that the evaluation of the moments of the diagonal density matrix shifts the saddle point of the quench action. An interesting consequence is that different Rényi entropies encode information about different regions of the spectrum of the postquench Hamiltonian. Our approach provides a very simple proof of the long-standing issue that, for integrable systems, the diagonal entropy is half of the thermodynamic one and it allows us to generalize this result to the case of arbitrary Rényi entropy.
Nonsymmetric entropy I: basic concepts and results
Liu, Chengshi
2006-01-01
A new concept named nonsymmetric entropy which generalizes the concepts of Boltzman's entropy and shannon's entropy, was introduced. Maximal nonsymmetric entropy principle was proven. Some important distribution laws were derived naturally from maximal nonsymmetric entropy principle.
Conditional Kaniadakis Entropy: a Preliminary Discussion
Sparavigna, Amelia Carolina
2015-01-01
Conditional entropies are fundamental for evaluating the mutual information of random variables. These entropies must be properly defined in the case of nonadditive entropies. Here, we propose the conditional entropy for one of them, the Kaniadakis entropy
Entropy coherent and entropy convex measures of risk
Laeven, R.J.A.; Stadje, M.
2013-01-01
We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. Entropy coherent and entropy convex measures of risk are special cases of φ-coherent and φ-convex measures of risk. Contrary to the classical use of coherent and convex
Entropy of the Mixture of Sources and Entropy Dimension
Smieja, Marek; Tabor, Jacek
2011-01-01
We investigate the problem of the entropy of the mixture of sources. There is given an estimation of the entropy and entropy dimension of convex combination of measures. The proof is based on our alternative definition of the entropy based on measures instead of partitions.
Entropy Coherent and Entropy Convex Measures of Risk
Laeven, R.J.A.; Stadje, M.A.
2011-01-01
We introduce two subclasses of convex measures of risk, referred to as entropy coherent and entropy convex measures of risk. We prove that convex, entropy convex and entropy coherent measures of risk emerge as certainty equivalents under variational, homothetic and multiple priors preferences,
Sensitivity of Tropical Cyclone Spinup Time to the Initial Entropy Deficit
Tang, B.; Corbosiero, K. L.; Rios-Berrios, R.; Alland, J.; Berman, J.
2014-12-01
The development timescale of a tropical cyclone from genesis to the start of rapid intensification in an axisymmetric model is hypothesized to be a function of the initial entropy deficit. We run a set of idealized simulations in which the initial entropy deficit between the boundary layer and free troposphere varies from 0 to 100 J kg-1 K-1. The development timescale is measured by changes in the integrated kinetic energy of the low-level vortex. This timescale is inversely related to the mean mass flux during the tropical cyclone gestation period. The mean mass flux, in turn, is a function of the statistics of convective updrafts and downdrafts. Contour frequency by altitude diagrams show that entrainment of dry air into updrafts is predominately responsible for differences in the mass flux between the experiments, while downdrafts play a secondary role. Analyses of the potential and kinetic energy budgets indicate less efficient conversion of available potential energy to kinetic energy in the experiments with higher entropy deficits. Entrainment leads to the loss of buoyancy and the destruction of available potential energy. In the presence of strong downdrafts, there can even be a reversal of the conversion term. Weaker and more radially confined radial inflow results in less convergence of angular momentum in the experiments with higher entropy deficits. The result is a slower vortex spinup and a reduction in steady-state vortex size, despite similar steady-state maximum intensities among the experiments.
Directory of Open Access Journals (Sweden)
Leonid M. Martyushev
2015-06-01
Full Text Available The entropy production (inside the volume bounded by a photosphere of main-sequence stars, subgiants, giants, and supergiants is calculated based on B–V photometry data. A non-linear inverse relationship of thermodynamic fluxes and forces as well as an almost constant specific (per volume entropy production of main-sequence stars (for 95% of stars, this quantity lies within 0.5 to 2.2 of the corresponding solar magnitude is found. The obtained results are discussed from the perspective of known extreme principles related to entropy production.
Entropy Associated with Information Storage and Its Retrieval
Directory of Open Access Journals (Sweden)
Abu Mohamed Alhasan
2015-08-01
Full Text Available We provide an entropy analysis for light storage and light retrieval. In this analysis, entropy extraction and reduction in a typical light storage experiment are identified. The spatiotemporal behavior of entropy is presented for D1 transition in cold sodium atoms. The governing equations are the reduced Maxwell field equations and the Liouville–von Neumann equation for the density matrix of the dressed atom.
Engineering Entropy for Colloidal Design
Geng, Yina; Anders, Greg Van; Dodd, Paul M.; Glotzer, Sharon C.; Glotzer group Collaboration
The inverse design of target material structures is a fundamental challenge. Here, we demonstrate the direct inverse design of soft materials for target crystal structures using entropy alone. Our approach does not require any geometric ansatz. Instead, it efficiently samples 92- or 188-dimensional building-block parameter spaces to determine thermodynamically optimal shapes. We present detailed data for optimal particle characteristics and parameter tolerances for six target structures. Our results demonstrate a general, rational, and precise method for engineering new colloidal materials, and will guide nanoparticle synthesis to realize these materials.
Energy Technology Data Exchange (ETDEWEB)
Estes, John [Blackett Laboratory, Imperial College,London SW7 2AZ (United Kingdom); Jensen, Kristan [Department of Physics and Astronomy, University of Victoria,Victoria, BC V8W 3P6 (Canada); C.N. Yang Institute for Theoretical Physics, SUNY Stony Brook,Stony Brook, NY 11794-3840 (United States); O’Bannon, Andy [Rudolf Peierls Centre for Theoretical Physics, University of Oxford,1 Keble Road, Oxford OX1 3NP (United Kingdom); Tsatis, Efstratios [8 Kotylaiou Street, Athens 11364 (Greece); Wrase, Timm [Stanford Institute for Theoretical Physics, Stanford University,Stanford, CA 94305 (United States)
2014-05-19
We study a number of (3+1)- and (2+1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3+1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1+1)-dimensional field theories generalizes to higher dimensions.
Estes, John; Jensen, Kristan; O'Bannon, Andy; Tsatis, Efstratios; Wrase, Timm
2014-05-01
We study a number of (3 + 1)- and (2 + 1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3 + 1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1 + 1)-dimensional field theories generalizes to higher dimensions.
Entropy of international trades
Oh, Chang-Young; Lee, D.-S.
2017-05-01
The organization of international trades is highly complex under the collective efforts towards economic profits of participating countries given inhomogeneous resources for production. Considering the trade flux as the probability of exporting a product from a country to another, we evaluate the entropy of the world trades in the period 1950-2000. The trade entropy has increased with time, and we show that it is mainly due to the extension of trade partnership. For a given number of trade partners, the mean trade entropy is about 60% of the maximum possible entropy, independent of time, which can be regarded as a characteristic of the trade fluxes' heterogeneity and is shown to be derived from the scaling and functional behaviors of the universal trade-flux distribution. The correlation and time evolution of the individual countries' gross-domestic products and the number of trade partners show that most countries achieved their economic growth partly by extending their trade relationship.
Stabilities of generalized entropies
Energy Technology Data Exchange (ETDEWEB)
Abe, Sumiyoshi [Institute of Physics, University of Tsukuba, Ibaraki 305-8571 (Japan); Kaniadakis, G [Dipartimento di Fisica and Istituto Nazionale di Fisica della Materia (INFM), Politecnico di Torino, Corso Duca degli Abruzzi 24, I-10129 Torino (Italy); Scarfone, A M [Dipartimento di Fisica and Istituto Nazionale di Fisica della Materia (INFM), Politecnico di Torino, Corso Duca degli Abruzzi 24, I-10129 Torino (Italy)
2004-11-05
The generalized entropic measure, which is maximized by a given arbitrary distribution under the constraints on normalization of the distribution and the finite ordinary expectation value of a physical random quantity, is considered. To examine if it can be of physical relevance, its experimental robustness is discussed. In particular, Lesche's criterion is analysed, which states that an entropic measure is stable if its change under an arbitrary weak deformation of the distribution (representing fluctuations of experimental data) remains small. It is essential to note the difference between this criterion and thermodynamic stability. A general condition, under which the generalized entropy becomes stable, is derived. Examples known in the literature, including the entropy for the stretched-exponential distribution, the quantum-group entropy and the {kappa}-entropy are discussed.
Stabilities of generalized entropies
Abe, Sumiyoshi; Kaniadakis, G.; Scarfone, A. M.
2004-01-01
The generalized entropic measure, which is optimized by a given arbitrary distribution under the constraints on normalization of the distribution and the finite ordinary expectation value of a physical random quantity, is considered and its Lesche stability property (that is different from thermodynamic stability) is examined. A general condition, under which the generalized entropy becomes stable, is derived. Examples known in the literature, including the entropy for the stretched-exponenti...
Productive efficiency of rural health clinics: the Midwest experience.
Sinay, T
2001-01-01
This article identifies the characteristics of efficient and inefficient rural clinics in the Midwest, using 1994 Medicare cost reports. Rural health clinics are compared on the basis of productive efficiency by estimating a nonparametric frontier. Six inputs and five output categories were employed to estimate an efficient frontier. The results show that an efficient clinic, on average, employs approximately 1.5 more physicians than an inefficient clinic and incurs capital expenses more than twice those of the inefficient clinic. Future rural clinics are expected to be larger, employing more capital and labor to take advantage of scale economies. However, given the steady (or decreasing) population of rural communities, the expansion of relatively small rural clinics could involve forming rural health care systems and/or networks in close proximity to create synergies from scale economies, staff recruitment, easier access to capital, shared information systems, improved mobility of physicians among several clinics and savings from management costs.
Entropy Is Simple, Qualitatively
Lambert, Frank L.
2002-10-01
Qualitatively, entropy is simple. What it is, why it is useful in understanding the behavior of macro systems or of molecular systems is easy to state: Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. The conventional q in qrev/T is the energy dispersed to or from a substance or a system. On a molecular basis, entropy increase means that a system changes from having fewer accessible microstates to having a larger number of accessible microstates. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental "disorder" as a descriptor of entropy change. The foregoing in no way denies the subtlety or the difficulty presented by entropy in thermodynamics—to first-year students or to professionals. However, as an aid to beginners in their quantitative study of thermodynamics, the qualitative conclusions in this article give students the advantage of a clear bird’s-eye view of why entropy increases in a wide variety of basic cases: a substance going from 0 K to T, phase change, gas expansion, mixing of ideal gases or liquids, colligative effects, and the Gibbs equation. See Letter re: this article.
Entropy, matter, and cosmology.
Prigogine, I; Géhéniau, J
1986-09-01
The role of irreversible processes corresponding to creation of matter in general relativity is investigated. The use of Landau-Lifshitz pseudotensors together with conformal (Minkowski) coordinates suggests that this creation took place in the early universe at the stage of the variation of the conformal factor. The entropy production in this creation process is calculated. It is shown that these dissipative processes lead to the possibility of cosmological models that start from empty conditions and gradually build up matter and entropy. Gravitational entropy takes a simple meaning as associated to the entropy that is necessary to produce matter. This leads to an extension of the third law of thermodynamics, as now the zero point of entropy becomes the space-time structure out of which matter is generated. The theory can be put into a convenient form using a supplementary "C" field in Einstein's field equations. The role of the C field is to express the coupling between gravitation and matter leading to irreversible entropy production.
Reforms, Productivity, and Efficiency in Banking: The Indian Experience
Rakesh Mohan
2005-01-01
India embarked on a strategy of economic reforms in the wake of a serious balance-ofpayments crisis in 1991. A central plank of the reforms was reform in the financial sector and, with banks being the mainstay of financial intermediation, the banking sector. The objective of the banking sector reforms was to promote a diversified, efficient and competitive financial system with the ultimate objective of improving the allocative efficiency of resources through operational flexibility, improved...
Autonomous entropy-based intelligent experimental design
Malakar, Nabin Kumar
2011-07-01
The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same
On the Conditional Rényi Entropy
S. Fehr (Serge); S. Berens (Stefan)
2014-01-01
htmlabstractThe Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy notions, like the min-entropy or the collision entropy. In contrast to the Shannon entropy, there seems to be no commonly accepted definition for the conditional Rényi entropy: several
Acoustic Entropy of the Materials in the Course of Degradation
Directory of Open Access Journals (Sweden)
Ali Kahirdeh
2016-07-01
Full Text Available We report experimental observations on the evolution of acoustic entropy in the course of cyclic loading as degradation occurs due to fatigue. The measured entropy is a result of the materials’ microstructural changes that occur as degradation due to cyclic mechanical loading. Experimental results demonstrate that maximum acoustic entropy emanating from materials during the course of degradation remains similar. Experiments are shown for two different types of materials: Aluminum 6061 (a metallic alloy and glass/epoxy (a composite laminate. The evolution of the acoustic entropy demonstrates a persistent trend over the course of degradation.
Converting Sunlight to Mechanical Energy: A Polymer Example of Entropy.
Mathias, Lon J.
1987-01-01
This experiment/demonstration provides elementary through high school science students with hands-on experience with polymer entropy. Construction of a simple machine for converting light into mechanical energy is described. (RH)
Emission and Absorption Entropy Generation in Semiconductors
DEFF Research Database (Denmark)
Reck, Kasper; Varpula, Aapo; Prunnila, Mika
2013-01-01
materials due to emission and absorption of electromagnetic radiation. It is shown that the emission and absorption entropy generation reduces the fundamental limit on the efficiency of any semiconductor solar cell even further than the Landsberg limit. The results are derived from purely thermodynamical...
Financing energy efficiency: lessons from experiences in India and China
DEFF Research Database (Denmark)
Painuly, J.P.
2009-01-01
Purpose – Improving energy efficiency is considered one of the most desirable and effective short-term measures to address the issue of energy security, and also reduce the emission of greenhouse gases. However, lack of access to domestic finance is the major hindrance in achieving the potential ...
An entropy model with variable target
K O Jörnsten; Larsson, T; J T Lundgren; Migdalas, A.
1990-01-01
In this paper an entropy model with variable target is presented, in which target values are assumed to belong to a specified convex set, so that multiple base-year information and forecasts of future trends can be handled without prior aggregation of such information into one fixed target. Three solution methods for such a model are presented -- one cutting-plane and two search methods -- all of which utilize the fact that entropy models with fixed targets can be solved efficiently. Some com...
Entropy of balance - some recent results
Directory of Open Access Journals (Sweden)
Laxåback Gerd
2010-07-01
Full Text Available Abstract Background Entropy when applied to biological signals is expected to reflect the state of the biological system. However the physiological interpretation of the entropy is not always straightforward. When should high entropy be interpreted as a healthy sign, and when as marker of deteriorating health? We address this question for the particular case of human standing balance and the Center of Pressure data. Methods We have measured and analyzed balance data of 136 participants (young, n = 45; elderly, n = 91 comprising in all 1085 trials, and calculated the Sample Entropy (SampEn for medio-lateral (M/L and anterior-posterior (A/P Center of Pressure (COP together with the Hurst self-similariy (ss exponent α using Detrended Fluctuation Analysis (DFA. The COP was measured with a force plate in eight 30 seconds trials with eyes closed, eyes open, foam, self-perturbation and nudge conditions. Results 1 There is a significant difference in SampEn for the A/P-direction between the elderly and the younger groups Old > young. 2 For the elderly we have in general A/P > M/L. 3 For the younger group there was no significant A/P-M/L difference with the exception for the nudge trials where we had the reverse situation, A/P Eyes Open. 5 In case of the Hurst ss-exponent we have for the elderly, M/L > A/P. Conclusions These results seem to be require some modifications of the more or less established attention-constraint interpretation of entropy. This holds that higher entropy correlates with a more automatic and a less constrained mode of balance control, and that a higher entropy reflects, in this sense, a more efficient balancing.
A Discrete Constraint for Entropy Conservation and Sound Waves in Cloud-Resolving Modeling
Zeng, Xi-Ping; Tao, Wei-Kuo; Simpson, Joanne
2003-01-01
Ideal cloud-resolving models contain little-accumulative errors. When their domain is so large that synoptic large-scale circulations are accommodated, they can be used for the simulation of the interaction between convective clouds and the large-scale circulations. This paper sets up a framework for the models, using moist entropy as a prognostic variable and employing conservative numerical schemes. The models possess no accumulative errors of thermodynamic variables when they comply with a discrete constraint on entropy conservation and sound waves. Alternatively speaking, the discrete constraint is related to the correct representation of the large-scale convergence and advection of moist entropy. Since air density is involved in entropy conservation and sound waves, the challenge is how to compute sound waves efficiently under the constraint. To address the challenge, a compensation method is introduced on the basis of a reference isothermal atmosphere whose governing equations are solved analytically. Stability analysis and numerical experiments show that the method allows the models to integrate efficiently with a large time step.
An entropy-assisted musculoskeletal shoulder model.
Xu, Xu; Lin, Jia-Hua; McGorry, Raymond W
2017-04-01
Optimization combined with a musculoskeletal shoulder model has been used to estimate mechanical loading of musculoskeletal elements around the shoulder. Traditionally, the objective function is to minimize the summation of the total activities of the muscles with forces, moments, and stability constraints. Such an objective function, however, tends to neglect the antagonist muscle co-contraction. In this study, an objective function including an entropy term is proposed to address muscle co-contractions. A musculoskeletal shoulder model is developed to apply the proposed objective function. To find the optimal weight for the entropy term, an experiment was conducted. In the experiment, participants generated various 3-D shoulder moments in six shoulder postures. The surface EMG of 8 shoulder muscles was measured and compared with the predicted muscle activities based on the proposed objective function using Bhattacharyya distance and concordance ratio under different weight of the entropy term. The results show that a small weight of the entropy term can improve the predictability of the model in terms of muscle activities. Such a result suggests that the concept of entropy could be helpful for further understanding the mechanism of muscle co-contractions as well as developing a shoulder biomechanical model with greater validity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Farokhi, Saeed; Taghavi, Ray; Keshmiri, Shawn
2015-11-01
Stealth technology is developed for military aircraft to minimize their signatures. The primary attention was focused on radar signature, followed by the thermal and noise signatures of the vehicle. For radar evasion, advanced configuration designs, extensive use of carbon composites and radar-absorbing material, are developed. On thermal signature, mainly in the infra-red (IR) bandwidth, the solution was found in blended rectangular nozzles of high aspect ratio that are shielded from ground detectors. For noise, quiet and calm jets are integrated into vehicles with low-turbulence configuration design. However, these technologies are totally incapable of detecting new generation of revolutionary aircraft. These shall use all electric, distributed, propulsion system that are thermally transparent. In addition, composite skin and non-emitting sensors onboard the aircraft will lead to low signature. However, based on the second-law of thermodynamics, there is no air vehicle that can escape from leaving an entropy trail. Entropy is thus the only inevitable signature of any system, that once measured, can detect the source. By characterizing the entropy field based on its statistical properties, the source may be recognized, akin to face recognition technology. Direct measurement of entropy is cumbersome, however as a derived property, it can be easily measured. The measurement accuracy depends on the probe design and the sensors onboard. One novel air data sensor suite is introduced with promising potential to capture the entropy trail.
Entropy, color, and color rendering.
Price, Luke L A
2012-12-01
The Shannon entropy [Bell Syst. Tech J.27, 379 (1948)] of spectral distributions is applied to the problem of color rendering. With this novel approach, calculations for visual white entropy, spectral entropy, and color rendering are proposed, indices that are unreliant on the subjectivity inherent in reference spectra and color samples. The indices are tested against real lamp spectra, showing a simple and robust system for color rendering assessment. The discussion considers potential roles for white entropy in several areas of color theory and psychophysics and nonextensive entropy generalizations of the entropy indices in mathematical color spaces.
Experience with Energy Efficiency Requirements for Electrical Equipment
Energy Technology Data Exchange (ETDEWEB)
NONE
2007-07-01
This publication has been produced as part of the work programme in support of the Gleneagles Plan of Action (GPOA), where the IEA was requested to 'undertake a study to review existing global appliance standards and codes'. In accordance with the G8 request, this study investigates the coverage and impact of forms of minimum energy performance standards (MEPS) and comparative energy labelling programmes; which comprise the cornerstone of most IEA countries national energy efficiency strategy. This scope also reflects governments' aspirations to achieve ambitious targets for reducing greenhouse gas emissions. As a result, this study does not address endorsement labelling and associated voluntary programmes, although these are also important policy tools for national energy efficiency strategies.
Enzyme catalysis by entropy without Circe effect.
Kazemi, Masoud; Himo, Fahmi; Åqvist, Johan
2016-03-01
Entropic effects have often been invoked to explain the extraordinary catalytic power of enzymes. In particular, the hypothesis that enzymes can use part of the substrate-binding free energy to reduce the entropic penalty associated with the subsequent chemical transformation has been very influential. The enzymatic reaction of cytidine deaminase appears to be a distinct example. Here, substrate binding is associated with a significant entropy loss that closely matches the activation entropy penalty for the uncatalyzed reaction in water, whereas the activation entropy for the rate-limiting catalytic step in the enzyme is close to zero. Herein, we report extensive computer simulations of the cytidine deaminase reaction and its temperature dependence. The energetics of the catalytic reaction is first evaluated by density functional theory calculations. These results are then used to parametrize an empirical valence bond description of the reaction, which allows efficient sampling by molecular dynamics simulations and computation of Arrhenius plots. The thermodynamic activation parameters calculated by this approach are in excellent agreement with experimental data and indeed show an activation entropy close to zero for the rate-limiting transition state. However, the origin of this effect is a change of reaction mechanism compared the uncatalyzed reaction. The enzyme operates by hydroxide ion attack, which is intrinsically associated with a favorable activation entropy. Hence, this has little to do with utilization of binding free energy to pay the entropic penalty but rather reflects how a preorganized active site can stabilize a reaction path that is not operational in solution.
Replication and efficiency in experiments for marketable emissions permits
Energy Technology Data Exchange (ETDEWEB)
Cason, T.N. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Economics; Elliott, S.R. [Oak Ridge National Lab., TN (United States); Kundra, I. [Dept. of Energy, Washington, DC (United States). Energy Information Administration; Van Boening, M.V. [Univ. of Mississippi, Oxford, MS (United States). Dept. of Economics and Finance
1996-04-01
The Energy Information Administration (EIA) funded the universities of Colorado and Arizona to define an experimental institution that captures the salient features of the sulfur dioxide allowance market created by the Clean Air Act Amendments of 1990 (CAAA); to develop and document a transportable software that implements the experimental institutions; and to replicate experiments. Subsequently, EIA, in conjunction with the Oak Ridge National Laboratory (ORNL) funded the universities of Mississippi and Southern California to test the replicability of these experiments using statistically sound experimental design and the standardized software developed by the University of Arizona. The present experiment is designed to identify any differences in the results of the two laboratory sites. It is designed to determine whether market outcomes are reproducible across different laboratories and experimenters and to determine if any behavior patterns exist across a large set of independent experimental sessions.
Fault diagnosis for micro-gas turbine engine sensors via wavelet entropy.
Yu, Bing; Liu, Dongdong; Zhang, Tianhong
2011-01-01
Sensor fault diagnosis is necessary to ensure the normal operation of a gas turbine system. However, the existing methods require too many resources and this need can't be satisfied in some occasions. Since the sensor readings are directly affected by sensor state, sensor fault diagnosis can be performed by extracting features of the measured signals. This paper proposes a novel fault diagnosis method for sensors based on wavelet entropy. Based on the wavelet theory, wavelet decomposition is utilized to decompose the signal in different scales. Then the instantaneous wavelet energy entropy (IWEE) and instantaneous wavelet singular entropy (IWSE) are defined based on the previous wavelet entropy theory. Subsequently, a fault diagnosis method for gas turbine sensors is proposed based on the results of a numerically simulated example. Then, experiments on this method are carried out on a real micro gas turbine engine. In the experiment, four types of faults with different magnitudes are presented. The experimental results show that the proposed method for sensor fault diagnosis is efficient.
Causality & holographic entanglement entropy
Energy Technology Data Exchange (ETDEWEB)
Headrick, Matthew [Martin Fisher School of Physics, Brandeis University, MS 057, 415 South Street, Waltham, MA 02454 (United States); Hubeny, Veronika E. [Centre for Particle Theory & Department of Mathematical Sciences,Science Laboratories, South Road, Durham DH1 3LE (United Kingdom); Lawrence, Albion [Martin Fisher School of Physics, Brandeis University, MS 057, 415 South Street, Waltham, MA 02454 (United States); Rangamani, Mukund [Centre for Particle Theory & Department of Mathematical Sciences,Science Laboratories, South Road, Durham DH1 3LE (United Kingdom)
2014-12-29
We identify conditions for the entanglement entropy as a function of spatial region to be compatible with causality in an arbitrary relativistic quantum field theory. We then prove that the covariant holographic entanglement entropy prescription (which relates entanglement entropy of a given spatial region on the boundary to the area of a certain extremal surface in the bulk) obeys these conditions, as long as the bulk obeys the null energy condition. While necessary for the validity of the prescription, this consistency requirement is quite nontrivial from the bulk standpoint, and therefore provides important additional evidence for the prescription. In the process, we introduce a codimension-zero bulk region, named the entanglement wedge, naturally associated with the given boundary spatial region. We propose that the entanglement wedge is the most natural bulk region corresponding to the boundary reduced density matrix.
Directory of Open Access Journals (Sweden)
Angel Garrido
2011-07-01
Full Text Available Our paper analyzes some aspects of Uncertainty Measures. We need to obtain new ways to model adequate conditions or restrictions, constructed from vague pieces of information. The classical entropy measure originates from scientific fields; more specifically, from Statistical Physics and Thermodynamics. With time it was adapted by Claude Shannon, creating the current expanding Information Theory. However, the Hungarian mathematician, Alfred Rényi, proves that different and valid entropy measures exist in accordance with the purpose and/or need of application. Accordingly, it is essential to clarify the different types of measures and their mutual relationships. For these reasons, we attempt here to obtain an adequate revision of such fuzzy entropy measures from a mathematical point of view.
Entropy in Pynchon's "Entropy" and Lefebvre's The Production of Space
Snart, Jason
2001-01-01
In his paper, "Entropy in Pynchon's 'Entropy' and Lefebvre's The Production of Space," Jason Snart examines Thomas Pynchon's short story "Entropy" for the ways in which it deals with the kinds of disorder(s) associated with entropy as a thermodynamic and informational concept. Those concepts are installed as a framework within which to consider cultural studies work like Henri Lefebfre's thought in his The Production of Space and Ludwig von Bertalanffy's general systems theory and themodynami...
Directory of Open Access Journals (Sweden)
F. TopsÃƒÂ¸e
2001-09-01
Full Text Available Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over
Entropy, materials, and posterity
Cloud, P.
1977-01-01
Materials and energy are the interdependent feedstocks of economic systems, and thermodynamics is their moderator. It costs energy to transform the dispersed minerals of Earth's crust into ordered materials and structures. And it costs materials to collect and focus the energy to perform work - be it from solar, fossil fuel, nuclear, or other sources. The greater the dispersal of minerals sought, the more energy is required to collect them into ordered states. But available energy can be used once only. And the ordered materials of industrial economies become disordered with time. They may be partially reordered and recycled, but only at further costs in energy. Available energy everywhere degrades to bound states and order to disorder - for though entropy may be juggled it always increases. Yet industry is utterly dependent on low entropy states of matter and energy, while decreasing grades of ore require ever higher inputs of energy to convert them to metals, with ever increasing growth both of entropy and environmental hazard. Except as we may prize a thing for its intrinsic qualities - beauty, leisure, love, or gold - low-entropy is the only thing of real value. It is worth whatever the market will bear, and it becomes more valuable as entropy increases. It would be foolish of suppliers to sell it more cheaply or in larger amounts than their own enjoyment of life requires, whatever form it may take. For this reason, and because of physical constraints on the availability of all low-entropy states, the recent energy crises is only the first of a sequence of crises to be expected in energy and materials as long as current trends continue. The apportioning of low-entropy states in a modern industrial society is achieved more or less according to the theory of competitive markets. But the rational powers of this theory suffer as the world grows increasingly polarized into rich, over-industrialized nations with diminishing resource bases and poor, supplier nations
Calibrated entanglement entropy
Bakhmatov, I.; Deger, N. S.; Gutowski, J.; Colgáin, E. Ó.; Yavartanoo, H.
2017-07-01
The Ryu-Takayanagi prescription reduces the problem of calculating entanglement entropy in CFTs to the determination of minimal surfaces in a dual anti-de Sitter geometry. For 3D gravity theories and BTZ black holes, we identify the minimal surfaces as special Lagrangian cycles calibrated by the real part of the holomorphic one-form of a spacelike hypersurface. We show that (generalised) calibrations provide a unified way to determine holographic entanglement entropy from minimal surfaces, which is applicable to warped AdS3 geometries. We briefly discuss generalisations to higher dimensions.
Entanglement entropy and duality
Energy Technology Data Exchange (ETDEWEB)
Radičević, Ðorđe [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA 94305-4060 (United States)
2016-11-22
Using the algebraic approach to entanglement entropy, we study several dual pairs of lattice theories and show how the entropy is completely preserved across each duality. Our main result is that a maximal algebra of observables in a region typically dualizes to a non-maximal algebra in a dual region. In particular, we show how the usual notion of tracing out external degrees of freedom dualizes to a tracing out coupled to an additional summation over superselection sectors. We briefly comment on possible extensions of our results to more intricate dualities, including holographic ones.
Harremoeës, P.; Topsøe, F.
2001-09-01
In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over the development of natural
DEFF Research Database (Denmark)
Yuri, Shtarkov; Justesen, Jørn
1997-01-01
The concept of entropy for an image on a discrete two dimensional grid is introduced. This concept is used as an information theoretic bound on the coding rate for the image. It is proved that this quantity exists as a limit for arbitrary sets satisfying certain conditions.......The concept of entropy for an image on a discrete two dimensional grid is introduced. This concept is used as an information theoretic bound on the coding rate for the image. It is proved that this quantity exists as a limit for arbitrary sets satisfying certain conditions....
Power-Efficient Computing: Experiences from the COSA Project
Directory of Open Access Journals (Sweden)
Daniele Cesini
2017-01-01
Full Text Available Energy consumption is today one of the most relevant issues in operating HPC systems for scientific applications. The use of unconventional computing systems is therefore of great interest for several scientific communities looking for a better tradeoff between time-to-solution and energy-to-solution. In this context, the performance assessment of processors with a high ratio of performance per watt is necessary to understand how to realize energy-efficient computing systems for scientific applications, using this class of processors. Computing On SOC Architecture (COSA is a three-year project (2015–2017 funded by the Scientific Commission V of the Italian Institute for Nuclear Physics (INFN, which aims to investigate the performance and the total cost of ownership offered by computing systems based on commodity low-power Systems on Chip (SoCs and high energy-efficient systems based on GP-GPUs. In this work, we present the results of the project analyzing the performance of several scientific applications on several GPU- and SoC-based systems. We also describe the methodology we have used to measure energy performance and the tools we have implemented to monitor the power drained by applications while running.
Chained Bell Inequality Experiment with High-Efficiency Measurements
Tan, T. R.; Wan, Y.; Erickson, S.; Bierhorst, P.; Kienzler, D.; Glancy, S.; Knill, E.; Leibfried, D.; Wineland, D. J.
2017-03-01
We report correlation measurements on two 9Be+ ions that violate a chained Bell inequality obeyed by any local-realistic theory. The correlations can be modeled as derived from a mixture of a local-realistic probabilistic distribution and a distribution that violates the inequality. A statistical framework is formulated to quantify the local-realistic fraction allowable in the observed distribution without the fair-sampling or independent-and-identical-distributions assumptions. We exclude models of our experiment whose local-realistic fraction is above 0.327 at the 95% confidence level. This bound is significantly lower than 0.586, the minimum fraction derived from a perfect Clauser-Horne-Shimony-Holt inequality experiment. Furthermore, our data provide a device-independent certification of the deterministically created Bell states.
Polymorphism in a high-entropy alloy
Zhang, Fei; Wu, Yuan; Lou, Hongbo; Zeng, Zhidan; Prakapenka, Vitali B.; Greenberg, Eran; Ren, Yang; Yan, Jinyuan; Okasinski, John S.; Liu, Xiongjun; Liu, Yong; Zeng, Qiaoshi; Lu, Zhaoping
2017-06-01
Polymorphism, which describes the occurrence of different lattice structures in a crystalline material, is a critical phenomenon in materials science and condensed matter physics. Recently, configuration disorder was compositionally engineered into single lattices, leading to the discovery of high-entropy alloys and high-entropy oxides. For these novel entropy-stabilized forms of crystalline matter with extremely high structural stability, is polymorphism still possible? Here by employing in situ high-pressure synchrotron radiation X-ray diffraction, we reveal a polymorphic transition from face-centred-cubic (fcc) structure to hexagonal-close-packing (hcp) structure in the prototype CoCrFeMnNi high-entropy alloy. The transition is irreversible, and our in situ high-temperature synchrotron radiation X-ray diffraction experiments at different pressures of the retained hcp high-entropy alloy reveal that the fcc phase is a stable polymorph at high temperatures, while the hcp structure is more thermodynamically favourable at lower temperatures. As pressure is increased, the critical temperature for the hcp-to-fcc transformation also rises.
Sadeghi, Pegah; Safavinejad, Ali
2017-11-01
Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.
Entropy is a Mathematical Formula
Garai, Jozsef
2003-01-01
The microscopic explanation of entropy has been challenged from both experimental and theoretical point of view. The expression of entropy is derived from the first law of thermodynamics indicating that entropy or the second law of thermodynamics is not an independent law.
Sato, Humitaka
2010-06-01
Charles Darwin's calculation of a life of Earth had ignited Kelvin's insight on a life of Sun, which had eventually inherited to the physical study of stellar structure and energy source. Nuclear energy had secured a longevity of the universe and the goal of the cosmic evolution has been secured by the entropy of black holes.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy à la Boltzmann. Jayanta K Bhattacharjee. General Article Volume 6 Issue 9 September 2001 pp 19-34. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/09/0019-0034. Author Affiliations.
Rescaling Temperature and Entropy
Olmsted, John, III
2010-01-01
Temperature and entropy traditionally are expressed in units of kelvin and joule/kelvin. These units obscure some important aspects of the natures of these thermodynamic quantities. Defining a rescaled temperature using the Boltzmann constant, T' = k[subscript B]T, expresses temperature in energy units, thereby emphasizing the close relationship…
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. Entropy in Biology. Jayant B Udgaonkar. General Article Volume 6 Issue 9 September 2001 pp 61-66. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/09/0061-0066. Author Affiliations.
On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy
Directory of Open Access Journals (Sweden)
Mikołaj Morzy
2017-01-01
Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.
Projective Power Entropy and Maximum Tsallis Entropy Distributions
Directory of Open Access Journals (Sweden)
Shinto Eguchi
2011-09-01
Full Text Available We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index. A close relation of the entropy with the Lebesgue space Lp and the dual Lq is explored, in which the escort distribution associates with an interesting property. When we consider maximum Tsallis entropy distributions under the constraints of the mean vector and variance matrix, the model becomes a multivariate q-Gaussian model with elliptical contours, including a Gaussian and t-distribution model. We discuss the statistical estimation by minimization of the empirical loss associated with the projective power entropy. It is shown that the minimum loss estimator for the mean vector and variance matrix under the maximum entropy model are the sample mean vector and the sample variance matrix. The escort distribution of the maximum entropy distribution plays the key role for the derivation.
Entropy, neutro-entropy and anti-entropy for neutrosophic information
Patrascu, Vasile
2017-01-01
This approach presents a multi-valued representation of the neutrosophic information. It highlights the link between the bifuzzy information and neutrosophic one. The constructed deca-valued structure shows the neutrosophic information complexity. This deca-valued structure led to construction of two new concepts for the neutrosophic information: neutro-entropy and anti-entropy. These two concepts are added to the two existing: entropy and non-entropy. Thus, we obtained the following triad: e...
Financial time series analysis based on effective phase transfer entropy
Yang, Pengbo; Shang, Pengjian; Lin, Aijing
2017-02-01
Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.
Quantum entanglement and Kaniadakis entropy
Ourabah, Kamel; Hiba Hamici-Bendimerad, Amel; Tribeche, Mouloud
2015-04-01
A first use of Kaniadakis entropy in the context of quantum information is presented. First we show that (as all smooth and concave trace-form entropies) it exhibits some properties allowing it to be a possible candidate for a generalized quantum information theory. We then use it to determine the degree of entanglement. The influence of the parameter κ, that underpins Kaniadakis entropy, on the mutual information measure is then highlighted. It is shown that Kaniadakis entropy reduces the mutual information, which is always smaller than its usual von Neumann counterpart. Our results may contribute to the ongoing investigation involving generalized entropies in the context of quantum information.
Fluctuation of Information Entropy Measures in Cell Image
Directory of Open Access Journals (Sweden)
Ishay Wohl
2017-10-01
Full Text Available A simple, label-free cytometry technique is introduced. It is based on the analysis of the fluctuation of image Gray Level Information Entropy (GLIE which is shown to reflect intracellular biophysical properties like generalized entropy. In this study, the analytical relations between cellular thermodynamic generalized entropy and diffusivity and GLIE fluctuation measures are explored for the first time. The standard deviation (SD of GLIE is shown by experiments, simulation and theoretical analysis to be indifferent to microscope system “noise”. Then, the ability of GLIE fluctuation measures to reflect basic cellular entropy conditions of early death and malignancy is demonstrated in a cell model of human, healthy-donor lymphocytes, malignant Jurkat cells, as well as dead lymphocytes and Jurkat cells. Utilization of GLIE-based fluctuation measures seems to have the advantage of displaying biophysical characterization of the tested cells, like diffusivity and entropy, in a novel, unique, simple and illustrative way.
Entropy, recycling and macroeconomics of water resources
Karakatsanis, Georgios; Mamassis, Nikos; Koutsoyiannis, Demetris
2014-05-01
We propose a macroeconomic model for water quantity and quality supply multipliers derived by water recycling (Karakatsanis et al. 2013). Macroeconomic models that incorporate natural resource conservation have become increasingly important (European Commission et al. 2012). In addition, as an estimated 80% of globally used freshwater is not reused (United Nations 2012), under increasing population trends, water recycling becomes a solution of high priority. Recycling of water resources creates two major conservation effects: (1) conservation of water in reservoirs and aquifers and (2) conservation of ecosystem carrying capacity due to wastewater flux reduction. Statistical distribution properties of the recycling efficiencies -on both water quantity and quality- for each sector are of vital economic importance. Uncertainty and complexity of water reuse in sectors are statistically quantified by entropy. High entropy of recycling efficiency values signifies greater efficiency dispersion; which -in turn- may indicate the need for additional infrastructure for the statistical distribution's both shifting and concentration towards higher efficiencies that lead to higher supply multipliers. Keywords: Entropy, water recycling, water supply multipliers, conservation, recycling efficiencies, macroeconomics References 1. European Commission (EC), Food and Agriculture Organization (FAO), International Monetary Fund (IMF), Organization of Economic Cooperation and Development (OECD), United Nations (UN) and World Bank (2012), System of Environmental and Economic Accounting (SEEA) Central Framework (White cover publication), United Nations Statistics Division 2. Karakatsanis, G., N. Mamassis, D. Koutsoyiannis and A. Efstratiades (2013), Entropy and reliability of water use via a statistical approach of scarcity, 5th EGU Leonardo Conference - Hydrofractals 2013 - STAHY '13, Kos Island, Greece, European Geosciences Union, International Association of Hydrological Sciences
The ordinal Kolmogorov-Sinai entropy: A generalized approximation
Eyebe Fouda, J. S. Armand; Koepf, Wolfram; Jacquir, Sabir
2017-05-01
We introduce the multi-dimensional ordinal arrays complexity as a generalized approximation of the ordinal Komogorov-Sinai entropy. The ordinal arrays entropy (OAE) is defined as the Shannon entropy of a series of m-ordinal patterns encoded symbols, while the ordinal arrays complexity (OAC) is defined as the differential of the OAE with respect to m. We theoretically establish that the OAC provides a better estimate of the complexity measure for short length time series. Simulations were carried out using discrete maps, and confirm the efficiency of the OAC as complexity measure from a small data set even in a noisy environment.
Energy Technology Data Exchange (ETDEWEB)
Weinberg, A.M.
1982-10-01
Utopians who use entropy to warn of a vast deterioration of energy and mineral resources seek a self-fulfilling prophesy when they work to deny society access to new energy sources, particularly nuclear power. While theoretically correct, entropy is not the relevant factor for the rest of this century. The more extreme entropists call for a return to an eotechnic society based on decentralized, renewable energy technologies, which rests on the assumptions of a loss in Gibbs Free Energy, a mineral depletion that will lead to OPEC-like manipulation, and a current technology that is destroying the environment. The author challenges these assumptions and calls for an exorcism of public fears over reactor accidents. He foresees a resurgence in public confidence in nuclear power by 1990 that will resolve Western dependence on foreign oil. (DCK)
Entropy generation analysis of magnetohydrodynamic induction devices
Energy Technology Data Exchange (ETDEWEB)
Salas, Hugo [UAEMor., Facultad de Ciencias, Cuernavaca (Mexico); Cuevas, Sergio; Haro, Mariano Lopez de [UNAM, Centro de Investigacion en Energia, Temixco (Mexico)
1999-10-21
Magnetohydrodynamic (MHD) induction devices such as electromagnetic pumps or electric generators are analysed within the approach of entropy generation. The flow of an electrically-conducting incompressible fluid in an MHD induction machine is described through the well known Hartmann model. Irreversibilities in the system due to ohmic dissipation, flow friction and heat flow are included in the entropy-generation rate. This quantity is used to define an overall efficiency for the induction machine that considers the total loss caused by process irreversibility. For an MHD generator working at maximum power output with walls at constant temperature, an optimum magnetic field strength (i.e., Hartmann number) is found based on the maximum overall efficiency. (Author)
Entropy for Mechanically Vibrating Systems
Tufano, Dante
The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators
Holographic entanglement entropy
Rangamani, Mukund
2017-01-01
This book provides a comprehensive overview of developments in the field of holographic entanglement entropy. Within the context of the AdS/CFT correspondence, it is shown how quantum entanglement is computed by the area of certain extremal surfaces. The general lessons one can learn from this connection are drawn out for quantum field theories, many-body physics, and quantum gravity. An overview of the necessary background material is provided together with a flavor of the exciting open questions that are currently being discussed. The book is divided into four main parts. In the first part, the concept of entanglement, and methods for computing it, in quantum field theories is reviewed. In the second part, an overview of the AdS/CFT correspondence is given and the holographic entanglement entropy prescription is explained. In the third part, the time-dependence of entanglement entropy in out-of-equilibrium systems, and applications to many body physics are explored using holographic methods. The last part f...
Directory of Open Access Journals (Sweden)
Bernard S. Kay
2015-12-01
Full Text Available We give a review, in the style of an essay, of the author’s 1998 matter-gravity entanglement hypothesis which, unlike the standard approach to entropy based on coarse-graining, offers a definition for the entropy of a closed system as a real and objective quantity. We explain how this approach offers an explanation for the Second Law of Thermodynamics in general and a non-paradoxical understanding of information loss during black hole formation and evaporation in particular. It also involves a radically different from usual description of black hole equilibrium states in which the total state of a black hole in a box together with its atmosphere is a pure state—entangled in just such a way that the reduced state of the black hole and of its atmosphere are each separately approximately thermal. We also briefly recall some recent work of the author which involves a reworking of the string-theory understanding of black hole entropy consistent with this alternative description of black hole equilibrium states and point out that this is free from some unsatisfactory features of the usual string theory understanding. We also recall the author’s recent arguments based on this alternative description which suggest that the Anti de Sitter space (AdS/conformal field theory (CFT correspondence is a bijection between the boundary CFT and just the matter degrees of freedom of the bulk theory.
Preimage entropy dimension of topological dynamical systems
Liu, Lei; Zhou, Xiaomin; Zhou, Xiaoyao
2014-01-01
We propose a new definition of preimage entropy dimension for continuous maps on compact metric spaces, investigate fundamental properties of the preimage entropy dimension, and compare the preimage entropy dimension with the topological entropy dimension. The defined preimage entropy dimension holds various basic properties of topological entropy dimension, for example, the preimage entropy dimension of a subsystem is bounded by that of the original system and topologically conjugated system...
Entropy in probability and statistics
Energy Technology Data Exchange (ETDEWEB)
Rolke, W.A.
1992-01-01
The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.
DEFF Research Database (Denmark)
Müller-Lennert, Martin; Dupont-Dupuis, Fréderic; Szehr, Oleg
2013-01-01
in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new...... quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data...
Fractal Structure and Entropy Production within the Central Nervous System
Directory of Open Access Journals (Sweden)
Andrew J. E. Seely
2014-08-01
Full Text Available Our goal is to explore the relationship between two traditionally unrelated concepts, fractal structure and entropy production, evaluating both within the central nervous system (CNS. Fractals are temporal or spatial structures with self-similarity across scales of measurement; whereas entropy production represents the necessary exportation of entropy to our environment that comes with metabolism and life. Fractals may be measured by their fractal dimension; and human entropy production may be estimated by oxygen and glucose metabolism. In this paper, we observe fractal structures ubiquitously present in the CNS, and explore a hypothetical and unexplored link between fractal structure and entropy production, as measured by oxygen and glucose metabolism. Rapid increase in both fractal structures and metabolism occur with childhood and adolescent growth, followed by slow decrease during aging. Concomitant increases and decreases in fractal structure and metabolism occur with cancer vs. Alzheimer’s and multiple sclerosis, respectively. In addition to fractals being related to entropy production, we hypothesize that the emergence of fractal structures spontaneously occurs because a fractal is more efficient at dissipating energy gradients, thus maximizing entropy production. Experimental evaluation and further understanding of limitations and necessary conditions are indicated to address broad scientific and clinical implications of this work.
Entanglement Entropy of Black Holes
Directory of Open Access Journals (Sweden)
Sergey N. Solodukhin
2011-10-01
Full Text Available The entanglement entropy is a fundamental quantity, which characterizes the correlations between sub-systems in a larger quantum-mechanical system. For two sub-systems separated by a surface the entanglement entropy is proportional to the area of the surface and depends on the UV cutoff, which regulates the short-distance correlations. The geometrical nature of entanglement-entropy calculation is particularly intriguing when applied to black holes when the entangling surface is the black-hole horizon. I review a variety of aspects of this calculation: the useful mathematical tools such as the geometry of spaces with conical singularities and the heat kernel method, the UV divergences in the entropy and their renormalization, the logarithmic terms in the entanglement entropy in four and six dimensions and their relation to the conformal anomalies. The focus in the review is on the systematic use of the conical singularity method. The relations to other known approaches such as ’t Hooft’s brick-wall model and the Euclidean path integral in the optical metric are discussed in detail. The puzzling behavior of the entanglement entropy due to fields, which non-minimally couple to gravity, is emphasized. The holographic description of the entanglement entropy of the black-hole horizon is illustrated on the two- and four-dimensional examples. Finally, I examine the possibility to interpret the Bekenstein-Hawking entropy entirely as the entanglement entropy.
Temporal Entropy Generation in the Viscous Layers of Laterally-converging Duct Flows
Energy Technology Data Exchange (ETDEWEB)
Donald M. McEligot; Robert S. Brodkey; Helmut Eckelmann
2008-12-01
Since insight into entropy generation is a key to increasing efficiency and thereby reducing fuel consumption and/or waste and -- for wall-bounded flows -- most entropy is generated in the viscous layer, we examine the transient behavior of its dominant contributor there for a non-canonical flow. New measurements in oil flow are presented for the effects of favorable streamwise mean pressure gradients on temporal entropy generation rates and, in the process, on key Reynolds-stress-producing events such as sweep front passage and on the deceleration/outflow phase of the overall bursting process. Two extremes have been considered: (1) a high pressure gradient, nearing "laminarization," and (2), for comparison, a low pressure gradient corresponding to many earlier experiments. In both cases, the peak temporal entropy generation rate occurs shortly after passage of the ejection/sweep interface. Whether sweep and ejection rates appear to decrease or increase with the pressure gradient depends on the feature examined and the manner of sampling. When compared using wall coordinates for velocities, distances and time, the trends and magnitudes of the transient behaviors are mostly the same. The main effects of the higher pressure gradient are (1) changes in the time lag between detections -- representing modification of the shape of the sweep front and the sweep angle with the wall, (2) modification of the magnitude of an instantaneous Reynolds shear stress with wall distance and (3) enlarging the sweeps and ejections. Results new for both low and high pressure gradients are the temporal behaviors of the dominant contribution to entropy generation; it is found to be much more sensitive to distance from the wall than to streamwise pressure gradient.
Nonextensive entropy interdisciplinary applications
Tsallis, Constantino
2004-01-01
A great variety of complex phenomena in many scientific fields exhibit power-law behavior, reflecting a hierarchical or fractal structure. Many of these phenomena seem to be susceptible to description using approaches drawn from thermodynamics or statistical mechanics, particularly approaches involving the maximization of entropy and of Boltzmann-Gibbs statistical mechanics and standard laws in a natural way. The book addresses the interdisciplinary applications of these ideas, and also on various phenomena that could possibly be quantitatively describable in terms of these ideas.
Howard, Eric M
2016-01-01
We analyze spacetimes with horizons and study the thermodynamic aspects of causal horizons, suggesting that the resemblance between gravitational and thermodynamic systems has a deeper quantum mechanical origin. We find that the observer dependence of such horizons is a direct consequence of associating a temperature and entropy to a spacetime. The geometrical picture of a horizon acting as a one-way membrane for information flow can be accepted as a natural interpretation of assigning a quantum field theory to a spacetime with boundary, ultimately leading to a close connection with thermodynamics.
Katona, Gyula O H; Tardos, Gábor
2007-01-01
The present volume is a collection of survey papers in the fields of entropy, search and complexity. They summarize the latest developments in their respective areas. More than half of the papers belong to search theory which lies on the borderline of mathematics and computer science, information theory and combinatorics, respectively. Search theory has variegated applications, among others in bioinformatics. Some of these papers also have links to linear statistics and communicational complexity. Further works survey the fundamentals of information theory and quantum source coding. The volume is recommended to experienced researchers as well as young scientists and students both in mathematics and computer science
Directory of Open Access Journals (Sweden)
Prasad Radha K.
2017-09-01
Full Text Available This paper presents mathematical modelling and numerical analysis to evaluate entropy generation analysis (EGA by considering pressure drop and second law efficiency based on thermodynamics for forced convection heat transfer in rectangular duct of a solar air heater with wire as artificial roughness in the form of arc shape geometry on the absorber plate. The investigation includes evaluations of entropy generation, entropy generation number, Bejan number and irreversibilities of roughened as well as smooth absorber plate solar air heaters to compare the relative performances. Furthermore, effects of various roughness parameters and operating parameters on entropy generation have also been investigated. Entropy generation and irreversibilities (exergy destroyed has its minimum value at relative roughness height of 0.0422 and relative angle of attack of 0.33, which leads to the maximum exergetic efficiency. Entropy generation and exergy based analyses can be adopted for the evaluation of the overall performance of solar air heaters.
Efficient Testing Combing Design of Experiment and Learn-to-Fly Strategies
Murphy, Patrick C.; Brandon, Jay M.
2017-01-01
Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.
Efficiency-Morality Trade-Offs in Repugnant Transactions: A Choice Experiment
Elias, Julio; Lacetera, Nicola; Macis, Mario
2016-01-01
Societies prohibit many transactions considered morally repugnant, although potentially efficiency-enhancing. We conducted an online choice experiment to characterize preferences for the morality and efficiency of payments to kidney donors. Preferences were heterogeneous, ranging from deontological to strongly consequentialist; the median respondent would support payments by a public agency if they increased the annual kidney supply by six percentage points, and private transactions for a thi...
Giant onsite electronic entropy enhances the performance of ceria for water splitting.
Naghavi, S Shahab; Emery, Antoine A; Hansen, Heine A; Zhou, Fei; Ozolins, Vidvuds; Wolverton, Chris
2017-08-18
Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has been the focus of most previous work. Here we report a different source of entropy, the onsite electronic configurational entropy, arising from coupling between orbital and spin angular momenta in lanthanide f orbitals. We find that onsite electronic configurational entropy is sizable in all lanthanides, and reaches a maximum value of ≈4.7 k B per oxygen vacancy for Ce 4+ /Ce 3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has a high electronic entropy and thus could also be a potential candidate for solar thermochemical reactions.Solid-state entropy of reduction increases the thermodynamic efficiency of ceria for two-step thermochemical water splitting. Here, the authors report a large and different source of entropy, the onsite electronic configurational entropy arising from coupling between orbital and spin angular momenta in f orbitals.
Bukowski, Marcin; Asanowicz, Dariusz; Marzecová, Anna; Lupiáñez, Juan
2015-01-01
Two experiments were conducted to explore the effects of experiencing uncontrollability on the efficiency of attentional control. The experience of uncontrollability was induced either by unsolvable tasks (Experiment 1) or by tasks in which non-contingent feedback was provided (Experiment 2). A version of the Attentional Network Test-Interactions with an additional measure of vigilance (ANTI-V) was used to evaluate the efficiency of the attentional networks (i.e., alerting, orienting, and executive). Results of both experiments revealed a decreased efficiency of executive attention in participants who experienced stable control deprivation but no negative effects in participants who were able to restore their sense of previously deprived control. Additionally, when participants were asked to perform unsolvable tasks and did not receive feedback (Experiment 1), detrimental effects on the orienting network and vigilance were observed. The motivational and cognitive mechanisms underlying the effects of various uncontrollability experiences on conflict resolution and attentional control are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Radiation Entropy and Near-Field Thermophotovoltaics
Zhang, Zhuomin M.
2008-08-01
Radiation entropy was key to the original derivation of Planck's law of blackbody radiation, in 1900. This discovery opened the door to quantum mechanical theory and Planck was awarded the Nobel Prize in Physics in 1918. Thermal radiation plays an important role in incandescent lamps, solar energy utilization, temperature measurements, materials processing, remote sensing for astronomy and space exploration, combustion and furnace design, food processing, cryogenic engineering, as well as numerous agricultural, health, and military applications. While Planck's law has been fruitfully applied to a large number of engineering problems for over 100 years, questions have been raised about its limitation in micro/nano systems, especially at subwavelength distances or in the near field. When two objects are located closer than the characteristic wavelength, wave interference and photon tunneling occurs that can result in significant enhancement of the radiative transfer. Recent studies have shown that the near-field effects can realize emerging technologies, such as superlens, sub-wavelength light source, polariton-assisted nanolithography, thermophotovoltaic (TPV) systems, scanning tunneling thermal microscopy, etc. The concept of entropy has also been applied to explain laser cooling of solids as well as the second law efficiency of devices that utilize thermal radiation to produce electricity. However, little is known as regards the nature of entropy in near-field radiation. Some history and recent advances are reviewed in this presentation with a call for research of radiation entropy in the near field, due to the important applications in the optimization of thermophotovoltaic converters and in the design of practical systems that can harvest photon energies efficiently.
Configurational entropy of glueball states
Energy Technology Data Exchange (ETDEWEB)
Bernardini, Alex E., E-mail: alexeb@ufscar.br [Departamento de Física, Universidade Federal de São Carlos, PO Box 676, 13565-905, São Carlos, SP (Brazil); Braga, Nelson R.F., E-mail: braga@if.ufrj.br [Instituto de Física, Universidade Federal do Rio de Janeiro, Caixa Postal 68528, RJ 21941-972 (Brazil); Rocha, Roldão da, E-mail: roldao.rocha@ufabc.edu.br [CMCC, Universidade Federal do ABC, UFABC, 09210-580, Santo André (Brazil)
2017-02-10
The configurational entropy of glueball states is calculated using a holographic description. Glueball states are represented by a supergravity dual picture, consisting of a 5-dimensional graviton–dilaton action of a dynamical holographic AdS/QCD model. The configurational entropy is studied as a function of the glueball spin and of the mass, providing information about the stability of the glueball states.
Approximate entropy of network parameters
West, James; Lacasa, Lucas; Severini, Simone; Teschendorff, Andrew
2012-04-01
We study the notion of approximate entropy within the framework of network theory. Approximate entropy is an uncertainty measure originally proposed in the context of dynamical systems and time series. We first define a purely structural entropy obtained by computing the approximate entropy of the so-called slide sequence. This is a surrogate of the degree sequence and it is suggested by the frequency partition of a graph. We examine this quantity for standard scale-free and Erdös-Rényi networks. By using classical results of Pincus, we show that our entropy measure often converges with network size to a certain binary Shannon entropy. As a second step, with specific attention to networks generated by dynamical processes, we investigate approximate entropy of horizontal visibility graphs. Visibility graphs allow us to naturally associate with a network the notion of temporal correlations, therefore providing the measure a dynamical garment. We show that approximate entropy distinguishes visibility graphs generated by processes with different complexity. The result probes to a greater extent these networks for the study of dynamical systems. Applications to certain biological data arising in cancer genomics are finally considered in the light of both approaches.
Entropy, Age and Time Operator
Directory of Open Access Journals (Sweden)
Ilias Gialampoukidis
2015-01-01
Full Text Available The time operator and internal age are intrinsic features of entropy producing innovation processes. The innovation spaces at each stage are the eigenspaces of the time operator. The internal age is the average innovation time, analogous to lifetime computation. Time operators were originally introduced for quantum systems and highly unstable dynamical systems. Extending the time operator theory to regular Markov chains allows one to relate internal age with norm distances from equilibrium. The goal of this work is to express the evolution of internal age in terms of Lyapunov functionals constructed from entropies. We selected the Boltzmann–Gibbs–Shannon entropy and more general entropy functions, namely the Tsallis entropies and the Kaniadakis entropies. Moreover, we compare the evolution of the distance of initial distributions from equilibrium to the evolution of the Lyapunov functionals constructed from norms with the evolution of Lyapunov functionals constructed from entropies. It is remarkable that the entropy functionals evolve, violating the second law of thermodynamics, while the norm functionals evolve thermodynamically.
Entropy Budget for Hawking Evaporation
Directory of Open Access Journals (Sweden)
Ana Alonso-Serrano
2017-07-01
Full Text Available Blackbody radiation, emitted from a furnace and described by a Planck spectrum, contains (on average an entropy of 3 . 9 ± 2 . 5 bits per photon. Since normal physical burning is a unitary process, this amount of entropy is compensated by the same amount of “hidden information” in correlations between the photons. The importance of this result lies in the posterior extension of this argument to the Hawking radiation from black holes, demonstrating that the assumption of unitarity leads to a perfectly reasonable entropy/information budget for the evaporation process. In order to carry out this calculation, we adopt a variant of the “average subsystem” approach, but consider a tripartite pure system that includes the influence of the rest of the universe, and which allows “young” black holes to still have a non-zero entropy; which we identify with the standard Bekenstein entropy.
Entropy and equilibrium via games of complexity
Topsøe, Flemming
2004-09-01
It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.
Directory of Open Access Journals (Sweden)
Omar S. Gómez
2017-07-01
Full Text Available Background: Common approaches to software verification include static testing techniques, such as code reading, and dynamic testing techniques, such as black-box and white-box testing. Objective: With the aim of gaining a~better understanding of software testing techniques, a~controlled experiment replication and the synthesis of previous experiments which examine the efficiency of code reading, black-box and white-box testing techniques were conducted. Method: The replication reported here is composed of four experiments in which instrumented programs were used. Participants randomly applied one of the techniques to one of the instrumented programs. The outcomes were synthesized with seven experiments using the method of network meta-analysis (NMA. Results: No significant differences in the efficiency of the techniques were observed. However, it was discovered the instrumented programs had a~significant effect on the efficiency. The NMA results suggest that the black-box and white-box techniques behave alike; and the efficiency of code reading seems to be sensitive to other factors. Conclusion: Taking into account these findings, the Authors suggest that prior to carrying out software verification activities, software engineers should have a~clear understanding of the software product to be verified; they can apply either black-box or white-box testing techniques as they yield similar defect detection rates.
DEFF Research Database (Denmark)
Hansen, Britt Rosendahl; Kuhn, Luise Theil; Bahl, Christian Robert Haffenden
2010-01-01
Some manifestations of magnetism are well-known and utilized on an everyday basis, e.g. using a refrigerator magnet for hanging that important note on the refrigerator door. Others are, so far, more exotic, such as cooling by making use of the magnetocaloric eect. This eect can cause a change...... in the temperature of a magnetic material when a magnetic eld is applied or removed. For many years, experimentalists have made use of dilute paramagnetic materials to achieve milliKelvin temperatures by use of the magnetocaloric eect. Also, research is done on materials, which might be used for hydrogen, helium...... the eect: the isothermal magnetic entropy change and the adiabatic temperature change. Some of the manifestations and utilizations of the MCE will be touched upon in a general way and nally I will talk about the results I have obtained on a sample of Gadolinium Iron Garnet (GdIG, Gd3Fe5O12), which...
NUCLEATION AND ENTROPY COMPENSATION IN BIOLOGICAL ASSEMBLY
Directory of Open Access Journals (Sweden)
Frank A. Ferrone
2012-12-01
Full Text Available The assembly of molecules from solution into larger aggregates de-activates their independent rotational and translational motion, which would represent an insuperable penalty in free energy without a compensatory mechanism for regaining at least some of the lost entropy. Such compensation is provided by the internal rigidbody motion of molecules in protein aggregates such as polymers and crystals. While the concepts behind the contributions of these entropic elements, known as vibrational entropy, are not new, the magnitude of the effects is little appreciated. Based on extensive experiments on sickle cell hemoglobin polymerization, we present examples showing the magnitude of the effects and the role they play in explaining such things as the rapid assembly of fibers compared with crystals. While the example will be drawn from sickle hemoglobin, the principles and applications of the concepts are quite general.
Psychoacoustic entropy theory and its implications for performance practice
Strohman, Gregory J.
. However, the limited scope of most harmonic systems used for Western common practice music greatly simplifies the necessary level of mathematical detail. Psychoacoustic entropy theory requires a greater deal of mathematical complexity due to its sheer scope as a generalized theory of musical harmony. Fortunately, under specific assumptions the theory can take on vastly simpler forms. Psychoacoustic entropy theory appears to be highly compatible with the latest scientific research in psychoacoustics. However, the theory itself should be regarded as a hypothesis and this dissertation an experiment in progress. The evaluation of psychoacoustic entropy theory as a scientific theory of human sonic perception must wait for more rigorous future research.
Population entropies estimates of proteins
Low, Wai Yee
2017-05-01
The Shannon entropy equation provides a way to estimate variability of amino acids sequences in a multiple sequence alignment of proteins. Knowledge of protein variability is useful in many areas such as vaccine design, identification of antibody binding sites, and exploration of protein 3D structural properties. In cases where the population entropies of a protein are of interest but only a small sample size can be obtained, a method based on linear regression and random subsampling can be used to estimate the population entropy. This method is useful for comparisons of entropies where the actual sequence counts differ and thus, correction for alignment size bias is needed. In the current work, an R based package named EntropyCorrect that enables estimation of population entropy is presented and an empirical study on how well this new algorithm performs on simulated dataset of various combinations of population and sample sizes is discussed. The package is available at https://github.com/lloydlow/EntropyCorrect. This article, which was originally published online on 12 May 2017, contained an error in Eq. (1), where the summation sign was missing. The corrected equation appears in the Corrigendum attached to the pdf.
A highly efficient neutron time-of-flight detector for inertial confinement fusion experiments
Izumi, N.; Yamaguchi, K.; Yamagajo, T.; Nakano, T.; Kasai, T.; Urano, T.; Azechi, H.; Nakai, S.; Iida, T.
1999-01-01
We have developed the highly efficient neutron detector system MANDALA for the inertial-confinement-fusion experiment. The MANDALA system consists of 842 elements plastic scintillation detectors and data acquisition electronics. The detection level is the yield of 1.2×105 for 2.5 MeV and 1×105 for 14.1 MeV neutrons (with 100 detected hits). We have calibrated the intrinsic detection efficiencies of the detector elements using a neutron generator facility. Timing calibration and integrity test of the system were also carried out with a 60Co γ ray source. MANDALA system was applied to the implosion experiments at the GEKKO XII laser facility. The integrity test was carried out by implosion experiments.
Energy efficiency in existing detached housing. Danish experiences with different policy instruments
Energy Technology Data Exchange (ETDEWEB)
Gram-Hanssen, K.; Haunstrup Christensen, T. (Aalborg Univ., Danish Building Research Institute, Hoersholm (Denmark))
2011-07-01
This report contains a memo written as an input to the German project Enef-haus on energy-efficient restoration of single-family houses in Germany. The memo contains a summary of the Danish experiences divided into three main sections: first is a short historic overview of the Danish energy policy indicating when different relevant instruments have been introduced to increase the energy efficiency of privately owned single-family houses. Second is a short introduction to the Danish housing sector and its energy supplies. The third and main part of the report is an examination of the most recent and relevant instruments concluding both on the results concerning the impact of the instruments especially on owners of single-family houses and on more general experiences with their implementation. Finally the memo concludes on the general lessons that can be learned from the Danish experiences. (Author)
Bubble Entropy: An Entropy Almost Free of Parameters.
Manis, George; Aktaruzzaman, Md; Sassi, Roberto
Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy
The concept of entropy. Relation between action and entropy
Directory of Open Access Journals (Sweden)
J.-P.Badiali
2005-01-01
Full Text Available The Boltzmann expression for entropy represents the traditional link between thermodynamics and statistical mechanics. New theoretical developments like the Unruh effect or the black hole theory suggest a new definition of entropy. In this paper we consider the thermodynamics of black holes as seriously founded and we try to see what we can learn from it in the case of ordinary systems for which a pre-relativistic description is sufficient. We introduce a space-time model and a new definition of entropy considering the thermal equilibrium from a dynamic point of view. Then we show that for black hole and ordinary systems we have the same relation relating a change of entropy to a change of action.
Entanglement entropy converges to classical entropy around periodic orbits
Energy Technology Data Exchange (ETDEWEB)
Asplund, Curtis T., E-mail: ca2621@columbia.edu [Department of Physics, Columbia University, 538 West 120th Street, New York, NY 10027 (United States); Berenstein, David, E-mail: dberens@physics.ucsb.edu [Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA (United Kingdom)
2016-03-15
We consider oscillators evolving subject to a periodic driving force that dynamically entangles them, and argue that this gives the linearized evolution around periodic orbits in a general chaotic Hamiltonian dynamical system. We show that the entanglement entropy, after tracing over half of the oscillators, generically asymptotes to linear growth at a rate given by the sum of the positive Lyapunov exponents of the system. These exponents give a classical entropy growth rate, in the sense of Kolmogorov, Sinai and Pesin. We also calculate the dependence of this entropy on linear mixtures of the oscillator Hilbert-space factors, to investigate the dependence of the entanglement entropy on the choice of coarse graining. We find that for almost all choices the asymptotic growth rate is the same.
Interval Entropy and Informative Distance
Directory of Open Access Journals (Sweden)
Fakhroddin Misagh
2012-03-01
Full Text Available The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
Maximizing entropy over Markov processes
DEFF Research Database (Denmark)
Biondi, Fabrizio; Legay, Axel; Nielsen, Bo Friis
2014-01-01
computation reduces to finding a model of a specification with highest entropy. Entropy maximization for probabilistic process specifications has not been studied before, even though it is well known in Bayesian inference for discrete distributions. We give a characterization of global entropy of a process...... to use Interval Markov Chains to model abstractions of deterministic systems with confidential data, and use the above results to compute their channel capacity. These results are a foundation for ongoing work on computing channel capacity for abstractions of programs derived from code. © 2014 Elsevier...
Gravitational entropy of cosmic expansion
Sussman, Roberto A
2014-01-01
We apply a recent proposal to define "gravitational entropy" to the expansion of cosmic voids within the framework of non-perturbative General Relativity. By considering CDM void configurations compatible with basic observational constraints, we show that this entropy grows from post-inflationary conditions towards a final asymptotic value in a late time fully non-linear regime described by the Lemaitre-Tolman-Bondi (LTB) dust models. A qualitatively analogous behavior occurs if we assume a positive cosmological constant consistent with a $\\Lambda$-CDM background model. However, the $\\Lambda$ term introduces a significant suppression of entropy growth with the terminal equilibrium value reached at a much faster rate.
Energy Technology Data Exchange (ETDEWEB)
Price, Lynn; Galitsky, Christina; Sinton, Jonathan; Worrell,Ernst; Graus, Wina
2005-09-15
The Energy Foundation's China Sustainable Energy Program (CSEP) has undertaken a major project investigating fiscal and tax policy options for stimulating energy efficiency and renewable energy development in China. This report, which is part of the sectoral sub-project studies on energy efficiency in industry, surveys international experience with tax and fiscal policies directed toward increasing investments in energy efficiency in the industrial sector. The report begins with an overview of tax and fiscal policies, including descriptions and evaluations of programs that use energy or energy-related carbon dioxide (CO2) taxes, pollution levies, public benefit charges, grants or subsidies, subsidized audits, loans, tax relief for specific technologies, and tax relief as part of an energy or greenhouse gas (GHG) emission tax or agreement scheme. Following the discussion of these individual policies, the report reviews experience with integrated programs found in two countries as well as with GHG emissions trading programs. The report concludes with a discussion of the best practices related to international experience with tax and fiscal policies to encourage investment in energy efficiency in industry.
Directory of Open Access Journals (Sweden)
Xiaoyan Liu
2015-01-01
Full Text Available A multitude of the researches focus on the factors of the thermal efficiency of a parabolic trough solar collector, that is, the optical-thermal efficiency. However, it is limited to a single or double factors for available system. The aim of this paper is to investigate the multifactors effect on the system’s efficiency in cold climate region. Taking climatic performance into account, an average outlet temperature of LS-2 collector has been simulated successfully by coupling SolTrace software with CFD software. Effects of different factors on instantaneous efficiency have been determined by orthogonal experiment and single factor experiment. After that, the influence degree of different factors on the collector instantaneous efficiency is obtained clearly. The results show that the order of effect extent for average maximal deviation of each factor is inlet temperature, solar radiation intensity, diameter, flow rate, condensation area, pipe length, and ambient temperature. The encouraging results will provide a reference for the exploitation and utilization of parabolic trough solar collector in cold climate region.
A High-Efficiency and High-Resolution Straw Tube Tracker for the LHCb Experiment
Tuning, Niels
2005-01-01
The Outer Tracker detector for the LHCb experiment at CERN will provide accurate position information on the charged particles in B-decays. It is crucial to accurately and efficiently detect these particles, in the high-density particle environment of the LHC. For this, the Outer Tracker is being constructed, consisting of $\\sim$ 55,000 straw tubes, covering in total an area of 360 m$^2$ of double layers. At present, approximately 90% of the detector has been constructed and fully tested. In addition, a beam test has been performed at DESY, Hamburg, to validate the final read-out electronics, in terms of efficiency, position resolution, noise and cross talk.
Entropy exchange for infinite-dimensional systems.
Duan, Zhoubo; Hou, Jinchuan
2017-02-06
In this paper the entropy exchange for channels and states in infinite-dimensional systems are defined and studied. It is shown that, this entropy exchange depends only on the given channel and the state. An explicit expression of the entropy exchange in terms of the state and the channel is proposed. The generalized Klein's inequality, the subadditivity and the triangle inequality about the entropy including infinite entropy for the infinite-dimensional systems are established, and then, applied to compare the entropy exchange with the entropy change.
Shaking the entropy out of a lattice
DEFF Research Database (Denmark)
C. Tichy, Malte; Mølmer, Klaus; F. Sherson, Jacob
2012-01-01
We present a simple and efficient scheme to reduce atom-number fluctuations in optical lattices. The interaction-energy difference for atoms in different vibrational states is used to remove excess atomic occupation. The remaining vacant sites are then filled with atoms by merging adjacent wells......, for which we implement a protocol that circumvents the constraints of unitarity. The preparation of large regions with precisely one atom per lattice site is discussed for both bosons and fermions. The resulting low-entropy Mott-insulating states may serve as high-fidelity register states for quantum...
Minimum Entropy Rate Simplification of Stochastic Processes.
Henter, Gustav Eje; Kleijn, W Bastiaan
2016-12-01
We propose minimum entropy rate simplification (MERS), an information-theoretic, parameterization-independent framework for simplifying generative models of stochastic processes. Applications include improving model quality for sampling tasks by concentrating the probability mass on the most characteristic and accurately described behaviors while de-emphasizing the tails, and obtaining clean models from corrupted data (nonparametric denoising). This is the opposite of the smoothing step commonly applied to classification models. Drawing on rate-distortion theory, MERS seeks the minimum entropy-rate process under a constraint on the dissimilarity between the original and simplified processes. We particularly investigate the Kullback-Leibler divergence rate as a dissimilarity measure, where, compatible with our assumption that the starting model is disturbed or inaccurate, the simplification rather than the starting model is used for the reference distribution of the divergence. This leads to analytic solutions for stationary and ergodic Gaussian processes and Markov chains. The same formulas are also valid for maximum-entropy smoothing under the same divergence constraint. In experiments, MERS successfully simplifies and denoises models from audio, text, speech, and meteorology.
Quantile based Tsallis entropy in residual lifetime
Khammar, A. H.; Jahanshahi, S. M. A.
2018-02-01
Tsallis entropy is a generalization of type α of the Shannon entropy, that is a nonadditive entropy unlike the Shannon entropy. Shannon entropy may be negative for some distributions, but Tsallis entropy can always be made nonnegative by choosing appropriate value of α. In this paper, we derive the quantile form of this nonadditive's entropy function in the residual lifetime, namely the residual quantile Tsallis entropy (RQTE) and get the bounds for it, depending on the Renyi's residual quantile entropy. Also, we obtain relationship between RQTE and concept of proportional hazards model in the quantile setup. Based on the new measure, we propose a stochastic order and aging classes, and study its properties. Finally, we prove characterizations theorems for some well known lifetime distributions. It is shown that RQTE uniquely determines the parent distribution unlike the residual Tsallis entropy.
Linearity of holographic entanglement entropy
National Research Council Canada - National Science Library
Almheiri, Ahmed; Dong, Xi; Swingle, Brian
2017-01-01
We consider the question of whether the leading contribution to the entanglement entropy in holographic CFTs is truly given by the expectation value of a linear operator as is suggested by the Ryu-Takayanagi formula...
Scaling behaviour of entropy estimates
Schürmann, Thomas
2002-02-01
Entropy estimation of information sources is highly non-trivial for symbol sequences with strong long-range correlations. The rabbit sequence, related to the symbolic dynamics of the nonlinear circle map at the critical point as well as the logistic map at the Feigenbaum point, is known to produce long memory tails. For both dynamical systems the scaling behaviour of the block entropy of order n has been shown to increase ∝log n. In contrast to such probabilistic concepts, we investigate the scaling behaviour of certain non-probabilistic entropy estimation schemes suggested by Lempel and Ziv (LZ) in the context of algorithmic complexity and data compression. These are applied in a sequential manner with the scaling variable being the length N of the sequence. We determine the scaling law for the LZ entropy estimate applied to the case of the critical circle map and the logistic map at the Feigenbaum point in a binary partition.
African Journals Online (AJOL)
2009-10-05
RE3 and SE3 respectively). Surgery was then allowed to proceed and sevoflurane concentrations and fresh gas flows were adjusted as necessary. The following calculations were performed: • The absolute changes in entropy ...
Entanglement entropy in flat holography
Jiang, Hongliang; Song, Wei; Wen, Qiang
2017-07-01
BMS symmetry, which is the asymptotic symmetry at null infinity of flat spacetime, is an important input for flat holography. In this paper, we give a holographic calculation of entanglement entropy and Rényi entropy in three dimensional Einstein gravity and Topologically Massive Gravity. The geometric picture for the entanglement entropy is the length of a spacelike geodesic which is connected to the interval at null infinity by two null geodesics. The spacelike geodesic is the fixed points of replica symmetry, and the null geodesics are along the modular flow. Our strategy is to first reformulate the Rindler method for calculating entanglement entropy in a general setup, and apply it for BMS invariant field theories, and finally extend the calculation to the bulk.
Quantum entropy and special relativity.
Peres, Asher; Scudo, Petra F; Terno, Daniel R
2002-06-10
We consider a single free spin- 1 / 2 particle. The reduced density matrix for its spin is not covariant under Lorentz transformations. The spin entropy is not a relativistic scalar and has no invariant meaning.
Holographic avatars of entanglement entropy
Energy Technology Data Exchange (ETDEWEB)
Barbon, J.L.F. [Instituto de Fisica Teorica IFT UAM/CSIC, Ciudad Universitaria de Cantoblanco 28049, Madrid (Spain)
2009-07-15
This is a rendering of the blackboard lectures at the 2008 Cargese summer school, discussing some elementary facts regarding the application of AdS/CFT techniques to the computation of entanglement entropy in strongly coupled systems. We emphasize the situations where extensivity of the entanglement entropy can be used as a crucial criterion to characterize either nontrivial dynamical phenomena at large length scales, or nonlocality in the short-distance realm.
Configurational entropy of glueball states
Directory of Open Access Journals (Sweden)
Alex E. Bernardini
2017-02-01
Full Text Available The configurational entropy of glueball states is calculated using a holographic description. Glueball states are represented by a supergravity dual picture, consisting of a 5-dimensional graviton–dilaton action of a dynamical holographic AdS/QCD model. The configurational entropy is studied as a function of the glueball spin and of the mass, providing information about the stability of the glueball states.
GEM detector performance and efficiency in Proton Charge Radius (PRad) Experiment
Bai, Xinzhan; PRad Collaboration
2017-09-01
The PRad experiment (E12-11-106) was performed in 2016 at Jefferson Lab in Hall B. It aims to investigate the proton charge radius puzzle through electron proton elastic scattering process. The experiment used a non-magnetic spectrometer method, and reached a very small ep scattering angle and thus an unprecedented small four-momentum transfer squared region, Q2 from 2 ×10-4 to 0.06(GeV / c) 2 . PRad experiment was designed to measure the proton charge radius within a sub-percent precision. Gas Electron Multiplier (GEM) detectors have contributed to reach the experimental goal. A pair of large area GEM detectors, and a large acceptance, high resolution calorimeter(HyCal) were utilized in the experiment to detect the scattered electrons. The precision requirements of the experiment demands a highly accurate understanding of efficiency and stability of GEM detectors. In this talk, we will present the preliminary results on the performance and efficiency of GEM detectors. This work is supported in part by NSF MRI award PHY-1229153, the U.S. Department of Energy under Contract No. DE-FG02-07ER41528, No. DE-FG02-03ER41240 and Thomas Jefferson National Laboratory.
Entropy Production in Chemical Reactors
Kingston, Diego; Razzitte, Adrián C.
2017-06-01
We have analyzed entropy production in chemically reacting systems and extended previous results to the two limiting cases of ideal reactors, namely continuous stirred tank reactor (CSTR) and plug flow reactor (PFR). We have found upper and lower bounds for the entropy production in isothermal systems and given expressions for non-isothermal operation and analyzed the influence of pressure and temperature in entropy generation minimization in reactors with a fixed volume and production. We also give a graphical picture of entropy production in chemical reactions subject to constant volume, which allows us to easily assess different options. We show that by dividing a reactor into two smaller ones, operating at different temperatures, the entropy production is lowered, going as near as 48 % less in the case of a CSTR and PFR in series, and reaching 58 % with two CSTR. Finally, we study the optimal pressure and temperature for a single isothermal PFR, taking into account the irreversibility introduced by a compressor and a heat exchanger, decreasing the entropy generation by as much as 30 %.
Crystal Collimation efficiency measured with the Medipix detector in SPS UA9 experiment.
Laface, E; Tlustos, L; Ippolito, V
2010-01-01
The UA9 experiment was performed in 6 MDs from May to November 2009 with the goal of studying the collimation properties of a crystal in the framework of a future exploitation in the LHC collimation system. An important parameter evaluated for the characterization of the crystal collimation is the efficiency of halo extraction when the crystal is in channeling mode. In this paper it is explained how this efficiency can be measured using a pixel detector, the Medipix, installed in the Roman Pot of UA9. The number of extracted particles counted by the Medipix is compared with the total number of circulating particles measured by the Beam Current Transformers (BCTs): from this comparison the efficiency of the system composed by the crystal, used in channeling mode, and a tungsten absorber is proved to be greater than 85%.
CHANTI: a Fast and Efficient Charged Particle Veto Detector for the NA62 Experiment at CERN
INSPIRE-00293636; Capussela, T.; Di Filippo, D.; Massarotti, P.; Mirra, M.; Napolitano, M.; Palladino, V.; Saracino, G.; Roscilli, L.; Vanzanella, A.; Corradi, G.; Tagnani, D.; Paglia, U.
2016-03-29
The design, construction and test of a charged particle detector made of scintillation counters read by Silicon Photomultipliers (SiPM) is described. The detector, which operates in vacuum and is used as a veto counter in the NA62 experiment at CERN, has a single channel time resolution of 1.14 ns, a spatial resolution of ~2.5 mm and an efficiency very close to 1 for penetrating charged particles.
Monitoring the depth of anesthesia from rat EEG using modified Shannon entropy analysis.
Yoon, Young-Gyu; Kim, Tae-Ho; Jeong, Dae-Woong; Park, Sang-Hyun
2011-01-01
In this paper, an entropy based method for quantifying the depth of anesthesia from rat EEG is presented. The proposed index for the depth of anesthesia called modified Shannon entropy (MShEn) is based on Shannon entropy (ShEn) and spectral entropy (SpEn) which are widely used for analyzing non-stationary signals. Discrimination power (DP), as a performance indicator for indexes, is defined and used to derive the final index for the depth of anesthesia. For experiment, EEG from anesthetized rats are measured and analyzed by using MShEn. MShEn shows both high stability and high correlation with other indexes for depth of anesthesia.
Generalized Maximum Entropy Analysis of the Linear Simultaneous Equations Model
Directory of Open Access Journals (Sweden)
Thomas L. Marsh
2014-02-01
Full Text Available A generalized maximum entropy estimator is developed for the linear simultaneous equations model. Monte Carlo sampling experiments are used to evaluate the estimator’s performance in small and medium sized samples, suggesting contexts in which the current generalized maximum entropy estimator is superior in mean square error to two and three stage least squares. Analytical results are provided relating to asymptotic properties of the estimator and associated hypothesis testing statistics. Monte Carlo experiments are also used to provide evidence on the power and size of test statistics. An empirical application is included to demonstrate the practical implementation of the estimator.
Entropy of liquid water from ab initio molecular dynamics.
Zhang, Cui; Spanu, Leonardo; Galli, Giulia
2011-12-08
We have computed the entropy of liquid water using a two-phase thermodynamic model and trajectories generated by ab initio molecular dynamics simulations. We present the results obtained with semilocal, hybrid, and van der Waals density functionals. We show that in all cases, at the experimental equilibrium density and at temperatures in the vicinity of 300 K, the computed entropies are underestimated, with respect to experiment, and the liquid exhibits a degree of tetrahedral order higher than in experiments. The underestimate is more severe for the PBE and PBE0 functionals than for several van der Waals functionals. © 2011 American Chemical Society
Numerical study of effect of oxygen fraction on local entropy ...
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
process in order to achieve higher combustion efficiency with a uniform exit temperature and allowable linear ... processes are accompanied by an irreversible increase in entropy, which leads to a decrease in exergy (available ... swirling jet impingement on an adiabatic wall (Shuja & Yilbas 2001b; 2003) and an imping-.
An entropy approach to size and variance heterogeneity
Balasubramanyan, L.; Stefanou, S.E.; Stokes, J.R.
2012-01-01
In this paper, we investigate the effect of bank size differences on cost efficiency heterogeneity using a heteroskedastic stochastic frontier model. This model is implemented by using an information theoretic maximum entropy approach. We explicitly model both bank size and variance heterogeneity
Does Electromagnetic Radiation Generate Entropy? The Carnot Cycle Revisited
Bligh, Bernard R.
2010-06-01
The Thermodynamics of radiation is to be found in some textbooks which portray the entropy of radiation as 4/3aT3V, where T and V, respectively are the temperature and volume of a cavity at equilibrium with the radiation. Poynting and Thomson (1911) go through the exercise of putting a "photon gas" through a Carnot Cycle,—the only book I have found to do so, but there is an error in their calculation. One purpose of this paper is to correct that mistake, and in extension, rigorously work through the Carnot Cycle because present day students and scientists wanting to study the Carnot Cycle are unlikely to find that textbook. Feynman also dealt with the Thermodynamics of radiation, but his approach is different from that of other authors. Although the Thermodynamics of radiation superficially appears to be self-consistent, there are some queries, which need to be exposed. Firstly, the unit of entropy of radiation is Joule/K which is different from the usual unit of entropy, namely Joule/mole-K. Secondly, entropy in matter relates to atoms and molecules exchanging energy endlessly and randomly, but this is not true for photons. Thirdly, it is possible to do practical experiments in order to estimate numerically the entropy of substances or entropy changes in reactions, but for radiation, are there any experiments which can produce numerical values for the entropy? The importance of this study is that some cosmologists state that, according to the Hot Big Bang Theory, the Universe expanded isentropically and that radiation went through this isentropic process. Objections are raised against this part of the theory.
Minimum entropy density method for the time series analysis
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
The transported entropy of Na+ in solid state cryolite
Sharivker, V. S.; Ratkje, S. Kjelstrup
1996-10-01
The transported entropy of Na+ in mixtures of NaF (s) and Na3AlF6 (s) is determined from thermocell experiments. The experiments were favorably described by the electric work method. The variation observed in the thermocell electromotive force (emf) with composition can be explained from the probable path of charge transfer in the electrolyte. The transported entropies are S*cry Na+ = 140 ± 7 J K-1 mol-1 for cryolite and S*NaFNa+ = 81 ± 8 J K-1 mol-1 for sodium fluoride between 380 °C and 500 °C. The value obtained for sodium in the solid cryolite makes us predict that the transported entropy for Na+ in the molten electrolyte mixture for aluminum production is substantial and that the reversible heat effects in the aluminum electrolysis cell are the same.
Shannon Entropy-Based Prediction of Solar Cycle 25
Kakad, Bharati; Kakad, Amar; Ramesh, Durbha Sai
2017-07-01
A new model is proposed to forecast the peak sunspot activity of the upcoming solar cycle (SC) using Shannon entropy estimates related to the declining phase of the preceding SC. Daily and monthly smoothed international sunspot numbers are used in the present study. The Shannon entropy is the measure of inherent randomness in the SC and is found to vary with the phase of an SC as it progresses. In this model each SC with length T_{cy} is divided into five equal parts of duration T_{cy}/5. Each part is considered as one phase, and they are sequentially termed P1, P2, P3, P4, and P5. The Shannon entropy estimates for each of these five phases are obtained for the nth SC starting from n=10 - 23. We find that the Shannon entropy during the ending phase (P5) of the nth SC can be efficiently used to predict the peak smoothed sunspot number of the (n+1)th SC, i.e. S_{max}^{n+1}. The prediction equation derived in this study has a good correlation coefficient of 0.94. A noticeable decrease in entropy from 4.66 to 3.89 is encountered during P5 of SCs 22 to 23. The entropy value for P5 of the present SC 24 is not available as it has not yet ceased. However, if we assume that the fall in entropy continues for SC 24 at the same rate as that for SC 23, then we predict the peak smoothed sunspot number of 63±11.3 for SC 25. It is suggested that the upcoming SC 25 will be significantly weaker and comparable to the solar activity observed during the Dalton minimum in the past.
Friction, Free Axes of Rotation and Entropy
Directory of Open Access Journals (Sweden)
Alexander Kazachkov
2017-03-01
Full Text Available Friction forces acting on rotators may promote their alignment and therefore eliminate degrees of freedom in their movement. The alignment of rotators by friction force was shown by experiments performed with different spinners, demonstrating how friction generates negentropy in a system of rotators. A gas of rigid rotators influenced by friction force is considered. The orientational negentropy generated by a friction force was estimated with the Sackur-Tetrode equation. The minimal change in total entropy of a system of rotators, corresponding to their eventual alignment, decreases with temperature. The reported effect may be of primary importance for the phase equilibrium and motion of ubiquitous colloidal and granular systems.
Numerical Stability of Generalized Entropies
Steinbrecher, György
2016-01-01
In many applications, the probability density function is subject to experimental errors. In this work the continuos dependence of a class of generalized entropies on the experimental errors is studied. This class includes the C. Shannon, C. Tsallis, A. R\\'enyi and generalized R\\'enyi entropies. By using the connection between R\\'enyi or Tsallis entropies, and the "distance" in a family of metric functional spaces, family that includes the Lebesgue normed vector spaces, we introduce a further extensive generalizations of the R\\'enyi entropy. In this work we suppose that the experimental error is measured by some $L^{p}$ norm. In line with the methodology normally used for treating the so called "ill-posed problems", auxiliary stabilizing conditions are determined, such that small - in the sense of $L^{p}$ metric - experimental errors provoke small variations of the classical and generalized entropies. These stabilizing conditions are formulated in terms of $L^{p}$ metric in a class of generalized $L^{p}$ spac...
Optimizing streamflow monitoring networks using joint permutation entropy
Stosic, Tatijana; Stosic, Borko; Singh, Vijay P.
2017-09-01
Using joint permutation entropy we address the issue of minimizing the cost of monitoring, while minimizing redundancy of the information content, of daily streamflow data recorded during the period 1989-2016 at twelve gauging stations on Brazos River, Texas, USA. While the conventional entropy measures take into account only the probability of occurrence of a given set of events, permutation entropy also takes into account local ordering of the sequential values, thus enriching the analysis. We find that the best cost efficiency is achieved by performing weekly measurements, in comparison with which daily measurements exhibit information redundancy, and monthly measurements imply information loss. We also find that the cumulative information redundancy of the twelve considered stations is over 10% for the observed period, and that the number of monitoring stations can be reduced by half bringing the cumulative redundancy level to less than 1%.
Quantifying the tangling of trajectories using the topological entropy
Candelaresi, S.; Pontin, D. I.; Hornig, G.
2017-09-01
We present a simple method to efficiently compute a lower limit of the topological entropy and its spatial distribution for two-dimensional mappings. These mappings could represent either two-dimensional time-periodic fluid flows or three-dimensional magnetic fields, which are periodic in one direction. This method is based on measuring the length of a material line in the flow. Depending on the nature of the flow, the fluid can be mixed very efficiently which causes the line to stretch. Here, we study a method that adaptively increases the resolution at locations along the line where folds lead to a high curvature. This reduces the computational cost greatly which allows us to study unprecedented parameter regimes. We demonstrate how this efficient implementation allows the computation of the variation of the finite-time topological entropy in the mapping. This measure quantifies spatial variations of the braiding efficiency, important in many practical applications.
Voice Activity Detection Using Fuzzy Entropy and Support Vector Machine
Directory of Open Access Journals (Sweden)
R. Johny Elton
2016-08-01
Full Text Available This paper proposes support vector machine (SVM based voice activity detection using FuzzyEn to improve detection performance under noisy conditions. The proposed voice activity detection (VAD uses fuzzy entropy (FuzzyEn as a feature extracted from noise-reduced speech signals to train an SVM model for speech/non-speech classification. The proposed VAD method was tested by conducting various experiments by adding real background noises of different signal-to-noise ratios (SNR ranging from −10 dB to 10 dB to actual speech signals collected from the TIMIT database. The analysis proves that FuzzyEn feature shows better results in discriminating noise and corrupted noisy speech. The efficacy of the SVM classifier was validated using 10-fold cross validation. Furthermore, the results obtained by the proposed method was compared with those of previous standardized VAD algorithms as well as recently developed methods. Performance comparison suggests that the proposed method is proven to be more efficient in detecting speech under various noisy environments with an accuracy of 93.29%, and the FuzzyEn feature detects speech efficiently even at low SNR levels.
Entropy production in oscillatory processes during photosynthesis.
López-Agudelo, Víctor A; Barragán, Daniel
2014-01-01
The flow of matter and heat and the rate of enzymatic reactions are examined using two models of photosynthesis that exhibit sustained and damped oscillatory dynamics, with the objective of calculating the rate of entropy generation and studying the effects of temperature and kinetic constants on the thermodynamic efficiency of photosynthesis. The global coefficient of heat transfer and the direct and inverse constants of the formation reaction of the RuBisCO-CO2 complex were used as control parameters. Results show that when the system moves from isothermal to non-isothermal conditions, the transition from a steady state to oscillations facilitates an increase in the energy efficiency of the process. The simulations were carried out for two photosynthetic models in a system on a chloroplast reactor scale.
Entropy and Information Transmission in Causation and Retrocausation
Moddel, Garret
2006-10-01
Although experimental evidence for retrocausation exists, there are clearly subtleties to the phenomenon. The bilking paradox, in which one intervenes to eliminate a subsequent cause after a preceding effect has occurred, appears on the surface to show that retrocausation is logically impossible. In a previous paper, the second law of thermodynamics was invoked to show that the entropy in each process of a psi interaction (presentience, telepathy, remote perception, and psychokinesis) cannot decrease, prohibiting psi processes in which signals condense from background fluctuations. Here it is shown, perhaps contrary to one's intuition, that reversible processes cannot be influenced through retrocausation, but irreversible processes can. The increase in thermodynamic entropy in irreversible processes — which are generally described by Newtonian mechanics but not Lagrangian dynamics and Hamilton's Principle — is required for causation. Thermodynamically reversible processes cannot be causal and hence also cannot be retrocausal. The role of entropy in psi interactions is extended by using the bilking paradox to consider information transmission in retroactive psychokinesis (PK). PK efficiency, ηPK, is defined. A prediction of the analysis is that ηPK ⩽ H/H0, where H is the information uncertainty or entropy in the retro-PK agent's knowledge of the event that is to be influenced retrocausally. The information entropy can provide the necessary ingredient for non-reversibility, and hence retrocausation. Noise and bandwidth limitations in the communication to the agent of the outcome of the event increase the maximum PK efficiency. Avoidance of the bilking paradox does not bar a subject from using the premonition of an event to prevent it from occurring. The necessity for large information entropy, which is the expected value of the surprisal, is likely to be essential for any successful PK process, not just retro-PK processes. Hence uncertainty in the
Quantum geometry and gravitational entropy
Energy Technology Data Exchange (ETDEWEB)
Simon, Joan; Balasubramanian, Vijay; Czech, Bart Iomiej; Larjo, Klaus; Marolf, Donald; Simon, Joan
2007-05-29
Most quantum states have wavefunctions that are widely spread over the accessible Hilbert space and hence do not have a good description in terms of a single classical geometry. In order to understand when geometric descriptions are possible, we exploit the AdS/CFT correspondence in the half-BPS sector of asymptotically AdS_5 x S5 universes. In this sector we devise a"coarse-grained metric operator" whose eigenstates are well described by a single spacetime topology and geometry. We show that such half-BPS universes have a non-vanishing entropy if and only if the metric is singular, and that the entropy arises from coarse-graining the geometry. Finally, we use our entropy formula to find the most entropic spacetimes with fixed asymptotic moments beyond the global charges.
Lemons, Don S
2013-01-01
Striving to explore the subject in as simple a manner as possible, this book helps readers understand the elusive concept of entropy. Innovative aspects of the book include the construction of statistical entropy, the derivation of the entropy of classical systems from purely classical assumptions, and a statistical thermodynamics approach to the ideal Fermi and ideal Bose gases. Derivations are worked through step-by-step and important applications are highlighted in over 20 worked examples. Nearly 50 end-of-chapter exercises test readers' understanding. The book also features a glossary giving definitions for all essential terms, a time line showing important developments, and list of books for further study. It is an ideal supplement to undergraduate courses in physics, engineering, chemistry and mathematics.
Construction of microcanonical entropy on thermodynamic pillars
Campisi, Michele
2015-05-01
A question that is currently highly debated is whether the microcanonical entropy should be expressed as the logarithm of the phase volume (volume entropy, also known as the Gibbs entropy) or as the logarithm of the density of states (surface entropy, also known as the Boltzmann entropy). Rather than postulating them and investigating the consequence of each definition, as is customary, here we adopt a bottom-up approach and construct the entropy expression within the microcanonical formalism upon two fundamental thermodynamic pillars: (i) The second law of thermodynamics as formulated for quasistatic processes: δ Q /T is an exact differential, and (ii) the law of ideal gases: P V =kBN T . The first pillar implies that entropy must be some function of the phase volume Ω . The second pillar singles out the logarithmic function among all possible functions. Hence the construction leads uniquely to the expression S =kBlnΩ , that is, the volume entropy. As a consequence any entropy expression other than that of Gibbs, e.g., the Boltzmann entropy, can lead to inconsistencies with the two thermodynamic pillars. We illustrate this with the prototypical example of a macroscopic collection of noninteracting spins in a magnetic field, and show that the Boltzmann entropy severely fails to predict the magnetization, even in the thermodynamic limit. The uniqueness of the Gibbs entropy, as well as the demonstrated potential harm of the Boltzmann entropy, provide compelling reasons for discarding the latter at once.
Improving Care Experiences, Efficiencies and Quality of Care for Seniors in Alberta.
Abbasi, Marjan; Khera, Sheny; Dabravolskaj, Julia; Xia, Linda
2017-01-01
Improving Care Experiences, Efficiencies and Quality of Care for Seniors in Alberta Forum was held to explore the current challenges and opportunities in seniors' care. A diverse group of 53 attendees, representing a cross section of healthcare organizations, front-line healthcare providers, researchers and patients, participated in facilitative, small group discussions to share and propose solutions to barriers to coordinating and integrating care for the senior population across the continuum within the Edmonton zone, to comment on a standardized assessment that may inform integrated care and support planning and to outline steps towards health information continuity.
Entanglement entropy: a perturbative calculation
Energy Technology Data Exchange (ETDEWEB)
Rosenhaus, Vladimir; Smolkin, Michael [Center for Theoretical Physics and Department of Physics,University of California, Berkeley, CA 94720 (United States)
2014-12-31
We provide a framework for a perturbative evaluation of the reduced density matrix. The method is based on a path integral in the analytically continued spacetime. It suggests an alternative to the holographic and ‘standard’ replica trick calculations of entanglement entropy. We implement this method within solvable field theory examples to evaluate leading order corrections induced by small perturbations in the geometry of the background and entangling surface. Our findings are in accord with Solodukhin’s formula for the universal term of entanglement entropy for four dimensional CFTs.
Catching homologies by geometric entropy
Felice, Domenico; Franzosi, Roberto; Mancini, Stefano; Pettini, Marco
2018-02-01
A geometric entropy is defined in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. As such it can be a good candidate for measuring networks complexity. Here we investigate its ability to single out topological features of networks proceeding in a bottom-up manner: first we consider small size networks by analytical methods and then large size networks by numerical techniques. Two different classes of networks, the random graphs and the scale-free networks, are investigated computing their Betti numbers and then showing the capability of geometric entropy of detecting homologies.
Text mining by Tsallis entropy
Jamaati, Maryam; Mehri, Ali
2018-01-01
Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.
Virtuality and Efficiency - Overcoming Past Antinomy in the Remote Collaboration Experience
Fernandes, J; Martin Clavo, D; Baron, T; CERN. Geneva. IT Department
2010-01-01
Several recent initiatives have been put in place by the CERN IT Department to improve the user experience in remote dispersed meetings and remote collaboration at large in the LHC communities worldwide. We will present an analysis of the factors which were historically limiting the efficiency of remote dispersed meetings and describe the consequent actions which were undertaken at CERN to overcome these limitations. After giving a status update of the different equipment available at CERN to enable the virtual sessions and the various collaborative tools which are currently proposed to users, we will focus on the evolution of this market: how can the new technological trends (among others, HD videoconferencing, Telepresence, Unified Communications, etc.) impact positively the user experience and how to attain the best usage of them. Finally, by projecting ourselves in the future, we will give some hints as to how to answer the difficult question of selecting the next generation of collaborative tools: which ...
Multivariate refined composite multiscale entropy analysis
Energy Technology Data Exchange (ETDEWEB)
Humeau-Heurtier, Anne, E-mail: anne.humeau@univ-angers.fr
2016-04-01
Multiscale entropy (MSE) has become a prevailing method to quantify signals complexity. MSE relies on sample entropy. However, MSE may yield imprecise complexity estimation at large scales, because sample entropy does not give precise estimation of entropy when short signals are processed. A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. Nevertheless, RCMSE is for univariate signals only. The simultaneous analysis of multi-channel (multivariate) data often over-performs studies based on univariate signals. We therefore introduce an extension of RCMSE to multivariate data. Applications of multivariate RCMSE to simulated processes reveal its better performances over the standard multivariate MSE. - Highlights: • Multiscale entropy quantifies data complexity but may be inaccurate at large scale. • A refined composite multiscale entropy (RCMSE) has therefore recently been proposed. • Nevertheless, RCMSE is adapted to univariate time series only. • We herein introduce an extension of RCMSE to multivariate data. • It shows better performances than the standard multivariate multiscale entropy.
The entropy principle thermodynamics for the unsatisfied
Thess, André
2011-01-01
Entropy is the most important and the most difficult to understand term of thermodynamics. This book helps make this key concept understandable. It includes seven illustrative examples of applications of entropy, which are presented step by step.
On thermodynamic limits of entropy densities
Moriya, H; Van Enter, A
We give some sufficient conditions which guarantee that the entropy density in the thermodynamic limit is equal to the thermodynamic limit of the entropy densities of finite-volume (local) Gibbs states.
Duality of Maximum Entropy and Minimum Divergence
Directory of Open Access Journals (Sweden)
Shinto Eguchi
2014-06-01
Full Text Available We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.
Structural stability of high entropy alloys under pressure and temperature
DEFF Research Database (Denmark)
Ahmad, Azkar S.; Su, Y.; Liu, S. Y.
2017-01-01
The stability of high-entropy alloys (HEAs) is a key issue before their selection for industrial applications. In this study, in-situ high-pressure and high-temperature synchrotron radiation X-ray diffraction experiments have been performed on three typical HEAs Ni20Co20Fe20Mn20Cr20, Hf25Nb25Zr25Ti...
Rényi entropy flows from quantum heat engines
Ansari, M.H.; Nazarov, Y.V.
2015-01-01
We evaluate Rényi entropy flows from generic quantum heat engines (QHE) to a weakly coupled probe environment kept in thermal equilibrium. We show that the flows are determined not only by heat flow but also by a quantum coherent flow that can be separately measured in experiment apart from the heat
Non-convex Shannon entropy for photon-limited imaging
Adhikari, Lasith; Baikejiang, Reheman; DeGuchy, Omar; Marcia, Roummel F.
2017-08-01
Reconstructing high-dimensional sparse signals from low-dimensional low-count photon observations is a challenging nonlinear optimization problem. In this paper, we build upon previous work on minimizing the Poisson log-likelihood and incorporate recent work on the generalized nonconvex Shannon entropy function for promoting sparsity in solutions. We explore the effectiveness of the proposed approach using numerical experiments.
A Maximum Entropy Method for a Robust Portfolio Problem
Directory of Open Access Journals (Sweden)
Yingying Xu
2014-06-01
Full Text Available We propose a continuous maximum entropy method to investigate the robustoptimal portfolio selection problem for the market with transaction costs and dividends.This robust model aims to maximize the worst-case portfolio return in the case that allof asset returns lie within some prescribed intervals. A numerical optimal solution tothe problem is obtained by using a continuous maximum entropy method. Furthermore,some numerical experiments indicate that the robust model in this paper can result in betterportfolio performance than a classical mean-variance model.
Generic Properties of Stochastic Entropy Production
Pigolotti, Simone; Neri, Izaak; Roldán, Édgar; Jülicher, Frank
2017-10-01
We derive an Itô stochastic differential equation for entropy production in nonequilibrium Langevin processes. Introducing a random-time transformation, entropy production obeys a one-dimensional drift-diffusion equation, independent of the underlying physical model. This transformation allows us to identify generic properties of entropy production. It also leads to an exact uncertainty equality relating the Fano factor of entropy production and the Fano factor of the random time, which we also generalize to non-steady-state conditions.
Entropy production in continuous phase space systems
Luposchainsky, David; Hinrichsen, Haye
2013-01-01
We propose an alternative method to compute the entropy production of a classical underdamped nonequilibrium system in a continuous phase space. This approach has the advantage that it is not necessary to distinguish between even and odd-parity variables. We show that the method leads to the same local entropy production as in previous studies while the differential entropy production along a stochastic trajectory turns out to be different. This demonstrates that the differential entropy prod...
Beaty, Roger E; Kaufman, Scott Barry; Benedek, Mathias; Jung, Rex E; Kenett, Yoed N; Jauk, Emanuel; Neubauer, Aljoscha C; Silvia, Paul J
2016-02-01
The brain's default network (DN) has been a topic of considerable empirical interest. In fMRI research, DN activity is associated with spontaneous and self-generated cognition, such as mind-wandering, episodic memory retrieval, future thinking, mental simulation, theory of mind reasoning, and creative cognition. Despite large literatures on developmental and disease-related influences on the DN, surprisingly little is known about the factors that impact normal variation in DN functioning. Using structural equation modeling and graph theoretical analysis of resting-state fMRI data, we provide evidence that Openness to Experience-a normally distributed personality trait reflecting a tendency to engage in imaginative, creative, and abstract cognitive processes-underlies efficiency of information processing within the DN. Across two studies, Openness predicted the global efficiency of a functional network comprised of DN nodes and corresponding edges. In Study 2, Openness remained a robust predictor-even after controlling for intelligence, age, gender, and other personality variables-explaining 18% of the variance in DN functioning. These findings point to a biological basis of Openness to Experience, and suggest that normally distributed personality traits affect the intrinsic architecture of large-scale brain systems. Hum Brain Mapp 37:773-779, 2016. © 2015 Wiley Periodicals, Inc. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
Virtuality and efficiency - overcoming past antinomy in the remote collaboration experience
Energy Technology Data Exchange (ETDEWEB)
Fernandes, Joao; Bjorkli, Knut; Clavo, David Martin; Baron, Thomas, E-mail: Joao.Fernandes@cern.c [CERN IT-UDS-AVC, 1211 Geneva 23 (Switzerland)
2010-04-01
Several recent initiatives have been put in place by the CERN IT Department to improve the user experience in remote dispersed meetings and remote collaboration at large in the LHC communities worldwide. We will present an analysis of the factors which were historically limiting the efficiency of remote dispersed meetings and describe the consequent actions which were undertaken at CERN to overcome these limitations. After giving a status update of the different equipment available at CERN to enable the virtual sessions and the various collaborative tools which are currently proposed to users, we will focus on the evolution of this market: how can the new technological trends (among others, HD videoconferencing, Telepresence, Unified Communications, etc.) impact positively the user experience and how to attain the best usage of them. Finally, by projecting ourselves in the future, we will give some hints as to how to answer the difficult question of selecting the next generation of collaborative tools: which set of tools among the various offers (systems like Vidyo H264 SVC, next generation EVO, Groupware offers, standard H323 systems, etc.) is best suited for our environment and how to unify this set for the common user. This will finally allow us to definitively overcome the past antinomy between virtuality and efficiency.
Yang, Jiao-lan; Chen, Dong-qing; Li, Shu-min; Yue, Yin-ling; Jin, Xin; Zhao, Bing-cheng; Ying, Bo
2010-02-05
The fluorosis derived from coal burning is a very serious problem in China. By using fluorine-fixing technology during coal burning we are able to reduce the release of fluorides in coal at the source in order to reduce pollution to the surrounding environment by coal burning pollutants as well as decrease the intake and accumulating amounts of fluorine in the human body. The aim of this study was to conduct a pilot experiment on calcium-based fluorine-fixing material efficiency during coal burning to demonstrate and promote the technology based on laboratory research. A proper amount of calcium-based fluorine sorbent was added into high-fluorine coal to form briquettes so that the fluorine in high-fluorine coal can be fixed in coal slag and its release into atmosphere reduced. We determined figures on various components in briquettes and fluorine in coal slag as well as the concentrations of indoor air pollutants, including fluoride, sulfur dioxide and respirable particulate matter (RPM), and evaluated the fluorine-fixing efficiency of calcium-based fluorine sorbents and the levels of indoor air pollutants. Pilot experiments on fluorine-fixing efficiency during coal burning as well as its demonstration and promotion were carried out separately in Guiding and Longli Counties of Guizhou Province, two areas with coal burning fluorosis problems. If the calcium-based fluorine sorbent mixed coal was made into honeycomb briquettes the average fluorine-fixing ratio in the pilot experiment was 71.8%. If the burning calcium-based fluorine-fixing bitumite was made into a coalball, the average of fluorine-fixing ratio was 77.3%. The concentration of fluoride, sulfur dioxide and PM10 of indoor air were decreased significantly. There was a 10% increase in the cost of briquettes due to the addition of calcium-based fluorine sorbent. The preparation process of calcium-based fluorine-fixing briquette is simple yet highly flammable and it is applicable to regions with abundant
DDoS Attack Detection Algorithms Based on Entropy Computing
Li, Liying; Zhou, Jianying; Xiao, Ning
Distributed Denial of Service (DDoS) attack poses a severe threat to the Internet. It is difficult to find the exact signature of attacking. Moreover, it is hard to distinguish the difference of an unusual high volume of traffic which is caused by the attack or occurs when a huge number of users occasionally access the target machine at the same time. The entropy detection method is an effective method to detect the DDoS attack. It is mainly used to calculate the distribution randomness of some attributes in the network packets' headers. In this paper, we focus on the detection technology of DDoS attack. We improve the previous entropy detection algorithm, and propose two enhanced detection methods based on cumulative entropy and time, respectively. Experiment results show that these methods could lead to more accurate and effective DDoS detection.
Energy Technology Data Exchange (ETDEWEB)
Jollands, Nigel; Gasc, Emilien; Pasquier, Sara Bryan
2009-12-15
Despite creating a plethora of national and international regulations and voluntary programmes to improve energy efficiency, countries are far from achieving full energy efficiency potential across all sectors of the economy. One major challenge, among numerous barriers, is policy implementation. One strategy that many national governments and international organisations have used to address the implementation issue is to engage regional and local authorities. To that end, many programmes have been created that foster energy efficiency action and collaboration across levels of government. The aim of this report is to identify trends and detail recent developments in multi-level governance in energy efficiency (MLGEE). By sharing lessons learned from daily practitioners in the field, the IEA hopes energy efficiency policy makers at all levels of government will be able to identify useful multilevel governance (MLG) practices across geographical and political contexts and use these to design robust programmes; modify existing programmes, and connect and share experiences with other policy makers in this field.
Entropy-based complexity measures for gait data of patients with Parkinson's disease.
Afsar, Ozgur; Tirnakli, Ugur; Kurths, Juergen
2016-02-01
Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk predictor such that the sensitivity, i.e., the percentage of patients with PD who are correctly identified as having PD, increases from 25% to 67% while the Hoehn-Yahr stage increases from 2.5 to 3.0 (this stage goes from 0 to 5 as the disease severity increases). The renormalized entropy method for stride time variability of gait is found to correctly identify patients with a sensitivity of 80%, while the Shannon entropy and the Kullback-Leibler relative entropy can do this with a sensitivity of only 26.7% and 13.3%, respectively.
Logical entropy of quantum dynamical systems
Directory of Open Access Journals (Sweden)
Ebrahimzadeh Abolfazl
2016-01-01
Full Text Available This paper introduces the concepts of logical entropy and conditional logical entropy of hnite partitions on a quantum logic. Some of their ergodic properties are presented. Also logical entropy of a quantum dynamical system is dehned and ergodic properties of dynamical systems on a quantum logic are investigated. Finally, the version of Kolmogorov-Sinai theorem is proved.
[Maximum entropy principle and population genetic equilibrium].
Wang, Xiao-Long; Yuan, Zhi-Fa; Guo, Man-Cai; Song, Shi-De; Zhang, Quan-Qi; Bao, Zhen-Min
2002-06-01
A general mathematic model of population genetic equilibrium was constructed based on the maximum entropy principle. We proved that the maximum entropy probability distribution was equivalent to the Hardy-Weinberg equilibrium law. A population reached genetic equilibrium when the genotype entropy of the population reached the maximal possible value. In information theory, the entropy or the information content is used to measure the uncertainty of a system. In population genetics, we can use entropy to measure the uncertainty of the genotype of a population. The agreement of the maximum entropy principle and the hardy-Weinberg equilibrium law indicated that random crossing is an irreversible process, which increases the genotype entropy of the population, while inbreeding and selection decrease the genotype entropy of the population. In animal or plant breeding, we often use selection and/or inbreeding to decrease the entropy of a population, and use intercrossing to increase the entropy of the population. In this point of view, breeding is actually regulating the entropy of population. By applying the basic principle of informatics in population genetics, we revealed the biological significance of the genotype entropy and demonstrated that we can work over population genetic problems with the principles and methods of informatics and cybernetics.
Topological Entropy of Cournot-Puu Duopoly
Directory of Open Access Journals (Sweden)
Jose S. Cánovas
2010-01-01
Full Text Available The aim of this paper is to analyze a classical duopoly model introduced by Tönu Puu in 1991. For that, we compute the topological entropy of the model and characterize those parameter values with positive entropy. Although topological entropy is a measure of the dynamical complexity of the model, we will show that such complexity could not be observed.
The Entropy of Morbidity Trauma and Mortality
Neal-Sturgess, Clive
2010-01-01
In this paper it is shown that statistical mechanics in the form of thermodynamic entropy can be used as a measure of the severity of individual injuries (AIS), and that the correct way to account for multiple injuries is to sum the entropies. It is further shown that summing entropies according to the Planck-Boltzmann (P-B) definition of entropy is formally the same as ISS, which is why ISS works. Approximate values of the probabilities of fatality are used to calculate the Gibb's entropy, which is more accurate than the P-B entropy far from equilibrium, and are shown to be again proportional to ISS. For the categorisation of injury using entropies it is necessary to consider the underlying entropy of the individuals morbidity to which is added the entropy of trauma, which then may result in death. Adding in the underlying entropy and summing entropies of all AIS3+ values gives a more extended scale than ISS, and so entropy is considered the preferred measure. A small scale trial is conducted of these concep...
Definition of Nonequilibrium Entropy of General Systems
Mei, Xiaochun
1999-01-01
The definition of nonequilibrium entropy is provided for the general nonequilibrium processes by connecting thermodynamics with statistical physics, and the principle of entropy increment in the nonequilibrium processes is also proved in the paper. The result shows that the definition of nonequilibrium entropy is not unique.
Quantum Kaniadakis entropy under projective measurement
Ourabah, Kamel; Hamici-Bendimerad, Amel Hiba; Tribeche, Mouloud
2015-09-01
It is well known that the von Neumann entropy of a quantum state does not decrease with a projective measurement. This property holds for Tsallis and Rényi entropies as well. We show that the recently introduced quantum version of the Kaniadakis entropy preserves this property.
On The Generalized Additivity Of Kaniadakis Entropy
Sparavigna, Amelia Carolina
2015-01-01
Since entropy has several applications in the information theory, such as, for example, in bi-level or multi-level thresholding of images, it is interesting to investigate the generalized additivity of Kaniadakis entropy for more than two systems. Here we consider the additivity for three, four and five systems, because we aim applying Kaniadakis entropy to such multi-level analyses
Entanglement entropy and anomaly inflow
Hughes, Taylor L.; Leigh, Robert G.; Parrikar, Onkar; Ramamurthy, Srinidhi T.
2016-03-01
We study entanglement entropy for parity-violating (time-reversal breaking) quantum field theories on R1 ,2 in the presence of a domain wall between two distinct parity-odd phases. The domain wall hosts a 1 +1 -dimensional conformal field theory (CFT) with nontrivial chiral central charge. Such a CFT possesses gravitational anomalies. It has been shown recently that, as a consequence, its intrinsic entanglement entropy is sensitive to Lorentz boosts around the entangling surface. Here, we show using various methods that the entanglement entropy of the three-dimensional bulk theory is also sensitive to such boosts owing to parity-violating effects, and that the bulk response to a Lorentz boost precisely cancels the contribution coming from the domain wall CFT. We argue that this can naturally be interpreted as entanglement inflow (i.e., inflow of entanglement entropy analogous to the familiar Callan-Harvey effect) between the bulk and the domain-wall, mediated by the low-lying states in the entanglement spectrum. These results can be generally applied to 2 +1 -d topological phases of matter that have edge theories with gravitational anomalies, and provide a precise connection between the gravitational anomaly of the physical edge theory and the low-lying spectrum of the entanglement Hamiltonian.
Mushotzky, R.
2008-01-01
I will discuss how one can determine the origin of the 'extra entropy' in groups and clusters and the feedback needed in models of galaxy formation. I will stress the use of x-ray spectroscopy and imaging and the critical value that Con-X has in this regard.
Biosemiotic Entropy: Concluding the Series
Directory of Open Access Journals (Sweden)
John W. Oller
2014-07-01
Full Text Available This article concludes the special issue on Biosemiotic Entropy looking toward the future on the basis of current and prior results. It highlights certain aspects of the series, concerning factors that damage and degenerate biosignaling systems. As in ordinary linguistic discourse, well-formedness (coherence in biological signaling systems depends on valid representations correctly construed: a series of proofs are presented and generalized to all meaningful sign systems. The proofs show why infants must (as empirical evidence shows they do proceed through a strict sequence of formal steps in acquiring any language. Classical and contemporary conceptions of entropy and information are deployed showing why factors that interfere with coherence in biological signaling systems are necessary and sufficient causes of disorders, diseases, and mortality. Known sources of such formal degeneracy in living organisms (here termed, biosemiotic entropy include: (a toxicants, (b pathogens; (c excessive exposures to radiant energy and/or sufficiently powerful electromagnetic fields; (d traumatic injuries; and (e interactions between the foregoing factors. Just as Jaynes proved that irreversible changes invariably increase entropy, the theory of true narrative representations (TNR theory demonstrates that factors disrupting the well-formedness (coherence of valid representations, all else being held equal, must increase biosemiotic entropy—the kind impacting biosignaling systems.
Entanglement Entropy of Black Shells
Arenas, J Robel; 10.1393/ncb/i2010-10922-3
2011-01-01
We present a coherent account of how the entanglement interpretation, thermofield dynamical description and the brick wall formulations (with the ground state correctly identified) fit into a connected and self-consistent explanation of what Bekenstein-Hawking entropy is, and where it is located.
A modified belief entropy in Dempster-Shafer framework.
Zhou, Deyun; Tang, Yongchuan; Jiang, Wen
2017-01-01
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.
A modified belief entropy in Dempster-Shafer framework
Zhou, Deyun; Jiang, Wen
2017-01-01
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method. PMID:28481914
A modified belief entropy in Dempster-Shafer framework.
Directory of Open Access Journals (Sweden)
Deyun Zhou
Full Text Available How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What's more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.
Minimization of entropy production in separate and connected process units
Energy Technology Data Exchange (ETDEWEB)
Roesjorde, Audun
2004-08-01
The objective of this thesis was to further develop a methodology for minimizing the entropy production of single and connected chemical process units. When chemical process equipment is designed and operated at the lowest entropy production possible, the energy efficiency of the equipment is enhanced. We have found for single process units that the entropy production could be reduced with up to 20-40%, given the degrees of freedom in the optimization. In processes, our results indicated that even bigger reductions were possible. The states of minimum entropy production were studied and important painter's for obtaining significant reductions in the entropy production were identified. Both from sustain ability and economical viewpoints knowledge of energy efficient design and operation are important. In some of the systems we studied, nonequilibrium thermodynamics was used to model the entropy production. In Chapter 2, we gave a brief introduction to different industrial applications of nonequilibrium thermodynamics. The link between local transport phenomena and overall system description makes nonequilibrium thermodynamics a useful tool for understanding design of chemical process units. We developed the methodology of minimization of entropy production in several steps. First, we analyzed and optimized the entropy production of single units: Two alternative concepts of adiabatic distillation; diabatic and heat-integrated distillation, were analyzed and optimized in Chapter 3 to 5. In diabatic distillation, heat exchange is allowed along the column, and it is this feature that increases the energy efficiency of the distillation column. In Chapter 3, we found how a given area of heat transfer should be optimally distributed among the trays in a column separating a mixture of propylene and propane. The results showed that heat exchange was most important on the trays close to the re boiler and condenser. In Chapter 4 and 5, we studied how the entropy
Gene-Centric Genomewide Association Study via Entropy
Cui, Yuehua; Kang, Guolian; Sun, Kelian; Qian, Minping; Romero, Roberto; Fu, Wenjiang
2008-01-01
Genes are the functional units in most organisms. Compared to genetic variants located outside genes, genic variants are more likely to affect disease risk. The development of the human HapMap project provides an unprecedented opportunity for genetic association studies at the genomewide level for elucidating disease etiology. Currently, most association studies at the single-nucleotide polymorphism (SNP) or the haplotype level rely on the linkage information between SNP markers and disease variants, with which association findings are difficult to replicate. Moreover, variants in genes might not be sufficiently covered by currently available methods. In this article, we present a gene-centric approach via entropy statistics for a genomewide association study to identify disease genes. The new entropy-based approach considers genic variants within one gene simultaneously and is developed on the basis of a joint genotype distribution among genetic variants for an association test. A grouping algorithm based on a penalized entropy measure is proposed to reduce the dimension of the test statistic. Type I error rates and power of the entropy test are evaluated through extensive simulation studies. The results indicate that the entropy test has stable power under different disease models with a reasonable sample size. Compared to single SNP-based analysis, the gene-centric approach has greater power, especially when there is more than one disease variant in a gene. As the genomewide genic SNPs become available, our entropy-based gene-centric approach would provide a robust and computationally efficient way for gene-based genomewide association study. PMID:18458106
Controlling the Shannon Entropy of Quantum Systems
Xing, Yifan; Wu, Jun
2013-01-01
This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking. PMID:23818819
Controlling the shannon entropy of quantum systems.
Xing, Yifan; Wu, Jun
2013-01-01
This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking.
Controlling the Shannon Entropy of Quantum Systems
Directory of Open Access Journals (Sweden)
Yifan Xing
2013-01-01
Full Text Available This paper proposes a new quantum control method which controls the Shannon entropy of quantum systems. For both discrete and continuous entropies, controller design methods are proposed based on probability density function control, which can drive the quantum state to any target state. To drive the entropy to any target at any prespecified time, another discretization method is proposed for the discrete entropy case, and the conditions under which the entropy can be increased or decreased are discussed. Simulations are done on both two- and three-dimensional quantum systems, where division and prediction are used to achieve more accurate tracking.
Towards information inequalities for generalized graph entropies.
Directory of Open Access Journals (Sweden)
Lavanya Sivakumar
Full Text Available In this article, we discuss the problem of establishing relations between information measures for network structures. Two types of entropy based measures namely, the Shannon entropy and its generalization, the Rényi entropy have been considered for this study. Our main results involve establishing formal relationships, by means of inequalities, between these two kinds of measures. Further, we also state and prove inequalities connecting the classical partition-based graph entropies and partition-independent entropy measures. In addition, several explicit inequalities are derived for special classes of graphs.
Generalized gravitational entropy from total derivative action
Energy Technology Data Exchange (ETDEWEB)
Dong, Xi [Stanford Institute for Theoretical Physics, Department of Physics, Stanford University,Stanford, CA 94305 (United States); School of Natural Sciences, Institute for Advanced Study,Princeton, NJ 08540 (United States); Miao, Rong-Xin [Max Planck Institute for Gravitational Physics (Albert Einstein Institute),Am Mühlenberg 1, 14476 Golm (Germany)
2015-12-16
We investigate the generalized gravitational entropy from total derivative terms in the gravitational action. Following the method of Lewkowycz and Maldacena, we find that the generalized gravitational entropy from total derivatives vanishes. We compare our results with the work of Astaneh, Patrushev, and Solodukhin. We find that if total derivatives produced nonzero entropy, the holographic and the field-theoretic universal terms of entanglement entropy would not match. Furthermore, the second law of thermodynamics could be violated if the entropy of total derivatives did not vanish.
Chen, Tianheng; Shu, Chi-Wang
2017-09-01
It is well known that semi-discrete high order discontinuous Galerkin (DG) methods satisfy cell entropy inequalities for the square entropy for both scalar conservation laws (Jiang and Shu (1994) [39]) and symmetric hyperbolic systems (Hou and Liu (2007) [36]), in any space dimension and for any triangulations. However, this property holds only for the square entropy and the integrations in the DG methods must be exact. It is significantly more difficult to design DG methods to satisfy entropy inequalities for a non-square convex entropy, and/or when the integration is approximated by a numerical quadrature. In this paper, we develop a unified framework for designing high order DG methods which will satisfy entropy inequalities for any given single convex entropy, through suitable numerical quadrature which is specific to this given entropy. Our framework applies from one-dimensional scalar cases all the way to multi-dimensional systems of conservation laws. For the one-dimensional case, our numerical quadrature is based on the methodology established in Carpenter et al. (2014) [5] and Gassner (2013) [19]. The main ingredients are summation-by-parts (SBP) operators derived from Legendre Gauss-Lobatto quadrature, the entropy conservative flux within elements, and the entropy stable flux at element interfaces. We then generalize the scheme to two-dimensional triangular meshes by constructing SBP operators on triangles based on a special quadrature rule. A local discontinuous Galerkin (LDG) type treatment is also incorporated to achieve the generalization to convection-diffusion equations. Extensive numerical experiments are performed to validate the accuracy and shock capturing efficacy of these entropy stable DG methods.
Mao, Chao; Chen, Shou
2017-01-01
According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.
Efficient continuous-duty Bitter-type electromagnets for cold atom experiments.
Sabulsky, Dylan O; Parker, Colin V; Gemelke, Nathan D; Chin, Cheng
2013-10-01
We present the design, construction, and characterization of Bitter-type electromagnets which can generate high magnetic fields under continuous operation with efficient heat removal for cold atom experiments. The electromagnets are constructed from a stack of alternating layers consisting of copper arcs and insulating polyester spacers. Efficient cooling of the copper is achieved via parallel rectangular water cooling channels between copper layers with low resistance to flow; a high ratio of the water-cooled surface area to the volume of copper ensures a short length scale (~1 mm) to extract dissipated heat. High copper fraction per layer ensures high magnetic field generated per unit energy dissipated. The ensemble is highly scalable and compressed to create a watertight seal without epoxy. From our measurements, a peak field of 770 G is generated 14 mm away from a single electromagnet with a current of 400 A and a total power dissipation of 1.6 kW. With cooling water flowing at 3.8 l/min, the coil temperature only increases by 7 °C under continuous operation.
CFD Analyses and Experiments in a PAT Modeling: Pressure Variation and System Efficiency
Directory of Open Access Journals (Sweden)
Modesto Pérez-Sánchez
2017-10-01
Full Text Available Analysis of a PAT modeling is presented for application in water pipe systems as an interesting and promising energy converter to improve the system energy efficiency. The study is focused on the use of a Computational Fluid Dynamics (CFD model in conjunction with laboratory data for representing PAT performance. The first stage of the procedure concerns a systematic analysis of the role played by the characteristic PAT parameters in the computational mesh definitions of the CFD model, with the aim of defining the most efficient set of capturing the main features of the PAT behaviour under different operating conditions. In the second stage, comparisons of CFD results and experiments were carried out to examine some system components for better understanding the PAT response. Specifically, the behavior of the pressure distribution along the PAT installation when implemented in a water pipe system are analyzed, and the links between pressure variation and the head drop in different system components responsible for the head losses and net head definition are also examined.
Kaufman, Scott Barry; Benedek, Mathias; Jung, Rex E.; Kenett, Yoed N.; Jauk, Emanuel; Neubauer, Aljoscha C.; Silvia, Paul J.
2015-01-01
Abstract The brain's default network (DN) has been a topic of considerable empirical interest. In fMRI research, DN activity is associated with spontaneous and self‐generated cognition, such as mind‐wandering, episodic memory retrieval, future thinking, mental simulation, theory of mind reasoning, and creative cognition. Despite large literatures on developmental and disease‐related influences on the DN, surprisingly little is known about the factors that impact normal variation in DN functioning. Using structural equation modeling and graph theoretical analysis of resting‐state fMRI data, we provide evidence that Openness to Experience—a normally distributed personality trait reflecting a tendency to engage in imaginative, creative, and abstract cognitive processes—underlies efficiency of information processing within the DN. Across two studies, Openness predicted the global efficiency of a functional network comprised of DN nodes and corresponding edges. In Study 2, Openness remained a robust predictor—even after controlling for intelligence, age, gender, and other personality variables—explaining 18% of the variance in DN functioning. These findings point to a biological basis of Openness to Experience, and suggest that normally distributed personality traits affect the intrinsic architecture of large‐scale brain systems. Hum Brain Mapp 37:773–779, 2016. © 2015 Wiley Periodicals, Inc. PMID:26610181
Efficient Continuous-Duty Bitter-Type Electromagnets for Cold Atom Experiments
Sabulsky, Dylan; Ocola, Paloma; Parker, Colin; Gemelke, Nathan; Chin, Cheng
2014-05-01
We present the design, construction and characterization of Bitter-type electromagnets which can generate high magnetic fields under continuous operation with efficient heat removal for cold atom experiments. The electromagnets are constructed from a stack of alternating layers consisting of copper arcs and insulating polyester spacers. Efficient cooling of the copper is achieved via parallel rectangular water cooling channels between copper layers with low resistance to flow; a high ratio of the water-cooled surface area to the volume of copper ensures a short length scale ~1 mm to extract dissipated heat. High copper fraction per layer ensures high magnetic field generated per unit energy dissipated. The ensemble is highly scalable and compressed to create a watertight seal without epoxy. From our measurements, a peak field of 770 G is generated 14 mm away from a single electromagnet with a current of 400 A and a total power dissipation of 1.6 kW. With cooling water flowing at 3.8 l/min, the coil temperature only increases by 7 degrees Celsius under continuous operation.
Conservative models: parametric entropy vs. temporal entropy in outcomes.
Huang, Lumeng; Ritzi, Robert W; Ramanathan, Ramya
2012-01-01
The geologic architecture in aquifer systems affects the behavior of fluid flow and the dispersion of mass. The spatial distribution and connectivity of higher-permeability facies play an important role. Models that represent this geologic structure have reduced entropy in the spatial distribution of permeability relative to models without structure. The literature shows that the stochastic model with the greatest variance in the distribution of predictions (i.e., the most conservative model) will not simply be the model representing maximum disorder in the permeability field. This principle is further explored using the Shannon entropy as a single metric to quantify and compare model parametric spatial disorder to the temporal distribution of mass residence times in model predictions. The principle is most pronounced when geologic structure manifests as preferential-flow pathways through the system via connected high-permeability sediments. As per percolation theory, at certain volume fractions the full connectivity of the high-permeability sediments will not be represented unless the model is three-dimensional. At these volume fractions, two-dimensional models can profoundly underrepresent the entropy in the real, three-dimensional, aquifer system. Thus to be conservative, stochastic models must be three-dimensional and include geologic structure. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.
Entropy and Entropy Production: Old Misconceptions and New Breakthroughs
Directory of Open Access Journals (Sweden)
Leonid M. Martyushev
2013-03-01
Full Text Available Persistent misconceptions existing for dozens of years and influencing progress in various fields of science are sometimes encountered in the scientific and especially, the popular-science literature. The present brief review deals with two such interrelated misconceptions (misunderstandings. The first misunderstanding: entropy is a measure of disorder. This is an old and very common opinion. The second misconception is that the entropy production minimizes in the evolution of nonequilibrium systems. However, as it has recently become clear, evolution (progress in Nature demonstrates the opposite, i.e., maximization of the entropy production. The principal questions connected with this maximization are considered herein. The two misconceptions mentioned above can lead to the apparent contradiction between the conclusions of modern thermodynamics and the basic conceptions of evolution existing in biology. In this regard, the analysis of these issues seems extremely important and timely as it contributes to the deeper understanding of the laws of development of the surrounding World and the place of humans in it.
Giant onsite electronic entropy enhances the performance of ceria for water splitting
DEFF Research Database (Denmark)
Naghavi, S. Shahab; Emery, Antoine A.; Hansen, Heine Anton
2017-01-01
lanthanides, and reaches a maximum value of ≈4.7 kB per oxygen vacancy for Ce4+/Ce3+ reduction. This unique and large positive entropy source in ceria explains its excellent performance for high-temperature catalytic redox reactions such as water splitting. Our calculations also show that terbium dioxide has......Previous studies have shown that a large solid-state entropy of reduction increases the thermodynamic efficiency of metal oxides, such as ceria, for two-step thermochemical water splitting cycles. In this context, the configurational entropy arising from oxygen off-stoichiometry in the oxide, has...
X-ray conversion efficiency in vacuum hohlraum experiments at the National Ignition Facility
Energy Technology Data Exchange (ETDEWEB)
Olson, R. E. [Sandia National Laboratories, Albuquerque, New Mexico 87185 (United States); Suter, L. J.; Callahan, D. A.; Rosen, M. D.; Dixit, S. N.; Landen, O. L.; Meezan, N. B.; Moody, J. D.; Thomas, C. A.; Warrick, A.; Widmann, K.; Williams, E. A.; Glenzer, S. H. [Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Kline, J. L. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)
2012-05-15
X-ray fluxes measured in the first 96 and 192 beam vacuum hohlraum experiments at the National Ignition Facility (NIF) were significantly higher than predicted by computational simulations employing XSN average atom atomic physics and highly flux-limited electron heat conduction. For agreement with experimental data, it was found that the coronal plasma emissivity must be simulated with a detailed configuration accounting model that accounts for x-ray emission involving all of the significant ionization states. It was also found that an electron heat conduction flux limit of f= 0.05 is too restrictive, and that a flux limit of f= 0.15 results in a much better match with the NIF vacuum hohlraum experimental data. The combination of increased plasma emissivity and increased electron heat conduction in this new high flux hohlraum model results in a reduction in coronal plasma energy and, hence, an explanation for the high ({approx}85%-90%) x-ray conversion efficiencies observed in the 235 < T{sub r} < 345 eV NIF vacuum hohlraum experiments.
Modular invariance and entanglement entropy
Energy Technology Data Exchange (ETDEWEB)
Lokhande, Sagar Fakirchand; Mukhi, Sunil [Indian Institute of Science Education and Research,Homi Bhabha Rd, Pashan, Pune 411 008 (India)
2015-06-17
We study the Rényi and entanglement entropies for free 2d CFT’s at finite temperature and finite size, with emphasis on their properties under modular transformations of the torus. We address the issue of summing over fermion spin structures in the replica trick, and show that the relation between entanglement and thermal entropy determines two different ways to perform this sum in the limits of small and large interval. Both answers are modular covariant, rather than invariant. Our results are compared with those for a free boson at unit radius in the two limits and complete agreement is found, supporting the view that entanglement respects Bose-Fermi duality. We extend our computations to multiple free Dirac fermions having correlated spin structures, dual to free bosons on the Spin(2d) weight lattice.
Preserved entropy and fragile magnetism.
Canfield, Paul C; Bud'ko, Sergey L
2016-08-01
A large swath of quantum critical and strongly correlated electron systems can be associated with the phenomena of preserved entropy and fragile magnetism. In this overview we present our thoughts and plans for the discovery and development of lanthanide and transition metal based, strongly correlated systems that are revealed by suppressed, fragile magnetism, quantum criticality, or grow out of preserved entropy. We will present and discuss current examples such as YbBiPt, YbAgGe, YbFe2Zn20, PrAg2In, BaFe2As2, CaFe2As2, LaCrSb3 and LaCrGe3 as part of our motivation and to provide illustrative examples.
ASSESSMENT OF MOTIVATION BY ENTROPY
Tadeusz G³owacki
2014-01-01
Motivation is inseparable from human work. It is also one of the five most important elements of the management process. The ability to determine the level of motivation would therefore be very useful in the work of every manager. This paper is an attempt to quantify motivation and evaluate its size, using the concept of entropy. The main reason to try defining a method of measuring the amount of motivation is to improve the management techniques of companies.
Multivariate Generalized Multiscale Entropy Analysis
Directory of Open Access Journals (Sweden)
Anne Humeau-Heurtier
2016-11-01
Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.
Indian Academy of Sciences (India)
The entropy of the S-matrix statistical distribution is maximized, with the constraint TrSS† αn: n is the dimensionality of S, and 0 α. 1. For α. 1 the S-matrix distribution concentrates on the unitarity sphere and we have no absorption; for α. 0 the distribution becomes a ... light of a central-limit theorem. For weak absorption, some ...
Maximum Entropy Discrimination Markov Networks
Zhu, Jun; Xing, Eric P.
2009-01-01
In this paper, we present a novel and general framework called {\\it Maximum Entropy Discrimination Markov Networks} (MaxEnDNet), which integrates the max-margin structured learning and Bayesian-style estimation and combines and extends their merits. Major innovations of this model include: 1) It generalizes the extant Markov network prediction rule based on a point estimator of weights to a Bayesian-style estimator that integrates over a learned distribution of the weights. 2) It extends the ...
The Homological Nature of Entropy
Directory of Open Access Journals (Sweden)
Pierre Baudot
2015-05-01
Full Text Available We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1 classical probabilities and random variables; (2 quantum probabilities and observable operators; (3 dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback–Leibler divergence and generalizes them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems A, C and E in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system.
A method for testing the cosmic homogeneity with Shannon entropy
Pandey, Biswajit
2013-04-01
We propose a method for testing cosmic homogeneity based on the Shannon entropy in Information theory and test the potentials and limitations of the method on Monte Carlo simulations of some homogeneous and inhomogeneous 3D point process in a finite region of space. We analyse a set of N-body simulations to investigate the prospect of determining the scale of homogeneity with the proposed method and show that the method could serve as an efficient tool for the study of homogeneity.
Breaking the glass ceiling: Configurational entropy measurements in extremely supercooled liquids
Berthier, Ludovic
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, due to the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally-relevant timescales. In this work we not only close the colossal gap between experiments and simulations but manage to create in-silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four independent estimates of their configurational entropy. These measurements consistently indicate that the steep entropy decrease observed in experiments is found in simulations even beyond the experimental glass transition. Our numerical results thus open a new observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation.
Configurational entropy measurements in extremely supercooled liquids that break the glass ceiling.
Berthier, Ludovic; Charbonneau, Patrick; Coslovich, Daniele; Ninarello, Andrea; Ozawa, Misaki; Yaida, Sho
2017-10-24
Liquids relax extremely slowly on approaching the glass state. One explanation is that an entropy crisis, because of the rarefaction of available states, makes it increasingly arduous to reach equilibrium in that regime. Validating this scenario is challenging, because experiments offer limited resolution, while numerical studies lag more than eight orders of magnitude behind experimentally relevant timescales. In this work, we not only close the colossal gap between experiments and simulations but manage to create in silico configurations that have no experimental analog yet. Deploying a range of computational tools, we obtain four estimates of their configurational entropy. These measurements consistently confirm that the steep entropy decrease observed in experiments is also found in simulations, even beyond the experimental glass transition. Our numerical results thus extend the observational window into the physics of glasses and reinforce the relevance of an entropy crisis for understanding their formation. Published under the PNAS license.
Hardie, L C; Armentano, L E; Shaver, R D; VandeHaar, M J; Spurlock, D M; Yao, C; Bertics, S J; Contreras-Govea, F E; Weigel, K A
2015-04-01
Prior to genomic selection on a trait, a reference population needs to be established to link marker genotypes with phenotypes. For costly and difficult-to-measure traits, international collaboration and sharing of data between disciplines may be necessary. Our aim was to characterize the combining of data from nutrition studies carried out under similar climate and management conditions to estimate genetic parameters for feed efficiency. Furthermore, we postulated that data from the experimental cohorts within these studies can be used to estimate the net energy of lactation (NE(L)) densities of diets, which can provide estimates of energy intakes for use in the calculation of the feed efficiency metric, residual feed intake (RFI), and potentially reduce the effect of variation in energy density of diets. Individual feed intakes and corresponding production and body measurements were obtained from 13 Midwestern nutrition experiments. Two measures of RFI were considered, RFI(Mcal) and RFI(kg), which involved the regression of NE(L )intake (Mcal/d) or dry matter intake (DMI; kg/d) on 3 expenditures: milk energy, energy gained or lost in body weight change, and energy for maintenance. In total, 677 records from 600 lactating cows between 50 and 275 d in milk were used. Cows were divided into 46 cohorts based on dietary or nondietary treatments as dictated by the nutrition experiments. The realized NE(L) densities of the diets (Mcal/kg of DMI) were estimated for each cohort by totaling the average daily energy used in the 3 expenditures for cohort members and dividing by the cohort's total average daily DMI. The NE(L) intake for each cow was then calculated by multiplying her DMI by her cohort's realized energy density. Mean energy density was 1.58 Mcal/kg. Heritability estimates for RFI(kg), and RFI(Mcal) in a single-trait animal model did not differ at 0.04 for both measures. Information about realized energy density could be useful in standardizing intake data from
Local entropy of a nonequilibrium fermion system
Stafford, Charles A.; Shastry, Abhay
2017-03-01
The local entropy of a nonequilibrium system of independent fermions is investigated and analyzed in the context of the laws of thermodynamics. It is shown that the local temperature and chemical potential can only be expressed in terms of derivatives of the local entropy for linear deviations from local equilibrium. The first law of thermodynamics is shown to lead to an inequality, not equality, for the change in the local entropy as the nonequilibrium state of the system is changed. The maximum entropy principle (second law of thermodynamics) is proven: a nonequilibrium distribution has a local entropy less than or equal to a local equilibrium distribution satisfying the same constraints. It is shown that the local entropy of the system tends to zero when the local temperature tends to zero, consistent with the third law of thermodynamics.
On Thermodynamic Interpretation of Transfer Entropy
Directory of Open Access Journals (Sweden)
Don C. Price
2013-02-01
Full Text Available We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect.
Choosing a Definition of Entropy that Works
Swendsen, Robert H.
2012-04-01
Disagreements over the meaning of the thermodynamic entropy and how it should be defined in statistical mechanics have endured for well over a century. In an earlier paper, I showed that there were at least nine essential properties of entropy that are still under dispute among experts. In this paper, I examine the consequences of differing definitions of the thermodynamic entropy of macroscopic systems. Two proposed definitions of entropy in classical statistical mechanics are (1) defining entropy on the basis of probability theory (first suggested by Boltzmann in 1877), and (2) the traditional textbook definition in terms of a volume in phase space (also attributed to Boltzmann). The present paper demonstrates the consequences of each of these proposed definitions of entropy and argues in favor of a definition based on probabilities.
Irudayam, Sheeba Jem; Henchman, Richard H
2010-07-21
An equation for the chemical potential of a dilute aqueous solution of noble gases is derived in terms of energies, force and torque magnitudes, and solute and water coordination numbers, quantities which are all measured from an equilibrium molecular dynamics simulation. Also derived are equations for the Gibbs free energy, enthalpy and entropy of hydration for the Henry's law process, the Ostwald process, and a third proposed process going from an arbitrary concentration in the gas phase to the equivalent mole fraction in aqueous solution which has simpler expressions for the enthalpy and entropy changes. Good agreement with experimental hydration free energies is obtained in the TIP4P and SPC/E water models although the solute's force field appears to affect the enthalpies and entropies obtained. In contrast to other methods, the approach gives a complete breakdown of the entropy for every degree of freedom and makes possible a direct structural interpretation of the well-known entropy loss accompanying the hydrophobic hydration of small non-polar molecules under ambient conditions. The noble-gas solutes experience only a small reduction in their vibrational entropy, with larger solutes experiencing a greater loss. The vibrational and librational entropy components of water actually increase but only marginally, negating any idea of water confinement. The term that contributes the most to the hydrophobic entropy loss is found to be water's orientational term which quantifies the number of orientational minima per water molecule and how many ways the whole hydrogen-bond network can form. These findings help resolve contradictory deductions from experiments that water structure around non-polar solutes is similar to bulk water in some ways but different in others. That the entropy loss lies in water's rotational entropy contrasts with other claims that it largely lies in water's translational entropy, but this apparent discrepancy arises because of different
Energy conservation and maximal entropy production in enzyme reactions.
Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš
2017-08-01
A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.
Kraatz, Miriam; Sears, Lindsay E; Coberley, Carter R; Pope, James E
2016-08-01
Well-being is linked to important societal factors such as health care costs and productivity and has experienced a surge in development activity of both theories and measurement. This study builds on validation of the Well-Being 5 survey and for the first time applies Item Response Theory, a modern and flexible measurement paradigm, to form the basis of adaptive population well-being measurement. Adaptive testing allows survey questions to be administered selectively, thereby reducing the number of questions required of the participant. After the graded response model was fit to a sample of size N = 12,035, theta scores were estimated based on both the full-item bank and a simulation of Computerized Adaptive Testing (CAT). Comparisons of these 2 sets of score estimates with each other and of their correlations with external outcomes of job performance, absenteeism, and hospital admissions demonstrate that the CAT well-being scores maintain accuracy and validity. The simulation indicates that the average survey taker can expect a reduction in number of items administered during the CAT process of almost 50%. An increase in efficiency of this extent is of considerable value because of the time savings during the administration of the survey and the potential improvement of user experience, which in turn can help secure the success of a total population-based well-being improvement program. (Population Health Management 2016;19:284-290).
Entropy jump across an inviscid shock wave
Salas, Manuel D.; Iollo, Angelo
1995-01-01
The shock jump conditions for the Euler equations in their primitive form are derived by using generalized functions. The shock profiles for specific volume, speed, and pressure are shown to be the same, however density has a different shock profile. Careful study of the equations that govern the entropy shows that the inviscid entropy profile has a local maximum within the shock layer. We demonstrate that because of this phenomenon, the entropy, propagation equation cannot be used as a conservation law.
Entropy In the Universe: A New Approach
Directory of Open Access Journals (Sweden)
Antonio Alfonso-Faus
2000-09-01
Full Text Available Abstract: We propose a new definition of entropy for any mass m, based on gravitation and through the concept of a gravitational cross section. It turns out to be proportional to mass, and therefore extensive, and to the age of the Universe. It is a Machian approach. It is also the number of gravity quanta the mass has emitted through its age. The entropy of the Uni-verse is so determined and the cosmological entropy problem solved.
On Gravitational Entropy of de Sitter Universe
Ulhoa, S C
2013-01-01
The paper deals with the calculation of the gravitational entropy in the context of teleparallel gravity for de Sitter space-time. In such a theory it is possible to define gravitational energy and pressure, thus we use those expressions to construct the gravitational entropy. We interpret the cosmological constant as the temperature and write the first law of thermodynamics. In the limit $\\Lambda\\ll 1$ we find that the entropy is proportional to volume and $\\Delta S\\geq 0$.
The pressure and entropy of a unitary Fermi gas with particle-hole fluctuation
Gong, Hao; Ruan, Xiao-Xia; Zong, Hong-Shi
2018-01-01
We calculate the pressure and entropy of a unitary Fermi gas based on universal relations combined with our previous prediction of energy which was calculated within the framework of the non-self-consistent T-matrix approximation with particle-hole fluctuation. The resulting entropy and pressure are compared with the experimental data and the theoretical results without induced interaction. For entropy, we find good agreement between our results with particle-hole fluctuation and the experimental measurements reported by ENS group and MIT experiment. For pressure, our results suffer from a systematic upshift compared to MIT data.
The Macro and Micro of it Is that Entropy Is the Spread of Energy
Phillips, Jeffrey A.
2016-09-01
While entropy is often described as "disorder," it is better thought of as a measure of how spread out energy is within a system. To illustrate this interpretation of entropy to introductory college or high school students, several activities have been created. Students first study the relationship between microstates and macrostates to better understand the probabilities involved. Then, each student observes how a system evolves as energy is allowed to move within it. By studying how the class's ensemble of systems evolves, the tendency of energy to spread, rather than concentrate, can be observed. All activities require minimal equipment and provide students with a tactile and visual experience with entropy.
An Automatic Multilevel Image Thresholding Using Relative Entropy and Meta-Heuristic Algorithms
Directory of Open Access Journals (Sweden)
Josue R. Cuevas
2013-06-01
Full Text Available Multilevel thresholding has been long considered as one of the most popular techniques for image segmentation. Multilevel thresholding outputs a gray scale image in which more details from the original picture can be kept, while binary thresholding can only analyze the image in two colors, usually black and white. However, two major existing problems with the multilevel thresholding technique are: it is a time consuming approach, i.e., finding appropriate threshold values could take an exceptionally long computation time; and defining a proper number of thresholds or levels that will keep most of the relevant details from the original image is a difficult task. In this study a new evaluation function based on the Kullback-Leibler information distance, also known as relative entropy, is proposed. The property of this new function can help determine the number of thresholds automatically. To offset the expensive computational effort by traditional exhaustive search methods, this study establishes a procedure that combines the relative entropy and meta-heuristics. From the experiments performed in this study, the proposed procedure not only provides good segmentation results when compared with a well known technique such as Otsu’s method, but also constitutes a very efficient approach.
Liu, Ming-Yu; Tuzel, Oncel; Ramalingam, Srikumar; Chellappa, Rama
2014-01-01
We propose a new objective function for clustering. This objective function consists of two components: the entropy rate of a random walk on a graph and a balancing term. The entropy rate favors formation of compact and homogeneous clusters, while the balancing function encourages clusters with similar sizes and penalizes larger clusters that aggressively group samples. We present a novel graph construction for the graph associated with the data and show that this construction induces a matroid--a combinatorial structure that generalizes the concept of linear independence in vector spaces. The clustering result is given by the graph topology that maximizes the objective function under the matroid constraint. By exploiting the submodular and monotonic properties of the objective function, we develop an efficient greedy algorithm. Furthermore, we prove an approximation bound of (1/2) for the optimality of the greedy solution. We validate the proposed algorithm on various benchmarks and show its competitive performances with respect to popular clustering algorithms. We further apply it for the task of superpixel segmentation. Experiments on the Berkeley segmentation data set reveal its superior performances over the state-of-the-art superpixel segmentation algorithms in all the standard evaluation metrics.
An evaluation of coordination relationships during earthquake emergency rescue using entropy theory
Directory of Open Access Journals (Sweden)
Huang Rong
2015-05-01
Full Text Available Emergency rescue after an earthquake is complex work which requires the participation of relief and social organizations. Studying earthquake emergency coordination efficiency can not only help rescue organizations to define their own rescue missions, but also strengthens inter-organizational communication and collaboration tasks, improves the efficiency of emergency rescue, and reduces loss. In this paper, collaborative entropy is introduced to study earthquake emergency rescue operations. To study the emergency rescue coordination relationship, collaborative matrices and collaborative entropy functions are established between emergency relief work and relief organizations, and the collaborative efficiency of the emergency rescue elements is determined based on this entropy function. Finally, the Lushan earthquake is used as an example to evaluate earthquake emergency rescue coordination efficiency.
Constructing black hole entropy from gravitational collapse
Acquaviva, Giovanni; Goswami, Rituparno; Hamid, Aymen I M
2016-01-01
Based on a recent proposal for the gravitational entropy of free gravitational fields, we investigate the thermodynamic properties of black hole formation through gravitational collapse in the framework of the semitetrad 1+1+2 covariant formalism. In the simplest case of an Oppenheimer-Snyder-Datt collapse we prove that the change in gravitational entropy outside a collapsing body is related to the variation of the surface area of the body itself, even before the formation of horizons. As a result, we are able to relate the Bekenstein-Hawking entropy of the black hole endstate to the variation of the vacuum gravitational entropy outside the collapsing body.
Constructing black hole entropy from gravitational collapse
Acquaviva, Giovanni; Goswami, Rituparno; Hamid, Aymen I M
2014-01-01
Based on a recent proposal for the gravitational entropy of free gravitational fields, we investigate the thermodynamic properties of black hole formation through gravitational collapse in the framework of the semitetrad 1+1+2 covariant formalism. In the simplest case of an Oppenheimer-Snyder-Datt collapse we prove that the change in gravitational entropy outside a collapsing body is related to the variation of the surface area of the body itself, even before the formation of horizons. As a result, we are able to relate the Bekenstein-Hawking entropy of the black hole endstate to the variation of the vacuum gravitational entropy outside the collapsing body.
Polynomial entropies for Bott nondegenerate Hamiltonian systems
Labrousse, Clémence; Marco, Jean-Pierre
2012-01-01
In this paper, we study the entropy of a Hamiltonian flow in restriction to an enregy level where it admits a first integral which is nondegenerate in the Bott sense. It is easy to see that for such a flow, the topological entropy vanishes. We focus on the polynomial and the weak polynomial entropies. We prove that, under conditions on the critical level of the Bott first integral and dynamical conditions on the hamiltonian function, the weak polynomial entropy belongs to {0,1} and the polyno...
Entanglement entropy for nonzero genus topologies
Kumar, S. Santhosh; Ghosh, Suman; Shankaranarayanan, S.
2014-03-01
Over the last three decades, entanglement entropy has been obtained for quantum fields propagating in Genus-0 topologies (spheres). For scalar fields propagating in these topologies, it has been shown that the entanglement entropy scales as area. In the last few years, nontrivial topologies are increasingly relevant for different areas. For instance, in describing quantum phases, it has been realized that long-range entangled states are described by topological order. If quantum entanglement can plausibly provide explanation for these, it is then imperative to obtain entanglement entropy in these topologies. In this work, using two different methods, we explicitly show that the entanglement entropy scales as area of the Genus-1 geometry.
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
Modeling the Mass Action Dynamics of Metabolism with Fluctuation Theorems and Maximum Entropy
Cannon, William; Thomas, Dennis; Baxter, Douglas; Zucker, Jeremy; Goh, Garrett
The laws of thermodynamics dictate the behavior of biotic and abiotic systems. Simulation methods based on statistical thermodynamics can provide a fundamental understanding of how biological systems function and are coupled to their environment. While mass action kinetic simulations are based on solving ordinary differential equations using rate parameters, analogous thermodynamic simulations of mass action dynamics are based on modeling states using chemical potentials. The latter have the advantage that standard free energies of formation/reaction and metabolite levels are much easier to determine than rate parameters, allowing one to model across a large range of scales. Bridging theory and experiment, statistical thermodynamics simulations allow us to both predict activities of metabolites and enzymes and use experimental measurements of metabolites and proteins as input data. Even if metabolite levels are not available experimentally, it is shown that a maximum entropy assumption is quite reasonable and in many cases results in both the most energetically efficient process and the highest material flux.
Topological entropy of autonomous flows
Energy Technology Data Exchange (ETDEWEB)
Badii, R. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1997-06-01
When studying fluid dynamics, especially in a turbulent regime, it is crucial to estimate the number of active degrees of freedom or of localized structures in the system. The topological entropy quantifies the exponential growth of the number of `distinct` orbits in a dynamical system as a function of their length, in the infinite spatial resolution limit. Here, I illustrate a novel method for its evaluation, which extends beyond maps and is applicable to any system, including autonomous flows: these are characterized by lack of a definite absolute time scale for the orbit lengths. (author) 8 refs.
Entropy Analysis as an Electroencephalogram Feature Extraction Method
Directory of Open Access Journals (Sweden)
P. I. Sotnikov
2014-01-01
Full Text Available The aim of this study was to evaluate a possibility for using an entropy analysis as an electroencephalogram (EEG feature extraction method in brain-computer interfaces (BCI. The first section of the article describes the proposed algorithm based on the characteristic features calculation using the Shannon entropy analysis. The second section discusses issues of the classifier development for the EEG records. We use a support vector machine (SVM as a classifier. The third section describes the test data. Further, we estimate an efficiency of the considered feature extraction method to compare it with a number of other methods. These methods include: evaluation of signal variance; estimation of spectral power density (PSD; estimation of autoregression model parameters; signal analysis using the continuous wavelet transform; construction of common spatial pattern (CSP filter. As a measure of efficiency we use the probability value of correctly recognized types of imagery movements. At the last stage we evaluate the impact of EEG signal preprocessing methods on the final classification accuracy. Finally, it concludes that the entropy analysis has good prospects in BCI applications.
Clausius entropy for arbitrary bifurcate null surfaces
Baccetti, Valentina; Visser, Matt
2014-02-01
Jacobson’s thermodynamic derivation of the Einstein equations was originally applied only to local Rindler horizons. But at least some parts of that construction can usefully be extended to give meaningful results for arbitrary bifurcate null surfaces. As presaged in Jacobson’s original article, this more general construction sharply brings into focus the questions: is entropy objectively ‘real’? Or is entropy in some sense subjective and observer-dependent? These innocent questions open a Pandora’s box of often inconclusive debate. A consensus opinion, though certainly not universally held, seems to be that Clausius entropy (thermodynamic entropy, defined via a Clausius relation {\\rm{d}}S = \\unicode{x111} Q/T) should be objectively real, but that the ontological status of statistical entropy (Shannon or von Neumann entropy) is much more ambiguous, and much more likely to be observer-dependent. This question is particularly pressing when it comes to understanding Bekenstein entropy (black hole entropy). To perhaps further add to the confusion, we shall argue that even the Clausius entropy can often be observer-dependent. In the current article we shall conclusively demonstrate that one can meaningfully assign a notion of Clausius entropy to arbitrary bifurcate null surfaces—effectively defining a ‘virtual Clausius entropy’ for arbitrary ‘virtual (local) causal horizons’. As an application, we see that we can implement a version of the generalized second law (GSL) for this virtual Clausius entropy. This version of GSL can be related to certain (nonstandard) integral variants of the null energy condition. Because the concepts involved are rather subtle, we take some effort in being careful and explicit in developing our framework. In future work we will apply this construction to generalize Jacobson’s derivation of the Einstein equations.
Pouran, Behdad; Arbabi, Vahid; Weinans, Harrie; Zadpoor, Amir A
2016-11-01
Transport of solutes helps to regulate normal physiology and proper function of cartilage in diarthrodial joints. Multiple studies have shown the effects of characteristic parameters such as concentration of proteoglycans and collagens and the orientation of collagen fibrils on the diffusion process. However, not much quantitative information and accurate models are available to help understand how the characteristics of the fluid surrounding articular cartilage influence the diffusion process. In this study, we used a combination of micro-computed tomography experiments and biphasic-solute finite element models to study the effects of three parameters of the overlying bath on the diffusion of neutral solutes across cartilage zones. Those parameters include bath size, degree of stirring of the bath, and the size and concentration of the stagnant layer that forms at the interface of cartilage and bath. Parametric studies determined the minimum of the finite bath size for which the diffusion behavior reduces to that of an infinite bath. Stirring of the bath proved to remarkably influence neutral solute transport across cartilage zones. The well-stirred condition was achieved only when the ratio of the diffusivity of bath to that of cartilage was greater than ≈1000. While the thickness of the stagnant layer at the cartilage-bath interface did not significantly influence the diffusion behavior, increase in its concentration substantially elevated solute concentration in cartilage. Sufficient stirring attenuated the effects of the stagnant layer. Our findings could be used for efficient design of experimental protocols aimed at understanding the transport of molecules across articular cartilage. Copyright © 2016 Elsevier Ltd. All rights reserved.
ENTROPY CHARACTERISTICS IN MODELS FOR COORDINATION OF NEIGHBORING ROAD SECTIONS
Directory of Open Access Journals (Sweden)
N. I. Kulbashnaya
2016-01-01
Full Text Available The paper considers an application of entropy characteristics as criteria to coordinate traffic conditions at neighboring road sections. It has been proved that the entropy characteristics are widely used in the methods that take into account information influence of the environment on drivers and in the mechanisms that create such traffic conditions which ensure preservation of the optimal level of driver’s emotional tension during the drive. Solution of such problem is considered in the aspect of coordination of traffic conditions at neighboring road sections that, in its turn, is directed on exclusion of any driver’s transitional processes. Methodology for coordination of traffic conditions at neighboring road sections is based on the E. V. Gavrilov’s concept on coordination of some parameters of road sections which can be expressed in the entropy characteristics. The paper proposes to execute selection of coordination criteria according to accident rates because while moving along neighboring road sections traffic conditions change drastically that can result in creation of an accident situation. Relative organization of a driver’s perception field and driver’s interaction with the traffic environment has been selected as entropy characteristics. Therefore, the given characteristics are made conditional to the road accidents rate. The investigation results have revealed a strong correlation between the relative organization of the driver’s perception field and the relative organization of the driver’s interaction with the traffic environment and the accident rate. Results of the executed experiment have proved an influence of the accident rate on the investigated entropy characteristics.
Gaining Efficiency of Computational Experiments in Modeling the Flight Vehicle Movement
Directory of Open Access Journals (Sweden)
I. K. Romanova
2017-01-01
Full Text Available The paper considers one of the important aspects to gain efficiency of conducted computational experiments, namely to provide grid optimization. The problem solution will ultimately create a more perfect system, because just a multivariate simulation is a basis to apply optimization methods by the specified criteria and to identify problems in functioning of technical systems.The paper discusses a class of the moving objects, representing a body of revolution, which, for one reason or another, endures deformation of casing. Analyses using the author's techniques have shown that there are the following complex functional dependencies of aerodynamic characteristics of the studied class of deformed objects.Presents a literature review on new ways for organizing the calculations, data storage and transfer. Provides analysing the methods of forming grids, including those used in initial calculations and visualization of information. In addition to the regular grids, are offered unstructured grids, including those for dynamic spatial-temporal information. Attention is drawn to the problem of an efficient retrieval of information. The paper shows a relevant capability to run with large data volumes, including an OLAP technology, multidimensional cubes (Data Cube, and finally, an integrated Date Mining approach.Despite the huge number of successful modern approaches to the solution of problems of formation, storage and processing of multidimensional data, it should be noted that computationally these tools are quite expensive. Expenditure for using the special tools often exceeds the cost of directly conducted computational experiments as such. In this regard, it was recognized that it is unnecessary to abandon the use of traditional tools and focus on a direct increase of their efficiency. Within the framework of the applied problem under consideration such a tool was to form the optimal grids.The optimal grid was understood to be a grid in the N
Hacisuleyman, Aysima; Erman, Burak
2017-01-01
It has recently been proposed by Gunasakaran et al. that allostery may be an intrinsic property of all proteins. Here, we develop a computational method that can determine and quantify allosteric activity in any given protein. Based on Schreiber's transfer entropy formulation, our approach leads to an information transfer landscape for the protein that shows the presence of entropy sinks and sources and explains how pairs of residues communicate with each other using entropy transfer. The model can identify the residues that drive the fluctuations of others. We apply the model to Ubiquitin, whose allosteric activity has not been emphasized until recently, and show that there are indeed systematic pathways of entropy and information transfer between residues that correlate well with the activities of the protein. We use 600 nanosecond molecular dynamics trajectories for Ubiquitin and its complex with human polymerase iota and evaluate entropy transfer between all pairs of residues of Ubiquitin and quantify the binding susceptibility changes upon complex formation. We explain the complex formation propensities of Ubiquitin in terms of entropy transfer. Important residues taking part in allosteric communication in Ubiquitin predicted by our approach are in agreement with results of NMR relaxation dispersion experiments. Finally, we show that time delayed correlation of fluctuations of two interacting residues possesses an intrinsic causality that tells which residue controls the interaction and which one is controlled. Our work shows that time delayed correlations, entropy transfer and causality are the required new concepts for explaining allosteric communication in proteins.
A relative entropy method to measure non-exponential random data
Energy Technology Data Exchange (ETDEWEB)
Liang, Yingjie; Chen, Wen, E-mail: chenwen@hhu.edu.cn
2015-01-23
This paper develops a relative entropy method to measure non-exponential random data in conjunction with fractional order moment, logarithmic moment and tail statistics of Mittag–Leffler distribution. The distribution of non-exponential random data follows neither the exponential distribution nor exponential decay. The proposed strategy is validated by analyzing the experiment data, which are generated by Monte Carlo method using Mittag–Leffler distribution. Compared with the traditional Shannon entropy, the relative entropy method is simple to be implemented, and its corresponding relative entropies approximated by the fractional order moment, logarithmic moment and tail statistics can easily and accurately detect the non-exponential random data. - Highlights: • A relative entropy method is developed to measure non-exponential random data. • The fractional order moment, logarithmic moment and tail statistics are employed. • The three strategies of Mittag–Leffler distribution can be accurately established. • Compared with Shannon entropy, the relative entropy method is easy to be implemented.
Directory of Open Access Journals (Sweden)
Zhendong Mu
2017-02-01
Full Text Available Driver fatigue has become one of the major causes of traffic accidents, and is a complicated physiological process. However, there is no effective method to detect driving fatigue. Electroencephalography (EEG signals are complex, unstable, and non-linear; non-linear analysis methods, such as entropy, maybe more appropriate. This study evaluates a combined entropy-based processing method of EEG data to detect driver fatigue. In this paper, 12 subjects were selected to take part in an experiment, obeying driving training in a virtual environment under the instruction of the operator. Four types of enthrones (spectrum entropy, approximate entropy, sample entropy and fuzzy entropy were used to extract features for the purpose of driver fatigue detection. Electrode selection process and a support vector machine (SVM classification algorithm were also proposed. The average recognition accuracy was 98.75%. Retrospective analysis of the EEG showed that the extracted features from electrodes T5, TP7, TP8 and FP1 may yield better performance. SVM classification algorithm using radial basis function as kernel function obtained better results. A combined entropy-based method demonstrates good classification performance for studying driver fatigue detection.
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Weighted entropy for segmentation evaluation
Khan, Jesmin F.; Bhuiyan, Sharif M.
2014-04-01
In many image, video and computer vision systems the image segmentation is an essential part. Significant research has been done in image segmentation and a number of quantitative evaluation methods have already been proposed in the literature. However, often the segmentation evaluation is subjective that means it has been done visually or qualitatively. A segmentation evaluation method based on entropy is proposed in this work which is objective and simple to implement. A weighted self and mutual entropy are proposed to measure the dissimilarity of the pixels among the segmented regions and the similarity within a region. This evaluation technique gives a score that can be used to compare different segmentation algorithms for the same image, or to compare the segmentation results of a given algorithm with different images, or to find the best suited values of the parameters of a segmentation algorithm for a given image. The simulation results show that the proposed method can identify over-segmentation, under-segmentation, and the good segmentation.
Fuzzy Pattern Recognition Based on Symmetric Fuzzy Relative Entropy
Shi, Y F; He, L.H.; Chen, J.
2009-01-01
Based on fuzzy similarity degree, entropy, relative entropy and fuzzy entropy, the symmetric fuzzy relative entropy is presented, which not only has a full physical meaning, but also has succinct practicability. The symmetric fuzzy relative entropy can be used to measure the divergence between different fuzzy patterns. The example demonstrates that the symmetric fuzzy relative entropy is valid and reliable for fuzzy pattern recognition and classification, and its classification precision is v...
Nonextensive random-matrix theory based on Kaniadakis entropy
Abul-Magd, A. Y.
2006-01-01
The joint eigenvalue distributions of random-matrix ensembles are derived by applying the principle maximum entropy to the Renyi, Abe and Kaniadakis entropies. While the Renyi entropy produces essentially the same matrix-element distributions as the previously obtained expression by using the Tsallis entropy, and the Abe entropy does not lead to a closed form expression, the Kaniadakis entropy leads to a new generalized form of the Wigner surmise that describes a transition of the spacing dis...
Fromm, Steven
2017-09-01
In an effort to study and improve the optical trapping efficiency of the 225Ra Electric Dipole Moment experiment, a fully parallelized Monte Carlo simulation of the laser cooling and trapping apparatus was created at Argonne National Laboratory and now maintained and upgraded at Michigan State University. The simulation allows us to study optimizations and upgrades without having to use limited quantities of 225Ra (15 day half-life) in experiment's apparatus. It predicts a trapping efficiency that differs from the observed value in the experiment by approximately a factor of thirty. The effects of varying oven geometry, background gas interactions, laboratory magnetic fields, MOT laser beam configurations and laser frequency noise were studied and ruled out as causes of the discrepancy between measured and predicted values of the overall trapping efficiency. Presently, the simulation is being used to help optimize a planned blue slower laser upgrade in the experiment's apparatus, which will increase the overall trapping efficiency by up to two orders of magnitude. This work is supported by Michigan State University, the Director's Research Scholars Program at the National Superconducting Cyclotron Laboratory, and the U.S. DOE, Office of Science, Office of Nuclear Physics, under Contract DE-AC02-06CH11357.
Pellegrino, Antonio
2010-01-01
The LHCb experiment is a single arm spectrometer, designed to study CP violation in B-decays at the Large Hadron Collider (LHC). It is crucial to accurately and efficiently detect the charged decay particles, in the high-density particle environment of the LHC. For this, the Outer Tracker (OT) was
Directory of Open Access Journals (Sweden)
Carolina Castillo
2013-03-01
Full Text Available The response to P and water deficiencies of forage Lotus species has not been sufficiently studied in the Andisol and Vertisol soil orders in Chile's marginal areas. A pot experiment under cover was carried out between October 2007 and March 2008 to study the effects of P and soil water availability (SWA on DM production, P absorption, and P use efficiency in Lotus spp. The experiment included three Lotus (L. corniculatus L., L. tenuis Waldst. & Kit. ex Willd., and L. uliginosus Schkuhr species, two soils (Andisol and Vertisol, two contrasting P levels (low and high, and two SWA levels (10% and 100%. A completely randomized design with a 3 x 2 x 2 x 2 factorial arrangement with four replicates was used. Accumulated shoot and root DM, P absorption and efficiency, and arbuscular mycorrhizal (AM colonization were measured. Phosphorus absorption was significantly higher in Andisol with 100% SWA and high P in the three species, which was reflected in P efficiency where the species exhibited higher P absorption efficiency (PAE and P utilization efficiency (PUE with low P, and mean of the three species with low P and high SWA. When the P level was low, L. uliginosus showed the highest PAE and L. corniculatus exhibited the highest PUE. Phosphorus efficiency was also influenced by AM colonization since on the average mycorrhization in the three species was significantly higher in the low P treatments. Differences existed among species for DM production, response to P, P absorption, PAE, and PUE.
Generalized entropy production fluctuation theorems for quantum ...
Indian Academy of Sciences (India)
Home; Journals; Pramana – Journal of Physics; Volume 80; Issue 2. Generalized entropy ... Based on trajectory-dependent path probability formalism in state space, we derive generalized entropy production fluctuation relations for a quantum system in the presence of measurement and feedback. We have obtained these ...
Entropy and Certainty in Lossless Data Compression
Jacobs, James Jay
2009-01-01
Data compression is the art of using encoding techniques to represent data symbols using less storage space compared to the original data representation. The encoding process builds a relationship between the entropy of the data and the certainty of the system. The theoretical limits of this relationship are defined by the theory of entropy in…
Entropy Generation in a Chemical Reaction
Miranda, E. N.
2010-01-01
Entropy generation in a chemical reaction is analysed without using the general formalism of non-equilibrium thermodynamics at a level adequate for advanced undergraduates. In a first approach to the problem, the phenomenological kinetic equation of an elementary first-order reaction is used to show that entropy production is always positive. A…
Universal canonical entropy for gravitating systems
Indian Academy of Sciences (India)
Since the microcanonical entropy also has universal logarithmic corrections to the area law (from quantum space-time fluctuations, as found earlier) the canonical entropy then has a universal form including logarithmic corrections to the area law. This form is shown to be independent of the index appearing in assumption ...
Entanglement entropy in lattice gauge theories
Buividovich, . P. V.
We report on the recent progress in theoretical and numerical studies of entanglement entropy in lattice gauge theories. It is shown that the concept of quantum entanglement between gauge fields in two complementary regions of space can only be introduced if the Hilbert space of physical states is extended in a certain way. In the extended Hilbert space, the entanglement entropy can be partially interpreted as the classical Shannon entropy of the flux of the gauge fields through the boundary between the two regions. Such an extension leads to a reduction procedure which can be easily implemented in lattice simulations by constructing lattices with special topology. This enables us to measure the entanglement entropy in lattice Monte-Carlo simulations. On the simplest example of Z2 lattice gauge theory in (2 + 1) dimensions we demonstrate the relation between entanglement entropy and the classical entropy of the field flux. For SU (2) lattice gauge theory in four dimensions, we find a signature of non-analytic dependence of the entanglement entropy on the size of the region. We also comment on the holographic interpretation of the entanglement entropy.
Quantum aspects of black hole entropy
Indian Academy of Sciences (India)
Quantum corrections to the semiclassical Bekenstein–Hawking area law for black hole entropy, obtained within the quantum geometry framework, are treated in some detail. Their ramiﬁcation for the holographic entropy bound for bounded stationary spacetimes is discussed. Four dimensional supersymmetric extremal black ...
Quantum aspects of black hole entropy
Indian Academy of Sciences (India)
Abstract. This survey intends to cover recent approaches to black hole entropy which attempt to go beyond the standard semiclassical perspective. Quantum corrections to the semiclassical Bekenstein–. Hawking area law for black hole entropy, obtained within the quantum geometry framework, are treated in some detail.
Chemical Engineering Students' Ideas of Entropy
Haglund, Jesper; Andersson, Staffan; Elmgren, Maja
2015-01-01
Thermodynamics, and in particular entropy, has been found to be challenging for students, not least due to its abstract character. Comparisons with more familiar and concrete domains, by means of analogy and metaphor, are commonly used in thermodynamics teaching, in particular the metaphor "entropy is disorder." However, this particular…
Entropy estimation of very short symbolic sequences
Lesne, Annick; Blanc, Jean-Luc; Pezard, Laurent
2009-04-01
While entropy per unit time is a meaningful index to quantify the dynamic features of experimental time series, its estimation is often hampered in practice by the finite length of the data. We here investigate the performance of entropy estimation procedures, relying either on block entropies or Lempel-Ziv complexity, when only very short symbolic sequences are available. Heuristic analytical arguments point at the influence of temporal correlations on the bias and statistical fluctuations, and put forward a reduced effective sequence length suitable for error estimation. Numerical studies are conducted using, as benchmarks, the wealth of different dynamic regimes generated by the family of logistic maps and stochastic evolutions generated by a Markov chain of tunable correlation time. Practical guidelines and validity criteria are proposed. For instance, block entropy leads to a dramatic overestimation for sequences of low entropy, whereas it outperforms Lempel-Ziv complexity at high entropy. As a general result, the quality of entropy estimation is sensitive to the sequence temporal correlation hence self-consistently depends on the entropy value itself, thus promoting a two-step procedure. Lempel-Ziv complexity is to be preferred in the first step and remains the best estimator for highly correlated sequences.
Progress in High-Entropy Alloys
Energy Technology Data Exchange (ETDEWEB)
Gao, Michael C
2013-12-01
Strictly speaking, high-entropy alloys (HEAs) refer to single-phase, solid-solution alloys with multiprincipal elements in an equal or a near-equal molar ratio whose configurational entropy is tremendously high. This special topic was organized to reflect the focus and diversity of HEA research topics in the community.
Entanglement Entropy in Warped Conformal Field Theories
Castro, A.; Hofman, D.M.; Iqbal, N.
We present a detailed discussion of entanglement entropy in (1+1)-dimensional Warped Conformal Field Theories (WCFTs). We implement the Rindler method to evaluate entanglement and Renyi entropies for a single interval and along the way we interpret our results in terms of twist field correlation
Thermodynamic stabilities of the generalized Boltzmann entropies
Wada, Tatsuaki
2004-09-01
We consider the thermodynamic stability conditions (TSC) on the Boltzmann entropies generalized by Tsallis’ q- and Kaniadakis’ κ-deformed logarithmic functions. It is shown that the corresponding TSCs are not necessarily equivalent to the concavity of the generalized Boltzmann entropies with respect to internal energy. Nevertheless, both the TSCs are equivalent to the positivity of standard heat capacity.
The Thermal Entropy Density of Spacetime
Directory of Open Access Journals (Sweden)
Rongjia Yang
2013-01-01
Full Text Available Introducing the notion of thermal entropy density via the first law of thermodynamics and assuming the Einstein equation as an equation of thermal state, we obtain the thermal entropy density of any arbitrary spacetime without assuming a temperature or a horizon. The results confirm that there is a profound connection between gravity and thermodynamics.
Riccardo, F.; Van Oel, C.J.; De Jong, P.
2013-01-01
Energy efficiency and architectural value of post-war residential buildings in Europeis poor. To deal with energy efficiency, decay and livability problems, improvements of building façades seem to be indicated especially when combined with tenants’ preferences for architectural aesthetics. But a
Durant, Rita A.; Carlon, Donna M.; Downs, Alexis
2017-01-01
This article describes the results of the "Efficiency Challenge," a 10-week, Principles of Management course activity that uses reflection and goal setting to help students understand the concept of operational efficiency. With transformative learning theory as a lens, we base our report on 4 years' worth of student reflections regarding…
What is the entropy of the universe?
Energy Technology Data Exchange (ETDEWEB)
Frampton, Paul H [Department of Physics and Astronomy, UNC-Chapel Hill, NC 27599 (United States); Hsu, Stephen D H; Reeb, David [Institute of Theoretical Science, University of Oregon, Eugene, OR 97403 (United States); Kephart, Thomas W, E-mail: frampton@physics.unc.ed, E-mail: hsu@uoregon.ed, E-mail: tom.kephart@gmail.co, E-mail: dreeb@uoregon.ed [Department of Physics and Astronomy, Vanderbilt University, Nashville, TN 37235 (United States)
2009-07-21
Standard calculations suggest that the entropy of our universe is dominated by black holes, whose entropy is of order their area in Planck units, although they comprise only a tiny fraction of its total energy. Statistical entropy is the logarithm of the number of microstates consistent with the observed macroscopic properties of a system, hence a measure of uncertainty about its precise state. Therefore, assuming unitarity in black hole evaporation, the standard results suggest that the largest uncertainty in the future quantum state of the universe is due to the Hawking radiation from evaporating black holes. However, the entropy of the matter precursors to astrophysical black holes is enormously less than that given by area entropy. If unitarity relates the future radiation states to the black hole precursor states, then the standard results are highly misleading, at least for an observer that can differentiate the individual states of the Hawking radiation.
Entropy of uremia and dialysis technology.
Ronco, Claudio
2013-01-01
The second law of thermodynamics applies with local exceptions to patient history and therapy interventions. Living things preserve their low level of entropy throughout time because they receive energy from their surroundings in the form of food. They gain their order at the expense of disordering the nutrients they consume. Death is the thermodynamically favored state: it represents a large increase in entropy as molecular structure yields to chaos. The kidney is an organ dissipating large amounts of energy to maintain the level of entropy of the organism as low as possible. Diseases, and in particular uremia, represent conditions of rapid increase in entropy. Therapeutic strategies are oriented towards a reduction in entropy or at least a decrease in the speed of entropy increase. Uremia is a process accelerating the trend towards randomness and disorder (increase in entropy). Dialysis is a factor external to the patient that tends to reduce the level of entropy caused by kidney disease. Since entropy can only increase in closed systems, energy and work must be spent to limit the entropy of uremia. This energy should be adapted to the system (patient) and be specifically oriented and personalized. This includes a multidimensional effort to achieve an adequate dialysis that goes beyond small molecular weight solute clearance. It includes a biological plan for recovery of homeostasis and a strategy towards long-term rehabilitation of the patient. Such objectives can be achieved with a combination of technology and innovation to answer specific questions that are still present after 60 years of dialysis history. This change in the individual bioentropy may represent a local exception to natural trends as the patient could be considered an isolated universe responding to the classic laws of thermodynamics. Copyright © 2013 S. Karger AG, Basel.
Kumar, Dinesh
2013-01-01
Sequence specific resonance assignment and secondary structure determination of proteins form the basis for variety of structural and functional proteomics studies by NMR. In this context, an efficient standalone method for rapid assignment of backbone (1H, 15N, 13Ca and 13C') resonances and secondary structure determination of proteins has been presented here. Compared to currently available strategies used for the purpose, the method employs only a single reduced dimensionality (RD) experiment -(4,3)D-hnCOCANH and exploits the linear combinations of backbone (13Ca and 13C') chemical shifts to achieve a dispersion relatively better compared to those of individual chemical shifts (see the text) for efficient and rapid data analysis. Further, the experiment leads to the spectrum with direct distinction of self (intra-residue) and sequential (inter-residue) carbon correlation peaks; these appear opposite in signs and therefore can easily be discriminated without using an additional complementary experiment. On ...
Entropy demystified the second law reduced to plain common sense
Ben-Naim, Arieh
2016-01-01
In this unique book, the reader is invited to experience the joy of appreciating something which has eluded understanding for many years — entropy and the Second Law of Thermodynamics. The book has a two-pronged message: first, that the Second Law is not infinitely incomprehensible as commonly stated in most textbooks on thermodynamics, but can, in fact, be comprehended through sheer common sense; and second, that entropy is not a mysterious quantity that has resisted understanding but a simple, familiar and easily comprehensible concept.Written in an accessible style, the book guides the reader through an abundance of dice games and examples from everyday life. The author paves the way for readers to discover for themselves what entropy is, how it changes, and, most importantly, why it always changes in one direction in a spontaneous process.In this new edition, seven simulated games are included so that the reader can actually experiment with the games described in the book. These simulated games are mean...
Maximum Entropy: Clearing up Mysteries
Directory of Open Access Journals (Sweden)
Marian GrendÃƒÂ¡r
2001-04-01
Full Text Available Abstract: There are several mystifications and a couple of mysteries pertinent to MaxEnt. The mystifications, pitfalls and traps are set up mainly by an unfortunate formulation of Jaynes' die problem, the cause cÃƒÂ©lÃƒÂ¨bre of MaxEnt. After discussing the mystifications a new formulation of the problem is proposed. Then we turn to the mysteries. An answer to the recurring question 'Just what are we accomplishing when we maximize entropy?' [8], based on MaxProb rationale of MaxEnt [6], is recalled. A brief view on the other mystery: 'What is the relation between MaxEnt and the Bayesian method?' [9], in light of the MaxProb rationale of MaxEnt suggests that there is not and cannot be a conflict between MaxEnt and Bayes Theorem.
Entanglement Entropy of Quantum Hall Systems with Short Range Disorder
Friedman, Barry; Levine, Greg
2015-03-01
The critical value of the mobility for which the filling 5/2 quantum Hall effect is destroyed by short range disorder is determined from an earlier calculation of the entanglement entropy. The value agrees well with experiment; this agreement is particularly significant in that there are no adjustable parameters. Entanglement entropy vs. disorder strength for filling 1/2, filling 9/2 and filling 7/3 is calculated. For filling 1/2 there is no evidence for a transition for the disorder strengths considered; for filling 9/2 there appears to be a stripe-liquid transition. For filling 7/3 there again appears to be a transition at similar value of the disorder strength as the 5/2 transition but there are stronger finite size effects.
Packer Detection for Multi-Layer Executables Using Entropy Analysis
Directory of Open Access Journals (Sweden)
Munkhbayar Bat-Erdene
2017-03-01
Full Text Available Packing algorithms are broadly used to avoid anti-malware systems, and the proportion of packed malware has been growing rapidly. However, just a few studies have been conducted on detection various types of packing algorithms in a systemic way. Following this understanding, we elaborate a method to classify packing algorithms of a given executable into three categories: single-layer packing, re-packing, or multi-layer packing. We convert entropy values of the executable file loaded into memory into symbolic representations, for which we used SAX (Symbolic Aggregate Approximation. Based on experiments of 2196 programs and 19 packing algorithms, we identify that precision (97.7%, accuracy (97.5%, and recall ( 96.8% of our method are respectively high to confirm that entropy analysis is applicable in identifying packing algorithms.
Entropy Rate Estimates for Natural Language—A New Extrapolation of Compressed Large-Scale Corpora
Directory of Open Access Journals (Sweden)
Ryosuke Takahira
2016-10-01
Full Text Available One of the fundamental questions about human language is whether its entropy rate is positive. The entropy rate measures the average amount of information communicated per unit time. The question about the entropy of language dates back to experiments by Shannon in 1951, but in 1990 Hilberg raised doubt regarding a correct interpretation of these experiments. This article provides an in-depth empirical analysis, using 20 corpora of up to 7.8 gigabytes across six languages (English, French, Russian, Korean, Chinese, and Japanese, to conclude that the entropy rate is positive. To obtain the estimates for data length tending to infinity, we use an extrapolation function given by an ansatz. Whereas some ansatzes were proposed previously, here we use a new stretched exponential extrapolation function that has a smaller error of fit. Thus, we conclude that the entropy rates of human languages are positive but approximately 20% smaller than without extrapolation. Although the entropy rate estimates depend on the script kind, the exponent of the ansatz function turns out to be constant across different languages and governs the complexity of natural language in general. In other words, in spite of typological differences, all languages seem equally hard to learn, which partly confirms Hilberg’s hypothesis.
Entropy production in a box: Analysis of instabilities in confined hydrothermal systems
Börsing, N.; Wellmann, J. F.; Niederau, J.; Regenauer-Lieb, K.
2017-09-01
We evaluate if the concept of thermal entropy production can be used as a measure to characterize hydrothermal convection in a confined porous medium as a valuable, thermodynamically motivated addition to the standard Rayleigh number analysis. Entropy production has been used widely in the field of mechanical and chemical engineering as a way to characterize the thermodynamic state and irreversibility of an investigated system. Pioneering studies have since adapted these concepts to natural systems, and we apply this measure here to investigate the specific case of hydrothermal convection in a "box-shaped" confined porous medium, as a simplified analog for, e.g., hydrothermal convection in deep geothermal aquifers. We perform various detailed numerical experiments to assess the response of the convective system to changing boundary conditions or domain aspect ratios, and then determine the resulting entropy production for each experiment. In systems close to the critical Rayleigh number, we derive results that are in accordance to the analytically derived predictions. At higher Rayleigh numbers, however, we observe multiple possible convection modes, and the analysis of the integrated entropy production reveals distinct curves of entropy production that provide an insight into the hydrothermal behavior in the system, both for cases of homogeneous materials, as well as for heterogeneous spatial material distributions. We conclude that the average thermal entropy production characterizes the internal behavior of hydrothermal systems with a meaningful thermodynamic measure, and we expect that it can be useful for the investigation of convection systems in many similar hydrogeological and geophysical settings.
Ai, Yan-Ting; Guan, Jiao-Yue; Fei, Cheng-Wei; Tian, Jing; Zhang, Feng-Ling
2017-05-01
To monitor rolling bearing operating status with casings in real time efficiently and accurately, a fusion method based on n-dimensional characteristic parameters distance (n-DCPD) was proposed for rolling bearing fault diagnosis with two types of signals including vibration signal and acoustic emission signals. The n-DCPD was investigated based on four information entropies (singular spectrum entropy in time domain, power spectrum entropy in frequency domain, wavelet space characteristic spectrum entropy and wavelet energy spectrum entropy in time-frequency domain) and the basic thought of fusion information entropy fault diagnosis method with n-DCPD was given. Through rotor simulation test rig, the vibration and acoustic emission signals of six rolling bearing faults (ball fault, inner race fault, outer race fault, inner-ball faults, inner-outer faults and normal) are collected under different operation conditions with the emphasis on the rotation speed from 800 rpm to 2000 rpm. In the light of the proposed fusion information entropy method with n-DCPD, the diagnosis of rolling bearing faults was completed. The fault diagnosis results show that the fusion entropy method holds high precision in the recognition of rolling bearing faults. The efforts of this study provide a novel and useful methodology for the fault diagnosis of an aeroengine rolling bearing.
Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray
2014-05-13
The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.
Directory of Open Access Journals (Sweden)
Gyöngyi Munkácsy
2016-01-01
Full Text Available No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal–Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E−06. Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E−04. There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.
Triadic conceptual structure of the maximum entropy approach to evolution.
Herrmann-Pillath, Carsten; Salthe, Stanley N
2011-03-01
Many problems in evolutionary theory are cast in dyadic terms, such as the polar oppositions of organism and environment. We argue that a triadic conceptual structure offers an alternative perspective under which the information generating role of evolution as a physical process can be analyzed, and propose a new diagrammatic approach. Peirce's natural philosophy was deeply influenced by his reception of both Darwin's theory and thermodynamics. Thus, we elaborate on a new synthesis which puts together his theory of signs and modern Maximum Entropy approaches to evolution in a process discourse. Following recent contributions to the naturalization of Peircean semiosis, pointing towards 'physiosemiosis' or 'pansemiosis', we show that triadic structures involve the conjunction of three different kinds of causality, efficient, formal and final. In this, we accommodate the state-centered thermodynamic framework to a process approach. We apply this on Ulanowicz's analysis of autocatalytic cycles as primordial patterns of life. This paves the way for a semiotic view of thermodynamics which is built on the idea that Peircean interpretants are systems of physical inference devices evolving under natural selection. In this view, the principles of Maximum Entropy, Maximum Power, and Maximum Entropy Production work together to drive the emergence of information carrying structures, which at the same time maximize information capacity as well as the gradients of energy flows, such that ultimately, contrary to Schrödinger's seminal contribution, the evolutionary process is seen to be a physical expression of the Second Law. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Symbolic phase transfer entropy method and its application
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-10-01
In this paper, we introduce symbolic phase transfer entropy (SPTE) to infer the direction and strength of information flow among systems. The advantages of the proposed method are investigated by simulations on synthetic signals and real-world data. We demonstrate that symbolic phase transfer entropy is a robust and efficient tool to infer the information flow between complex systems. Based on the study of the synthetic data, we find a significant advantage of SPTE is its reduced sensitivity to noise. In addition, SPTE requires less amount of data than symbolic transfer entropy(STE). We analyze the direction and strength of information flow between six stock markets during the period from 2006 to 2016. The results indicate that the information flow among stocks varies over different periods. We also find that the interaction network pattern among stocks undergoes hierarchial reorganization with transition from one period to another. It is shown that the clusters are mainly classified according to period, and then by region. The stocks during the same time period are shown to drop into the same cluster.
Introducing E-tec: Ensemble-based Topological Entropy Calculation
Roberts, Eric; Smith, Spencer; Sindi, Suzanne; Smith, Kevin
2017-11-01
Topological entropy is a measurement of orbit complexity in a dynamical system that can be estimated in 2D by embedding an initial material curve L0 in the fluid and estimating its growth under the evolution of the flow. This growth is given by L (t) = | L0 | eht , where L (t) is the length of the curve as a function of t and h is the topological entropy. In order to develop a method for computing Eq. (1) that will efficiently scale up in both system size and modeling time, one must be clever about extracting the maximum information from the limited trajectories available. The relative motion of trajectories through phase space encodes global information that is not contained in any individual trajectory. That is, extra information is ''hiding'' in an ensemble of classical trajectories, which is not exploited in a trajectory-by-trajectory approach. Using tools from computational geometry, we introduce a new algorithm designed to take advantage of such additional information that requires only potentially sparse sets of particle trajectories as input and no reliance on any detailed knowledge of the velocity field: the Ensemble-Based Topological Entropy Calculation, or E-tec.
Entropy Generation of Desalination Powered by Variable Temperature Waste Heat
Directory of Open Access Journals (Sweden)
David M. Warsinger
2015-10-01
Full Text Available Powering desalination by waste heat is often proposed to mitigate energy consumption and environmental impact; however, thorough technology comparisons are lacking in the literature. This work numerically models the efficiency of six representative desalination technologies powered by waste heat at 50, 70, 90, and 120 °C, where applicable. Entropy generation and Second Law efficiency analysis are applied for the systems and their components. The technologies considered are thermal desalination by multistage flash (MSF, multiple effect distillation (MED, multistage vacuum membrane distillation (MSVMD, humidification-dehumidification (HDH, and organic Rankine cycles (ORCs paired with mechanical technologies of reverse osmosis (RO and mechanical vapor compression (MVC. The most efficient technology was RO, followed by MED. Performances among MSF, MSVMD, and MVC were similar but the relative performance varied with waste heat temperature or system size. Entropy generation in thermal technologies increases at lower waste heat temperatures largely in the feed or brine portions of the various heat exchangers used. This occurs largely because lower temperatures reduce recovery, increasing the relative flow rates of feed and brine. However, HDH (without extractions had the reverse trend, only being competitive at lower temperatures. For the mechanical technologies, the energy efficiency only varies with temperature because of the significant losses from the ORC.
On entropy, financial markets and minority games
Zapart, Christopher A.
2009-04-01
The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.
Information, Entropy, and the Classical Ideal Gas
Sands, David; Dunning-Davies, Jeremy
2013-09-01
The physical basis of the canonical and grand canonical distributions is questioned. In particular, we question the usual methods by which these distributions are derived, namely that fluctuations in entropy around energy and particle number are assumed to occur when the entropy depends only on variables which cannot themselves fluctuate. We show, starting from the Maxwellian velocity distribution, that the probability that a classical ideal gas at a fixed temperature occupies a given energy state corresponds not to the canonical ensemble of classical statistical mechanics but to the Gamma distribution. Computer simulations of a hard-sphere fluid demonstrate the principles. The analysis is extended to open systems in which the number of particles fluctuates and we show that for a system connected to a particle reservoir the Poisson distribution governs the probability of finding a given number of particles. The resulting probability distributions are used to calculate the Shannon information entropy which is then compared with the thermodynamic entropy. We argue that information theoretic entropy and thermodynamic entropy, whilst related, are not necessarily identical and that the information entropy contains non-thermodynamic components.
The rank-size scaling law and entropy-maximizing principle
Chen, Yanguang
2012-02-01
The rank-size regularity known as Zipf's law is one of the scaling laws and is frequently observed in the natural living world and social institutions. Many scientists have tried to derive the rank-size scaling relation through entropy-maximizing methods, but they have not been entirely successful. By introducing a pivotal constraint condition, I present here a set of new derivations based on the self-similar hierarchy of cities. First, I derive a pair of exponent laws by postulating local entropy maximizing. From the two exponential laws follows a general hierarchical scaling law, which implies the general form of Zipf's law. Second, I derive a special hierarchical scaling law with the exponent equal to 1 by postulating global entropy maximizing, and this implies the pure form of Zipf's law. The rank-size scaling law has proven to be one of the special cases of the hierarchical scaling law, and the derivation suggests a certain scaling range with the first or the last data point as an outlier. The entropy maximization of social systems differs from the notion of entropy increase in thermodynamics. For urban systems, entropy maximizing suggests the greatest equilibrium between equity for parts/individuals and efficiency of the whole.
Shojaeezadeh, Shahab Aldin; Amiri, Seyyed Mehrab
2018-02-01
Estimation of velocity distribution profile is a challenging subject of open channel hydraulics. In this study, an entropy-based method is used to derive two-dimensional velocity distribution profile. The General Index Entropy (GIE) can be considered as the generalized form of Shannon entropy which is suitable to combine with the different form of Cumulative Distribution Function (CDF). Using the principle of maximum entropy (POME), the velocity distribution is defined by maximizing the GIE by treating the velocity as a random variable. The combination of GIE and a CDF proposed by Marini et al. (2011) was utilized to introduce an efficient entropy model whose results are comparable with several well-known experimental and field data. Consequently, in spite of less sensitivity of the related parameters of the model to flow conditions and less complexity in application of the model compared with other entropy-based methods, more accuracy is obtained in estimating velocity distribution profile either near the boundaries or the free surface of the flow.
Metastable high-entropy dual-phase alloys overcome the strength-ductility trade-off
Li, Zhiming; Pradeep, Konda Gokuldoss; Deng, Yun; Raabe, Dierk; Tasan, Cemal Cem
2016-06-01
Metals have been mankind’s most essential materials for thousands of years; however, their use is affected by ecological and economical concerns. Alloys with higher strength and ductility could alleviate some of these concerns by reducing weight and improving energy efficiency. However, most metallurgical mechanisms for increasing strength lead to ductility loss, an effect referred to as the strength-ductility trade-off. Here we present a metastability-engineering strategy in which we design nanostructured, bulk high-entropy alloys with multiple compositionally equivalent high-entropy phases. High-entropy alloys were originally proposed to benefit from phase stabilization through entropy maximization. Yet here, motivated by recent work that relaxes the strict restrictions on high-entropy alloy compositions by demonstrating the weakness of this connection, the concept is overturned. We decrease phase stability to achieve two key benefits: interface hardening due to a dual-phase microstructure (resulting from reduced thermal stability of the high-temperature phase); and transformation-induced hardening (resulting from the reduced mechanical stability of the room-temperature phase). This combines the best of two worlds: extensive hardening due to the decreased phase stability known from advanced steels and massive solid-solution strengthening of high-entropy alloys. In our transformation-induced plasticity-assisted, dual-phase high-entropy alloy (TRIP-DP-HEA), these two contributions lead respectively to enhanced trans-grain and inter-grain slip resistance, and hence, increased strength. Moreover, the increased strain hardening capacity that is enabled by dislocation hardening of the stable phase and transformation-induced hardening of the metastable phase produces increased ductility. This combined increase in strength and ductility distinguishes the TRIP-DP-HEA alloy from other recently developed structural materials. This metastability-engineering strategy should
Entropy generation: Minimum inside and maximum outside
Lucia, Umberto
2014-02-01
The extremum of entropy generation is evaluated for both maximum and minimum cases using a thermodynamic approach which is usually applied in engineering to design energy transduction systems. A new result in the thermodynamic analysis of the entropy generation extremum theorem is proved by the engineering approach. It follows from the proof that the entropy generation results as a maximum when it is evaluated by the exterior surroundings of the system and a minimum when it is evaluated within the system. The Bernoulli equation is analyzed as an example in order to evaluate the internal and external dissipations, in accordance with the theoretical results obtained.
Entropy in Kerr-Newman-Kasuya spacetime
Gao Chang Jun
2002-01-01
The entropy of a rotating Kerr-Newman-Kasuya black hole due to massive charged fields (bosons and fermions) is calculated using an improved brick-wall model. The result shows that the entropy depends not on the mass and charge but the spin of the fields. Considering statistical physics, we do not propose to consider the superradiant modes for bosons (fermion fields do not display superradiance). In fact, the non-superradiant mode contributes exactly the area entropy for both bosons and fermions.
Entropy viscosity method for nonlinear conservation laws
Guermond, Jean-Luc
2011-05-01
A new class of high-order numerical methods for approximating nonlinear conservation laws is described (entropy viscosity method). The novelty is that a nonlinear viscosity based on the local size of an entropy production is added to the numerical discretization at hand. This new approach does not use any flux or slope limiters, applies to equations or systems supplemented with one or more entropy inequalities and does not depend on the mesh type and polynomial approximation. Various benchmark problems are solved with finite elements, spectral elements and Fourier series to illustrate the capability of the proposed method. © 2010 Elsevier Inc.
Entropy/information flux in Hawking radiation
Alonso-Serrano, Ana; Visser, Matt
2018-01-01
Blackbody radiation contains (on average) an entropy of 3.9 ± 2.5 bits per photon. If the emission process is unitary, then this entropy is exactly compensated by "hidden information" in the correlations. We extend this argument to the Hawking radiation from GR black holes, demonstrating that the assumption of unitarity leads to a perfectly reasonable entropy/information budget. The key technical aspect of our calculation is a variant of the "average subsystem" approach developed by Page, which we extend beyond bipartite pure systems, to a tripartite pure system that considers the influence of the environment.
Permutation Entropy for Random Binary Sequences
Directory of Open Access Journals (Sweden)
Lingfeng Liu
2015-12-01
Full Text Available In this paper, we generalize the permutation entropy (PE measure to binary sequences, which is based on Shannon’s entropy, and theoretically analyze this measure for random binary sequences. We deduce the theoretical value of PE for random binary sequences, which can be used to measure the randomness of binary sequences. We also reveal the relationship between this PE measure with other randomness measures, such as Shannon’s entropy and Lempel–Ziv complexity. The results show that PE is consistent with these two measures. Furthermore, we use PE as one of the randomness measures to evaluate the randomness of chaotic binary sequences.
Density estimation by maximum quantum entropy
Energy Technology Data Exchange (ETDEWEB)
Silver, R.N.; Wallstrom, T.; Martz, H.F.
1993-11-01
A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets.
Emerging issues in the evaluation of energy-efficiency programs. The US experience
Energy Technology Data Exchange (ETDEWEB)
Vine, E. [Lawrence Berkeley National Laboratory LBNL, Berkeley, CA (United States); Hall, N.; Keating, K.M. [TecMarket Works, Oregon, WI (United States); Kushler, M. [American Council for an Energy-Efficient Economy, Washington, DC (United States); Prahl, R. [Prahl and Associates, Fremont, CA (United States)
2012-01-15
The evaluation, measurement, and verification (EM and V) of energy-efficiency programs has a rich and extensive history in the United States, dating back to the late 1970s. During this time, many different kinds of EM and V issues have been addressed: technical (primarily focusing on EM and V methods and protocols), policy (primarily focusing on how EM and V results will be used by energy-efficiency program managers and policymakers), and infrastructure (primarily focusing on the development of EM and V professionals and an EM and V workforce). We address the issues that are currently important and/or are expected to become more critical in the coming years. We expect many of these issues will also be relevant for a non-US audience, particularly as more attention is paid to the reliability of energy savings and carbon emissions reductions from energy-efficiency programs.
Directory of Open Access Journals (Sweden)
Rebecca Impelizieri Moura da Silveira
2013-12-01
Full Text Available This article aims to identify productive characteristics that differentiate technically efficient and inefficient firms in the furniture industry in Brazil. The sampling of companies obeyed especially the accessibility criteria, ensuring, however, the assumptions of homogeneity and reliability of the data. Data collection occurred in loco (May-September 2011 with the respondents, mostly corporate managers. The data analysis is based on efficiency scores obtained using Data Envelopment Analysis. Companies that has efficient production can be defined as those midsized and with pushed production. Additionally, they develop internally much of its processes, using outsourcing with less emphasis on the research and development of new products and with the client research, are relatively recent in the market and have a more innovative profile. But are those companies who have better control of production systems, presenting products with higher quality and at lower unit costs of production.
Applying Improved Multiscale Fuzzy Entropy for Feature Extraction of MI-EEG
Directory of Open Access Journals (Sweden)
Ming-ai Li
2017-01-01
Full Text Available Electroencephalography (EEG is considered the output of a brain and it is a bioelectrical signal with multiscale and nonlinear properties. Motor Imagery EEG (MI-EEG not only has a close correlation with the human imagination and movement intention but also contains a large amount of physiological or disease information. As a result, it has been fully studied in the field of rehabilitation. To correctly interpret and accurately extract the features of MI-EEG signals, many nonlinear dynamic methods based on entropy, such as Approximate Entropy (ApEn, Sample Entropy (SampEn, Fuzzy Entropy (FE, and Permutation Entropy (PE, have been proposed and exploited continuously in recent years. However, these entropy-based methods can only measure the complexity of MI-EEG based on a single scale and therefore fail to account for the multiscale property inherent in MI-EEG. To solve this problem, Multiscale Sample Entropy (MSE, Multiscale Permutation Entropy (MPE, and Multiscale Fuzzy Entropy (MFE are developed by introducing scale factor. However, MFE has not been widely used in analysis of MI-EEG, and the same parameter values are employed when the MFE method is used to calculate the fuzzy entropy values on multiple scales. Actually, each coarse-grained MI-EEG carries the characteristic information of the original signal on different scale factors. It is necessary to optimize MFE parameters to discover more feature information. In this paper, the parameters of MFE are optimized independently for each scale factor, and the improved MFE (IMFE is applied to the feature extraction of MI-EEG. Based on the event-related desynchronization (ERD/event-related synchronization (ERS phenomenon, IMFE features from multi channels are fused organically to construct the feature vector. Experiments are conducted on a public dataset by using Support Vector Machine (SVM as a classifier. The experiment results of 10-fold cross-validation show that the proposed method yields
DEFF Research Database (Denmark)
Haller, Michel; Cruickshank, Chynthia; Streicher, Wolfgang
2009-01-01
during charging, storing and discharging, and represent this ability with a single numerical value in terms of a stratification efficiency for a given experiment or under given boundary conditions. Existing methods for calculating stratification efficiencies have been applied to hypothetical storage......This paper reviews different methods that have been proposed to characterize thermal stratification in energy storages from a theoretical point of view. Specifically, this paper focuses on the methods that can be used to determine the ability of a storage to promote and maintain stratification...... processes of charging, discharging and storing, and compared with the rate of entropy production caused by mixing calculated for the same experiments. The results depict that only one of the applied methods is in qualitative agreement with the rate of entropy production, however, none of the applied methods...
Beef cow efficiency, a century’s old debate, on what the criteria, certain phenotypic traits, and definition of an “efficient” cow really should be. However, we do know that energy utilization by the cow herd is proportionally large compared to the rest of the sector. This requirement accounts up to...
Energy Technology Data Exchange (ETDEWEB)
Levine, M.D.; Koomey, J.; Price, L. [Lawrence Berkeley Lab., CA (United States); Geller, H.; Nadel, S. [American Council for an Energy-Efficient Economy, Washington, DC (United States)
1992-03-01
In its August meeting in Geneva, the Energy and Industry Subcommittee (EIS) of the Policy Response Panel of the Intergovernmental Panel on Climate Change (IPCC) identified a series of reports to be produced. One of these reports was to be a synthesis of available information on global electricity end-use efficiency, with emphasis on developing nations. The report will be reviewed by the IPCC and approved prior to the UN Conference on Environment and Development (UNCED), Brazil, June 1992. A draft outline for the report was submitted for review at the November 1991 meeting of the EIS. This outline, which was accepted by the EIS, identified three main topics to be addressed in the report: status of available technologies for increasing electricity end-use efficiency; review of factors currently limiting application of end-use efficiency technologies; and review of policies available to increase electricity end-use efficiency. The United States delegation to the EIS agreed to make arrangements for the writing of the report.
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
Biofiltration of exhaust air from animal houses: removal efficiencies and practical experiences
Melse, R.W.; Hol, J.M.G.
2014-01-01
Two wood-chip biofilters (capacity and surface area for biofilter #1: 75.000 m3/hour from poultry manure dryer, 68 m2; biofilter #2: 100,000 m3/hour from fattening pig house, 188 m2; media depth: 25 cm) were monitored during 6 - 10 months. Average ammonia (NH3) and odour removal efficiencies were 42
Directory of Open Access Journals (Sweden)
R. Sagan
2011-11-01
Full Text Available This article considers different aspects which allow defining correctness of choosing sorting algorithms. Also some algorithms, needed for computational experiments for certain class of programs, are compared.
Stock Selection for Portfolios Using Expected Utility-Entropy Decision Model
Directory of Open Access Journals (Sweden)
Jiping Yang
2017-09-01
Full Text Available Yang and Qiu proposed and then recently improved an expected utility-entropy (EU-E measure of risk and decision model. When segregation holds, Luce et al. derived an expected utility term, plus a constant multiplies the Shannon entropy as the representation of risky choices, further demonstrating the reasonability of the EU-E decision model. In this paper, we apply the EU-E decision model to selecting the set of stocks to be included in the portfolios. We first select 7 and 10 stocks from the 30 component stocks of Dow Jones Industrial Average index, and then derive and compare the efficient portfolios in the mean-variance framework. The conclusions imply that efficient portfolios composed of 7(10 stocks selected using the EU-E model with intermediate intervals of the tradeoff coefficients are more efficient than that composed of the sets of stocks selected using the expected utility model. Furthermore, the efficient portfolio of 7(10 stocks selected by the EU-E decision model have almost the same efficient frontier as that of the sample of all stocks. This suggests the necessity of incorporating both the expected utility and Shannon entropy together when taking risky decisions, further demonstrating the importance of Shannon entropy as the measure of uncertainty, as well as the applicability of the EU-E model as a decision-making model.
On statistical properties of Jizba-Arimitsu hybrid entropy
Çankaya, Mehmet Niyazi; Korbel, Jan
2017-06-01
Jizba-Arimitsu entropy (also called hybrid entropy) combines axiomatics of Rényi and Tsallis entropy. It has many common properties with them, on the other hand, some aspects as e.g., MaxEnt distributions, are different. In this paper, we discuss statistical properties of hybrid entropy. We define hybrid entropy for continuous distributions and its relation to discrete entropy. Additionally, definition of hybrid divergence and its connection to Fisher metric is also presented. Interestingly, Fisher metric connected to hybrid entropy differs from corresponding Fisher metrics of Rényi and Tsallis entropy. This motivates us to introduce average hybrid entropy, which can be understood as an average between Tsallis and Rényi entropy.
Extropy: a complementary dual of entropy
Lad, Frank; Agrò, Gianna
2011-01-01
This article resolves a longstanding question in the axiomatisation of entropy as proposed by Shannon and highlighted in renewed concerns expressed by Jaynes. We introduce a companion measure of a probability distribution that we suggest be called the extropy of the distribution. The entropy and the extropy of an event distribution are identical. However, this identical measure bifurcates into distinct measures for any quantity that is not merely an event indicator. As for entropy, the maximum extropy distribution is also the uniform distribution. We display several theoretical and geometrical properties of the proposed extropy measure, discussing in detail the difference between its assessment of a refined probability distribution and the axiom that characterises the Shannon entropy in this regard. This is what resolves the concerns of Shannon and Jaynes. In a discrete context, the extropy measure is approximated by a variant of Gini's index of heterogeneity when the maximum probability mass is small. This i...
Multi-Granulation Entropy and Its Applications
Directory of Open Access Journals (Sweden)
Kai Zeng
2013-06-01
Full Text Available In the view of granular computing, some general uncertainty measures are proposed through single-granulation by generalizing Shannon’s entropy. However, in the practical environment we need to describe concurrently a target concept through multiple binary relations. In this paper, we extend the classical information entropy model to a multi-granulation entropy model (MGE by using a series of general binary relations. Two types of MGE are discussed. Moreover, a number of theorems are obtained. It can be concluded that the single-granulation entropy is the special instance of MGE. We employ the proposed model to evaluate the significance of the attributes for classification. A forward greedy search algorithm for feature selection is constructed. The experimental results show that the proposed method presents an effective solution for feature analysis.
Topological entropy for induced hyperspace maps
Energy Technology Data Exchange (ETDEWEB)
Canovas Pena, Jose S. [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, 30203 Cartagena, Murcia (Spain)]. E-mail: Jose.canovas@upct.es; Lopez, Gabriel Soler [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, 30203 Cartagena, Murcia (Spain)]. E-mail: Gabriel.soler@upct.es
2006-05-15
Let (X,d) be a compact metric space and let f:X->X be continuous. Let K(X) be the family of compact subsets of X endowed with the Hausdorff metric and define the extension f-bar :K(X)->K(X) by f-bar (K)=f(K) for any K-bar K(X). We prove that the topological entropy of f-bar is greater or equal than the topological entropy of f, and this inequality can be strict. On the other hand, we prove that the topological entropy of f is positive if and only if the topological entropy of f-bar is also positive.
On Gravitational Entropy of de Sitter Universe
Directory of Open Access Journals (Sweden)
S. C. Ulhoa
2016-01-01
Full Text Available The paper deals with the calculation of the gravitational entropy in the context of teleparallel gravity for de Sitter space-time. In such a theory it is possible to define gravitational energy and pressure; thus we use those expressions to construct the gravitational entropy. We use the temperature as a function of the cosmological constant and write the first law of thermodynamics from which we obtain the entropy. In the limit Λ≪1 we find that the entropy is proportional to volume, for a specific temperature’s choice; we find that ΔS≥0 as well. We also identify a phase transition in de Sitter space-time by analyzing the specific heat.
Multidimensional entropy landscape of quantum criticality
Grube, K.; Zaum, S.; Stockert, O.; Si, Q.; Löhneysen, H. V.
2017-08-01
The third law of thermodynamics states that the entropy of any system in equilibrium has to vanish at absolute zero temperature. At nonzero temperatures, on the other hand, matter is expected to accumulate entropy near a quantum critical point, where it undergoes a continuous transition from one ground state to another. Here, we determine, based on general thermodynamic principles, the spatial-dimensional profile of the entropy S near a quantum critical point and its steepest descent in the corresponding multidimensional stress space. We demonstrate this approach for the canonical quantum critical compound CeCu 6-xAux near its onset of antiferromagnetic order. We are able to link the directional stress dependence of S to the previously determined geometry of quantum critical fluctuations. Our demonstration of the multidimensional entropy landscape provides the foundation to understand how quantum criticality nucleates novel phases such as high-temperature superconductivity.
Holographic equipartition and the maximization of entropy
Krishna, P. B.; Mathew, Titus K.
2017-09-01
The accelerated expansion of the Universe can be interpreted as a tendency to satisfy holographic equipartition. It can be expressed by a simple law, Δ V =Δ t (Nsurf-ɛ Nbulk) , where V is the Hubble volume in Planck units, t is the cosmic time in Planck units, and Nsurf /bulk is the number of degrees of freedom on the horizon/bulk of the Universe. We show that this holographic equipartition law effectively implies the maximization of entropy. In the cosmological context, a system that obeys the holographic equipartition law behaves as an ordinary macroscopic system that proceeds to an equilibrium state of maximum entropy. We consider the standard Λ CDM model of the Universe and show that it is consistent with the holographic equipartition law. Analyzing the entropy evolution, we find that it also proceeds to an equilibrium state of maximum entropy.
Independent Component Analysis by Entropy Maximization (INFOMAX)
National Research Council Canada - National Science Library
Garvey, Jennie H
2007-01-01
... (BSS). The Infomax method separates unknown source signals from a number of signal mixtures by maximizing the entropy of a transformed set of signal mixtures and is accomplished by performing gradient ascent in MATLAB...
Entropy Generation Analysis of Wildfire Propagation
Directory of Open Access Journals (Sweden)
Elisa Guelpa
2017-08-01
Full Text Available Entropy generation is commonly applied to describe the evolution of irreversible processes, such as heat transfer and turbulence. These are both dominating phenomena in fire propagation. In this paper, entropy generation analysis is applied to a grassland fire event, with the aim of finding possible links between entropy generation and propagation directions. The ultimate goal of such analysis consists in helping one to overcome possible limitations of the models usually applied to the prediction of wildfire propagation. These models are based on the application of the superimposition of the effects due to wind and slope, which has proven to fail in various cases. The analysis presented here shows that entropy generation allows a detailed analysis of the landscape propagation of a fire and can be thus applied to its quantitative description.
Entropy Approximation in Lossy Source Coding Problem
Directory of Open Access Journals (Sweden)
Marek Śmieja
2015-05-01
Full Text Available In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained.
Group entropies, correlation laws, and zeta functions
Tempesta, Piergiulio
2011-08-01
The notion of group entropy is proposed. It enables the unification and generaliztion of many different definitions of entropy known in the literature, such as those of Boltzmann-Gibbs, Tsallis, Abe, and Kaniadakis. Other entropic functionals are introduced, related to nontrivial correlation laws characterizing universality classes of systems out of equilibrium when the dynamics is weakly chaotic. The associated thermostatistics are discussed. The mathematical structure underlying our construction is that of formal group theory, which provides the general structure of the correlations among particles and dictates the associated entropic functionals. As an example of application, the role of group entropies in information theory is illustrated and generalizations of the Kullback-Leibler divergence are proposed. A new connection between statistical mechanics and zeta functions is established. In particular, Tsallis entropy is related to the classical Riemann zeta function.
New entropy formula for Kerr black holes
Gonzalez, Hernan; Grumiller, Daniel; Merbis, Wout; Wutte, Raphaela
2017-01-01
We introduce a new entropy formula for Kerr black holes inspired by recent results for 3-dimensional black holes and cosmologies with soft Heisenberg hair. We show that also Kerr-Taub-NUT black holes obey the same formula.
Generalized Entropies and the Similarity of Texts
Altmann, Eduardo G; Gerlach, Martin
2016-01-01
We show how generalized Gibbs-Shannon entropies can provide new insights on the statistical properties of texts. The universal distribution of word frequencies (Zipf's law) implies that the generalized entropies, computed at the word level, are dominated by words in a specific range of frequencies. Here we show that this is the case not only for the generalized entropies but also for the generalized (Jensen-Shannon) divergences, used to compute the similarity between different texts. This finding allows us to identify the contribution of specific words (and word frequencies) for the different generalized entropies and also to estimate the size of the databases needed to obtain a reliable estimation of the divergences. We test our results in large databases of books (from the Google n-gram database) and scientific papers (indexed by Web of Science).
Energy Technology Data Exchange (ETDEWEB)
Price, Lynn; Galitsky, Christina; Kramer, Klaas Jan
2008-02-02
Target-setting agreements, also known as voluntary ornegotiated agreements, have been used by a number of governments as amechanism for promoting energy efficiency within the industrial sector. Arecent survey of such target-setting agreement programs identified 23energy efficiency or GHG emissions reduction voluntary agreement programsin 18 countries. International best practice related to target-settingagreement programs calls for establishment of a coordinated set ofpolicies that provide strong economic incentives as well as technical andfinancial support to participating industries. The key program elementsof a target-setting program are the target-setting process,identification of energy-saving technologies and measures usingenergy-energy efficiency guidebooks and benchmarking as well as byconducting energy-efficiency audits, development of an energy-savingsaction plan, development and implementation of energy managementprotocols, development of incentives and supporting policies, monitoringprogress toward targets, and program evaluation. This report firstprovides a description of three key target-setting agreement programs andthen describes international experience with the key program elementsthat comprise such programs using information from the three keytarget-setting programs as well as from other international programsrelated to industrial energy efficiency or GHG emissionsreductions.
Inverting Monotonic Nonlinearities by Entropy Maximization.
Directory of Open Access Journals (Sweden)
Jordi Solé-Casals
Full Text Available This paper proposes a new method for blind inversion of a monotonic nonlinear map applied to a sum of random variables. Such kinds of mixtures of random variables are found in source separation and Wiener system inversion problems, for example. The importance of our proposed method is based on the fact that it permits to decouple the estimation of the nonlinear part (nonlinear compensation from the estimation of the linear one (source separation matrix or deconvolution filter, which can be solved by applying any convenient linear algorithm. Our new nonlinear compensation algorithm, the MaxEnt algorithm, generalizes the idea of Gaussianization of the observation by maximizing its entropy instead. We developed two versions of our algorithm based either in a polynomial or a neural network parameterization of the nonlinear function. We provide a sufficient condition on the nonlinear function and the probability distribution that gives a guarantee for the MaxEnt method to succeed compensating the distortion. Through an extensive set of simulations, MaxEnt is compared with existing algorithms for blind approximation of nonlinear maps. Experiments show that MaxEnt is able to successfully compensate monotonic distortions outperforming other methods in terms of the obtained Signal to Noise Ratio in many important cases, for example when the number of variables in a mixture is small. Besides its ability for compensating nonlinearities, MaxEnt is very robust, i.e. showing small variability in the results.
Directory of Open Access Journals (Sweden)
Zhuofei Xu
2017-01-01
Full Text Available Empirical mode decomposition (EMD is a self-adaptive analysis method for nonlinear and nonstationary signals. It has been widely applied to machinery fault diagnosis and structural damage detection. A novel feature, maximum symbolic entropy of intrinsic mode function based on EMD, is proposed to enhance the ability of recognition of EMD in this paper. First, a signal is decomposed into a collection of intrinsic mode functions (IMFs based on the local characteristic time scale of the signal, and then IMFs are transformed into a serious of symbolic sequence with different parameters. Second, it can be found that the entropies of symbolic IMFs are quite different. However, there is always a maximum value for a certain symbolic IMF. Third, take the maximum symbolic entropy as features to describe IMFs from a signal. Finally, the proposed features are applied to evaluate the effect of maximum symbolic entropy in fault diagnosis of rolling bearing, and then the maximum symbolic entropy is compared with other standard time analysis features in a contrast experiment. Although maximum symbolic entropy is only a time domain feature, it can reveal the signal characteristic information accurately. It can also be used in other fields related to EMD method.
Sui, Ning; Li, Min; He, Ping
2014-12-01
In this work, we investigate the statistical computation of the Boltzmann entropy of statistical samples. For this purpose, we use both histogram and kernel function to estimate the probability density function of statistical samples. We find that, due to coarse-graining, the entropy is a monotonic increasing function of the bin width for histogram or bandwidth for kernel estimation, which seems to be difficult to select an optimal bin width/bandwidth for computing the entropy. Fortunately, we notice that there exists a minimum of the first derivative of entropy for both histogram and kernel estimation, and this minimum point of the first derivative asymptotically points to the optimal bin width or bandwidth. We have verified these findings by large amounts of numerical experiments. Hence, we suggest that the minimum of the first derivative of entropy be used as a selector for the optimal bin width or bandwidth of density estimation. Moreover, the optimal bandwidth selected by the minimum of the first derivative of entropy is purely data-based, independent of the unknown underlying probability density distribution, which is obviously superior to the existing estimators. Our results are not restricted to one-dimensional, but can also be extended to multivariate cases. It should be emphasized, however, that we do not provide a robust mathematical proof of these findings, and we leave these issues with those who are interested in them.
Silveira, Vladímir de Aquino; Souza, Givago da Silva; Gomes, Bruno Duarte; Rodrigues, Anderson Raiol; Silveira, Luiz Carlos de Lima
2014-01-01
We used psychometric functions to estimate the joint entropy for space discrimination and spatial frequency discrimination. Space discrimination was taken as discrimination of spatial extent. Seven subjects were tested. Gábor functions comprising unidimensionalsinusoidal gratings (0.4, 2, and 10 cpd) and bidimensionalGaussian envelopes (1°) were used as reference stimuli. The experiment comprised the comparison between reference and test stimulithat differed in grating's spatial frequency or envelope's standard deviation. We tested 21 different envelope's standard deviations around the reference standard deviation to study spatial extent discrimination and 19 different grating's spatial frequencies around the reference spatial frequency to study spatial frequency discrimination. Two series of psychometric functions were obtained for 2%, 5%, 10%, and 100% stimulus contrast. The psychometric function data points for spatial extent discrimination or spatial frequency discrimination were fitted with Gaussian functions using the least square method, and the spatial extent and spatial frequency entropies were estimated from the standard deviation of these Gaussian functions. Then, joint entropy was obtained by multiplying the square root of space extent entropy times the spatial frequency entropy. We compared our results to the theoretical minimum for unidimensional Gábor functions, 1/4π or 0.0796. At low and intermediate spatial frequencies and high contrasts, joint entropy reached levels below the theoretical minimum, suggesting non-linear interactions between two or more visual mechanisms. We concluded that non-linear interactions of visual pathways, such as the M and P pathways, could explain joint entropy values below the theoretical minimum at low and intermediate spatial frequencies and high contrasts. These non-linear interactions might be at work at intermediate and high contrasts at all spatial frequencies once there was a substantial decrease in joint
Distribution entropy analysis of epileptic EEG signals.
Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun
2015-01-01
It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.
A Remark on Topological Sequence Entropy
Wu, Xinxing
2017-06-01
Let h∞(T) be the supremum of all topological sequence entropies of a dynamical system (X,T). This paper obtains the iteration invariance and commutativity of h∞(T) and proves that if T is a multisensitive transformation defined on a locally connected space, then h∞(T) = +∞. As an application, it is shown that a Cournot map is Li-Yorke chaotic if and only if its topological sequence entropy relative to a suitable sequence is positive.
Entropy, geometry, and the quantum potential
Carroll, Robert
2005-01-01
We sketch and emphasize the automatic emergence of a quantum potential Q in e.g. classical WDW type equations upon inserting a (Bohmian) complex wave function. The interpretation of Q in terms of momentum fluctuations via Fisher information and entropy ideas is discussed along with the essentially forced role of the amplitude squared as a probability density. We also review the constructions of Padmanabhan connecting entropy and the Einstein equations.
Non-Gaussian effects on quantum entropies
Santos, A. P.; Silva, R.; Alcaniz, J. S.; Anselmo, D. H. A. L.
2012-03-01
A deduction of generalized quantum entropies within the non-Gaussian frameworks, Tsallis and Kaniadakis, is derived using a generalized combinatorial method and the so-called q and κ calculus. In agreement with previous results, we also show that for the Tsallis formulation the q-quantum entropy is well-defined for values of the nonextensive parameter q lying in the interval [0,2].
IEA-ECBCS Annex 51: energy efficient communities. Experience from Denmark
DEFF Research Database (Denmark)
Dalla Rosa, Alessandro; Svendsen, Svend
2011-01-01
The paper describes the Danish contribution to the IEA-ECBCS Annex 51: “energy efficient communities”. We present three case studies, two from Annex subtask A (state-of-the-art review) and one from subtask B (ongoing projects). The first case study is “Samsoe: a renewable energy island...... (Building Regulation 2008). The project partners envisaged the implementation of selected key energy-supply technologies and building components and carried out an evaluation of user preferences to give suggestions to designers and constructors of low-energy houses. The third case study (Subtask B) is: “low...... the instruments that are needed to prepare local energy and climate change strategies and supports the planning and implementation of energy-efficient communities....
Szygula, Michal; Wojciechowski, Boguslaw; Sieron, Aleksander; Adamek, Mariusz; Cebula, Wojciech; Biniszkiewicz, Tomasz; Zieleznik, Witold; Kawczyk-Krupka, Aleksandra
2001-01-01
The efficiency of autofluorescence diagnosis within urinary bladder was analyzed in the study. We examined two groups of patients: the first one consisting of 22 patients suspected to have bladder cancer and the second one consisting of 45 patients who have undergone transurethral electro resection due to urinary bladder neoplasms. Our goal was to detect cancerous tissue invisible in white-light examination. In the first group sensitivity was 100 percent and specificity was 69.23 percent. In the second group sensitivity was 96 percent and specificity was 80 percent. We also report in the study treatment efficiency of PDT in 12 patients with superficial bladder cancer. In our procedure two hours after the instillation of bladder with ALA solution, the lesion was irradiated by laser light. In 9 out of 12 treated patients regression of bladder tumor was obtained, while in 3 cases a progression of neoplasmatic process was observed.
Houtz, Nathan; Frueh, Carolin
2015-01-01
Knowing a spacecraft’s orientation is crucial for many vital functions. Attitude is often determined using a star tracker. Star tracker attitude determination must be fast and efficient given the limited on board computing resources. To determine its attitude, a star tracker must take an image of its environment, locate the stars in that image, recognize a pattern among those stars, match it with patterns in a catalog, and determine the rotation matrix that relates the spacecraft to the inert...
Kimbrough, Charles W; McMasters, Kelly M; Canary, Jeff; Jackson, Lisa; Farah, Ian; Boswell, Mark V; Kim, Daniel; Scoggins, Charles R
2015-07-01
Suboptimal operating room (OR) efficiency is a universal complaint among surgeons. Nonetheless, maximizing efficiency is critical to institutional success. Here, we report improvement achieved from low-cost, low-technology measures instituted within a tertiary-care academic medical center/Level I trauma center. Improvements in preadmission testing and OR scheduling, including appointing a senior nurse anesthetist to help direct OR use, were instituted in March 2012. A retrospective review of prospectively maintained OR case data was performed to evaluate time periods before and after program implementation, as well as to assess trends over time. Operating room performance metrics were compared using Mann-Whitney and chi-squared tests. Changes over time were analyzed using linear regression. Data including all surgical cases were available for a 36-month period; 10 months (6,581 cases) before program implementation and 26 months afterward (17,574 cases). Dramatic improvement was seen in first-case on-time starts, which increased from 39.3% to 83.8% (p efficiency and case volume. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Dai, Hongliang; Chen, Wenliang; Dai, Zheqin; Li, Xiang; Lu, Xiwu
2017-08-01
A systematic calibration and validation procedure for the complex mechanistic modeling of anaerobic-anoxic/nitrifying (A2N) two-sludge system is needed. An efficient method based on phase experiments, sensitivity analysis, and genetic algorithm is proposed here for model calibration. Phase experiments (anaerobic phosphorus release, aerobic nitrification, and anoxic denitrifying phosphate accumulation) in an A2N sequencing batch reactor (SBR) were performed to reflect the process conditions accurately and improve the model calibration efficiency. The calibrated model was further validated using 30 batch experiments and 3-month dynamic continuous flow (CF) experiments for A2N-SBR and CF-A2N process, respectively. Several statistical criteria were conducted to evaluate the accuracy of model predications, including the average relative deviation (ARD), mean absolute error (MAE), root mean square error (RMSE), and Janus coefficient. Visual comparisons and statistical analyses indicated that the calibrated model could provide accurate predictions for the effluent chemical oxygen demand (COD), ammonia nitrogen (NH 4 + -N), total nitrogen (TN), and total phosphorus (TP), with only one iteration.
Visual experience is not necessary for efficient survey spatial cognition: evidence from blindness.
Tinti, Carla; Adenzato, Mauro; Tamietto, Marco; Cornoldi, Cesare
2006-07-01
This study investigated whether the lack of visual experience affects the ability to create spatial inferential representations of the survey type. We compared the performance of persons with congenital blindness and that of blindfolded sighted persons on four survey representation-based tasks (Experiment 1). Results showed that persons with blindness performed better than blindfolded sighted controls. We repeated the same tests introducing a third group of persons with late blindness (Experiment 2). This last group performed better than blindfolded sighted participants, whereas differences between participants with late and congenital blindness were nonsignificant. The present findings are compatible with results of other studies, which found that when visual perception is lacking, skill in gathering environmental spatial information provided by nonvisual modalities may contribute to a proper spatial encoding. It is concluded that, although it cannot be asserted that total lack of visual experience incurs no cost, our findings are further evidence that visual experience is not a necessary condition for the development of spatial inferential complex representations.
Morowitz, Harold J.
1996-10-01
Harold Morowitz has long been highly regarded both as an eminent scientist and as an accomplished science writer. The essays in The Wine of Life , his first collection, were hailed by C.P. Snow as "some of the wisest, wittiest and best informed I have ever read," and Carl Sagan called them "a delight to read." In later volumes he established a reputation for a wide-ranging intellect, an ability to see unexpected connections and draw striking parallels, and a talent for communicating scientific ideas with optimism and wit. With Entropy and the Magic Flute , Morowitz once again offers an appealing mix of brief reflections on everything from litmus paper to the hippopotamus to the sociology of Palo Alto coffee shops. Many of these pieces are appreciations of scientists that Morowitz holds in high regard, while others focus on health issues, such as America's obsession with cheese toppings. There is also a fascinating piece on the American Type Culture Collection, a zoo or warehouse for microbes that houses some 11,800 strains of bacteria, and over 3,000 specimens of protozoa, algae, plasmids, and oncogenes. Here then are over forty light, graceful essays in which one of our wisest experimental biologists comments on issues of science, technology, society, philosophy, and the arts.
Entropy as a collective variable
Parrinello, Michele
Sampling complex free energy surfaces that exhibit long lived metastable states separated by kinetic bottlenecks is one of the most pressing issues in the atomistic simulations of matter. Not surprisingly many solutions to this problem have been suggested. Many of them are based on the identification of appropriate collective variables that span the manifold of the slow varying modes of the system. While much effort has been put in devising and even constructing on the fly appropriate collective variables there is still a cogent need of introducing simple, generic, physically transparent, and yet effective collective variables. Motivated by the physical observation that in many case transitions between one metastable state and another result from a trade off between enthalpy and entropy we introduce appropriate collective variables that are able to represent in a simple way these two physical properties. We use these variables in the context of the recently introduced variationally enhanced sampling and apply it them with success to the simulation of crystallization from the liquid and to conformational transitions in protein. Department of Chemistry and Applied Biosciences, ETH Zurich, and Facolta' di Informatica, Istituto di Scienze Computazionali, Universita' della Svizzera Italiana, Via G. Buffi 13, 6900 Lugano, Switzerland.
Mechanical Entropy and Its Implications
Directory of Open Access Journals (Sweden)
Pharis E. Williams
2001-06-01
Full Text Available Abstract: It is shown that the classical laws of thermodynamics require that mechanical systems must exhibit energy that becomes unavailable to do useful work. In thermodynamics, this type of energy is called entropy. It is further shown that these laws require two metrical manifolds, equations of motion, field equations, and Weyl's quantum principles. Weyl's quantum principle requires quantization of the electrostatic potential of a particle and that this potential be non-singular. The interactions of particles through these non-singular electrostatic potentials are analyzed in the low velocity limit and in the relativistic limit. It is shown that writing the two particle interactions for unlike particles allows an examination in two limiting cases: large and small separations. These limits are shown to have the limiting motions of: all motions are ABOUT the center of mass or all motion is OF the center of mass. The first limit leads to the standard Dirac equation. The second limit is shown to have equations of which the electroweak theory is a subset. An extension of the gauge principle into a five-dimensional manifold, then restricting the generality of the five-dimensional manifold by using the conservation principle, shows that the four-dimensional hypersurface that is embedded within the 5-D manifold is required to obey Einstein's field equations. The 5-D gravitational quantum equations of the solar system are presented.
On variational expressions for quantum relative entropies
Berta, Mario; Fawzi, Omar; Tomamichel, Marco
2017-12-01
Distance measures between quantum states like the trace distance and the fidelity can naturally be defined by optimizing a classical distance measure over all measurement statistics that can be obtained from the respective quantum states. In contrast, Petz showed that the measured relative entropy, defined as a maximization of the Kullback-Leibler divergence over projective measurement statistics, is strictly smaller than Umegaki's quantum relative entropy whenever the states do not commute. We extend this result in two ways. First, we show that Petz' conclusion remains true if we allow general positive operator-valued measures. Second, we extend the result to Rényi relative entropies and show that for non-commuting states the sandwiched Rényi relative entropy is strictly larger than the measured Rényi relative entropy for α \\in (1/2, \\infty ) and strictly smaller for α \\in [0,1/2). The latter statement provides counterexamples for the data processing inequality of the sandwiched Rényi relative entropy for α quantum conditional mutual information are superadditive.
A review of entropy generation in microchannels
Directory of Open Access Journals (Sweden)
Mohamed M Awad
2015-12-01
Full Text Available In this study, a critical review of thermodynamic optimum of microchannels based on entropy generation analysis is presented. Using entropy generation analysis as evaluation parameter of microchannels has been reported by many studies in the literature. In these studies, different working fluids such as nanofluids, air, water, engine oil, aniline, ethylene glycol, and non-Newtonian fluids have been used. For the case of nanofluids, “nanoparticles” has been used in various kinds such as Al2O3 and Cu, and “base fluid” has been used in various kinds such as water and ethylene glycol. Furthermore, studies on thermodynamic optimum of microchannels based on entropy generation analysis are summarized in a table. At the end, recommendations of future work for thermodynamic optimum of microchannels based on entropy generation analysis are given. As a result, this article can not only be used as the starting point for the researcher interested in entropy generation in microchannels, but it also includes recommendations for future studies on entropy generation in microchannels.
Trajectories entropy in dynamical graphs with memory
Directory of Open Access Journals (Sweden)
Francesco eCaravelli
2016-04-01
Full Text Available In this paper we investigate the application of non-local graph entropy to evolving and dynamical graphs. The measure is based upon the notion of Markov diffusion on a graph, and relies on the entropy applied to trajectories originating at a specific node. In particular, we study the model of reinforcement-decay graph dynamics, which leads to scale free graphs. We find that the node entropy characterizes the structure of the network in the two parameter phase-space describing the dynamical evolution of the weighted graph. We then apply an adapted version of the entropy measure to purely memristive circuits. We provide evidence that meanwhile in the case of DC voltage the entropy based on the forward probability is enough to characterize the graph properties, in the case of AC voltage generators one needs to consider both forward and backward based transition probabilities. We provide also evidence that the entropy highlights the self-organizing properties of memristive circuits, which re-organizes itself to satisfy the symmetries of the underlying graph.
Maximum Entropy Approaches to Living Neural Networks
Directory of Open Access Journals (Sweden)
John M. Beggs
2010-01-01
Full Text Available Understanding how ensembles of neurons collectively interact will be a key step in developing a mechanistic theory of cognitive processes. Recent progress in multineuron recording and analysis techniques has generated tremendous excitement over the physiology of living neural networks. One of the key developments driving this interest is a new class of models based on the principle of maximum entropy. Maximum entropy models have been reported to account for spatial correlation structure in ensembles of neurons recorded from several different types of data. Importantly, these models require only information about the firing rates of individual neurons and their pairwise correlations. If this approach is generally applicable, it would drastically simplify the problem of understanding how neural networks behave. Given the interest in this method, several groups now have worked to extend maximum entropy models to account for temporal correlations. Here, we review how maximum entropy models have been applied to neuronal ensemble data to account for spatial and temporal correlations. We also discuss criticisms of the maximum entropy approach that argue that it is not generally applicable to larger ensembles of neurons. We conclude that future maximum entropy models will need to address three issues: temporal correlations, higher-order correlations, and larger ensemble sizes. Finally, we provide a brief list of topics for future research.
Conspiratorial Beliefs Observed through Entropy Principles
Directory of Open Access Journals (Sweden)
Nataša Golo
2015-08-01
Full Text Available We propose a novel approach framed in terms of information theory and entropy to tackle the issue of the propagation of conspiracy theories. We represent the initial report of an event (such as the 9/11 terroristic attack as a series of strings of information, each string classified by a two-state variable Ei = ±1, i = 1, …, N. If the values of the Ei are set to −1 for all strings, a state of minimum entropy is achieved. Comments on the report, focusing repeatedly on several strings Ek, might alternate their meaning (from −1 to +1. The representation of the event is turned fuzzy with an increased entropy value. Beyond some threshold value of entropy, chosen by simplicity to its maximum value, meaning N/2 variables with Ei = 1, the chance is created that a conspiracy theory might be initiated/propagated. Therefore, the evolution of the associated entropy is a way to measure the degree of penetration of a conspiracy theory. Our general framework relies on online content made voluntarily available by crowds of people, in response to some news or blog articles published by official news agencies. We apply different aggregation levels (comment, person, discussion thread and discuss the associated patterns of entropy change.
Efficient data analysis and travel time picking methods for crosshole GPR experiments
DEFF Research Database (Denmark)
Keskinen, Johanna; Moreau, Julien; Nielsen, Lars
2013-01-01
High-resolution GPR crosshole experiments are conducted to resolve ﬁne-scale anisotropy of chalk. Chalk plays important roles in groundwater production onshore Denmark and in hydrocarbon exploration in the North Sea, and chalk has previously been studied extensively with geological and geophysical...... methods. Future time-lapse GPR studies of different types of chalk aim at characterizing the ﬂow characteristics of these economically important lithologies. In the framework of the current study, we have collected new crosshole GPR data from a site located in a former quarry in Eastern Denmark, where...... crosshole GPR experiments have been previously carried out. Time-lapse GPR crosshole experiments involve the interpretation of large datasets organised in terms of transmitter or receiver gathers and hence call for efﬁcient and robust data inspection and picking of ﬁrst arrivals. To this end, we have...
Kessner, Darren; Novembre, John
2015-04-01
Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50-100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. Copyright © 2015 by the Genetics Society of America.
Bubble formation and (in)efficient markets in learning-to-forecast and -optimize experiments
Bao, T.; Hommes, C.; Makarewicz, T.
2014-01-01
This experiment compares the price dynamics and bubble formation in an asset market with a price adjustment rule in three treatments where subjects (1) submit a price forecast only, (2) choose quantity to buy/sell and (3) perform both tasks. We find deviation of the market price from the fundamental
Energy Technology Data Exchange (ETDEWEB)
Growitsch, Christian [WIK Wissenschaftliches Institut fuer Infrastruktur und Kommunikationsdienste GmbH, Bad Honnef (Germany). Dept. of Energy Markets and Energy Regulation; Jamasb, Tooraj [Edinburgh Univ. (United Kingdom). Dept. of Economics; Wetzel, Heike [Cologne Univ. (Germany). Dept. of Economics
2010-08-15
Since the 1990s, efficiency and benchmarking analysis has increasingly been used in network utilities research and regulation. A recurrent concern is the effect of environmental factors that are beyond the influence of firms (observable heterogeneity) and factors that are not identifiable (unobserved heterogeneity) on measured cost and quality performance of firms. This paper analyses the effect of geographic and weather factors and unobserved heterogeneity on a set of 128 Norwegian electricity distribution utilities for the 2001-2004 period. We utilize data on almost 100 geographic and weather variables to identify real economic inefficiency while controlling for observable and unobserved heterogeneity. We use the factor analysis technique to reduce the number of environmental factors into few composite variables and to avoid the problem of multicollinearity. We then estimate the established stochastic frontier models of Battese and Coelli (1992; 1995) and the recent true fixed effects models of Greene (2004; 2005) without and with environmental variables. In the former models some composite environmental variables have a significant effect on the performance of utilities. These effects vanish in the true fixed effects models. However, the latter models capture the entire unobserved heterogeneity and therefore show significantly higher average efficiency scores. (orig.)
Hassanein, A.; Sizyuk, T.; Sizyuk, V.; Harilal, S. S.
2011-04-01
Laser produced plasmas (LPP) is currently a promising source of an efficient extreme ultraviolet (EUV) photon source production for advanced lithography. Optimum laser pulse parameters with adjusted wavelength, energy, and duration for simple planar or spherical tin target can provide 2-3% conversion efficiency (CE) in laboratory experiments. These values are also in good agreement with modeling results. Additional effects such as targets with complex geometry and tin-doped targets using pre-pulsing of laser beams can significantly increase CE. Recent studies showed that such improvements in LPP system are due to reduction in laser energy losses by decreasing photons transmission (higher harmonic of Nd:YAG laser) or photons reflection (for CO2 laser). Optimization of target heating using pre-pulses or ablating low-density and nanoporous tin oxide can further improve LLP sources by creating more efficient plasma plumes and as a result increasing CE, the most important parameter for EUV sources. The second important challenge in developing LPP devices is to decrease fast ions and target debris to protect the optical collection system and increase its lifetime. We investigated the combined effects of pre-pulsing with various parameters and different target geometries on EUV conversion efficiency and on energetic ions production. The much higher reflectivity of CO2 laser from a tin target leads to two possible ways for system improvement using pre-pulses with shorter laser wavelengths or using more complex targets geometries with special grooves as developed previously by the authors.
A fast and efficient gene-network reconstruction method from multiple over-expression experiments
Directory of Open Access Journals (Sweden)
Thurner Stefan
2009-08-01
Full Text Available Abstract Background Reverse engineering of gene regulatory networks presents one of the big challenges in systems biology. Gene regulatory networks are usually inferred from a set of single-gene over-expressions and/or knockout experiments. Functional relationships between genes are retrieved either from the steady state gene expressions or from respective time series. Results We present a novel algorithm for gene network reconstruction on the basis of steady-state gene-chip data from over-expression experiments. The algorithm is based on a straight forward solution of a linear gene-dynamics equation, where experimental data is fed in as a first predictor for the solution. We compare the algorithm's performance with the NIR algorithm, both on the well known E. coli experimental data and on in-silico experiments. Conclusion We show superiority of the proposed algorithm in the number of correctly reconstructed links and discuss computational time and robustness. The proposed algorithm is not limited by combinatorial explosion problems and can be used in principle for large networks.
DEFF Research Database (Denmark)
Baldini, Mattia; Klinge Jacobsen, Henrik
2016-01-01
, current issues concerning climate change and fossil fuels depletion has moved attention towards cleaner ways to produce energy. This trend facilitated the breakthrough of renewable technologies. Since then, support policies have promoted the large deployment of renewables, without considering enough...... the improvements made in the energy saving field. Indeed, little attention has been paid to implement energy efficiency measures, which has resulted in scenarios where expedients for a wise use of energy (e.g. energy savings and renewables share) are unbalanced. The aim of this paper is to review and evaluate...... international experiences on finding the optimal trade-off between efficiency improvements and additional renewable energy supply. A critical review of each technique, focusing on purposes, methodology and outcomes, is provided along with a review of tools adopted for the analyses. The models are categorized...
DEFF Research Database (Denmark)
Baldini, Mattia; Klinge Jacobsen, Henrik
, current issues concerning climate change and fossil fuels depletion has moved attention towards cleaner ways to produce energy. This trend facilitated the breakthrough of renewable technologies. Since then, support policies have promoted the large deployment of renewables, without considering enough...... improvements made in the energy saving field. Indeed, less attention has been paid to implement energy efficiency measures in energy systems modeling, which has resulted in scenarios where expedients for a wise use of energy (e.g. energy savings and renewables’ share) are unbalanced and cost......-savings opportunities are missed. The aim of this paper is to review and evaluate international experiences on finding the optimal trade-off between efficiency improvements and additional renewable energy supply. A critical review of each technique, focusing on purposes, methodology and outcomes, is provided along...
Phonon broadening in high entropy alloys
Körmann, Fritz; Ikeda, Yuji; Grabowski, Blazej; Sluiter, Marcel H. F.
2017-09-01
Refractory high entropy alloys feature outstanding properties making them a promising materials class for next-generation high-temperature applications. At high temperatures, materials properties are strongly affected by lattice vibrations (phonons). Phonons critically influence thermal stability, thermodynamic and elastic properties, as well as thermal conductivity. In contrast to perfect crystals and ordered alloys, the inherently present mass and force constant fluctuations in multi-component random alloys (high entropy alloys) can induce significant phonon scattering and broadening. Despite their importance, phonon scattering and broadening have so far only scarcely been investigated for high entropy alloys. We tackle this challenge from a theoretical perspective and employ ab initio calculations to systematically study the impact of force constant and mass fluctuations on the phonon spectral functions of 12 body-centered cubic random alloys, from binaries up to 5-component high entropy alloys, addressing the key question of how chemical complexity impacts phonons. We find that it is crucial to include both mass and force constant fluctuations. If one or the other is neglected, qualitatively wrong results can be obtained such as artificial phonon band gaps. We analyze how the results obtained for the phonons translate into thermodynamically integrated quantities, specifically the vibrational entropy. Changes in the vibrational entropy with increasing the number of elements can be as large as changes in the configurational entropy and are thus important for phase stability considerations. The set of studied alloys includes MoTa, MoTaNb, MoTaNbW, MoTaNbWV, VW, VWNb, VWTa, VWNbTa, VTaNbTi, VWNbTaTi, HfZrNb, HfMoTaTiZr.
Entropy, Function and Evolution: Naturalizing Peircian Semiosis
Directory of Open Access Journals (Sweden)
Carsten Herrmann-Pillath
2010-02-01
Full Text Available In the biosemiotic literature there is a tension between the naturalistic reference to biological processes and the category of ‘meaning’ which is central in the concept of semiosis. A crucial term bridging the two dimensions is ‘information’. I argue that the tension can be resolved if we reconsider the relation between information and entropy and downgrade the conceptual centrality of Shannon information in the standard approach to entropy and information. Entropy comes into full play if semiosis is seen as a physical process involving causal interactions between physical systems with functions. Functions emerge from evolutionary processes, as conceived in recent philosophical contributions to teleosemantics. In this context, causal interactions can be interpreted in a dual mode, namely as standard causation and as an observation. Thus, a function appears to be the interpretant in the Peircian triadic notion of the sign. Recognizing this duality, the Gibbs/Jaynes notion of entropy is added to the picture, which shares an essential conceptual feature with the notion of function: Both concepts are part of a physicalist ontology, but are observer relative at the same time. Thus, it is possible to give an account of semiosis within the entropy framework without limiting the notion of entropy to the Shannon measure, but taking full account of the thermodynamic definition. A central feature of this approach is the conceptual linkage between the evolution of functions and maximum entropy production. I show how we can conceive of the semiosphere as a fundamental physical phenomenon. Following an early contribution by Hayek, in conclusion I argue that the category of ‘meaning’ supervenes on nested functions in semiosis, and has a function itself, namely to enable functional self-reference, which otherwise mainfests functional break-down because of standard set-theoretic paradoxes.
Directory of Open Access Journals (Sweden)
Li Pan
2016-03-01
Full Text Available Virtualization technologies make it possible for cloud providers to consolidate multiple IaaS provisions into a single server in the form of virtual machines (VMs. Additionally, in order to fulfill the divergent service requirements from multiple users, a cloud provider needs to offer several types of VM instances, which are associated with varying configurations and performance, as well as different prices. In such a heterogeneous virtual machine placement process, one significant problem faced by a cloud provider is how to optimally accept and place multiple VM service requests into its cloud data centers to achieve revenue maximization. To address this issue, in this paper, we first formulate such a revenue maximization problem during VM admission control as a multiple-dimensional knapsack problem, which is known to be NP-hard to solve. Then, we propose to use a cross-entropy-based optimization approach to address this revenue maximization problem, by obtaining a near-optimal eligible set for the provider to accept into its data centers, from the waiting VM service requests in the system. Finally, through extensive experiments and measurements in a simulated environment with the settings of VM instance classes derived from real-world cloud systems, we show that our proposed cross-entropy-based admission control optimization algorithm is efficient and effective in maximizing cloud providers’ revenue in a public cloud computing environment.
Progress in Preparation and Research of High Entropy Alloys
Directory of Open Access Journals (Sweden)
CHEN Yong-xing
2017-11-01
Full Text Available The current high entropy alloys' studies are most in block, powder, coating, film and other areas. There are few studies of high entropy alloys in other areas and they are lack of unified classification. According to the current high entropy alloys' research situation, The paper has focused on the classification on all kinds of high entropy alloys having been researched, introduced the selecting principle of elements, summarized the preparation methods, reviewed the research institutions, research methods and research contents of high entropy alloys, prospected the application prospect of high entropy alloys, put forward a series of scientific problems of high entropy alloys, including less research on mechanism, incomplete performance research, unsystematic thermal stability study, preparation process parameters to be optimized, lightweight high entropy alloys' design, the expansion on the research field, etc, and the solutions have been given. Those have certain guiding significance for the expansion of the application of high entropy alloys subjects in the future research direction.
Double symbolic joint entropy in nonlinear dynamic complexity analysis
Yao, Wenpo; Wang, Jun
2017-07-01
Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.
Xu, Kaixuan; Wang, Jun
2017-02-01
In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.
Xu, Yonghong; Li, Xingxing; Zhao, Yong
2013-10-01
In this paper, a new method combining wavelet packet transform and multivariate multiscale entropy for the classification of epilepsy EEG signals is introduced. Firstly, the original EEG signals are decomposed at multi-scales with the wavelet packet transform, and the wavelet packet coefficients of the required frequency bands are extracted. Secondly, the wavelet packet coefficients are processed with multivariate multiscale entropy algorithm. Finally, the EEG data are classified by support vector machines (SVM). The experimental results on the international public Bonn epilepsy EEG dataset show that the proposed method can efficiently extract epileptic features and the accuracy of classification result is satisfactory.
Stretching Rubber, Stretching Minds: a polymer physics lab for teaching entropy
Brzinski, Theodore A
2015-01-01
Entropy is a difficult concept to teach using real-world examples. Unlike temperature, pressure, volume, or work, it's not a quantity which most students encounter in their day-to-day lives. Even the way entropy is often qualitatively described, as a measure of disorder, is incomplete and can be misleading. In an effort to address these obstacles, we have developed a simple laboratory activity, the stretching of an elastic rubber sheet, intended to give students hands-on experience with the concepts of entropy, temperature and work in both adiabatic and quasistatic processes. We present two versions of the apparatus: a double-lever system, which may be reproduced with relatively little cost, and a commercial materials testing system, which provides students experience with scientific instrumentation that is used in research.
Liang, Xiuying; Zhu, Chunyan
2017-11-01
With rising global emphasizes on climate change and sustainable development, how to accelerate the transformation of energy efficiency has become an important question. Designing and implementing energy-efficiency policies for super-efficient products represents an important direction to achieve breakthroughs in the field of energy conservation. On December 31, 2014, China’s National Development and Reform Commission (NDRC) jointly six other ministerial agencies launched China Leading Energy Efficiency Program (LEP), which identifies top efficiency models for selected product categories. LEP sets the highest energy efficiency benchmark. Design of LEP took into consideration of how to best motivate manufacturers to accelerate technical innovation, promote high efficiency products. This paper explains core elements of LEP, such as objectives, selection criteria, implementation method and supportive policies. It also proposes recommendations to further improve LEP through international policy comparison with Japan’s Top Runner Program, U.S. Energy Star Most Efficient, and SEAD Global Efficiency Medal.
Analyzing bin-width effect on the computed entropy
Purwani, Sri; Nahar, Julita; Twining, Carole
2017-08-01
The Shannon entropy is a mathematical expression for quantifying the amount of randomness which can be used to measure information content. It is used in objective function. Mutual Information (MI) uses Shannon entropy in order to determine shared information content of two images. The Shannon entropy, which was originally derived by Shannon in the context of lossless encoding of messages, is also used to define an optimum message length used in the Minimum Description Length (MDL) principle for groupwise registration. Majority of papers used histogram for computing MI, and hence the entropy. We therefore, aim to analyze the effect of bin-width on the computed entropy. We first derived the Shannon entropy from the integral of probability density function (pdf), and found that Gaussian has maximum entropy over all possible distribution. We also show that the entropy of the flat distribution is less than the entropy of the Gaussian distribution with the same variance. We then investigated the bin-width effect on the computed entropy, and analyzed the relationship between the computed entropy and the integral entropy when we vary bin-width, but fix variance and the number of samples. We then found that the value of the computed entropy lies within the theoretical predictions at small and large bin-widths. We also show two types of bias in entropy estimator.
Properties of Risk Measures of Generalized Entropy in Portfolio Selection
Directory of Open Access Journals (Sweden)
Rongxi Zhou
2017-12-01
Full Text Available This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and Hybrid Entropy in the hybridized uncertainty of both fuzziness and randomness. We discover that none of the risk measures satisfy all six of the following properties, which various scholars have associated with effective risk measures: Monotonicity, Translation Invariance, Sub-additivity, Positive Homogeneity, Consistency and Convexity. Measures based on Fuzzy Entropy, Credibility Entropy, and Sine Entropy all exhibit the same properties: Sub-additivity, Positive Homogeneity, Consistency, and Convexity. These measures based on Information Entropy and Hybrid Entropy, meanwhile, only exhibit Sub-additivity and Consistency. Cumulative Residual Entropy satisfies just Sub-additivity, Positive Homogeneity, and Convexity. After identifying these properties, we develop seven portfolio models based on different risk measures and made empirical comparisons using samples from both the Shenzhen Stock Exchange of China and the New York Stock Exchange of America. The comparisons show that the Mean Fuzzy Entropy Model performs the best among the seven models with respect to both daily returns and relative cumulative returns. Overall, these results could provide an important reference for both constructing effective risk measures and rationally selecting the appropriate risk measure under different portfolio selection conditions.
He, Huiying; Yang, Rui; Li, Yajun; Ma, Aisheng; Cao, Lanqin; Wu, Xiaoming; Chen, Biyun; Tian, Hui; Gao, Yajun
2017-01-01
Oilseed rape (Brassica napus) characteristically has high N uptake efficiency and low N utilization efficiency (NUtE, seed yield/shoot N accumulation). Determining the NUtE phenotype of various genotypes in different growth conditions is a way of finding target traits to improve oilseed rape NUtE. The aim of this study was to compare oilseed rape genotypes grown on contrasting N supply rates in pot and field experiments to investigate the genotypic variations of NUtE and to identify indicators of N efficient genotypes. For 50 oilseed rape genotypes, NUtE, dry matter and N partitioning, morphological characteristics, and the yield components were investigated under high and low N supplies in a greenhouse pot experiment and a field trial. Although the genotype rankings of NUtE were different between the pot experiment and the field trial, some genotypes performed consistently in both two environments. N-responder, N-nonresponder, N-efficient and N-inefficient genotypes were identified from these genotypes with consistent NUtE. The correlations between the pot experiment and the field trial in NUtE were only 0.34 at high N supplies and no significant correlations were found at low N supplies. However, Pearson coefficient correlation (r) and principal component analysis showed NUtE had similar genetic correlations with other traits across the pot and field experiment. Among the yield components, only seeds per silique showed strong and positive correlations with NUtE under varying N supply in both experiments (r = 0.47**; 0.49**; 0.47**; 0.54**). At high and low N supply, NUtE was positively correlated with seed yield (r = 0.45**; 0.53**; 0.39**; 0.87**), nitrogen harvest index (NHI, r = 0.68**; 0.82**; 0.99**; 0.89**), and harvest index (HI, r = 0.79**; 0.83**; 0.90**; 0.78**) and negatively correlated with biomass distribution to stem and leaf (r = −0.34**; −0.45**; −0.37**; 0.62**), all aboveground plant section N concentration (r from −0.30* to −0
Directory of Open Access Journals (Sweden)
Huiying He
2017-10-01
Full Text Available Oilseed rape (Brassica napus characteristically has high N uptake efficiency and low N utilization efficiency (NUtE, seed yield/shoot N accumulation. Determining the NUtE phenotype of various genotypes in different growth conditions is a way of finding target traits to improve oilseed rape NUtE. The aim of this study was to compare oilseed rape genotypes grown on contrasting N supply rates in pot and field experiments to investigate the genotypic variations of NUtE and to identify indicators of N efficient genotypes. For 50 oilseed rape genotypes, NUtE, dry matter and N partitioning, morphological characteristics, and the yield components were investigated under high and low N supplies in a greenhouse pot experiment and a field trial. Although the genotype rankings of NUtE were different between the pot experiment and the field trial, some genotypes performed consistently in both two environments. N-responder, N-nonresponder, N-efficient and N-inefficient genotypes were identified from these genotypes with consistent NUtE. The correlations between the pot experiment and the field trial in NUtE were only 0.34 at high N supplies and no significant correlations were found at low N supplies. However, Pearson coefficient correlation (r and principal component analysis showed NUtE had similar genetic correlations with other traits across the pot and field experiment. Among the yield components, only seeds per silique showed strong and positive correlations with NUtE under varying N supply in both experiments (r = 0.47**; 0.49**; 0.47**; 0.54**. At high and low N supply, NUtE was positively correlated with seed yield (r = 0.45**; 0.53**; 0.39**; 0.87**, nitrogen harvest index (NHI, r = 0.68**; 0.82**; 0.99**; 0.89**, and harvest index (HI, r = 0.79**; 0.83**; 0.90**; 0.78** and negatively correlated with biomass distribution to stem and leaf (r = −0.34**; −0.45**; −0.37**; 0.62**, all aboveground plant section N concentration (r from −0.30* to
Energy Technology Data Exchange (ETDEWEB)
Deiss, R. [EnBW Regional AG, Stuttgart (Germany)
2004-07-01
In the late 1990's a new generation of devices in the sector of remote monitoring of CCP appeared, which allowed the large-scaled application of CCP-remote monitoring. The article deals with experiences of operation made up to now, the current device technology and aspects to efficiency of these systems. (orig.) [German] Mit dem Aufkommen einer neuen Generation von Fernueberwachungssensoren Ende der neunziger Jahre des vorigen Jahrhunderts war es moeglich, die KKS-Fernueberwachung in grossem Stile anzuwenden. Der Beitrag befasst sich mit den Betriebserfahrungen, die bisher gewonnen werden konnten, beleuchtet die aktuelle Geraetetechnik und betrachtet die Wirtschaftlichkeit dieser Systeme. (orig.)
Efficient use of correlation entropy for analysing time series data
Indian Academy of Sciences (India)
is more clearly seen in figure 4 where a blow up of the region from M = 5 to 8 of figure 3 is shown. Moreover, the error bar in figure 2 also increases proportional to the white noise contamination. For a noise level of 100% in figure 2d, the scheme computes K2 only up to a maximum M value of Mcr = 6. It shows that beyond.
Energy Technology Data Exchange (ETDEWEB)
Santillan, M [Cinvestav-IPN, Unidad Monterrey, Parque de Investigacion e Innovacion Tecnologica, Autopista Monterrey-Aeropuerto Km 10, 66600 Apodaca NL (Mexico); Zeron, E S [Departamento de Matematicas, Cinvestav-IPN, 07000 Mexico DF (Mexico); Rio-Correa, J L del [Departamento de Fisica, Universidad Autonoma Metropolitana Iztapalapa, 09340 Mexico DF (Mexico)], E-mail: msantillan@cinvestav.mx, E-mail: eszeron@math.cinvestav.mx, E-mail: jlrc@xanum.uam.mx
2008-05-15
In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of the demonstrations to pass from the microcanonical to the canonical and grand-canonical ensembles is hard to grasp. In the present work, we adapt the approach used by Schroedinger to introduce the entropy definition for quantum mechanical systems to derive a classical mechanical entropy definition, which is valid for all ensembles and is in complete agreement with the Gibbs entropy. Afterwards, we show how the specific probability densities for the microcanonical and canonical ensembles can be obtained from the system macrostate, the entropy definition and the second law of thermodynamics. After teaching the approach introduced in this paper for several years, we have found that it allows a better understanding of the statistical mechanics foundations. On the other hand, since it demands previous knowledge of thermodynamics and mathematical analysis, in our experience this approach is more adequate for final-year undergraduate and graduate physics students.
Numerical and analytical modelling of entropy noise in a supersonic nozzle with a shock
Leyko, M.; Moreau, S.; Nicoud, F.; Poinsot, T.
2011-08-01
Analytical and numerical assessments of the indirect noise generated through a nozzle are presented. The configuration corresponds to an experiment achieved at DLR by Bake et al. [The entropy wave generator (EWG): a reference case on entropy noise, Journal of Sound and Vibration 326 (2009) 574-598] where an entropy wave is generated upstream of a nozzle by an electrical heating device. Both 3-D and 2-D axisymmetric simulations are performed to demonstrate that the experiment is mostly driven by linear acoustic phenomena, including pressure wave reflection at the outlet and entropy-to-acoustic conversion in the accelerated regions. Moreover, the spatial inhomogeneity of the upstream entropy fluctuation has no visible effect for the investigated frequency range (0-100 Hz). Similar results are obtained with a purely analytical method based on the compact nozzle approximation of Marble and Candel [Acoustic disturbances from gas nonuniformities convected through a nozzle, Journal of Sound and Vibration 55 (1977) 225-243] demonstrating that the DLR results can be reproduced simply on the basis of a low-frequency compact-elements approximation. Like in the present simulations, the analytical method shows that the acoustic impedance downstream of the nozzle must be accounted for to properly recover the experimental pressure signal. The analytical method can also be used to optimize the experimental parameters and avoid the interaction between transmitted and reflected waves.
Emotion-induced higher wavelet entropy in the EEG with depression during a cognitive task.
Wei, Ling; Li, Yingjie; Ye, Jiping; Yang, Xiaoli; Wang, Jijun
2009-01-01
This paper presents a study about how emotion influences cognition. We used wavelet entropy as a tool to analyze event-related electroencephalograph during a cognitive task. Emotion and cognition are two major aspects of human mental life that are widely regarded as distinct but interacting. However, the mechanism of this interacting is still not well known. In our study, a recognition task with facial stimuli was utilized in order to address the influence of emotion on working memory. Three expressions of each face (happy-positive, sad-negative, and neutral) were chosen for the experiments. Since depression is characterized as a typical mental disease with emotion processing deficits, sixteen patients with depression and sixteen normal controls were chosen to participate in the experiment. The repeated measure analysis of variance (ANOVA) revealed that the patients with depression had a significantly higher entropies than the normal control overall the brain regions. Although behavior results did not indicate any emotion effect, wavelet entropy told more about it. The emotion effect was found in the right anterior and right center of the brain by the analysis of entropy. We concluded that patients with depression showed much higher emotion-induced disorder than normal persons after about 300ms after stimulus onset. In methodology wavelet entropy can help us to understand the interaction between emotion and cognition.
Golovko, V. V.; Iacob, V. E.; Hardy, J. C.
2008-04-01
We previously reported Monte Carlo (MC) studies of the efficiency of a 1-mm-thick plastic detector to few-MeV electrons with various programs: Geant4, EGSnrc and Penelope. The simulated results were also compared with measured data from standard conversion-electron sources: ^133Ba, ^137Cs and ^207Bi. [1] These studies were part of our program to test the Electroweak Standard Model via precise measurements of lifetimes, branching ratios and Q-values of superallowed 0^+->0^+ nuclear transitions [2], which in turn yield the value of the up-down quark-mixing element of the Cabibbo-Kobayasshi-Maskawa (CKM) matrix. The MC studies of the β-detector efficiency are important for the measurement of precise &+circ;-branching-ratios since there is a slight difference in the efficiency of the β-detector for different β-branches. This has an additional affect on the number of observed β-γ coincidences over and above the well known efficiency of our γ-ray detector. We report here an extension of the comparison between MC calculations and experiment to a ^60Co β-source, and a study of the influence of peripheral objects on the β-detector efficiency. [1] V.V. Golovko et. al. BAPS 59, no 6, p. DH4 83, 2006; BAPS 52, no 3, p. C16 53, 2007; BAPS 52, no 9, p. EH8 83, 2007. [2] J.C. Hardy and I.S. Towner. PRC, 71(5):055501, 2005.
Entropy and Energy, - a Universal Competition
Müller, Ingo
2008-12-01
When a body approaches equilibrium, energy tends to a minimum and entropy tends to a maximum. Often, or usually, the two tendencies favour different configurations of the body. Thus energy is deterministic in the sense that it favours fixed positions for the atoms, while entropy randomizes the positions. Both may exert considerable forces in the attempt to reach their objectives. Therefore they have to compromise; indeed, under most circumstances it is the available free energy which achieves a minimum. For low temperatures that free energy is energy itself, while for high temperatures it is determined by entropy. Several examples are provided for the roles of energy and entropy as competitors: - Planetary atmospheres; - osmosis; - phase transitions in gases and liquids and in shape memory alloys, and - chemical reactions, viz. the Haber Bosch synthesis of ammonia and photosynthesis. Some historical remarks are strewn through the text to make the reader appreciate the difficulties encountered by the pioneers in understanding the subtlety of the concept of entropy, and in convincing others of the validity and relevance of their arguments.
Information entropies of many-electron systems
Energy Technology Data Exchange (ETDEWEB)
Yanez, R.J.; Angulo, J.C.; Dehesa, J.S. [Universidad de Granada (Spain)
1995-12-05
The Boltzmann-Shannon (BS) information entropy S{sub {rho}} = - {integral} {rho}(r)log {rho}(r) dr measures the spread or extent of the one-electron density {rho}(r), which is the basic variable of the density function theory of the many electron systems. This quantity cannot be analytically computed, not even for simple quantum mechanical systems such as, e.g., the harmonic oscillator (HO) and the hydrogen atom (HA) in arbitrary excited states. Here, we first review (i) the present knowledge and open problems in the analytical determination of the BS entropies for the HO and HA systems in both position and momentum spaces and (ii) the known rigorous lower and upper bounds to the position and momentum BS entropies of many-electron systems in terms of the radial expectation values in the corresponding space. Then, we find general inequalities which relate the BS entropies and various density functionals. Particular cases of these results are rigorous relationships of the BS entropies and some relevant density functionals (e.g., the Thomas-Fermi kinetic energy, the Dirac-Slater exchange energy, the average electron density) for finite many-electron systems. 28 refs.
Relativistic entropy and related Boltzmann kinetics
Energy Technology Data Exchange (ETDEWEB)
Kaniadakis, G. [Politecnico di Torino (Italy). Dipartimento di Fisica
2009-06-15
It is well known that the particular form of the two-particle correlation function, in the collisional integral of the classical Boltzmann equation, fixes univocally the entropy of the system, which turns out to be the Boltzmann-Gibbs-Shannon entropy. In the ordinary relativistic Boltzmann equation, some standard generalizations, with respect to its classical version, imposed by the special relativity, are customarily performed. The only ingredient of the equation, which tacitly remains in its original classical form, is the two-particle correlation function, and this fact imposes that also the relativistic kinetics is governed by the Boltzmann-Gibbs-Shannon entropy. Indeed the ordinary relativistic Boltzmann equation admits as stationary stable distribution, the exponential Juttner distribution. Here, we show that the special relativity laws and the maximum entropy principle suggest a relativistic generalization also of the two-particle correlation function and then of the entropy. The so obtained, fully relativistic Boltzmann equation, obeys the H-theorem and predicts a stationary stable distribution, presenting power law tails in the high-energy region. The ensued relativistic kinetic theory preserves the main features of the classical kinetics, which recovers in the c{yields}{infinity} limit. (orig.)
Differential effects of gender on entropy perception
Satcharoen, Kleddao
2017-12-01
The purpose of this research is to examine differences in perception of entropy (color intensity) between male and female computer users. The objectives include identifying gender-based differences in entropy intention and exploring the potential effects of these differences (if any) on user interface design. The research is an effort to contribute to an emerging field of interest in gender as it relates to science, engineering and technology (SET), particularly user interface design. Currently, there is limited evidence on the role of gender in user interface design and in use of technology generally, with most efforts at gender-differentiated or customized design based on stereotypes and assumptions about female use of technology or the assumption of a default position based on male preferences. Image entropy was selected as a potential characteristic where gender could be a factor in perception because of known differences in color perception acuity between male and female individuals, even where there is no known color perception abnormality (which is more common with males). Although the literature review suggested that training could offset differences in color perception and identification, tests in untrained subject groups routinely show that females are more able to identify, match, and differentiate colors, and that there is a stronger emotional and psychosocial association of color for females. Since image entropy is associated with information content and image salience, the ability to identify areas of high entropy could make a difference in user perception and technological capabilities.
Capell, Santiago; Comas, Pere; Piella, Teresa; Rigau, Joaquim; Pruna, Xavier; Martínez, Francesc; Montull, Santiago
2004-09-04
To analyze the applicability of an out-patient Quick and Early Diagnostic Unit (QEDU) to evaluate patients with a potential life-threatening disorder on an out-patient basis. We analyzed prospectively all patients attended in the unit for five years (1997-2001). We compared patients with lung cancer and colorectal cancer admitted to hospital for conventional study versus patients studied at the unit. We attended 2,748 patients in total Main reasons for consultation were abdominal pain, asthenia-anorexia, neurologic symptoms, anemia and palpable tumors. The most frequent diagnostic category corresponded to gastroenterological diseases and neoplastic diseases. The mean interval (standard error) for the first visit was 4.9 (3.4) days and for diagnosis it was 5.7 (6.5) days. Some 95% patients displayed a high degree of satisfaction by the questionnaire. In patients with cancer of the colon studied at the QEDU, we observed a reduction in the average interval for diagnosis which was highly significant (p = 0.03). The overall costs of final diagnosis were also lower for the QEDU model. The QEDU unit represents an alternative to in hospital admission for diagnostic workouts, which is fully feasible in our setting. It can result in the same efficacy and a higher efficiency than hospital admission.
Maximum-Entropy Method for Evaluating the Slope Stability of Earth Dams
Directory of Open Access Journals (Sweden)
Shuai Wang
2012-10-01
Full Text Available The slope stability is a very important problem in geotechnical engineering. This paper presents an approach for slope reliability analysis based on the maximum-entropy method. The key idea is to implement the maximum entropy principle in estimating the probability density function. The performance function is formulated by the Simplified Bishop’s method to estimate the slope failure probability. The maximum-entropy method is used to estimate the probability density function (PDF of the performance function subject to the moment constraints. A numerical example is calculated and compared to the Monte Carlo simulation (MCS and the Advanced First Order Second Moment Method (AFOSM. The results show the accuracy and efficiency of the proposed method. The proposed method should be valuable for performing probabilistic analyses.
Medical Pattern Recognition: Applying an Improved Intuitionistic Fuzzy Cross-Entropy Approach
Directory of Open Access Journals (Sweden)
Kuo-Chen Hung
2012-01-01
Full Text Available One of the toughest challenges in medical diagnosis is the handling of uncertainty. Since medical diagnosis with respect to the symptoms uncertain, they will be assumed to have an intuitive nature. Thus, to obtain the uncertain optimism degree of the doctor, fuzzy linguistic quantifiers will be used. The aim of this article is to provide an improved nonprobabilistic entropy approach to support doctors examining the work of the preliminary diagnosing. The proposed entropy measure is based on intuitionistic fuzzy sets, extrainformation regarding hesitation degree, and an intuitive and mathematical connection between the notions of entropy in terms of fuzziness and intuitionism has been revealed. An illustrative example for medical pattern recognition demonstrates the usefulness of this study. Furthermore, in order to make computing and ranking results easier and to increase the recruiting productivity, a computer-based interface system has been developed to support doctors in making more efficient judgments.
Stepanov, A. V.
2015-11-01
Activation process for unimolecular reaction has been considered by means of radiation theory. The formulae of information entropy of activation have been derived for the Boltzmann-Arrhenius model and the activation process model (APM). The physical meaning of this entropy has been determined. It is a measure of conversion of thermal radiation energy to mechanical energy that moves atoms in a molecule during elementary activation act. It is also a measure of uncertainty of this energy conversion. The uncertainty is due to unevenness of distribution function representing the activation process. It has been shown that Arrhenius dependence is caused by the entropy change. Efficiency comparison of the two models under consideration for low-temperature fluctuations of a myoglobin molecule structure shows that the APM should be favored over the Boltzmann-Arrhenius one.
Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics
Carpenter, Mark H.
2016-01-04
Nonlinearly stable finite element methods of arbitrary type and order, are currently unavailable for discretizations of the compressible Navier-Stokes equations. Summation-by-parts (SBP) entropy stability analysis provides a means of constructing nonlinearly stable discrete operators of arbitrary order, but is currently limited to simple element types. Herein, recent progress is reported, on developing entropy-stable (SS) discontinuous spectral collocation formulations for hexahedral elements. Two complementary efforts are discussed. The first effort generalizes previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort extends previous work on entropy stability to include p-refinement at nonconforming interfaces. A generalization of existing entropy stability theory is required to accommodate the nuances of fully multidimensional SBP operators. The entropy stability of the compressible Euler equations on nonconforming interfaces is demonstrated using the newly developed LG operators and multidimensional interface interpolation operators. Preliminary studies suggest design order accuracy at nonconforming interfaces.
Entropy-Based Registration of Point Clouds Using Terrestrial Laser Scanning and Smartphone GPS
Directory of Open Access Journals (Sweden)
Maolin Chen
2017-01-01
Full Text Available Automatic registration of terrestrial laser scanning point clouds is a crucial but unresolved topic that is of great interest in many domains. This study combines terrestrial laser scanner with a smartphone for the coarse registration of leveled point clouds with small roll and pitch angles and height differences, which is a novel sensor combination mode for terrestrial laser scanning. The approximate distance between two neighboring scan positions is firstly calculated with smartphone GPS coordinates. Then, 2D distribution entropy is used to measure the distribution coherence between the two scans and search for the optimal initial transformation parameters. To this end, we propose a method called Iterative Minimum Entropy (IME to correct initial transformation parameters based on two criteria: the difference between the average and minimum entropy and the deviation from the minimum entropy to the expected entropy. Finally, the presented method is evaluated using two data sets that contain tens of millions of points from panoramic and non-panoramic, vegetation-dominated and building-dominated cases and can achieve high accuracy and efficiency.
Optimization of pressure gauge locations for water distribution systems using entropy theory.
Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon
2012-12-01
It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.
Energy Technology Data Exchange (ETDEWEB)
Vittone, E., E-mail: ettore.vittone@unito.it [Department of Physics, NIS Research Centre and CNISM, University of Torino, via P. Giuria 1, 10125 Torino (Italy); Pastuovic, Z. [Centre for Accelerator Science (ANSTO), Locked bag 2001, Kirrawee DC, NSW 2234 (Australia); Breese, M.B.H. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, Singapore 117542 (Singapore); Garcia Lopez, J. [Centro Nacional de Aceleradores (CNA), Sevilla University, J. Andalucia, CSIC, Av. Thomas A. Edison 7, 41092 Sevilla (Spain); Jaksic, M. [Department for Experimental Physics, Ruder Boškovic Institute (RBI), P.O. Box 180, 10002 Zagreb (Croatia); Raisanen, J. [Department of Physics, University of Helsinki, Helsinki 00014 (Finland); Siegele, R. [Centre for Accelerator Science (ANSTO), Locked bag 2001, Kirrawee DC, NSW 2234 (Australia); Simon, A. [International Atomic Energy Agency (IAEA), Vienna International Centre, P.O. Box 100, 1400 Vienna (Austria); Institute of Nuclear Research of the Hungarian Academy of Sciences (ATOMKI), Debrecen (Hungary); Vizkelethy, G. [Sandia National Laboratories (SNL), PO Box 5800, Albuquerque, NM (United States)
2016-04-01
Highlights: • We study the electronic degradation of semiconductors induced by ion irradiation. • The experimental protocol is based on MeV ion microbeam irradiation. • The radiation induced damage is measured by IBIC. • The general model fits the experimental data in the low level damage regime. • Key parameters relevant to the intrinsic radiation hardness are extracted. - Abstract: This paper investigates both theoretically and experimentally the charge collection efficiency (CCE) degradation in silicon diodes induced by energetic ions. Ion Beam Induced Charge (IBIC) measurements carried out on n- and p-type silicon diodes which were previously irradiated with MeV He ions show evidence that the CCE degradation does not only depend on the mass, energy and fluence of the damaging ion, but also depends on the ion probe species and on the polarization state of the device. A general one-dimensional model is derived, which accounts for the ion-induced defect distribution, the ionization profile of the probing ion and the charge induction mechanism. Using the ionizing and non-ionizing energy loss profiles resulting from simulations based on the binary collision approximation and on the electrostatic/transport parameters of the diode under study as input, the model is able to accurately reproduce the experimental CCE degradation curves without introducing any phenomenological additional term or formula. Although limited to low level of damage, the model is quite general, including the displacement damage approach as a special case and can be applied to any semiconductor device. It provides a method to measure the capture coefficients of the radiation induced recombination centres. They can be considered indexes, which can contribute to assessing the relative radiation hardness of semiconductor materials.
Mikov, A. A.; Svirin, V. N.
2008-04-01
The rapid development of quantum electronics and the advent of various types of lasers favored the formation of an independent line in medicine, namely, laser medicine In recent years devices based on semiconductor lasers have been introduced into medicine at a most rapid pace At present day this is connected with , that the essential improvement energy and spectral features has occurred in development semiconductor laser. The power of serial discrete near-IR semiconductor lasers has reached a level of 5 W and more, the spectral range has extended to 1.7...1.8 μm. Laser-optical information technologies and devices develop since the 70- years at the end of 20 century and are broadly used for treatment of oncologic diseases. Although such methods as photodynamic therapy (PDT), laser-induce thermotherapy (LITT), fluorescent diagnostics and spectrophotometry already more than 30 years are used for treatment and diagnostics of oncologic diseases, nevertheless, they are enough new methods and, as a rule, are used in large scientific centers and medical institutions. This is bound, first of all, with lack of information on modern method of cancer treatment, the absence of widely available laser procedures and corresponding devices in the polyclinics and even in district hospitals, as well as insufficient understanding of application areas, where laser methods has an advantage by comparison, for instance, with beam or chemotherapy. Presented in the article are new developed methods and results of designing equipment and software for their realization aimed at increase in efficiency of treatment of oncologic diseases as well as several clinical materials of the use of industrial models of the developed devices at medical institutions.
De Cian, Michel; Steinkamp, Olaf; Serra, Nicola
The LHCb experiment at the Large Hadron Collider is a particle physics experiment dedicated to the investigation of so-called $B$ mesons. To track long-living charged particles traversing the detector, LHCb comprises three tracking stations and a dipole magnet. The efficiency to reconstruct a long-living particle is of crucial importance for many physics analyses. A novel method to access this track reconstruction efficiency is presented and discussed in detail. Rare decays of $B$-mesons are a prospective way to search for physics beyond the Standard Model of particle physics, as new physics can enter at the same level as Standard Model physics. The decay $B^0 \\to K^{*0} \\mu^+ \\mu^-$ is an ideal laboratory for such searches, as its four-particle final state gives rise to many angular distributions which can be measured. One quantity of particular interest is the zero-crossing point of the forward-backward asymmetry $A_{\\rm{FB}}$, as it can be predicted theoretically with a small uncertainty. The first measure...
Directory of Open Access Journals (Sweden)
Gota Deguchi
2013-05-01
Full Text Available In the underground coal gasification (UCG process, cavity growth with crack extension inside the coal seam is an important phenomenon that directly influences gasification efficiency. An efficient and environmentally friendly UCG system also relies upon the precise control and evaluation of the gasification zone. This paper presents details of laboratory studies undertaken to evaluate structural changes that occur inside the coal under thermal stress and to evaluate underground coal-oxygen gasification simulated in an ex-situ reactor. The effects of feed temperature, the direction of the stratified plane, and the inherent microcracks on the coal fracture and crack extension were investigated using some heating experiments performed using plate-shaped and cylindrical coal specimens. To monitor the failure process and to measure the microcrack distribution inside the coal specimen before and after heating, acoustic emission (AE analysis and X-ray CT were applied. We also introduce a laboratory-scale UCG model experiment conducted with set design and operating parameters. The temperature profiles, AE activities, product gas concentration as well as the gasifier weight lossess were measured successively during gasification. The product gas mainly comprised combustible components such as CO, CH4, and H2 (27.5, 5.5, and 17.2 vol% respectively, which produced a high average calorific value (9.1 MJ/m3.
Directory of Open Access Journals (Sweden)
Hussein Aly Kamel Rady
2011-11-01
Full Text Available Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an active area of research. The last years have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically one is the output of the learning system and the other is the target. In this paper, improving the efficiency and convergence rate of Multilayer Backpropagation (BP Neural Networks was proposed. The usual Mean Square Error (MSE minimization principle is substituted by the minimization of Shannon Entropy (SE of the differences between the multilayer perceptions output and the desired target. These two cost functions are studied, analyzed and tested with two different activation functions namely, the Cauchy and the hyperbolic tangent activation functions. The comparative approach indicates that the Degree of convergence using Shannon Entropy cost function is higher than its counterpart using MSE and that MSE speeds the convergence than Shannon Entropy.
Differential network entropy reveals cancer system hallmarks
West, James; Bianconi, Ginestra; Severini, Simone; Teschendorff, Andrew E.
2012-01-01
The cellular phenotype is described by a complex network of molecular interactions. Elucidating network properties that distinguish disease from the healthy cellular state is therefore of critical importance for gaining systems-level insights into disease mechanisms and ultimately for developing improved therapies. By integrating gene expression data with a protein interaction network we here demonstrate that cancer cells are characterised by an increase in network entropy. In addition, we formally demonstrate that gene expression differences between normal and cancer tissue are anticorrelated with local network entropy changes, thus providing a systemic link between gene expression changes at the nodes and their local correlation patterns. In particular, we find that genes which drive cell-proliferation in cancer cells and which often encode oncogenes are associated with reductions in network entropy. These findings may have potential implications for identifying novel drug targets. PMID:23150773
Transfer entropy and transient limits of computation.
Prokopenko, Mikhail; Lizier, Joseph T
2014-06-23
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.
Large Field Inflation and Gravitational Entropy
DEFF Research Database (Denmark)
Kaloper, Nemanja; Kleban, Matthew; Lawrence, Albion
2016-01-01
Large field inflation can be sensitive to perturbative and nonperturbative quantum corrections that spoil slow roll. A large number $N$ of light species in the theory, which occur in many string constructions, can amplify these problems. One might even worry that in a de Sitter background, light...... species will lead to a violation of the covariant entropy bound at large $N$. If so, requiring the validity of the covariant entropy bound could limit the number of light species and their couplings, which in turn could severely constrain axion-driven inflation. Here we show that there is no such problem...... in this light, and show that they are perfectly consistent with the covariant entropy bound. Thus, while quantum gravity might yet spoil large field inflation, holographic considerations in the semiclassical theory do not obstruct it....
Clausius versus Sackur-Tetrode entropies
Oikonomou, Thomas; Baris Bagci, G.
2013-05-01
Based on the property of extensivity (mathematically, homogeneity of first degree), we derive in a mathematically consistent manner the explicit expressions of the chemical potential μ and the Clausius entropy S for the case of monoatomic ideal gases in open systems within phenomenological thermodynamics. Neither information theoretic nor quantum mechanical statistical concepts are invoked in this derivation. Considering a specific expression of the constant term of S, the derived entropy coincides with the Sackur-Tetrode entropy in the thermodynamic limit. We demonstrate, however, that the former limit is not contained in the classical thermodynamic relations, implying that the usual resolutions of Gibbs paradox do not succeed in bridging the gap between the thermodynamics and statistical mechanics. We finally consider the volume of the phase space as an entropic measure, albeit, without invoking the thermodynamic limit to investigate its relation to the thermodynamic equation of state and observables.
Anyonic entanglement and topological entanglement entropy
Bonderson, Parsa; Knapp, Christina; Patel, Kaushal
2017-10-01
We study the properties of entanglement in two-dimensional topologically ordered phases of matter. Such phases support anyons, quasiparticles with exotic exchange statistics. The emergent nonlocal state spaces of anyonic systems admit a particular form of entanglement that does not exist in conventional quantum mechanical systems. We study this entanglement by adapting standard notions of entropy to anyonic systems. We use the algebraic theory of anyon models (modular tensor categories) to illustrate the nonlocal entanglement structure of anyonic systems. Using this formalism, we present a general method of deriving the universal topological contributions to the entanglement entropy for general system configurations of a topological phase, including surfaces of arbitrary genus, punctures, and quasiparticle content. We analyze a number of examples in detail. Our results recover and extend prior results for anyonic entanglement and the topological entanglement entropy.
An entropy model for artificial grammar learning
Directory of Open Access Journals (Sweden)
Emmanuel Pothos
2010-06-01
Full Text Available A model is proposed to characterize the type of knowledge acquired in Artificial Grammar Learning (AGL. In particular, Shannon entropy is employed to compute the complexity of different test items in an AGL task, relative to the training items. According to this model, the more predictable a test item is from the training items, the more likely it is that this item should be selected as compatible with the training items. The predictions of the entropy model are explored in relation to the results from several previous AGL datasets and compared to other AGL measures. This particular approach in AGL resonates well with similar models in categorization and reasoning which also postulate that cognitive processing is geared towards the reduction of entropy.
Multipartite analysis of average-subsystem entropies
Alonso-Serrano, Ana; Visser, Matt
2017-11-01
So-called average subsystem entropies are defined by first taking partial traces over some pure state to define density matrices, then calculating the subsystem entropies, and finally averaging over the pure states to define the average subsystem entropies. These quantities are standard tools in quantum information theory, most typically applied in bipartite systems. We shall first present some extensions to the usual bipartite analysis (including a calculation of the average tangle and a bound on the average concurrence), follow this with some useful results for tripartite systems, and finally extend the discussion to arbitrary multipartite systems. A particularly nice feature of tripartite and multipartite analyses is that this framework allows one to introduce an "environment" to which small subsystems can couple.
Entropy Concept for Paramacrosystems with Complex States
Directory of Open Access Journals (Sweden)
Yuri S. Popkov
2012-05-01
Full Text Available Consideration is given to macrosystems called paramacrosystems with states of finite capacity and distinguishable and undistinguishable elements with stochastic behavior. The paramacrosystems fill a gap between Fermi and Einstein macrosystems. Using the method of the generating functions, we have obtained expressions for probabilistic characteristics (distribution of the macrostate probabilities, physical and information entropies of the paramacrosystems. The cases with equal and unequal prior probabilities for elements to occupy the states with finite capacities are considered. The unequal prior probabilities influence the morphological properties of the entropy functions and the functions of the macrostate probabilities, transforming them in the multimodal functions. The examples of the paramacrosystems with two-modal functions of the entropy and distribution of the macrostate probabilities are presented. The variation principle does not work for such cases.
Zipf's law, power laws and maximum entropy
Visser, Matt
2013-04-01
Zipf's law, and power laws in general, have attracted and continue to attract considerable attention in a wide variety of disciplines—from astronomy to demographics to software structure to economics to linguistics to zoology, and even warfare. A recent model of random group formation (RGF) attempts a general explanation of such phenomena based on Jaynes' notion of maximum entropy applied to a particular choice of cost function. In the present paper I argue that the specific cost function used in the RGF model is in fact unnecessarily complicated, and that power laws can be obtained in a much simpler way by applying maximum entropy ideas directly to the Shannon entropy subject only to a single constraint: that the average of the logarithm of the observable quantity is specified.
Entanglement entropy of critical spin liquids.
Zhang, Yi; Grover, Tarun; Vishwanath, Ashvin
2011-08-05
Quantum spin liquids are phases of matter whose internal structure is not captured by a local order parameter. Particularly intriguing are critical spin liquids, where strongly interacting excitations control low energy properties. Here we calculate their bipartite entanglement entropy that characterizes their quantum structure. In particular we calculate the Renyi entropy S(2) on model wave functions obtained by Gutzwiller projection of a Fermi sea. Although the wave functions are not sign positive, S(2) can be calculated on relatively large systems (>324 spins) using the variational Monte Carlo technique. On the triangular lattice we find that entanglement entropy of the projected Fermi sea state violates the boundary law, with S(2) enhanced by a logarithmic factor. This is an unusual result for a bosonic wave function reflecting the presence of emergent fermions. These techniques can be extended to study a wide class of other phases.
Spatial-dependence recurrence sample entropy
Pham, Tuan D.; Yan, Hong
2018-03-01
Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.
Braneworld black holes and entropy bounds
Directory of Open Access Journals (Sweden)
Y. Heydarzade
2018-01-01
Full Text Available The Bousso's D-bound entropy for the various possible black hole solutions on a 4-dimensional brane is checked. It is found that the D-bound entropy here is apparently different from that of obtained for the 4-dimensional black hole solutions. This difference is interpreted as the extra loss of information, associated to the extra dimension, when an extra-dimensional black hole is moved outward the observer's cosmological horizon. Also, it is discussed that N-bound entropy is hold for the possible solutions here. Finally, by adopting the recent Bohr-like approach to black hole quantum physics for the excited black holes, the obtained results are written also in terms of the black hole excited states.
Braneworld black holes and entropy bounds
Heydarzade, Y.; Hadi, H.; Corda, C.; Darabi, F.
2018-01-01
The Bousso's D-bound entropy for the various possible black hole solutions on a 4-dimensional brane is checked. It is found that the D-bound entropy here is apparently different from that of obtained for the 4-dimensional black hole solutions. This difference is interpreted as the extra loss of information, associated to the extra dimension, when an extra-dimensional black hole is moved outward the observer's cosmological horizon. Also, it is discussed that N-bound entropy is hold for the possible solutions here. Finally, by adopting the recent Bohr-like approach to black hole quantum physics for the excited black holes, the obtained results are written also in terms of the black hole excited states.
Entropy Bounds, Holographic Principle and Uncertainty Relation
Directory of Open Access Journals (Sweden)
I. V. Volovich
2001-06-01
Full Text Available Abstract: A simple derivation of the bound on entropy is given and the holographic principle is discussed. We estimate the number of quantum states inside space region on the base of uncertainty relation. The result is compared with the Bekenstein formula for entropy bound, which was initially derived from the generalized second law of thermodynamics for black holes. The holographic principle states that the entropy inside a region is bounded by the area of the boundary of that region. This principle can be called the kinematical holographic principle. We argue that it can be derived from the dynamical holographic principle which states that the dynamics of a system in a region should be described by a system which lives on the boundary of the region. This last principle can be valid in general relativity because the ADM hamiltonian reduces to the surface term.
Maximum Relative Entropy of Coherence: An Operational Coherence Measure
Bu, Kaifeng; Singh, Uttam; Fei, Shao-Ming; Pati, Arun Kumar; Wu, Junde
2017-10-01
The operational characterization of quantum coherence is the cornerstone in the development of the resource theory of coherence. We introduce a new coherence quantifier based on maximum relative entropy. We prove that the maximum relative entropy of coherence is directly related to the maximum overlap with maximally coherent states under a particular class of operations, which provides an operational interpretation of the maximum relative entropy of coherence. Moreover, we show that, for any coherent state, there are examples of subchannel discrimination problems such that this coherent state allows for a higher probability of successfully discriminating subchannels than that of all incoherent states. This advantage of coherent states in subchannel discrimination can be exactly characterized by the maximum relative entropy of coherence. By introducing a suitable smooth maximum relative entropy of coherence, we prove that the smooth maximum relative entropy of coherence provides a lower bound of one-shot coherence cost, and the maximum relative entropy of coherence is equivalent to the relative entropy of coherence in the asymptotic limit. Similar to the maximum relative entropy of coherence, the minimum relative entropy of coherence has also been investigated. We show that the minimum relative entropy of coherence provides an upper bound of one-shot coherence distillation, and in the asymptotic limit the minimum relative entropy of coherence is equivalent to the relative entropy of coherence.
CHANTI: a fast and efficient charged particle veto detector for the NA62 experiment at cern
Mirra, Marco
This work has been performed into the frame of the NA62 experiment at CERN that aims at measuring the Branching-Ratio of the ultra-rare kaon decay K+→π+ nu nubar with 10% uncertainty - using an unseparated kaon beam of 75GeV/c - in order to test the Standard Model (SM), to look for physics beyond SM and to measure the |Vtd| element of the Cabibbo-Kobayashi-Maskawa (CKM) flavor mixing matrix. Backgrounds, which are up to 10^10 times higher than the signal, will be suppressed by an accurate measurement of the momentum of the K+ (with a silicon beam tracker named GigaTracker) and the π+ (with a straw tracker) and by a complex system of particle identification and veto detectors. A critical background can be induced by inelastic interactions of the hadron beam with the GigaTracker. Pions produced in these interactions, emitted at low angle, can reach the straw tracker and mimic a kaon decay in the fiducial region, if no other track is detected. In order to suppress this background a CHarged track ANTIcounter ...
Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level
Silva, Carlos; Annamalai, Kalyan
2008-06-01
The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.
Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level
Directory of Open Access Journals (Sweden)
Kalyan Annamalai
2008-06-01
Full Text Available The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB and Centers for Disease Control and Prevention (CDC, which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death was found to be 11,404 kJ/Ã‚ÂºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years. From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a Ã¢Â€Âœhealthy minimumÃ¢Â€Â if entropy generation is to be minimized.
An Instructive Model of Entropy
Zimmerman, Seth
2010-01-01
This article first notes the misinterpretation of a common thought experiment, and the misleading comment that "systems tend to flow from less probable to more probable macrostates". It analyses the experiment, generalizes it and introduces a new tool of investigation, the simplectic structure. A time-symmetric model is built upon this structure,…
Perucca, Eliana; Molnar, Peter; Francis, Robert; Perona, Paolo
2010-05-01
The establishment and evolution of vegetation in organized patterns on river bar alluvial sediment are active morphodynamic processes governed by flow hydrology. Although this is well documented in literature, how do time scales between the arrival of flow disturbances and vegetation growth interact to determine the survival of certain vegetation characteristics is still unclear. We started to explore such dynamics within the research project RIVERINE (RIver - VEgetation interactions and Reproduction of Island Nuclei formation and Evolution), funded by the Hydralab III European Framework Programme. Laboratory experiments are important tools which allow comprehensive observations of the feedbacks between flow, channel morphology and vegetation, which are otherwise difficult to observe at the real scale. In this work we present the results of novel flume experiments that we carried out at the Total Environment Simulator (TES) of the University of Hull, United Kingdom. Starting from an initial levelled slope of 1%, the channel was seeded with Avena Sativa at a uniform density. A number of days after seeding, the flume was flooded daily for 4 days with a flood disturbance which lasted 15 minutes. Different flood magnitudes as well as times between seeding and the first disturbance and flume geometries (i.e., parallel and convergent walls) were investigated. After each disturbance the eroded material (sediment, seeds and plants) was collected at the channel bottom and corresponding statistics for main root length, number of roots and stem height were calculated. At the same time, random samples of non-eroded plants were sampled from the flume and the same statistics were computed for a control run. After every flood disturbance, the channel bed surface was measured with a laser scanner and photographed. Since flooding frequencies were comparable with the plant root germination and growth time scales, vegetation and flood disturbances were in direct competition. Results
Hjelmfelt, Allen; Harding, Robert H.; Tsujimoto, Kim K.; Ross, John
1990-03-01
Periodic perturbations are applied to the input fluxes of reactants in a system which exhibits autonomous oscillations, the combustion of acetaldehyde (ACH) and oxygen, and a system which exhibits damped oscillations, the combustion of methane and oxygen. The ACH system is studied by experiments and numerical analysis and the methane system is studied by numerical analysis. The periodic perturbations are in the form of a two-term Fourier series. Such perturbations may generate multiple attractors, which are either periodic or chaotic. We discuss two types of bistable responses: a new phase bistability, in which a subharmonic frequency is added to a sinusoidal perturbation at different phases relative to the periodic response; and jump phenomena, in which the resonant frequency of a nonlinear oscillator depends on the amplitude of the periodic perturbation. Both the ACH and the methane systems confirm the phase bistability. The additional complex behavior of bistability due to jump phenomena is seen only in calculations in the methane system. In both types of bistability a hysteresis loop is formed as we vary the form of the periodic perturbation. In the methane system, we find period doubling to chaos occuring on one branch of the hysteresis loop while the other branch remains periodic. The methane system has been studied in the context of the efficiency of power production. We calculate the efficiency corresponding to each bistable attractor and find one branch of each pair to be the more efficient mode of operation. In the case of the coexisting periodic and chaotic attractors the chaotic attractor is the more efficient mode of operation.
Maximum Entropy Learning with Deep Belief Networks
Directory of Open Access Journals (Sweden)
Payton Lin
2016-07-01
Full Text Available Conventionally, the maximum likelihood (ML criterion is applied to train a deep belief network (DBN. We present a maximum entropy (ME learning algorithm for DBNs, designed specifically to handle limited training data. Maximizing only the entropy of parameters in the DBN allows more effective generalization capability, less bias towards data distributions, and robustness to over-fitting compared to ML learning. Results of text classification and object recognition tasks demonstrate ME-trained DBN outperforms ML-trained DBN when training data is limited.
Entropy Inequality Violations from Ultraspinning Black Holes.
Hennigar, Robie A; Mann, Robert B; Kubizňák, David
2015-07-17
We construct a new class of rotating anti-de Sitter (AdS) black hole solutions with noncompact event horizons of finite area in any dimension and study their thermodynamics. In four dimensions these black holes are solutions to gauged supergravity. We find that their entropy exceeds the maximum implied from the conjectured reverse isoperimetric inequality, which states that for a given thermodynamic volume, the black hole entropy is maximized for Schwarzschild-AdS space. We use this result to suggest more stringent conditions under which this conjecture may hold.
Holographic entanglement entropy on generic time slices
Kusuki, Yuya; Takayanagi, Tadashi; Umemoto, Koji
2017-06-01
We study the holographic entanglement entropy and mutual information for Lorentz boosted subsystems. In holographic CFTs at zero and finite temperature, we find that the mutual information gets divergent in a universal way when the end points of two subsystems are light-like separated. In Lifshitz and hyperscaling violating geometries dual to non-relativistic theories, we show that the holographic entanglement entropy is not well-defined for Lorentz boosted subsystems in general. This strongly suggests that in non-relativistic theories, we cannot make a real space factorization of the Hilbert space on a generic time slice except the constant time slice, as opposed to relativistic field theories.
Biological Aging and Life Span Based on Entropy Stress via Organ and Mitochondrial Metabolic Loading
Directory of Open Access Journals (Sweden)
Kalyan Annamalai
2017-10-01
Full Text Available The energy for sustaining life is released through the oxidation of glucose, fats, and proteins. A part of the energy released within each cell is stored as chemical energy of Adenosine Tri-Phosphate molecules, which is essential for performing life-sustaining functions, while the remainder is released as heat in order to maintain isothermal state of the body. Earlier literature introduced the availability concepts from thermodynamics, related the specific irreversibility and entropy generation rates to metabolic efficiency and energy release rate of organ k, computed whole body specific entropy generation rate of whole body at any given age as a sum of entropy generation within four vital organs Brain, Heart, Kidney, Liver (BHKL with 5th organ being the rest of organs (R5 and estimated the life span using an upper limit on lifetime entropy generated per unit mass of body, σM,life. The organ entropy stress expressed in terms of lifetime specific entropy generated per unit mass of body organs (kJ/(K kg of organ k was used to rank organs and heart ranked highest while liver ranked lowest. The present work includes the effects of (1 two additional organs: adipose tissue (AT and skeletal muscles (SM which are of importance to athletes; (2 proportions of nutrients oxidized which affects blood temperature and metabolic efficiencies; (3 conversion of the entropy stress from organ/cellular level to mitochondrial level; and (4 use these parameters as metabolism-based biomarkers for quantifying the biological aging process in reaching the limit of σM,life. Based on the 7-organ model and Elia constants for organ metabolic rates for a male of 84 kg steady mass and using basic and derived allometric constants of organs, the lifetime energy expenditure is estimated to be 2725 MJ/kg body mass while lifetime entropy generated is 6050 kJ/(K kg body mass with contributions of 190; 1835.0; 610; 290; 700; 1470 and 95 kJ/K contributed by AT-BHKL-SM-R7 to 1 kg body
Directory of Open Access Journals (Sweden)
Yuji Ohya
2016-12-01
Full Text Available A new type of solar tower was developed through laboratory experiments and numerical analyses. The solar tower mainly consists of three components. The transparent collector area is an aboveground glass roof, with increasing height toward the center. Attached to the center of the inside of the collector is a vertical tower within which a wind turbine is mounted at the lower entry to the tower. When solar radiation heats the ground through the glass roof, ascending warm air is guided to the center and into the tower. A solar tower that can generate electricity using a simple structure that enables easy and less costly maintenance has considerable advantages. However, conversion efficiency from sunshine energy to mechanical turbine energy is very low. Aiming to improve this efficiency, the research project developed a diffuser-type tower instead of a cylindrical tower, and investigated a suitable diffuser shape for practical use. After changing the tower height and diffuser open angle, with a temperature difference between the ambient air aloft and within the collector, various diffuser tower shapes were tested by laboratory experiments and numerical analyses. As a result, it was found that a diffuser tower with a semi-open angle of 4° is an optimal shape, producing the fastest updraft at each temperature difference in both the laboratory experiments and numerical analyses. The relationships between thermal updraft speed and temperature difference and/or tower height were confirmed. It was found that the thermal updraft velocity is proportional to the square root of the tower height and/or temperature difference.
Martínez-Merino, Aldo; Ryan, Michael P
2016-01-01
Newtonian gravitation with some slight modifications, along with some highly simplified ideas from quantum field theory allow us to reproduce, at least at the level of back-of-the-envelope calculations, many results of black hole physics. We consider particle production by a black hole, the Newtonian equivalent of the Hawking temperature, and the Bekenstein entropy. Also, we are able to deduce Newtonian field equations from entropy. We finally study higher-order Newtonian theories under the same assumptions used for ordinary Newtonian theory. In a companion article we will look at entropic forces for various entropies and make contact with our analysis of higher-order Newtonian theories.
Contrast enhancement based on entropy and reflectance analysis for surgical lighting
Shen, Junfei; Wang, Huihui; Wu, Yisi; Li, An; Chen, Chi; Zheng, Zhenrong
2015-07-01
Light-emitting diode (LED) is the neotype surgical lighting device as an inexpensive and color-variable illumination. A methodology was designed to value the quality of surgical lighting and used to develop an operation lamp with LEDs enhancing the biological contrast. We assembled a modular array of Phillips LEDs as illumination. In the initial experiment, images of porcine heart were carried out in several LED environments and analyzed quantitatively to assess the function of these LEDs in contrast enhancement. Then we measured the reflectance spectrums of blood, fat and other tissues to obtain the spectral comparison. Based on the result, new illuminations with spectral components which differ most in the comparison was developed. Meanwhile, a new evaluation function combining the entropy analysis and brightness contrast was also built to value the quality of these illuminations. Experiments showed biological features are more visible with treated LED illuminations than the broadband lamps. Thus, the synthesis of LED lighting spectra could be adjusted to provide significant tissue identification. Therefore, we believe the new methodology will contribute to the manufacture of high efficient medical illuminations and act the positive role in coming surgical lighting fields.
Monotonicity, thinning and discrete versions of the Entropy Power Inequality
Johnson, Oliver
2009-01-01
We consider the entropy of sums of independent discrete random variables, in analogy with Shannon's Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the Entropy Power Inequality do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannon's Entropy Power Inequality is the behaviour of entropy on scaling of continuous random variables. We believe that R\\'{e}nyi's operation of thinning discrete random variables plays a similar role to scaling, and give a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning. In the spirit of the monotonicity results established by Artstein, Ball, Barthe and Naor, we prove a stronger version of concavity of entropy, which implies a strengthened form of our discrete Entropy Power Inequality.
Phase Change Enthalpies and Entropies of Liquid Crystals
National Research Council Canada - National Science Library
Acree, William E; Chickos, James S
2006-01-01
.... A group additivity approach used to estimate total phase change entropies of organic molecules applied to 627 of these liquid crystals is found to significantly overestimate their total phase change entropies...
Black holes, entropies, and semiclassical spacetime in quantum gravity
National Research Council Canada - National Science Library
Nomura, Yasunori; Weinberg, Sean J
2014-01-01
.... We identify the Bekenstein-Hawking entropy as the entropy associated with coarse-graining performed to obtain semiclassical field theory from a fundamental microscopic theory of quantum gravity...
Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.
2008-01-01
In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…
Piazzi, Marco; Zemen, Jan; Basso, Vittorio
We combine spin polarised density functional theory and thermodynamic mean field theory to describe the phase transitions of antiperovskite manganese nitrides. We find that the inclusion of the localized spin contribution to the entropy, evaluated through mean field theory, lowers the transition temperatures. Furthermore, we show that the electronic entropy leads to first order phase transitions in agreement with experiments whereas the localized spin contribution adds second order character to the transition. We compare our predictions to available experimental data to assess the validity of the assumptions underpinning our multilevel modelling.
Exploration and Development of High Entropy Alloys for Structural Applications
Directory of Open Access Journals (Sweden)
Daniel B. Miracle
2014-01-01
Full Text Available We develop a strategy to design and evaluate high-entropy alloys (HEAs for structural use in the transportation and energy industries. We give HEA goal properties for low (≤150 °C, medium (≤450 °C and high (≥1,100 °C use temperatures. A systematic design approach uses palettes of elements chosen to meet target properties of each HEA family and gives methods to build HEAs from these palettes. We show that intermetallic phases are consistent with HEA definitions, and the strategy developed here includes both single-phase, solid solution HEAs and HEAs with intentional addition of a 2nd phase for particulate hardening. A thermodynamic estimate of the effectiveness of configurational entropy to suppress or delay compound formation is given. A 3-stage approach is given to systematically screen and evaluate a vast number of HEAs by integrating high-throughput computations and experiments. CALPHAD methods are used to predict phase equilibria, and high-throughput experiments on materials libraries with controlled composition and microstructure gradients are suggested. Much of this evaluation can be done now, but key components (materials libraries with microstructure gradients and high-throughput tensile testing are currently missing. Suggestions for future HEA efforts are given.
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called
Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.
2014-12-01
The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.
The entropy gain of infinite-dimensional quantum channels
Holevo, A. S.
2010-01-01
In the present paper we study the entropy gain $H(\\Phi [\\rho])-H(\\rho)$ for infinite-dimensional channels $\\Phi$. We show that unlike finite-dimensional case where the minimal entropy gain is always nonpositive \\cite{al}, there is a plenty of channels with positive minimal entropy gain. We obtain the new lower bound and compute the minimal entropy gain for a broad class of Bosonic Gaussian channels by proving that the infimum is attained on the Gaussian states.
Rényi entropy measure of noise-aided information transmission in a binary channel.
Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès
2010-05-01
This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.
Entropy and spontaneity in an introductory physics course for life science students
Geller, Benjamin D; Gouvea, Julia; Sawtelle, Vashti; Turpen, Chandra; Redish, Edward F
2013-01-01
In an Introductory Physics for Life Science (IPLS) course that leverages authentic biological examples, student ideas about entropy as "disorder" or "chaos" come into contact with their ideas about the spontaneous formation of organized biological structure. It is possible to reconcile the "natural tendency to disorder" with the organized clustering of macromolecules, but doing so in a way that will be meaningful to students requires that we take seriously the ideas about entropy and spontaneity that students bring to IPLS courses from their prior experiences in biology and chemistry. We draw on case study interviews to argue that an approach that emphasizes the interplay of energy and entropy in determining spontaneity (one that involves a central role for free energy) is one that draws on students' resources from biology and chemistry in particularly effective ways. We see the positioning of entropic arguments alongside energetic arguments in the determination of spontaneity as an important step toward maki...
Sun, Yanqing; Zhou, Yu; Zhao, Qingwei; Zhang, Pengyuan; Pan, Fuping; Yan, Yonghong
In this paper, the robustness of the posterior-based confidence measures is improved by utilizing entropy information, which is calculated for speech-unit-level posteriors using only the best recognition result, without requiring a larger computational load than conventional methods. Using different normalization methods, two posterior-based entropy confidence measures are proposed. Practical details are discussed for two typical levels of hidden Markov model (HMM)-based posterior confidence measures, and both levels are compared in terms of their performances. Experiments show that the entropy information results in significant improvements in the posterior-based confidence measures. The absolute improvements of the out-of-vocabulary (OOV) rejection rate are more than 20% for both the phoneme-level confidence measures and the state-level confidence measures for our embedded test sets, without a significant decline of the in-vocabulary accuracy.
MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR
SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM
In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the
Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety
Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.
2012-01-01
Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…
The Entropy Theoretic Measure for Manpower Systems in Perspective
African Journals Online (AJOL)
... propose the use of transition probabilities of the imbedded Markov chain for manpower systems as inputs in the entropy statistic. The proposal is illustrated by refining the basic Shannon entropy rate and implemented in Matlab computing environment. Keywords: entropy, manpower system, Markov chain, Matlab package.
Multi-GPU maximum entropy image synthesis for radio astronomy
Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.
2018-01-01
The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.
Increasing the Discriminatory Power of DEA Using Shannon’s Entropy
Directory of Open Access Journals (Sweden)
Qiwei Xie
2014-03-01
Full Text Available In many data envelopment analysis (DEA applications, the analyst always confronts the difficulty that the selected data set is not suitable to apply traditional DEA models for their poor discrimination. This paper presents an approach using Shannon’s entropy to improve the discrimination of traditional DEA models. In this approach, DEA efficiencies are first calculated for all possible variable subsets and analyzed using Shannon’s entropy theory to calculate the degree of the importance of each subset in the performance measurement, then we combine the obtained efficiencies and the degrees of importance to generate a comprehensive efficiency score (CES, which can observably improve the discrimination of traditional DEA models. Finally, the proposed approach has been applied to some data sets from the prior DEA literature.
Malleability of the blockchain’s entropy
C.A. Pierrot (Cécile); W. Wesolowski (Benjamin)
2017-01-01
textabstractTrustworthy generation of public random numbers is necessary for the security of a number of cryptographic applications. It was suggested to use the inherent unpredictability of blockchains as a source of public randomness. Entropy from the Bitcoin blockchain in particular has been used
Carnot to Clausius: Caloric to Entropy
Newburgh, Ronald
2009-01-01
This paper discusses how the Carnot engine led to the formulation of the second law of thermodynamics and entropy. The operation of the engine is analysed both in terms of heat as the caloric fluid and heat as a form of energy. A keystone of Carnot's thinking was the absolute conservation of caloric. Although the Carnot analysis was partly…
Entropy of Mixing of Distinguishable Particles
Kozliak, Evguenii I.
2014-01-01
The molar entropy of mixing yields values that depend only on the number of mixing components rather than on their chemical nature. To explain this phenomenon using the logic of chemistry, this article considers mixing of distinguishable particles, thus complementing the well-known approach developed for nondistinguishable particles, for example,…
The Statistical Interpretation of Entropy: An Activity
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
Wide Range Multiscale Entropy Changes through Development
Directory of Open Access Journals (Sweden)
Nicola R. Polizzotto
2015-12-01
Full Text Available How variability in the brain’s neurophysiologic signals evolves during development is important for a global, system-level understanding of brain maturation and its disturbance in neurodevelopmental disorders. In the current study, we use multiscale entropy (MSE, a measure that has been related to signal complexity, to investigate how this variability evolves during development across a broad range of temporal scales. We computed MSE, standard deviation (STD and standard spectral analyses on resting EEG from 188 healthy individuals aged 8–22 years old. We found age-related increases in entropy at lower scales (<~20 ms and decreases in entropy at higher scales (~60–80 ms. Decreases in the overall signal STD were anticorrelated with entropy, especially in the lower scales, where regression analyses showed substantial covariation of observed changes. Our findings document for the first time the scale dependency of developmental changes from childhood to early adulthood, challenging a parsimonious MSE-based account of brain maturation along a unidimensional, complexity measure. At the level of analysis permitted by electroencephalography (EEG, MSE could capture critical spatiotemporal variations in the role of noise in the brain. However, interpretations critically rely on defining how signal STD affects MSE properties.
Texture analysis using Renyi's generalized entropies
Grigorescu, SE; Petkov, N
2003-01-01
We propose a texture analysis method based on Renyi's generalized entropies. The method aims at identifying texels in regular textures by searching for the smallest window through which the minimum number of different visual patterns is observed when moving the window over a given texture. The
Nazarov, Y.V.
2011-01-01
We demonstrate that the condensed matter quantum systems encompassing two reservoirs connected by a junction permit a natural definition of flows of conserved measures, i.e., Rényi entropies. Such flows are similar to the flows of physical conserved quantities such as charge and energy. We develop a
Information, entropy and fidelity in visual communication
Huck, Friedrich O.; Fales, Carl L.; Alter-Gartenberg, Rachel; Rahman, Zia-Ur
1992-01-01
This paper presents an assessment of visual communication that integrates the critical limiting factors of image gathering and display with the digital processing that is used to code and restore images. The approach focuses on two mathematical criteria, information and fidelity, and on their relationships to the entropy of the encoded data and to the visual quality of the restored image.
On Using Entropy for Enhancing Handwriting Preprocessing
Directory of Open Access Journals (Sweden)
Bernhard Peischl
2012-11-01
Full Text Available Handwriting is an important modality for Human-Computer Interaction. For medical professionals, handwriting is (still the preferred natural method of documentation. Handwriting recognition has long been a primary research area in Computer Science. With the tremendous ubiquity of smartphones, along with the renaissance of the stylus, handwriting recognition has become a new impetus. However, recognition rates are still not 100% perfect, and researchers still are constantly improving handwriting algorithms. In this paper we evaluate the performance of entropy based slant- and skew-correction, and compare the results to other methods. We selected 3700 words of 23 writers out of the Unipen-ICROW-03 benchmark set, which we annotated with their associated error angles by hand. Our results show that the entropy-based slant correction method outperforms a window based approach with an average precision of 6:02 for the entropy-based method, compared with the 7:85 for the alternative. On the other hand, the entropy-based skew correction yields a lower average precision of 2:86, compared with the average precision of 2:13 for the alternative LSM based approach.
Quantum dynamical entropies in discrete classical chaos
Energy Technology Data Exchange (ETDEWEB)
Benatti, Fabio [Dipartimento di Fisica Teorica, Universita di Trieste, Strada Costiera 11, 34014 Trieste (Italy); Cappellini, Valerio [Dipartimento di Fisica Teorica, Universita di Trieste, Strada Costiera 11, 34014 Trieste (Italy); Zertuche, Federico [Instituto de Matematicas, UNAM, Unidad Cuernavaca, AP 273-3, Admon. 3, 62251 Cuernavaca, Morelos (Mexico)
2004-01-09
We discuss certain analogies between quantization and discretization of classical systems on manifolds. In particular, we will apply the quantum dynamical entropy of Alicki and Fannes to numerically study the footprints of chaos in discretized versions of hyperbolic maps on the torus.
Entropy based fingerprint for local crystalline order
Piaggi, Pablo M.; Parrinello, Michele
2017-09-01
We introduce a new fingerprint that allows distinguishing between liquid-like and solid-like atomic environments. This fingerprint is based on an approximate expression for the entropy projected on individual atoms. When combined with local enthalpy, this fingerprint acquires an even finer resolution and it is capable of discriminating between different crystal structures.
The Optimal Use of Entropy and Enthalpy
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 9. The Optimal Use of Entropy and Enthalpy. M S Ananth R Ravi. General Article Volume 6 Issue 9 ... Author Affiliations. M S Ananth1 R Ravi1. Department of Chemical Engineering, Indian Institute of Technology, Chennai 600 036, India.
Generalization of Gibbs Entropy and Thermodynamic Relation
Park, Jun Chul
2010-01-01
In this paper, we extend Gibbs's approach of quasi-equilibrium thermodynamic processes, and calculate the microscopic expression of entropy for general non-equilibrium thermodynamic processes. Also, we analyze the formal structure of thermodynamic relation in non-equilibrium thermodynamic processes.
Comprehensive entropy weight observability-controllability risk ...
African Journals Online (AJOL)
Decision making for water resource planning is often related to social, economic and environmental factors. There are various methods for making decisions about water resource planning alternatives and measures with various shortcomings. A comprehensive entropy weight observability-controllability risk analysis ...
The Entropy-Based Quantum Metric
Directory of Open Access Journals (Sweden)
Roger Balian
2014-07-01
Full Text Available The von Neumann entropy S( D ^ generates in the space of quantum density matrices D ^ the Riemannian metric ds2 = −d2S( D ^ , which is physically founded and which characterises the amount of quantum information lost by mixing D ^ and D ^ + d D ^ . A rich geometric structure is thereby implemented in quantum mechanics. It includes a canonical mapping between the spaces of states and of observables, which involves the Legendre transform of S( D ^ . The Kubo scalar product is recovered within the space of observables. Applications are given to equilibrium and non equilibrium quantum statistical mechanics. There the formalism is specialised to the relevant space of observables and to the associated reduced states issued from the maximum entropy criterion, which result from the exact states through an orthogonal projection. Von Neumann’s entropy specialises into a relevant entropy. Comparison is made with other metrics. The Riemannian properties of the metric ds2 = −d2S( D ^ are derived. The curvature arises from the non-Abelian nature of quantum mechanics; its general expression and its explicit form for q-bits are given, as well as geodesics.
Geometric Entropy of Self-Gravitating Systems
Directory of Open Access Journals (Sweden)
Silvio Mercadante
2007-11-01
Full Text Available We shall review different approaches to the entropy of self-gravitating systems inGeneral Relativity. Then we shall discuss in detail the macroscopic approach based on a la `Clausius point of view. Recent developments will be reviewed discussing the aims as well asthe assumptions which the framework is based on.
Nelson, M.; Dempster, W. F.; Silverstone, S.; Alling, A.; Allen, J. P.; van Thillo, M.
Two crop growth experiments in the soil-based closed ecological facility, Laboratory Biosphere, were conducted from 2003 to 2004 with candidate space life support crops. Apogee wheat (Utah State University variety) was grown, planted at two densities, 400 and 800 seeds m -2. The lighting regime for the wheat crop was 16 h of light - 8 h dark at a total light intensity of around 840 μmol m -2 s -1 and 48.4 mol m -2 d -1 over 84 days. Average biomass was 1395 g m -2, 16.0 g m -2 d -1 and average seed production was 689 g m -2 and 7.9 g m -2 d -1. The less densely planted side was more productive than the denser planting, with 1634 g m -2 and 18.8 g m -2 d -1 of biomass vs. 1156 g m -2 and 13.3 g m -2 d -1; and a seed harvest of 812.3 g m -2 and 9.3 g m -2 d -1 vs. 566.5 g m -2 and 6.5 g m -2 d -1. Harvest index was 0.49 for the wheat crop. The experiment with sweet potato used TU-82-155 a compact variety developed at Tuskegee University. Light during the sweet potato experiment, on a 18 h on/6 h dark cycle, totaled 5568 total moles of light per square meter in 126 days for the sweet potatoes, or an average of 44.2 mol m -2 d -1. Temperature regime was 28 ± 3 °C day/22 ± 4 °C night. Sweet potato tuber yield was 39.7 kg wet weight, or an average of 7.4 kg m -2, and 7.7 kg dry weight of tubers since dry weight was about 18.6% wet weight. Average per day production was 58.7 g m -2 d -1 wet weight and 11.3 g m -2 d -1. For the wheat, average light efficiency was 0.34 g biomass per mole, and 0.17 g seed per mole. The best area of wheat had an efficiency of light utilization of 0.51 g biomass per mole and 0.22 g seed per mole. For the sweet potato crop, light efficiency per tuber wet weight was 1.33 g mol -1 and 0.34 g dry weight of tuber per mole of light. The best area of tuber production had 1.77 g mol -1 wet weight and 0.34 g mol -1 of light dry weight. The Laboratory Biosphere experiment's light efficiency was somewhat higher than the USU field results but
Entropy, pumped-storage and energy system finance
Karakatsanis, Georgios
2015-04-01
Pumped-storage holds a key role for integrating renewable energy units with non-renewable fuel plants into large-scale energy systems of electricity output. An emerging issue is the development of financial engineering models with physical basis to systematically fund energy system efficiency improvements across its operation. A fundamental physically-based economic concept is the Scarcity Rent; which concerns the pricing of a natural resource's scarcity. Specifically, the scarcity rent comprises a fraction of a depleting resource's full price and accumulates to fund its more efficient future use. In an integrated energy system, scarcity rents derive from various resources and can be deposited to a pooled fund to finance the energy system's overall efficiency increase; allowing it to benefit from economies of scale. With pumped-storage incorporated to the system, water upgrades to a hub resource, in which the scarcity rents of all connected energy sources are denominated to. However, as available water for electricity generation or storage is also limited, a scarcity rent upon it is also imposed. It is suggested that scarcity rent generation is reducible to three (3) main factors, incorporating uncertainty: (1) water's natural renewability, (2) the energy system's intermittent components and (3) base-load prediction deviations from actual loads. For that purpose, the concept of entropy is used in order to measure the energy system's overall uncertainty; hence pumped-storage intensity requirements and generated water scarcity rents. Keywords: pumped-storage, integration, energy systems, financial engineering, physical basis, Scarcity Rent, pooled fund, economies of scale, hub resource, uncertainty, entropy Acknowledgement: This research was funded by the Greek General Secretariat for Research and Technology through the research project Combined REnewable Systems for Sustainable ENergy DevelOpment (CRESSENDO; grant number 5145)
Entropy Production in Convective Hydrothermal Systems
Boersing, Nele; Wellmann, Florian; Niederau, Jan
2016-04-01
Exploring hydrothermal reservoirs requires reliable estimates of subsurface temperatures to delineate favorable locations of boreholes. It is therefore of fundamental and practical importance to understand the thermodynamic behavior of the system in order to predict its performance with numerical studies. To this end, the thermodynamic measure of entropy production is considered as a useful abstraction tool to characterize the convective state of a system since it accounts for dissipative heat processes and gives insight into the system's average behavior in a statistical sense. Solving the underlying conservation principles of a convective hydrothermal system is sensitive to initial conditions and boundary conditions which in turn are prone to uncertain knowledge in subsurface parameters. There exist multiple numerical solutions to the mathematical description of a convective system and the prediction becomes even more challenging as the vigor of convection increases. Thus, the variety of possible modes contained in such highly non-linear problems needs to be quantified. A synthetic study is carried out to simulate fluid flow and heat transfer in a finite porous layer heated from below. Various two-dimensional models are created such that their corresponding Rayleigh numbers lie in a range from the sub-critical linear to the supercritical non-linear regime, that is purely conductive to convection-dominated systems. Entropy production is found to describe the transient evolution of convective processes fairly well and can be used to identify thermodynamic equilibrium. Additionally, varying the aspect ratio for each Rayleigh number shows that the variety of realized convection modes increases with both larger aspect ratio and higher Rayleigh number. This phenomenon is also reflected by an enlarged spread of entropy production for the realized modes. Consequently, the Rayleigh number can be correlated to the magnitude of entropy production. In cases of moderate
Ben Ishak, Anis
2017-01-01
In this work, the effect of Rényi and Tsallis entropies' parameters on the image segmentation quality within a two-dimensional multilevel thresholding framework is assessed and analyzed. The problems of automatically tuning entropy's parameter and determining the optimal thresholding values are solved in a single task. This is done by using the Quantum Genetic Algorithm (QGA). The numerical experiments conducted on different types of images demonstrated that Rényi and Tsallis entropies perform approximately similarly, and they are optimal when their parameters are null. Moreover, it was shown that optimizing the entropy does not lead to maximize the Peak Signal to Noise Ratio (PSNR) and the Structural SIMilarity (SSIM) criteria. Then, we have proved that these two criteria are not sufficiently consistent with human visual perception. Finally, the comparative study performed on some synthetic and real images demonstrated the effectiveness of the proposed method.