Kiyota, Yasuomi; Yoshida, Norio; Hirata, Fumio
2011-11-08
A new approach to investigate a molecular recognition process of protein is presented based on the three-dimensional reference interaction site model (3D-RISM) theory, a statistical mechanics theory of molecular liquids. Numerical procedure for solving the conventional 3D-RISM equation consists of two steps. In step 1, we solve ordinary RISM (or 1D-RISM) equations for a solvent mixture including target ligands in order to obtain the density pair correlation functions (PCF) among molecules in the solution. Then, we solve the 3D-RISM equation for a solute-solvent system to find three-dimensional density distribution functions (3D-DDF) of solvent species around a protein, using PCF obtained in the first step. A key to the success of the method was to regard a target ligand as one of "solvent" species. However, the success is limited due to a difficulty of solving the 1D-RISM equation for a solvent mixture, including large ligand molecules. In the present paper, we propose a method which eases the limitation concerning solute size in the conventional method. In this approach, we solve a solute-solute 3D-RISM equations for a protein-ligand system in which both proteins and ligands are regarded as "solutes" at infinite dilution. The 3D- and 1D-RISM equations are solved for protein-solvent and ligand-solvent systems, respectively, in order to obtain the 3D- and 1D-DDF of solvent around the solutes, which are required for solving the solute-solute 3D-RISM equation. The method is applied to two practical and noteworthy examples concerning pharmaceutical design. One is an odorant binding protein in the Drosophila melanogaster , which binds an ethanol molecule. The other is phospholipase A2, which is known as a receptor of acetylsalicylic acid or aspirin. The result indicates that the method successfully reproduces the binding mode of the ligand molecules in the binding sites measured by the experiments.
Imai, Takashi; Oda, Koji; Kovalenko, Andriy; Hirata, Fumio; Kidera, Akinori
2009-09-02
In line with the recent development of fragment-based drug design, a new computational method for mapping of small ligand molecules on protein surfaces is proposed. The method uses three-dimensional (3D) spatial distribution functions of the atomic sites of the ligand calculated using the molecular theory of solvation, known as the 3D reference interaction site model (3D-RISM) theory, to identify the most probable binding modes of ligand molecules. The 3D-RISM-based method is applied to the binding of several small organic molecules to thermolysin, in order to show its efficiency and accuracy in detecting binding sites. The results demonstrate that our method can reproduce the major binding modes found by X-ray crystallographic studies with sufficient precision. Moreover, the method can successfully identify some binding modes associated with a known inhibitor, which could not be detected by X-ray analysis. The dependence of ligand-binding modes on the ligand concentration, which essentially cannot be treated with other existing computational methods, is also investigated. The results indicate that some binding modes are readily affected by the ligand concentration, whereas others are not significantly altered. In the former case, it is the subtle balance in the binding affinity between the ligand and water that determines the dominant ligand-binding mode.
Huang, WenJuan; Blinov, Nikolay; Kovalenko, Andriy
2015-04-30
The octanol-water partition coefficient is an important physical-chemical characteristic widely used to describe hydrophobic/hydrophilic properties of chemical compounds. The partition coefficient is related to the transfer free energy of a compound from water to octanol. Here, we introduce a new protocol for prediction of the partition coefficient based on the statistical-mechanical, 3D-RISM-KH molecular theory of solvation. It was shown recently that with the compound-solvent correlation functions obtained from the 3D-RISM-KH molecular theory of solvation, the free energy functional supplemented with the correction linearly related to the partial molar volume obtained from the Kirkwood-Buff/3D-RISM theory, also called the "universal correction" (UC), provides accurate prediction of the hydration free energy of small compounds, compared to explicit solvent molecular dynamics [ Palmer , D. S. ; J. Phys.: Condens. Matter 2010 , 22 , 492101 ]. Here we report that with the UC reparametrized accordingly this theory also provides an excellent agreement with the experimental data for the solvation free energy in nonpolar solvent (1-octanol) and so accurately predicts the octanol-water partition coefficient. The performance of the Kovalenko-Hirata (KH) and Gaussian fluctuation (GF) functionals of the solvation free energy, with and without UC, is tested on a large library of small compounds with diverse functional groups. The best agreement with the experimental data for octanol-water partition coefficients is obtained with the KH-UC solvation free energy functional.
Yokogawa, D.
2016-09-01
Theoretical approach to design bright bio-imaging molecules is one of the most progressing ones. However, because of the system size and computational accuracy, the number of theoretical studies is limited to our knowledge. To overcome the difficulties, we developed a new method based on reference interaction site model self-consistent field explicitly including spatial electron density distribution and time-dependent density functional theory. We applied it to the calculation of indole and 5-cyanoindole at ground and excited states in gas and solution phases. The changes in the optimized geometries were clearly explained with resonance structures and the Stokes shift was correctly reproduced.
Alexander E. Kobryn
2016-04-01
Full Text Available Although better means to model the properties of bulk heterojunction molecular blends are much needed in the field of organic optoelectronics, only a small subset of methods based on molecular dynamics- and Monte Carlo-based approaches have been hitherto employed to guide or replace empirical characterization and testing. Here, we present the first use of the integral equation theory of molecular liquids in modelling the structural properties of blends of phenyl-C61-butyric acid methyl ester (PCBM with poly(3-hexylthiophene (P3HT and a carboxylated poly(3-butylthiophene (P3BT, respectively. For this, we use the Reference Interaction Site Model (RISM with the Universal Force Field (UFF to compute the microscopic structure of blends and obtain insight into the miscibility of its components. Input parameters for RISM, such as optimized molecular geometries and charge distribution of interaction sites, are derived by the Density Functional Theory (DFT methods. We also run Molecular Dynamics (MD simulation to compare the diffusivity of the PCBM in binary blends with P3HT and P3BT, respectively. A remarkably good agreement with available experimental data and results of alternative modelling/simulation is observed for PCBM in the P3HT system. We interpret this as a step in the validation of the use of our approach for organic photovoltaics and support of its results for new systems that do not have reference data for comparison or calibration. In particular, for the less-studied P3BT, our results show that expectations about its performance in binary blends with PCBM may be overestimated, as it does not demonstrate the required level of miscibility and short-range structural organization. In addition, the simulated mobility of PCBM in P3BT is somewhat higher than what is expected for polymer blends and falls into a range typical for fluids. The significance of our predictive multi-scale modelling lies in the insights it offers into nanoscale
Imai, Takashi; Kovalenko, Andriy; Hirata, Fumio; Kidera, Akinori
2009-06-01
It has been shown that trifluoroethanol (TFE) induces helical structure in peptides and proteins. The molecular mechanism is, however, still not completely elucidated. In this study, the TFE effects on the solvation structure and on the free energy change associated with the helix-coil transition of a polypeptide are analyzed by using the three-dimensional reference interaction site model (3D-RISM) molecular theory of solvation. The theoretical result shows that TFE preferentially solvates at low concentrations around 30 vol% both for the helix and coil structures. However, the characteristic preferential solvation is not as significant in the TFE-induced helix stabilization as generally considered. It is also found that the overall energy contributes to the free energy difference more substantially than the solvation entropy.
Yokogawa, Daisuke; Ono, Kohei; Sato, Hirofumi; Sakaki, Shigeyoshi
2011-11-14
The ligand exchange process of cis-platin in aqueous solution was studied using RISM-SCF-SEDD (reference interaction site model-self-consistent field with spatial electron density distribution) method, a hybrid approach of quantum chemistry and statistical mechanics. The analytical nature of RISM theory enables us to compute accurate reaction free energy in aqueous solution based on CCSD(T), together with the microscopic solvation structure around the complex. We found that the solvation effect is indispensable to promote the dissociation of the chloride anion from the complex.
Gaëlle Leclercq
2010-11-01
Full Text Available Dans le domaine de la conservation-restauration de peintures de chevalet, l’intervention des retouches est un acte critique durant lequel un phénomène de métamérisme peut facilement apparaître dans les tonalités bleues. Cet article propose une alternative au problème. En effet, après avoir étudié l’ensemble des pigments bleus utilisés lors de l’exécution des retouches, il ressort que certains pigments diminueraient le risque de créer ce phénomène de métamérisme. Cette hypothèse se fonde sur les similitudes spectrales des différents pigments bleus utilisés au cours de l’histoire de la peinture.In the field of the conservation-restoration of paintings, the retouching intervention is a critical act during which a metamerism phenomenon can easily be created in the blue hues. This research suggests an alternative to this problem. Indeed, it is possible to find some current pigments which have a similar spectral composition to traditional pigments usually used by the ancient painters and so, this option would reduce the risk of creating a metamerism.
Range Information Systems Management (RISM) Phase 1 Report
Bastin, Gary L.; Harris, William G.; Nelson, Richard A.
2002-01-01
RISM investigated alternative approaches, technologies, and communication network architectures to facilitate building the Spaceports and Ranges of the future. RISM started by document most existing US ranges and their capabilities. In parallel, RISM obtained inputs from the following: 1) NASA and NASA-contractor engineers and managers, and; 2) Aerospace leaders from Government, Academia, and Industry, participating through the Space Based Range Distributed System Working Group (SBRDSWG), many of whom are also; 3) Members of the Advanced Range Technology Working Group (ARTWG) subgroups, and; 4) Members of the Advanced Spaceport Technology Working Group (ASTWG). These diverse inputs helped to envision advanced technologies for implementing future Ranges and Range systems that builds on today s cabled and wireless legacy infrastructures while seamlessly integrating both today s emerging and tomorrow s building-block communication techniques. The fundamental key is to envision a transition to a Space Based Range Distributed Subsystem. The enabling concept is to identify the specific needs of Range users that can be solved through applying emerging communication tech
Luchko, Tyler; Blinov, Nikolay; Limon, Garrett C.; Joyce, Kevin P.; Kovalenko, Andriy
2016-11-01
Implicit solvent methods for classical molecular modeling are frequently used to provide fast, physics-based hydration free energies of macromolecules. Less commonly considered is the transferability of these methods to other solvents. The Statistical Assessment of Modeling of Proteins and Ligands 5 (SAMPL5) distribution coefficient dataset and the accompanying explicit solvent partition coefficient reference calculations provide a direct test of solvent model transferability. Here we use the 3D reference interaction site model (3D-RISM) statistical-mechanical solvation theory, with a well tested water model and a new united atom cyclohexane model, to calculate partition coefficients for the SAMPL5 dataset. The cyclohexane model performed well in training and testing (R=0.98 for amino acid neutral side chain analogues) but only if a parameterized solvation free energy correction was used. In contrast, the same protocol, using single solute conformations, performed poorly on the SAMPL5 dataset, obtaining R=0.73 compared to the reference partition coefficients, likely due to the much larger solute sizes. Including solute conformational sampling through molecular dynamics coupled with 3D-RISM (MD/3D-RISM) improved agreement with the reference calculation to R=0.93. Since our initial calculations only considered partition coefficients and not distribution coefficients, solute sampling provided little benefit comparing against experiment, where ionized and tautomer states are more important. Applying a simple pK_{ {a}} correction improved agreement with experiment from R=0.54 to R=0.66, despite a small number of outliers. Better agreement is possible by accounting for tautomers and improving the ionization correction.
Size-dependent adsorption sites in a Prussian blue nanoparticle: A 3D-RISM study
Ruankaew, Nirun; Yoshida, Norio; Watanabe, Yoshihiro; Nakano, Haruyuki; Phongphanphanee, Saree
2017-09-01
The specific adsorption of alkali ions, Li+, Na+, K+, and Cs+, in electrolyte solutions on Prussian blue (PB) is investigated by using the three-dimensional (3D) reference interaction site-model (RISM) theory. The results from 3D-RISM show dramatically different adsorption sites between large ions (K+ and Cs+) and small ions (Li+ and Na+). The small ions are adsorbed at the channel entrance sites without the water-ion exchange mechanism. In contrast, the large ions are adsorbed in PB by the water-ion exchange mechanism, and the adsorption site of large ions is located at the center of the cage or at the interstitial site.
Cao, Siqin [The HKUST Shenzhen Research Institute, Shenzhen (China); Department of Chemistry, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong); Sheong, Fu Kit [Department of Chemistry, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong); Huang, Xuhui, E-mail: xuhuihuang@ust.hk [The HKUST Shenzhen Research Institute, Shenzhen (China); Department of Chemistry, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong); Division of Biomedical Engineering, Center of Systems Biology and Human Health, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon (Hong Kong)
2015-08-07
Reference interaction site model (RISM) has recently become a popular approach in the study of thermodynamical and structural properties of the solvent around macromolecules. On the other hand, it was widely suggested that there exists water density depletion around large hydrophobic solutes (>1 nm), and this may pose a great challenge to the RISM theory. In this paper, we develop a new analytical theory, the Reference Interaction Site Model with Hydrophobicity induced density Inhomogeneity (RISM-HI), to compute solvent radial distribution function (RDF) around large hydrophobic solute in water as well as its mixture with other polyatomic organic solvents. To achieve this, we have explicitly considered the density inhomogeneity at the solute-solvent interface using the framework of the Yvon-Born-Green hierarchy, and the RISM theory is used to obtain the solute-solvent pair correlation. In order to efficiently solve the relevant equations while maintaining reasonable accuracy, we have also developed a new closure called the D2 closure. With this new theory, the solvent RDFs around a large hydrophobic particle in water and different water-acetonitrile mixtures could be computed, which agree well with the results of the molecular dynamics simulations. Furthermore, we show that our RISM-HI theory can also efficiently compute the solvation free energy of solute with a wide range of hydrophobicity in various water-acetonitrile solvent mixtures with a reasonable accuracy. We anticipate that our theory could be widely applied to compute the thermodynamic and structural properties for the solvation of hydrophobic solute.
Mahmoud, Hosam M
2011-01-01
A cutting-edge look at the emerging distributional theory of sorting Research on distributions associated with sorting algorithms has grown dramatically over the last few decades, spawning many exact and limiting distributions of complexity measures for many sorting algorithms. Yet much of this information has been scattered in disparate and highly specialized sources throughout the literature. In Sorting: A Distribution Theory, leading authority Hosam Mahmoud compiles, consolidates, and clarifies the large volume of available research, providing a much-needed, comprehensive treatment of the
Georgiev, Svetlin G
2015-01-01
This book explains many fundamental ideas on the theory of distributions. The theory of partial differential equations is one of the synthetic branches of analysis that combines ideas and methods from different fields of mathematics, ranging from functional analysis and harmonic analysis to differential geometry and topology. This presents specific difficulties to those studying this field. This book, which consists of 10 chapters, is suitable for upper undergraduate/graduate students and mathematicians seeking an accessible introduction to some aspects of the theory of distributions. It can also be used for one-semester course.
A novel ligand-mapping method based on molecular liquid theory.
Imai, Takashi
2011-01-01
The recent development of a novel ligand-mapping method is reviewed. The method is based on a statistical-mechanical molecular theory of solvation, known as the three-dimensional reference interaction site model (3D-RISM). In the 3D-RISM-based ligand mapping (3D-RISM-LM) method, using the all-atom model for a target protein immersed in a ligand-water mixture solvent, the 3D-spatial distributions of the ligand atomic sites around the protein are first obtained, and then the most probable binding modes of the ligand molecule are constructed from the distributions. Unlike conventional docking simulations, 3D-RISM-LM can incorporate the effect of water from the atomic to thermodynamic level into the binding affinity through statistical mechanics. It has been demonstrated that 3D-RISM-LM can sensitively detect even weak binding modes of small molecules over the entire surface of protein. Therefore, this approach is expected to be particularly useful in fragment-based drug design.
Economic theory of distributism
Đurković Miša
2015-01-01
Full Text Available this paper, the economic theory of distributism has been analyzed. In the first place, the author explains that the distributism is a social thought which emerged in the Anglo-American world as the development of social teachings in the Roman Catholic Church. Although it has not received the status the main schools in modern economic thought have, distrubutism persists as a specific direction of socio-economic thinking. The paper particularly investigates the ideas of classical distibutism. The author focuses on two basic books by Gilbert Chesterton and two most important economic books by Hilaire Belloc. These authors have insisted on the problem of society moving towards the so-called servile state in which a small number of capitalists rule over mass of proletarians who are gradually coming under slavery status, which is sanctioned by the law. For the purpose of remedying this tendency and collectivism, they proposed a series of measures for a repeated broad distribution of ownership over the means of production. Finally, there is an overview of this idea and its development throughout the twentieth century, finishing with contemporary distributists like John Medaille and Alan Carlson. [Projekat Ministarstva nauke Republike Srbije, br. 179014
Commutative monads as a theory of distributions
Kock, Anders
2012-01-01
It is shown how the theory of commutative monads provides an axiomatic framework for several aspects of distribution theory in a broad sense, including probability distributions, physical extensive quantities, and Schwartz distributions of compact support. Among the particular aspects considered...... here are the notions of convolution, density, expectation, and conditional probability....
Commutative monads as a theory of distributions
Kock, Anders
2012-01-01
It is shown how the theory of commutative monads provides an axiomatic framework for several aspects of distribution theory in a broad sense, including probability distributions, physical extensive quantities, and Schwartz distributions of compact support. Among the particular aspects considered...... here are the notions of convolution, density, expectation, and conditional probability....
A nonlinear theory of tensor distributions
Vickers, J A
1998-01-01
The coordinate invariant theory of generalised functions of Colombeau and Meril is reviewed and extended to enable the construction of multi-index generalised tensor functions whose transformation laws coincide with their counterparts in classical distribution theory.
Nishiyama, Katsura; Watanabe, Yasuhiro; Yoshida, Norio; Hirata, Fumio
2013-09-01
The Stokes shift magnitudes for coumarin 153 (C153) in 13 organic solvents with various polarities have been determined by means of steady-state spectroscopy and reference interaction-site model-self-consistent-field (RISM-SCF) theory. RISM-SCF calculations have reproduced experimental results fairly well, including individual solvent characteristics. It is empirically known that in some solvents, larger Stokes shift magnitudes are detected than anticipated on the basis of the solvent relative permittivity, ɛr. In practice, 1,4-dioxane (ɛr = 2.21) provides almost identical Stokes shift magnitudes to that of tetrahydrofuran (THF, ɛr = 7.58), for C153 and other typical organic solutes. In this work, RISM-SCF theory has been used to estimate the energetics of C153-solvent systems involved in the absorption and fluorescence processes. The Stokes shift magnitudes estimated by RISM-SCF theory are ∼5 kJ mol(-1) (400 cm(-1)) less than those determined by spectroscopy; however, the results obtained are still adequate for dipole moment comparisons, in a qualitative sense. We have also calculated the solute-solvent site-site radial distributions by this theory. It is shown that solvation structures with respect to the C-O-C framework, which is common to dioxane and THF, in the near vicinity (∼0.4 nm) of specific solute sites can largely account for their similar Stokes shift magnitudes. In previous works, such solute-solvent short-range interactions have been explained in terms of the higher-order multipole moments of the solvents. Our present study shows that along with the short-range interactions that contribute most significantly to the energetics, long-range electrostatic interactions are also important. Such long-range interactions are effective up to 2 nm from the solute site, as in the case of a typical polar solvent, acetonitrile.
Information Theory and the Earth's Density Distribution
Rubincam, D. P.
1979-01-01
An argument for using the information theory approach as an inference technique in solid earth geophysics. A spherically symmetric density distribution is derived as an example of the method. A simple model of the earth plus knowledge of its mass and moment of inertia lead to a density distribution which was surprisingly close to the optimum distribution. Future directions for the information theory approach in solid earth geophysics as well as its strengths and weaknesses are discussed.
Distribution theory of algebraic numbers
Yang, Chung-Chun
2008-01-01
The book timely surveys new research results and related developments in Diophantine approximation, a division of number theory which deals with the approximation of real numbers by rational numbers. The book is appended with a list of challenging open problems and a comprehensive list of references. From the contents: Field extensions Algebraic numbers Algebraic geometry Height functions The abc-conjecture Roth''s theorem Subspace theorems Vojta''s conjectures L-functions.
Mathematical theories of distributed sensor networks
Iyengar, Sitharama S; Balakrishnan, N
2014-01-01
Mathematical Theory of Distributed Sensor Networks demonstrates how mathematical theories can be used to provide distributed sensor modeling and to solve important problems such as coverage hole detection and repair. The book introduces the mathematical and computational structure by discussing what they are, their applications and how they differ from traditional systems. The text also explains how mathematics are utilized to provide efficient techniques implementing effective coverage, deployment, transmission, data processing, signal processing, and data protection within distributed sensor networks. Finally, the authors discuss some important challenges facing mathematics to get more incite to the multidisciplinary area of distributed sensor networks.
Tielker, Nicolas; Heil, Jochen; Kloss, Thomas; Ehrhart, Sebastian; Güssregen, Stefan; Schmidt, K. Friedemann; Kast, Stefan M.
2016-01-01
We predict cyclohexane–water distribution coefficients (log D7.4) for drug-like molecules taken from the SAMPL5 blind prediction challenge by the “embedded cluster reference interaction site model” (EC-RISM) integral equation theory. This task involves the coupled problem of predicting both partition coefficients (log P) of neutral species between the solvents and aqueous acidity constants (pKa) in order to account for a change of protonation states. The first issue is addressed by calibrating an EC-RISM-based model for solvation free energies derived from the “Minnesota Solvation Database” (MNSOL) for both water and cyclohexane utilizing a correction based on the partial molar volume, yielding a root mean square error (RMSE) of 2.4 kcal mol−1 for water and 0.8–0.9 kcal mol−1 for cyclohexane depending on the parametrization. The second one is treated by employing on one hand an empirical pKa model (MoKa) and, on the other hand, an EC-RISM-derived regression of published acidity constants (RMSE...
ON THE DISTRIBUTION THEORY FOR SOME CONSTRAINED LIFE TESTING EXPERIMENTS.
RELIABILITY, *TEST METHODS, * DISTRIBUTION THEORY , MATHEMATICAL MODELS, STATISTICAL DISTRIBUTIONS, MULTIVARIATE ANALYSIS, DECISION THEORY, LIFE EXPECTANCY(SERVICE LIFE), EXPONENTIAL FUNCTIONS, THESES.
Distributed hash table theory, platforms and applications
Zhang, Hao; Xie, Haiyong; Yu, Nenghai
2013-01-01
This SpringerBrief summarizes the development of Distributed Hash Table in both academic and industrial fields. It covers the main theory, platforms and applications of this key part in distributed systems and applications, especially in large-scale distributed environments. The authors teach the principles of several popular DHT platforms that can solve practical problems such as load balance, multiple replicas, consistency and latency. They also propose DHT-based applications including multicast, anycast, distributed file systems, search, storage, content delivery network, file sharing and c
Angular distribution in complex oscillation theory
WU Shengjian
2005-01-01
Let f1 and f2 be two linearly independent solutions of the differential equation f" + Af =0,where A is an entire function.Set E-f1f2.In this paper,we shall study the angular distribution of E and establish a relation between zero accumulation rays and Borel directions of E.Consequently we can obtain some results in the complex differential equation by using known results in angular distribution theory of meromorphic functions.
Distributed computer systems theory and practice
Zedan, H S M
2014-01-01
Distributed Computer Systems: Theory and Practice is a collection of papers dealing with the design and implementation of operating systems, including distributed systems, such as the amoeba system, argus, Andrew, and grapevine. One paper discusses the concepts and notations for concurrent programming, particularly language notation used in computer programming, synchronization methods, and also compares three classes of languages. Another paper explains load balancing or load redistribution to improve system performance, namely, static balancing and adaptive load balancing. For program effici
Learning theory of distributed spectral algorithms
Guo, Zheng-Chu; Lin, Shao-Bo; Zhou, Ding-Xuan
2017-07-01
Spectral algorithms have been widely used and studied in learning theory and inverse problems. This paper is concerned with distributed spectral algorithms, for handling big data, based on a divide-and-conquer approach. We present a learning theory for these distributed kernel-based learning algorithms in a regression framework including nice error bounds and optimal minimax learning rates achieved by means of a novel integral operator approach and a second order decomposition of inverse operators. Our quantitative estimates are given in terms of regularity of the regression function, effective dimension of the reproducing kernel Hilbert space, and qualification of the filter function of the spectral algorithm. They do not need any eigenfunction or noise conditions and are better than the existing results even for the classical family of spectral algorithms.
APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method
Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno
2014-05-01
The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.
Applied optimal control theory of distributed systems
Lurie, K A
1993-01-01
This book represents an extended and substantially revised version of my earlierbook, Optimal Control in Problems ofMathematical Physics,originally published in Russian in 1975. About 60% of the text has been completely revised and major additions have been included which have produced a practically new text. My aim was to modernize the presentation but also to preserve the original results, some of which are little known to a Western reader. The idea of composites, which is the core of the modern theory of optimization, was initiated in the early seventies. The reader will find here its implementation in the problem of optimal conductivity distribution in an MHD-generatorchannel flow.Sincethen it has emergedinto an extensive theory which is undergoing a continuous development. The book does not pretend to be a textbook, neither does it offer a systematic presentation of the theory. Rather, it reflects a concept which I consider as fundamental in the modern approach to optimization of dis tributed systems. ...
A closure relation to molecular theory of solvation for macromolecules
Kobryn, Alexander E.; Gusarov, Sergey; Kovalenko, Andriy
2016-10-01
We propose a closure to the integral equations of molecular theory of solvation, particularly suitable for polar and charged macromolecules in electrolyte solution. This includes such systems as oligomeric polyelectrolytes at a finite concentration in aqueous and various non-aqueous solutions, as well as drug-like compounds in solution. The new closure by Kobryn, Gusarov, and Kovalenko (KGK closure) imposes the mean spherical approximation (MSA) almost everywhere in the solvation shell but levels out the density distribution function to zero (with the continuity at joint boundaries) inside the repulsive core and in the spatial regions of strong density depletion emerging due to molecular associative interactions. Similarly to MSA, the KGK closure reduces the problem to a linear equation for the direct correlation function which is predefined analytically on most of the solvation shells and has to be determined numerically on a relatively small (three-dimensional) domain of strong depletion, typically within the repulsive core. The KGK closure leads to the solvation free energy in the form of the Gaussian fluctuation (GF) functional. We first test the performance of the KGK closure coupled to the reference interaction site model (RISM) integral equations on the examples of Lennard-Jones liquids, polar and nonpolar molecular solvents, including water, and aqueous solutions of simple ions. The solvation structure, solvation chemical potential, and compressibility obtained from RISM with the KGK closure favorably compare to the results of the hypernetted chain (HNC) and Kovalenko-Hirata (KH) closures, including their combination with the GF solvation free energy. We then use the KGK closure coupled to RISM to obtain the solvation structure and thermodynamics of oligomeric polyelectrolytes and drug-like compounds at a finite concentration in electrolyte solution, for which no convergence is obtained with other closures. For comparison, we calculate their solvation
Semi-stable distributions in free probability theory
无
2006-01-01
Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.
Applying Distributed Learning Theory in Online Business Communication Courses.
Walker, Kristin
2003-01-01
Focuses on the critical use of technology in online formats that entail relatively new teaching media. Argues that distributed learning theory is valuable for teachers of online business communication courses for several reasons. Discusses the application of distributed learning theory to the teaching of business communication online. (SG)
Diffraction Theory and Almost Periodic Distributions
Strungaru, Nicolae; Terauds, Venta
2016-09-01
We introduce and study the notions of translation bounded tempered distributions, and autocorrelation for a tempered distribution. We further introduce the spaces of weakly, strongly and null weakly almost periodic tempered distributions and show that for weakly almost periodic tempered distributions the Eberlein decomposition holds. For translation bounded measures all these notions coincide with the classical ones. We show that tempered distributions with measure Fourier transform are weakly almost periodic and that for this class, the Eberlein decomposition is exactly the Fourier dual of the Lebesgue decomposition, with the Fourier-Bohr coefficients specifying the pure point part of the Fourier transform. We complete the project by looking at few interesting examples.
Central Place Theory and Distribution of Post Offices in Cities
无
2000-01-01
The feasibility of application of the Central Place Theory in the distribution of post offices in cities is analysed, the grade scale structure and space distribution structure Shijiazhauang of post offices in city are studied, the research results prove the actual value of the Central Place Theory, and the suggestion of adjustment in the space distribution Shijiazhuang of post offices in city is put forward.
The Schlueter distribution: theory and simulation
Shutler, P M E; Springham, S V; Martinez, J C [National Institute of Education, Nanyang Technological University, 1 Nanyang Walk, Singapore 637616 (Singapore)
2007-11-15
The distribution of molecular speeds for a hard spheres gas in the microcanonical ensemble follows the Schlueter distribution when the number of molecules is small, converging to the classical Maxwell distribution in the large number limit. We present a derivation of the Schlueter distribution, obtained from Khinchin's derivation of the factorization of the density of states, which is simpler and shorter than those currently available. We also verify its predictions for three-dimensional (3D) hard spheres using a desktop computer simulation, whereas previous studies have simulated only 2D hard discs.
Continuous and distributed systems theory and applications
Sadovnichiy, Victor
2014-01-01
In this volume, the authors close the gap between abstract mathematical approaches, such as abstract algebra, number theory, nonlinear functional analysis, partial differential equations, methods of nonlinear and multi-valued analysis, on the one hand, and practical applications in nonlinear mechanics, decision making theory and control theory on the other. Readers will also benefit from the presentation of modern mathematical modeling methods for the numerical solution of complicated engineering problems in hydromechanics, geophysics and mechanics of continua. This compilation will be of interest to mathematicians and engineers working at the interface of these field. It presents selected works of the open seminar series of Lomonosov Moscow State University and the National Technical University of Ukraine “Kyiv Polytechnic Institute”. The authors come from Germany, Italy, Spain, Russia, Ukraine, and the USA.
Small molecule hydration energy and entropy from 3D-RISM
Johnson, J.; Case, D. A.; Yamazaki, T.; Gusarov, S.; Kovalenko, A.; Luchko, T.
2016-09-01
Implicit solvent models offer an attractive way to estimate the effects of a solvent environment on the properties of small or large solutes without the complications of explicit simulations. One common test of accuracy is to compute the free energy of transfer from gas to liquid for a variety of small molecules, since many of these values have been measured. Studies of the temperature dependence of these values (i.e. solvation enthalpies and entropies) can provide additional insights into the performance of implicit solvent models. Here, we show how to compute temperature derivatives of hydration free energies for the 3D-RISM integral equation approach. We have computed hydration free energies of 1123 small drug-like molecules (both neutral and charged). Temperature derivatives were also used to calculate hydration energies and entropies of 74 of these molecules (both neutral and charged) for which experimental data is available. While direct results have rather poor agreement with experiment, we have found that several previously proposed linear hydration free energy correction schemes give good agreement with experiment. These corrections also provide good agreement for hydration energies and entropies though simple extensions are required in some cases.
The neoclassical theory of growth and distribution
Robert M. Solow
2000-12-01
Full Text Available The paper surveys the neoclassical theory of growth. As a preliminary, the meaning of the adjective "neoclassical" is discussed. The basic model is then sketched, and the conditions ensuring a stationary state are illustrated. The issue of the convergence to a stationary state (and that of the speed of convergence is further considered. A discussion of "primary factors" opens the way to the "new" theory of growth, with endogenous technical progress. A number of extensions of the basic model are then recalled: two-sector and multi-sectoral models, overlapping generations models, the role of money in growth models.
Tielker, Nicolas; Tomazic, Daniel; Heil, Jochen; Kloss, Thomas; Ehrhart, Sebastian; Güssregen, Stefan; Schmidt, K. Friedemann; Kast, Stefan M.
2016-11-01
We predict cyclohexane-water distribution coefficients (log D 7.4) for drug-like molecules taken from the SAMPL5 blind prediction challenge by the "embedded cluster reference interaction site model" (EC-RISM) integral equation theory. This task involves the coupled problem of predicting both partition coefficients (log P) of neutral species between the solvents and aqueous acidity constants (p K a) in order to account for a change of protonation states. The first issue is addressed by calibrating an EC-RISM-based model for solvation free energies derived from the "Minnesota Solvation Database" (MNSOL) for both water and cyclohexane utilizing a correction based on the partial molar volume, yielding a root mean square error (RMSE) of 2.4 kcal mol-1 for water and 0.8-0.9 kcal mol-1 for cyclohexane depending on the parametrization. The second one is treated by employing on one hand an empirical p K a model (MoKa) and, on the other hand, an EC-RISM-derived regression of published acidity constants (RMSE of 1.5 for a single model covering acids and bases). In total, at most 8 adjustable parameters are necessary (2-3 for each solvent and two for the p K a) for training solvation and acidity models. Applying the final models to the log D 7.4 dataset corresponds to evaluating an independent test set comprising other, composite observables, yielding, for different cyclohexane parametrizations, 2.0-2.1 for the RMSE with the first and 2.2-2.8 with the combined first and second SAMPL5 data set batches. Notably, a pure log P model (assuming neutral species only) performs statistically similarly for these particular compounds. The nature of the approximations and possible perspectives for future developments are discussed.
Integration in superspace using distribution theory
Coulembier, K; De Bie, H; Sommen, F [Clifford Research Group, Department of Mathematical Analysis Faculty of Engineering, Ghent University, Krijgslaan 281, 9000 Gent (Belgium)], E-mail: Coulembier@cage.ugent.be, E-mail: Hendrik.DeBie@UGent.be, E-mail: fs@cage.ugent.be
2009-10-02
In this paper, a new class of Cauchy integral formulae in superspace is obtained, using formal expansions of distributions. This allows us to solve five open problems in the study of harmonic and Clifford analysis in superspace.
Raney Distributions and Random Matrix Theory
Forrester, Peter J.; Liu, Dang-Zheng
2015-03-01
Recent works have shown that the family of probability distributions with moments given by the Fuss-Catalan numbers permit a simple parameterized form for their density. We extend this result to the Raney distribution which by definition has its moments given by a generalization of the Fuss-Catalan numbers. Such computations begin with an algebraic equation satisfied by the Stieltjes transform, which we show can be derived from the linear differential equation satisfied by the characteristic polynomial of random matrix realizations of the Raney distribution. For the Fuss-Catalan distribution, an equilibrium problem characterizing the density is identified. The Stieltjes transform for the limiting spectral density of the singular values squared of the matrix product formed from inverse standard Gaussian matrices, and standard Gaussian matrices, is shown to satisfy a variant of the algebraic equation relating to the Raney distribution. Supported on , we show that it too permits a simple functional form upon the introduction of an appropriate choice of parameterization. As an application, the leading asymptotic form of the density as the endpoints of the support are approached is computed, and is shown to have some universal features.
Universality of the Distribution Functions of Random Matrix Theory. II
Tracy, Craig A.; Widom, Harold
1999-01-01
This paper is a brief review of recent developments in random matrix theory. Two aspects are emphasized: the underlying role of integrable systems and the occurrence of the distribution functions of random matrix theory in diverse areas of mathematics and physics.
Localization theory of distributed fiber vibration sensor
Weimin Chen; Yuanyuan Xie; Peng Zhang; Lei Lin
2009-01-01
Based on Sagnac interferometer, a simple distributed optical fiber sensing system with sub-loop is pre-sented to monitor the vibration applied on the sensing fiber. By introducing a sub-loop, three output beams of interference with different delay time are gotten. Location of the vibration is analyzed through mathematical-physical equations. The vibration frequency, amplitude, and location are theoretically sim-ulated. The results agree well with the previous experiments.
Nishihara, S.; Otani, M.
2017-09-01
We present two hybrid solvation models for the calculation of the solvation structure with model 1 in a confined nanospace in bulk materials and model 2 at solid/liquid interfaces where an electrode is in contact with an electrolyte and a membrane is immersed into a solution. The hybrid theory is based on the reference interaction site method (RISM) for the solvent region. The electronic structure of a bulk material, an electrode, and a membrane is treated by density functional theory with the plane-wave basis and pseudopotentials technique. For model 1, we use the three-dimensional RISM (3D-RISM) by imposing a 3D periodic boundary condition on the system. However, for model 2, we reformulate the RISM by means of a two-dimensional boundary condition parallel to the surface and an open boundary condition normal to the surface. Four benchmark calculations are performed for the formaldehyde-water system, water packed into a zeolite framework, a NaCl solution in contact with an Al electrode, and an Al thin film immersed in a NaCl solution with different concentrations. The calculations are shown to be efficient and stable. Because of the flexibility of the RISM theory, the models are considered to be applicable to a wide range of solid/liquid interfaces.
Distributed Leadership through the Lens of Activity Theory
Yuen, Jeanne Ho Pau; Victor Chen, Der-Thanq; Ng, David
2016-01-01
Purpose: Using Activity Theory as an interpretive lens to examine the distribution of leadership, this paper shares a case study on how leadership for an ICT project was distributed in a Singapore school. Method: The case study involved observations of 49 meetings and 34 interviews of leaders and the teachers who were involved in the ICT project.…
Product Distribution Theory and Semi-Coordinate Transformations
Airiau, Stephane; Wolpert, David H.
2004-01-01
Product Distribution (PD) theory is a new framework for doing distributed adaptive control of a multiagent system (MAS). We introduce the technique of "coordinate transformations" in PD theory gradient descent. These transformations selectively couple a few agents with each other into "meta-agents". Intuitively, this can be viewed as a generalization of forming binding contracts between those agents. Doing this sacrifices a bit of the distributed nature of the MAS, in that there must now be communication from multiple agents in determining what joint-move is finally implemented However, as we demonstrate in computer experiments, these transformations improve the performance of the MAS.
Theory of the sea ice thickness distribution
Toppaladoddi, Srikanth
2015-01-01
We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution $g(h)$ due to Thorndike et al., (1975) into a Fokker-Planck like conservation law. The steady solution is $g(h) = {\\cal N}(q) h^q \\mathrm{e}^{-~ h/H}$, where $q$ and $H$ are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for $h \\ll 1$, $g(h)$ is controlled by both thermodynamics and mechanics, whereas for $h \\gg 1$ only mechanics controls $g(h)$. Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness $h$, from which we predict the observed $g(h)$. The genericity of our approach provides a framework for studying the geophysical scale structure of the ice pack using methods of broad relevance in statistical mechanics.
Species distributions, quantum theory, and the enhancement of biodiversity measures
Real, Raimundo; Barbosa, A. Márcia; Bull, Joseph William
2016-01-01
differently to similar environmental conditions at different places or moments, so their distribution is, in principle, not completely predictable. We argue that this uncertainty exists, and warrants considering species distributions as analogous to coherent quantum objects, whose distributions are better...... biodiversity”. We show how conceptualizing species’ distributions in this way could help overcome important weaknesses in current biodiversity metrics, both in theory and by using a worked case study of mammal distributions in Spain over the last decade. We propose that considerable theoretical advances could...... eventually be gained through interdisciplinary collaboration between biogeographers and quantum physicists. [Biogeography; favorability; physics; predictability; probability; species occurrence; uncertainty; wavefunction....
Kappa distributions: theory and applications in space plasmas
Pierrard, V
2010-01-01
Particle velocity distribution functions (VDF) in space plasmas often show non Maxwellian suprathermal tails decreasing as a power law of the velocity. Such distributions are well fitted by the so-called Kappa distribution. The presence of such distributions in different space plasmas suggests a universal mechanism for the creation of such suprathermal tails. Different theories have been proposed and are recalled in this review paper. The suprathermal particles have important consequences concerning the acceleration and the temperature that are well evidenced by the kinetic approach where no closure requires the distributions to be nearly Maxwellians. Moreover, the presence of the suprathermal particles take an important role in the wave-particle interactions.
Critique of the neoclassical theory of growth and distribution
Luigi L. Pasinetti
2000-12-01
Full Text Available The paper surveys the main theories of income distribution in their relationship with the theories of economic growth. First, the Classical approach is considered, focusing on the Ricardian theory. Then the neoclassical theory is discussed, highlighting its origins (Bohm-Bawerk, Wicksell, Clark and the role of the aggregate production function. The emergence of a "Keynesian" theory of income distributionin the wake of Harrod's model of growth is then recalled together with the surprising resurgence of the neoclassical theory (following the contributions of Solow and Meade. But, as the paper shows, the neoclassical theory of income distributionlacks logical consistency and has shaky foundations, as has been revealed by the severecritiques moved to the neoclassical production function. Mainstream economic literature circumvents this problem by simply ignoring it, while the models of endogenous growth exclude the issue of distribution theory from their consideration. However, while mainstream economics bypasses the problems of incomedistribution, this is too relevant an issue to be ignored and a number of new research lines, briefly surveyed, try new approaches to it.
Product Distribution Theory for Control of Multi-Agent Systems
Lee, Chia Fan; Wolpert, David H.
2004-01-01
Product Distribution (PD) theory is a new framework for controlling Multi-Agent Systems (MAS's). First we review one motivation of PD theory, as the information-theoretic extension of conventional full-rationality game theory to the case of bounded rational agents. In this extension the equilibrium of the game is the optimizer of a Lagrangian of the (probability distribution of) the joint stare of the agents. Accordingly we can consider a team game in which the shared utility is a performance measure of the behavior of the MAS. For such a scenario the game is at equilibrium - the Lagrangian is optimized - when the joint distribution of the agents optimizes the system's expected performance. One common way to find that equilibrium is to have each agent run a reinforcement learning algorithm. Here we investigate the alternative of exploiting PD theory to run gradient descent on the Lagrangian. We present computer experiments validating some of the predictions of PD theory for how best to do that gradient descent. We also demonstrate how PD theory can improve performance even when we are not allowed to rerun the MAS from different initial conditions, a requirement implicit in some previous work.
Theory and practice of runoff space-time distribution
WANG; Hao; WANG; Chengming; WANG; Jianhua; QIN; Dayong; ZH
2004-01-01
Based on the domestic and foreign concerning researches, this paper submits the runoff space-time distribution theory which shows evident scientific significances and powerful practical functions. On the basis of digital basin unit cell deriving from the digital elevation model (DEM) and assumption of linear confluence, this theory has been applied successfully to the runoff correlation researches in humid regions. In order to prove the adaptability of the theory in arid and semi-drought regions,this paper is used to the runoff correlation analysis in Wuding River basin--a tributary of Yellow River Basin, and has gained preliminary effective verification.
FOURIER SERIES AND CHEBYSHEV POLYNOMIALS IN STATISTICAL DISTRIBUTION THEORY.
After the elementary functions, the Fourier series are the most important functions in applied mathematics. Nevertheless, they have been somewhat...neglected in statistical distribution theory. In this paper, the reasons for this omission are investigated and certain modifications of the Fourier ... series proposed. These results are presented in the form of representation theorems. In addition to the basic theorems, computational algorithms and
A Positive and a Normative Theory of Income Distribution
J. Tinbergen (Jan)
1970-01-01
textabstractA positive theory of income distribution based on assumptions concerning the supply of and demand for each type of productive service is presented. The demand function of the organizers of production may be derived from the maximization of profits with the income scale and the production
Marshall ̶ Olkin Distributions : Advances in Theory and Applications
Durante, Fabrizio; Mulinacci, Sabrina
2015-01-01
This book presents the latest advances in the theory and practice of Marshall-Olkin distributions. These distributions have been increasingly applied in statistical practice in recent years, as they make it possible to describe interesting features of stochastic models like non-exchangeability, tail dependencies and the presence of a singular component. The book presents cutting-edge contributions in this research area, with a particular emphasis on financial and economic applications. It is recommended for researchers working in applied probability and statistics, as well as for practitioners interested in the use of stochastic models in economics. This volume collects selected contributions from the conference “Marshall-Olkin Distributions: Advances in Theory and Applications,” held in Bologna on October 2-3, 2013.
Value distribution theory and the research of Yang Lo
HAYMAN; W.; K.
2010-01-01
Value distribution theory is concerned with the position and frequency of solutions of the equation f(z) = a. Here f may be entire, i.e. an everywhere convergent power series or meromorphic, i.e. the ratio of two such series, or a function in some other domains, such as an angle or a disk. Yang Lo’s significant contributions to this area will be highlighted. Some of his important contributions to normal families will also be described.
Mean distribution approach to spin and gauge theories
Akerlund, Oscar
2016-01-01
We formulate self-consistency equations for the distribution of links in spin models and of plaquettes in gauge theories. This improves upon known mean-field, mean-link, and mean-plaquette approximations in such that we self-consistently determine all moments of the considered variable instead of just the first. We give examples in both Abelian and non-Abelian cases.
The effectiveness of mean-field theory for avalanche distributions
Lee, Edward; Raju, Archishman; Sethna, James
We explore the mean-field theory of the pseudogap found in avalanche systems with long-range anisotropic interactions using analytical and numerical tools. The pseudogap in the density of low-stability states emerges from the competition between stabilizing interactions between spins in an avalanche and the destabilizing random movement towards the threshold caused by anisotropic couplings. Pazmandi et al. have shown that for the Sherrington-Kirkpatrick model, the pseudogap scales linearly and produces a distribution of avalanche sizes with exponent t=1 in contrast with that predicted from RFIM t=3/2. Lin et al. have argued that the scaling exponent ? of the pseudogap depends on the tail of the distribution of couplings and on non-universal values like the strain rate and the magnitude of the coupling strength. Yet others have argued that the relationship between the pseudogap scaling and the distribution of avalanche sizes is dependent on dynamical details. Despite the theoretical arguments, the class of RFIM mean-field models is surprisingly good at predicting the distribution of avalanche sizes in a variety of different magnetic systems. We investigate these differences with a combination of theory and simulation.
Sugita, Masatake; Hirata, Fumio
2016-09-01
A protocol to calculate the binding free energy of a host-guest system is proposed based on the MM/3D-RISM method, taking cyclodextrin derivatives and their ligands as model systems. The protocol involves the procedure to identify the most probable binding mode (MPBM) of receptors and ligands by means of the umbrella sampling method. The binding free energies calculated by the MM/3D-RISM method for the complexes of the seven ligands with the MPBM of the cyclodextrin, and with the fluctuated structures around it, are in agreement with the corresponding experimental data in a semi-quantitative manner. It suggests that the protocol proposed here is promising for predicting the binding affinity of a small ligand to a relatively rigid receptor such as cyclodextrin.
Vidal, Daniel
2008-01-01
L’ésotérisme en tous ses états, jusqu’à la formulation contemporaine du « New Age », n’a cessé d’interpeller, et d’inquiéter, une Église catholique assurée de ses canons et de ses dogmes. En 2002, les Conseils pontificaux pour le dialogue inter-religieux et pour la culture, publient le rapport Jésus Christ le porteur d’eau vive : une réflexion chrétienne sur le « Nouvel Age », qui exprime la « perception catholique, essentialiste, doctrinale et hérésiologique de l’ésotérisme », d’hier et d’au...
Vidal, Daniel
2008-01-01
L’ésotérisme en tous ses états, jusqu’à la formulation contemporaine du « New Age », n’a cessé d’interpeller, et d’inquiéter, une Église catholique assurée de ses canons et de ses dogmes. En 2002, les Conseils pontificaux pour le dialogue inter-religieux et pour la culture, publient le rapport Jésus Christ le porteur d’eau vive : une réflexion chrétienne sur le « Nouvel Age », qui exprime la « perception catholique, essentialiste, doctrinale et hérésiologique de l’ésotérisme », d’hier et d’au...
Towards Resource Theory of Coherence in Distributed Scenarios
Streltsov, Alexander; Rana, Swapan; Bera, Manabendra Nath; Lewenstein, Maciej
2017-01-01
The search for a simple description of fundamental physical processes is an important part of quantum theory. One example for such an abstraction can be found in the distance lab paradigm: if two separated parties are connected via a classical channel, it is notoriously difficult to characterize all possible operations these parties can perform. This class of operations is widely known as local operations and classical communication. Surprisingly, the situation becomes comparably simple if the more general class of separable operations is considered, a finding that has been extensively used in quantum information theory for many years. Here, we propose a related approach for the resource theory of quantum coherence, where two distant parties can perform only measurements that do not create coherence and can communicate their outcomes via a classical channel. We call this class local incoherent operations and classical communication. While the characterization of this class is also difficult in general, we show that the larger class of separable incoherent operations has a simple mathematical form, yet still preserves the main features of local incoherent operations and classical communication. We demonstrate the relevance of our approach by applying it to three different tasks: assisted coherence distillation, quantum teleportation, and single-shot quantum state merging. We expect that the results we obtain in this work also transfer to other concepts of coherence that are discussed in recent literature. The approach we present here opens new ways to study the resource theory of coherence in distributed scenarios.
Chiral perturbation theory for generalized parton distributions and baryon distribution amplitudes
Wein, Philipp
2016-05-06
In this thesis we apply low-energy effective field theory to the first moments of generalized parton distributions and to baryon distribution amplitudes, which are both highly relevant for the parametrization of the nonperturbative part in hard processes. These quantities yield complementary information on hadron structure, since the former treat hadrons as a whole and, thus, give information about the (angular) momentum carried by an entire parton species on average, while the latter parametrize the momentum distribution within an individual Fock state. By performing one-loop calculations within covariant baryon chiral perturbation theory, we obtain sensible parametrizations of the quark mass dependence that are ideally suited for the subsequent analysis of lattice QCD data.
Toward a theory of distributed word expert natural language parsing
Rieger, C.; Small, S.
1981-01-01
An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.
Distribution theory with applications in engineering and physics
Teodorescu, Petre P; Toma, Antonela
2013-01-01
In this comprehensive monograph, the authors apply modern mathematical methods to the study of mechanical and physical phenomena or techniques in acoustics, optics, and electrostatics, where classical mathematical tools fail.They present a general method of approaching problems, pointing out different aspects and difficulties that may occur. With respect to the theory of distributions, only the results and the principle theorems are given as well as some mathematical results. The book also systematically deals with a large number of applications to problems of general Newtonian mechanics,
Latitudinal phytoplankton distribution and the neutral theory of biodiversity
Chust, Guillem
2012-11-16
Recent studies have suggested that global diatom distributions are not limited by dispersal, in the case of both extant species and fossil species, but rather that environmental filtering explains their spatial patterns. Hubbell\\'s neutral theory of biodiversity provides a framework in which to test these alternatives. Our aim is to test whether the structure of marine phytoplankton (diatoms, dinoflagellates and coccolithophores) assemblages across the Atlantic agrees with neutral theory predictions. We asked: (1) whether intersite variance in phytoplankton diversity is explained predominantly by dispersal limitation or by environmental conditions; and (2) whether species abundance distributions are consistent with those expected by the neutral model. Location: Meridional transect of the Atlantic (50° N-50° S). Methods: We estimated the relative contributions of environmental factors and geographic distance to phytoplankton composition using similarity matrices, Mantel tests and variation partitioning of the species composition based upon canonical ordination methods. We compared the species abundance distribution of phytoplankton with the neutral model using Etienne\\'s maximum-likelihood inference method. Results: Phytoplankton communities are slightly more determined by niche segregation (24%), than by dispersal limitation and ecological drift (17%). In 60% of communities, the assumption of neutrality in species\\' abundance distributions could not be rejected. In tropical zones, where oceanic gyres enclose large stable water masses, most communities showed low species immigration rates; in contrast, we infer that communities in temperate areas, out of oligotrophic gyres, have higher rates of species immigration. Conclusions: Phytoplankton community structure is consistent with partial niche assembly and partial dispersal and drift assembly (neutral processes). The role of dispersal limitation is almost as important as habitat filtering, a fact that has been
Spatiotemporal Chaos in Distributed Systems: Theory and Practice
Pavlos, George P.; Iliopoulos, A. C.; Tsoutsouras, V. G.; Karakatsanis, L. P.; Pavlos, E. G.
This paper presents theoretical and experimental results concerning the hypothesis of spatiotemporal chaos in distributed physical systems far from equilibrium. Modern tools of nonlinear time series analysis, such as the correlation dimension and the maximum Lyapunov exponent, were applied to various time series, corresponding to different physical systems such as space plasmas (solar flares, magnetic-electric field components) lithosphere-faults system (earthquakes) brain and cardiac dynamics during or without epileptic episodes. Futhermore, the method of surrogate data was used for the exclusion of 'pseudo chaos' caused by the nonlinear distortion of a purely stochastic process. The results of the nonlinear analysis presented in this study constitute experimental evidence for significant phenomena indicated by the theory of nonequilibrium dynamics such as nonequilibrium phase transition, chaotic synchronization, chaotic intermittency, directed percolation, defect turbulence, spinodal nucleation and clustering.
Orbit limited theory in the solar wind - κ distributions
Martinović M.M.
2016-01-01
Full Text Available When a solid object is immersed into ionized gas it gets brought to a certain value of electrostatic potential and surrounded by a space charge region called ‘plasma sheath’. Through this region, particles are attracted or repelled from the surface of the charge collecting object. For collisionless plasma, this process is described by the so-called orbit limited theory, which explains how the collection of particles is determined by the collector geometry and plasma velocity distribution function (VDF. In this article, we provide explicit expressions for orbit-limited currents for generalized Lorentzian (κ distributions. This work is useful to describe the charging processes of objects in non-collisional plasmas like the solar wind, where the electrons VDF is often observed to exhibit quasi power-law populations of suprathermal particles. It is found that these ‘suprathermals’ considerably increase the charge collection. Since the surface charging process that determines the value of electrostatic potential is also affected by the plasma VDF, calculation of the collector potential in the solar wind is described along with some quantitative predictions. [Projekat Ministarstva nauke Republike Srbije, br. 176002
Beyond Flory theory: Distribution functions for interacting lattice trees
Rosa, Angelo; Everaers, Ralf
2017-01-01
While Flory theories [J. Isaacson and T. C. Lubensky, J. Physique Lett. 41, 469 (1980), 10.1051/jphyslet:019800041019046900; M. Daoud and J. F. Joanny, J. Physique 42, 1359 (1981), 10.1051/jphys:0198100420100135900; A. M. Gutin et al., Macromolecules 26, 1293 (1993), 10.1021/ma00058a016] provide an extremely useful framework for understanding the behavior of interacting, randomly branching polymers, the approach is inherently limited. Here we use a combination of scaling arguments and computer simulations to go beyond a Gaussian description. We analyze distribution functions for a wide variety of quantities characterizing the tree connectivities and conformations for the four different statistical ensembles, which we have studied numerically in [A. Rosa and R. Everaers, J. Phys. A: Math. Theor. 49, 345001 (2016), 10.1088/1751-8113/49/34/345001 and J. Chem. Phys. 145, 164906 (2016), 10.1063/1.4965827]: (a) ideal randomly branching polymers, (b) 2 d and 3 d melts of interacting randomly branching polymers, (c) 3 d self-avoiding trees with annealed connectivity, and (d) 3 d self-avoiding trees with quenched ideal connectivity. In particular, we investigate the distributions (i) pN(n ) of the weight, n , of branches cut from trees of mass N by severing randomly chosen bonds; (ii) pN(l ) of the contour distances, l , between monomers; (iii) pN(r ⃗) of spatial distances, r ⃗, between monomers, and (iv) pN(r ⃗|l ) of the end-to-end distance of paths of length l . Data for different tree sizes superimpose, when expressed as functions of suitably rescaled observables x ⃗=r ⃗/√{ } or x =l / . In particular, we observe a generalized Kramers relation for the branch weight distributions (i) and find that all the other distributions (ii-iv) are of Redner-des Cloizeaux type, q (x ⃗) =C |x| θexp(-(K|x |) t) . We propose a coherent framework, including generalized Fisher-Pincus relations, relating most of the RdC exponents to each other and to the contact and Flory
Devanthery, N.; Luzi, G.; Stramondo, S.; Bignami, C.; Pierdicca, N.; Wegmuller, U.; Romaniello, V.; Anniballe, R.; Piscini, A.; Albano, M.; Moro, M.; Crosetto, M.
2016-08-01
The estimate of damage after an earthquake using spaceborne remote sensing data is one of the main application of the change detection methodologies widely discussed in literature. APhoRISM - Advanced PRocedures for volcanIc and Seismic Monitoring is a collaborative European Commission project (FP7-SP ACE- 2013-1) addressing the development of innovative methods, using space and ground sensors to support the management and mitigation of the seismic and the volcanic risk. In this paper a novel approach aimed at damage assessment based on the use of a priori information derived by different sources in a preparedness phase is described and a preliminary validation is shown.
Distribution theory approach to implementing directional acoustic sensors.
Schmidlin, Dean J
2010-01-01
The objective of directional acoustic sensors is to provide high directivity while occupying a small amount of space. An idealized point sensor achieves this objective from a knowledge of the spatial partial derivatives of acoustic pressure at a point in space. Direct measurement of these derivatives is difficult in practice. Consequently, it is expedient to come up with indirect methods. The use of pressure sensors to construct finite-difference approximations is an example of such a method. This paper utilizes the theory of distributions to derive another indirect method for estimating the various spatial partial derivatives of the pressure. This alternate method is then used to construct a multichannel filter which processes the acoustic pressure by mean of three-dimensional integral transforms throughout a 6epsilon-length cube centered at the origin. The output of the multichannel filter is a spatially and temporally filtered version of the pressure at the origin. The temporal filter is a lowpass Gaussian filter whose bandwidth is inversely proportional to epsilon. Finally, the lattice method for numerical multiple integration is utilized to develop a discrete-spatial version of the multichannel filter.
M-Estimation for Discrete Data: Asymptotic Distribution Theory and Implications.
1985-11-01
n 860 0029 M-ESTIMATION FOR DISCRETE DATA: ASYMPTOTIC DISTRIBUTION THEORY AND IMPLICATIONS by Douglas G. Simpson 1 Departraent of Statistics... distribution theory of M-estimators especially relevant to discrete data, although Theorem 1 is somewhat broader in scope’. The main results are given in...Extended asymptotic distribution theory Conditions for consistency of an M-estimator can be found in Huber (1964, 1967, 1981). Since the smoothness plays
M-Estimation for Discrete Data. Asymptotic Distribution Theory and Implications.
1985-10-01
DATA: ASYMPTOTIC DISTRIBUTION THEORY AND IMPLICATIONS by 1 Douglas G. Simpson Departr.ient of Statistics University of Illinois Urbana, Illinois...asymptotic distribution theory of M-estimators especially relevant to discrete data, although Theorem 1 is somewhat broader in scope’. The main results are...The version (2.5) c (X9 / - ), where s(a)=9 / + (c) is defined by (2.3), is slightly more convenient. 3. Extended asymptotic distribution theory Conditions
Distribution of local density of states in superstatistical random matrix theory
Abul-Magd, A.Y. [Department of Mathematics, Faculty of Science, Zagazig University, Zagazig (Egypt)]. E-mail: a_y_abul_magd@hotmail.com
2007-07-02
We expose an interesting connection between the distribution of local spectral density of states arising in the theory of disordered systems and the notion of superstatistics introduced by Beck and Cohen and recently incorporated in random matrix theory. The latter represents the matrix-element joint probability density function as an average of the corresponding quantity in the standard random-matrix theory over a distribution of level densities. We show that this distribution is in reasonable agreement with the numerical calculation for a disordered wire, which suggests to use the results of theory of disordered conductors in estimating the parameter distribution of the superstatistical random-matrix ensemble.
Palmer, David S; Mišin, Maksim; Fedorov, Maxim V; Llinas, Antonio
2015-09-08
We report a method to predict physicochemical properties of druglike molecules using a classical statistical mechanics based solvent model combined with machine learning. The RISM-MOL-INF method introduced here provides an accurate technique to characterize solvation and desolvation processes based on solute-solvent correlation functions computed by the 1D reference interaction site model of the integral equation theory of molecular liquids. These functions can be obtained in a matter of minutes for most small organic and druglike molecules using existing software (RISM-MOL) (Sergiievskyi, V. P.; Hackbusch, W.; Fedorov, M. V. J. Comput. Chem. 2011, 32, 1982-1992). Predictions of caco-2 cell permeability and hydration free energy obtained using the RISM-MOL-INF method are shown to be more accurate than the state-of-the-art tools for benchmark data sets. Due to the importance of solvation and desolvation effects in biological systems, it is anticipated that the RISM-MOL-INF approach will find many applications in biophysical and biomedical property prediction.
Dirichlet and Related Distributions Theory, Methods and Applications
Ng, Kai Wang; Tang, Man-Lai
2011-01-01
The Dirichlet distribution appears in many areas of application, which include modelling of compositional data, Bayesian analysis, statistical genetics, and nonparametric inference. This book provides a comprehensive review of the Dirichlet distribution and two extended versions, the Grouped Dirichlet Distribution (GDD) and the Nested Dirichlet Distribution (NDD), arising from likelihood and Bayesian analysis of incomplete categorical data and survey data with non-response. The theoretical properties and applications are also reviewed in detail for other related distributions, such as the inve
JIN Xuexiang; SU Yuelong; ZHANG Yi; WEI Zheng; LI Li
2009-01-01
The modeling of headway/spacing between two consecutive vehicles in a queue has many appli-cations in traffic flow theory and transport practice. Most known approaches have only studied vehicles on freeways. This paper presents a model for the spacing distribution of queuing vehicles at a signalized junc-tion based on random-matrix theory. The spacing distribution of a Gaussian symplectic ensemble (GSE) fits well with recently measured spacing distribution data. These results are also compared with measured spacing distribution observed for the car parking problem. Vehicle stationary queuing and vehicle parking have different spacing distributions due to different driving patterns.
Munaò, Gianmarco, E-mail: gmunao@unime.it; Costa, Dino; Caccamo, Carlo [Dipartimento di Fisica e di Scienze della Terra, Università degli Studi di Messina, Viale F. Stagno d’Alcontres 31, 98166 Messina (Italy); Gámez, Francisco [C/Clavel 101, Mairena del Aljarafe, 41927 Seville (Spain); Sciortino, Francesco [Dipartimento di Fisica and CNR-ISC, Università di Roma “Sapienza,” Piazzale Aldo Moro 2, 00185 Roma (Italy); Giacometti, Achille [Dipartimento di Scienze Molecolari e Nanosistemi, Università Ca’ Foscari Venezia, Calle Larga S.Marta DD2137, Venezia I-30123 (Italy)
2015-06-14
We investigate thermodynamic properties of anisotropic colloidal dumbbells in the frameworks provided by the Reference Interaction Site Model (RISM) theory and an Optimized Perturbation Theory (OPT), this latter based on a fourth-order high-temperature perturbative expansion of the free energy, recently generalized to molecular fluids. Our model is constituted by two identical tangent hard spheres surrounded by square-well attractions with same widths and progressively different depths. Gas-liquid coexistence curves are obtained by predicting pressures, free energies, and chemical potentials. In comparison with previous simulation results, RISM and OPT agree in reproducing the progressive reduction of the gas-liquid phase separation as the anisotropy of the interaction potential becomes more pronounced; in particular, the RISM theory provides reasonable predictions for all coexistence curves, bar the strong anisotropy regime, whereas OPT performs generally less well. Both theories predict a linear dependence of the critical temperature on the interaction strength, reproducing in this way the mean-field behavior observed in simulations; the critical density—that drastically drops as the anisotropy increases—turns to be less accurate. Our results appear as a robust benchmark for further theoretical studies, in support to the simulation approach, of self-assembly in model colloidal systems.
Prospect theory for continuous distributions: A preference foundation
A.V. Kothiyal (Amit); V. Spinu (Vitalie)
2011-01-01
textabstractPreference foundations give necessary and sufficient conditions for a decision model, stated directly in terms of the empirical primitive: the preference relation. For the most popular descriptive model for decision making under risk and uncertainty today, prospect theory, preference fou
Exponential wealth distribution : a new approach from functional iteration theory*
López José-Luis
2012-08-01
Full Text Available Different approaches are possible in order to derive the exponential regime in statistical systems. Here, a new functional equation is proposed in an economic context to explain the wealth exponential distribution. Concretely, the new iteration [1] given by egin{equation} f_{n+1}(x = int!!int_{u+v>x},{f_n(uf_n(vover u+v} ; {mathrm d}u{mathrm d}v ,. onumber label{syst1} end{equation} f n + 1 ( x = ∫ ∫ u + v > x f n ( u f n ( v u + v d u d v . It is found that the exponential distribution is a stable fixed point of this functional iteration equation. From this point of view, it is easily understood why the exponential wealth distribution (or by extension, other kind of distributions is asymptotically obtained in different multi-agent economic models. Différentes approches pour dériver le régime asymptotique exponentiel dans les systèmes statistiques sont possibles. Ici une nouvelle équation fonctionnelle est proposée, dans le cadre des systèmes économiques, pour expliquer la distribution exponentielle. Nous montrons que cette distribution est le seul point fixe vers lequel la dynamique de cette équation fonctionnelle évolue quand l’itération va vers l’infini. De ce point de vue, il est facile de comprendre l’ubiquité de cette distribution (ou d’autres en différents problèmes statistiques réels.
Continuous and distributed systems II theory and applications
Zgurovsky, Mikhail
2015-01-01
As in the previous volume on the topic, the authors close the gap between abstract mathematical approaches, such as applied methods of modern algebra and analysis, fundamental and computational mechanics, nonautonomous and stochastic dynamical systems, on the one hand, and practical applications in nonlinear mechanics, optimization, decision making theory and control theory on the other. Readers will also benefit from the presentation of modern mathematical modeling methods for the numerical solution of complicated engineering problems in biochemistry, geophysics, biology and climatology. This compilation will be of interest to mathematicians and engineers working at the interface of these fields. It presents selected works of the joint seminar series of Lomonosov Moscow State University and the Institute for Applied System Analysis at National Technical University of Ukraine “Kyiv Polytechnic Institute”. The authors come from Brazil, Germany, France, Mexico, Spain, Poland, Russia, Ukraine, and the USA. ...
Rubio de Francia's extrapolation theory: estimates for the distribution function
Carro, María J; Torres, Rodolfo H
2010-01-01
Let $T$ be an arbitrary operator bounded from $L^{p_0}(w)$ into $L^{p_0, \\infty}(w)$ for every weight $w$ in the Muckenhoupt class $A_{p_0}$. It is proved in this article that the distribution function of $Tf$ with respect to any weight $u$ can be essentially majorized by the distribution function of $Mf$ with respect to $u$ (plus an integral term easy to control). As a consequence, well-known extrapolation results, including results in a multilinear setting, can be obtained with very simple proofs. New applications in extrapolation for two-weight problems and estimates on rearrangement invariant spaces are established too.
Log-concave Probability Distributions: Theory and Statistical Testing
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and mul...
A macro-distributive theory dispelling the econometric fog
S. WEINTRAUB
2013-12-01
Full Text Available The work presents a linear model of the private market sector of the economy built out of the main blocks of Keynes with some pieces furnished by Kalecki, Kaldor, and Robinson. The author argues that such model is rich in promise by virtue of its scope and that it strips much of the mystery from econometric models and demonstrates the circumstances in which they are likely to perform well or badly. The consistent relations offer hospitable shelter for the theory of income, employment, price level, and income shares in a succinct design. Pedagogically, the elemental ideas are capable of transmission at an early stage in economic studies.
Derry, Sharon J.; DuRussel, Lori Adams; O'Donnell, Angela M.
We present a developing distributed cognition theory of interdisciplinary collaboration that incorporates concepts from both situated cognition and information processing theory. This theoretical framework is being refined as it is used for analyzing interdisciplinary collaboration within the National Institute of Science Education (NISE). The…
Research on the Cost Allocation of Joint Distribution of Agricultural Products based on Game Theory
Jing Wang
2013-08-01
Full Text Available Joint distribution in the process of circulation of agricultural products can reduce the cost of agricultural products circulation, improves the efficiency of logistics distribution, but for how to solve the problem of cost allocation has always been the major obstacle to the development of this model. The joint distribution model of agricultural products is presented in this study and then considers the problem of cost reduction in joint distribution of two agricultural products retailers. The amount of cost reduction is regarded as the income of distribution, which is distributed effectively by using game theory and resolve the problem of Cost allocation in joint distribution. Through the analysis of an example the joint distribution model can largely reduce the cost of distribution for agricultural products. Finally, through the distribution cost allocation verified the effectiveness and feasibility of this method of cost allocation.
Power law distribution of seismic rates: theory and data
Saichev, A
2004-01-01
We report an empirical determination of the probability density functions P(r) of the number r of earthquakes in finite space-time windows for the California catalog, over fixed spatial boxes 5 x 5 km^2 and time intervals dt =1, 10, 100 and 1000 days. We find a stable power law tail P(r) ~ 1/r^{1+mu} with exponent mu \\approx 1.6 for all time intervals. These observations are explained by a simple stochastic branching process previously studied by many authors, the ETAS (epidemic-type aftershock sequence) model which assumes that each earthquake can trigger other earthquakes (``aftershocks''). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. We develop the full theory in terms of generating functions for describing the space-time organization of earthquake sequences and develop several approximations to solve the equations. The calibration of the theory to the empirical observations shows that it is essential to augment the ETAS model by taking account of th...
Log-concave Probability Distributions: Theory and Statistical Testing
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
Theory of Nanocluster Size Distributions from Ion Beam Synthesis
Yuan, C.W.; Yi, D.O.; Sharp, I.D.; Shin, S.J.; Liao, C.Y.; Guzman, J.; Ager III, J.W.; Haller, E.E.; Chrzan, D.C.
2008-06-13
Ion beam synthesis of nanoclusters is studied via both kinetic Monte Carlo simulations and the self-consistent mean-field solution to a set of coupled rate equations. Both approaches predict the existence of a steady state shape for the cluster size distribution that depends only on a characteristic length determined by the ratio of the effective diffusion coefficient to the ion flux. The average cluster size in the steady state regime is determined by the implanted species/matrix interface energy.
Testing monotonicity of a hazard: asymptotic distribution theory
Groeneboom, Piet
2011-01-01
Two new test statistics are introduced to test the null hypotheses that the sampling distribution has an increasing hazard rate on a specified interval [0,a]. These statistics are empirical L_1-type distances between the isotonic estimates, which use the monotonicity constraint, and either the empirical distribution function or the empirical cumulative hazard. They measure the excursions of the empirical estimates with respect to the isotonic estimates, due to local non-monotonicity. Asymptotic normality of the test statistics, if the hazard is strictly increasing on [0,a], is established under mild conditions. This is done by first approximating the global empirical distance by an distance with respect to the underlying distribution function. The resulting integral is treated as sum of increasingly many local integrals to which a CLT can be applied. The behavior of the local integrals is determined by a canonical process: the difference between the stochastic process x -> W(x)+x^2 where W is standard two-sid...
Hengst, Julie A
2015-01-01
This article proposes distributed communication as a promising theoretical framework for building supportive environments for child language development. Distributed communication is grounded in an emerging intersection of cultural-historical activity theory (CHAT) and theories of communicative practices that argue for integrating accounts of language, cognition and culture. The article first defines and illustrates through selected research articles, three key principles of distributed communication: (a) language and all communicative resources are inextricably embedded in activity; (b) successful communication depends on common ground built up through short- and long-term histories of participation in activities; and (c) language cannot act alone, but is always orchestrated with other communicative resources. It then illustrates how these principles are fully integrated in everyday interactions by drawing from my research on Cindy Magic, a verbal make-believe game played by a father and his two daughters. Overall, the research presented here points to the remarkably complex communicative environments and sophisticated forms of distributed communication children routinely engage in as they interact with peer and adult communication partners in everyday settings. The article concludes by considering implications of these theories for, and examples of, distributed communication relevant to clinical intervention. Readers will learn about (1) distributed communication as a conceptual tool grounded in an emerging intersection of cultural-historical activity theory and theories of communicative practices and (2) how to apply distributed communication to the study of child language development and to interventions for children with communication disorders. Copyright © 2015 Elsevier Inc. All rights reserved.
Amadei, A; Apol, MEF; DiNola, A; Berendsen, HJC
1996-01-01
A new theory is presented for calculating the Helmholtz free energy based on the potential energy distribution function. The usual expressions of free energy, internal energy and entropy involving the partition function are rephrased in terms of the potential energy distribution function, which must
Amadei, A; Apol, MEF; DiNola, A; Berendsen, HJC
1996-01-01
A new theory is presented for calculating the Helmholtz free energy based on the potential energy distribution function. The usual expressions of free energy, internal energy and entropy involving the partition function are rephrased in terms of the potential energy distribution function, which must
Analysis of Product Distribution Strategy in Digital Publishing Industry Based on Game-Theory
Xu, Li-ping; Chen, Haiyan
2017-04-01
The digital publishing output increased significantly year by year. It has been the most vigorous point of economic growth and has been more important to press and publication industry. Its distribution channel has been diversified, which is different from the traditional industry. A deep research has been done in digital publishing industry, for making clear of the constitution of the industry chain and establishing the model of industry chain. The cooperative and competitive relationship between different distribution channels have been analyzed basing on a game-theory. By comparing the distribution quantity and the market size between the static distribution strategy and dynamic distribution strategy, we get the theory evidence about how to choose the distribution strategy to get the optimal benefit.
Recent developments in quantum key distribution: Theory and practice
Mauerer, W.; Helwig, W.; Silberhorn, C. [Max Planck Research Group, Institute of Optics, Information and Photonics, Integrated Quantum Optics Group, University of Erlangen-Nuernberg, Guenther-Scharowsky-Strasse 1/Bau 24, 91058 Erlangen (Germany)
2008-02-15
Quantum key distribution is among the foremost applications of quantum mechanics, both in terms of fundamental physics and as a technology on the brink of commercial deployment. Starting from principal schemes and initial proofs of unconditional security for perfect systems, much effort has gone into providing secure schemes which can cope with numerous experimental imperfections unavoidable in real world implementations. In this paper, we provide a comparison of various schemes and protocols. We analyse their efficiency and performance when implemented with imperfect physical components. We consider how experimental faults are accounted for using effective parameters. We compare various recent protocols and provide guidelines as to which components propose best advances when being improved. (Abstract Copyright [2008], Wiley Periodicals, Inc.)
Distribution theory for Schrödinger’s integral equation
Lange, Rutger-Jan, E-mail: rutger-jan.lange@cantab.net [VU University Amsterdam, 1081 HV Amsterdam (Netherlands)
2015-12-15
Much of the literature on point interactions in quantum mechanics has focused on the differential form of Schrödinger’s equation. This paper, in contrast, investigates the integral form of Schrödinger’s equation. While both forms are known to be equivalent for smooth potentials, this is not true for distributional potentials. Here, we assume that the potential is given by a distribution defined on the space of discontinuous test functions. First, by using Schrödinger’s integral equation, we confirm a seminal result by Kurasov, which was originally obtained in the context of Schrödinger’s differential equation. This hints at a possible deeper connection between both forms of the equation. We also sketch a generalisation of Kurasov’s [J. Math. Anal. Appl. 201(1), 297–323 (1996)] result to hypersurfaces. Second, we derive a new closed-form solution to Schrödinger’s integral equation with a delta prime potential. This potential has attracted considerable attention, including some controversy. Interestingly, the derived propagator satisfies boundary conditions that were previously derived using Schrödinger’s differential equation. Third, we derive boundary conditions for “super-singular” potentials given by higher-order derivatives of the delta potential. These boundary conditions cannot be incorporated into the normal framework of self-adjoint extensions. We show that the boundary conditions depend on the energy of the solution and that probability is conserved. This paper thereby confirms several seminal results and derives some new ones. In sum, it shows that Schrödinger’s integral equation is a viable tool for studying singular interactions in quantum mechanics.
Distribution theory for Schrödinger's integral equation
Lange, Rutger-Jan
2015-12-01
Much of the literature on point interactions in quantum mechanics has focused on the differential form of Schrödinger's equation. This paper, in contrast, investigates the integral form of Schrödinger's equation. While both forms are known to be equivalent for smooth potentials, this is not true for distributional potentials. Here, we assume that the potential is given by a distribution defined on the space of discontinuous test functions. First, by using Schrödinger's integral equation, we confirm a seminal result by Kurasov, which was originally obtained in the context of Schrödinger's differential equation. This hints at a possible deeper connection between both forms of the equation. We also sketch a generalisation of Kurasov's [J. Math. Anal. Appl. 201(1), 297-323 (1996)] result to hypersurfaces. Second, we derive a new closed-form solution to Schrödinger's integral equation with a delta prime potential. This potential has attracted considerable attention, including some controversy. Interestingly, the derived propagator satisfies boundary conditions that were previously derived using Schrödinger's differential equation. Third, we derive boundary conditions for "super-singular" potentials given by higher-order derivatives of the delta potential. These boundary conditions cannot be incorporated into the normal framework of self-adjoint extensions. We show that the boundary conditions depend on the energy of the solution and that probability is conserved. This paper thereby confirms several seminal results and derives some new ones. In sum, it shows that Schrödinger's integral equation is a viable tool for studying singular interactions in quantum mechanics.
Issues in biomedical statistics: comparing means under normal distribution theory.
Ludbrook, J
1995-04-01
The test used most commonly in biomedical research to compare means when measurements have been made on a continuous scale is Student's t-test, followed closely by various forms of analysis of variance. These tests require that defined populations have been randomly sampled, but there are other assumptions about populations and samples that must be satisfied. These include: (i) normality of the population distributions; (ii) equal variance in those normal populations; and (iii) statistical independence of the samples. This review offers advice to investigators on how to recognize breaches of the assumptions of normality and equality of variance, and how to deal with them by modifying the usual t-test or by transforming the experimental data. The sample-size also has an important bearing on statistical inferences: (i) if it is too small, the risk of Type II error is inflated; and (ii) inequality of sample size exaggerates the effects of inequality of variance. The assumption of independence is breached if repeated measurements are made serially rather than in random order, but adjustments to analysis of variance can be made to correct for the inflated risk of Type I error. The review also considers the problem of making multiple comparisons of means, and recommends solutions.
Decoy State Quantum Key Distribution: Theory and Practice
Zhao, Yi; Lo, Hoi-Kwong; Ma, Xiongfeng; Qi, Bing; Chen, Kai; Qian, Li
2007-03-01
Decoy state quantum key distribution (QKD) has been proposed as a novel approach to improve dramatically both the security and the performance of practical QKD set-ups. We proved its security, and proposed the first practical decoy state QKD protocols, including the one-decoy protocol, the weak+vacuum protocol, and the general two-decoy protocol. Our further study shows that the two-way communication can effectively improve the performance of decoy state QKD. We performed the first experiments of decoy state QKD. Two protocols -- the one-decoy protocol and the weak+vacuum protocol -- were implemented with a maximum transmission distance of 60km. We implemented the decoy state method by adding commercial acousto-optic modulator to a commercial QKD system. Our theoretical and experimental studies show explicitly the power and the feasibility of decoy method, and brings it to our real- life. Our works are published in [1-5]. [1] H. -K. Lo, X. Ma, and K. Chen, Phys. Rev. Lett. 94 230504 (2005) [2] X. Ma et. al., Phys. Rev. A 72, 012326 (2005) [3] Y. Zhao et. al., Phys. Rev. Lett., 96, 070502 (2006) [4] Y. Zhao et. al., in Proceedings of IEEE ISIT (IEEE, 2006) pp. 2094-2098 [5] X. Ma et. al., Phys. Rev. A 74, 032330 (2006)
Evaluating ecohydrological theories of woody root distribution in the Kalahari.
Abinash Bhattachan
Full Text Available The contribution of savannas to global carbon storage is poorly understood, in part due to lack of knowledge of the amount of belowground biomass. In these ecosystems, the coexistence of woody and herbaceous life forms is often explained on the basis of belowground interactions among roots. However, the distribution of root biomass in savannas has seldom been investigated, and the dependence of root biomass on rainfall regime remains unclear, particularly for woody plants. Here we investigate patterns of belowground woody biomass along a rainfall gradient in the Kalahari of southern Africa, a region with consistent sandy soils. We test the hypotheses that (1 the root depth increases with mean annual precipitation (root optimality and plant hydrotropism hypothesis, and (2 the root-to-shoot ratio increases with decreasing mean annual rainfall (functional equilibrium hypothesis. Both hypotheses have been previously assessed for herbaceous vegetation using global root data sets. Our data do not support these hypotheses for the case of woody plants in savannas. We find that in the Kalahari, the root profiles of woody plants do not become deeper with increasing mean annual precipitation, whereas the root-to-shoot ratios decrease along a gradient of increasing aridity.
Independent test assessment using the extreme value distribution theory.
Almeida, Marcio; Blondell, Lucy; Peralta, Juan M; Kent, Jack W; Jun, Goo; Teslovich, Tanya M; Fuchsberger, Christian; Wood, Andrew R; Manning, Alisa K; Frayling, Timothy M; Cingolani, Pablo E; Sladek, Robert; Dyer, Thomas D; Abecasis, Goncalo; Duggirala, Ravindranath; Blangero, John
2016-01-01
The new generation of whole genome sequencing platforms offers great possibilities and challenges for dissecting the genetic basis of complex traits. With a very high number of sequence variants, a naïve multiple hypothesis threshold correction hinders the identification of reliable associations by the overreduction of statistical power. In this report, we examine 2 alternative approaches to improve the statistical power of a whole genome association study to detect reliable genetic associations. The approaches were tested using the Genetic Analysis Workshop 19 (GAW19) whole genome sequencing data. The first tested method estimates the real number of effective independent tests actually being performed in whole genome association project by the use of an extreme value distribution and a set of phenotype simulations. Given the familiar nature of the GAW19 data and the finite number of pedigree founders in the sample, the number of correlations between genotypes is greater than in a set of unrelated samples. Using our procedure, we estimate that the effective number represents only 15 % of the total number of independent tests performed. However, even using this corrected significance threshold, no genome-wide significant association could be detected for systolic and diastolic blood pressure traits. The second approach implements a biological relevance-driven hypothesis tested by exploiting prior computational predictions on the effect of nonsynonymous genetic variants detected in a whole genome sequencing association study. This guided testing approach was able to identify 2 promising single-nucleotide polymorphisms (SNPs), 1 for each trait, targeting biologically relevant genes that could help shed light on the genesis of the human hypertension. The first gene, PFH14, associated with systolic blood pressure, interacts directly with genes involved in calcium-channel formation and the second gene, MAP4, encodes a microtubule-associated protein and had already been
The force distribution probability function for simple fluids by density functional theory.
Rickayzen, G; Heyes, D M
2013-02-28
Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.
A Stochastic Theory for Deep Bed Filtration Accounting for Dispersion and Size Distributions
Shapiro, Alexander; Bedrikovetsky, P. G.
2010-01-01
We develop a stochastic theory for filtration of suspensions in porous media. The theory takes into account particle and pore size distributions, as well as the random character of the particle motion, which is described in the framework of the theory of continuous-time random walks (CTRW......). In the limit of the infinitely many small walk steps we derive a system of governing equations for the evolution of the particle and pore size distributions. We consider the case of concentrated suspensions, where plugging the pores by particles may change porosity and other parameters of the porous medium....... A procedure for averaging of the derived system of equations is developed for polydisperse suspensions with several distinctive particle sizes. A numerical method for solution of the flow equations is proposed. Sample calculations are applied to compare the roles of the particle size distribution...
Transition state theory: a generalization to nonequilibrium systems with power-law distributions
Jiulin, Du
2011-01-01
Transition state theory (TST) is generalized for the nonequilibrium system with power-law distributions. The stochastic dynamics that gives rise to the power-law distributions for the reaction coordinate and momentum is modeled by the Langevin equations and corresponding Fokker-Planck equations. It is assumed that the system far away from equilibrium has not to relax to a thermal equilibrium state with Boltzmann-Gibbs distribution, but asymptotically approaches to a nonequilibrium stationary-state with power-law distributions. Thus, we obtain a generalization of TST rates to nonequilibrium systems with power-law distributions. Furthermore, we derive the generalized TST rate constants for one-dimension and n-dimension Hamiltonian systems away from equilibrium, and receive a generalized Arrhenius rate for the system with power-law distributions.
The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.
Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica
2014-05-01
The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.
Zemanian, AH
2010-01-01
This well-known text provides a relatively elementary introduction to distribution theory and describes generalized Fourier and Laplace transformations and their applications to integrodifferential equations, difference equations, and passive systems. Suitable for a graduate course for engineering and science students or for an advanced undergraduate course for mathematics majors. 1965 edition.
LUBRICATION BASIS THEORY OF WORM PAIR AND TEMPERATURE DISTRIBUTION ON WORM GEAR SURFACE
1998-01-01
The lubrication basis theory of worm pair is given. The lubrication state of worm gear is analyzed. It is found that the temperature distribution on the tooth surface of worm gear is closely related with the lubrication state and that the temperature on the tooth surface of worm gear is consistent with the characteristic term of mesh and motion of worm pair.
A Theory for the Initial Allocating of Real Time Tasks in Distributed Systems
鄢勇; 金灿明
1992-01-01
Referring to a set of real time tasks with arriving time,executing time and deadline,this paper discusses the problem of polynomial time initial-allocating approximation algorithms in a distributed system and five new results are gained which provide a theory for the designing of initial-allocating algorithms of real time tasks.
Reexamination of Correlations for Nucleate Site Distribution on Boiling Surface by Fractal Theory
YangChunxin
1997-01-01
Nucleate site distribution plays an essential role in nucleate boiling process.In this paper,it is pointed out that the size and spatial distributioin density of nucleate sites presented on real boiling surface can be described by the normalized fractal distribution function,and the physical meaning of parameters involved in some experimental correlations proposed by early investigations are identified according to fractal distribution function.It is further suggested that the surface micro geometry characteristics such as the shape of cavities should be described and analyzed qualitatively by using fractal theory.
An approximation theory for the identification of nonlinear distributed parameter systems
Banks, H. T.; Reich, Simeon; Rosen, I. G.
1990-01-01
An abstract approximation framework for the identification of nonlinear distributed parameter systems is developed. Inverse problems for nonlinear systems governed by strongly maximal monotone operators (satisfying a mild continuous dependence condition with respect to the unknown parameters to be identified) are treated. Convergence of Galerkin approximations and the corresponding solutions of finite dimensional approximating identification problems to a solution of the original finite dimensional identification problem is demonstrated using the theory of nonlinear evolution systems and a nonlinear analog of the Trotter-Kato appproximation result for semigroups of bounded linear operators. The nonlinear theory developed here is shown to subsume an existing linear theory as a special case. It is also shown to be applicable to a broad class of nonlinear elliptic operators and the corresponding nonlinear parabolic partial differential equations to which they lead. An application of the theory to a quasilinear model for heat conduction or mass transfer is discussed.
Omelyan, Igor, E-mail: omelyan@ualberta.ca, E-mail: omelyan@icmp.lviv.ua [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Alberta T6G 2M9 (Canada); Department of Mechanical Engineering, University of Alberta, Edmonton, Alberta T6G 2G8 (Canada); Institute for Condensed Matter Physics, National Academy of Sciences of Ukraine, 1 Svientsitskii Street, Lviv 79011 (Ukraine); Kovalenko, Andriy, E-mail: andriy.kovalenko@nrc-cnrc.gc.ca [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Alberta T6G 2M9 (Canada); Department of Mechanical Engineering, University of Alberta, Edmonton, Alberta T6G 2G8 (Canada)
2013-12-28
We develop efficient handling of solvation forces in the multiscale method of multiple time step molecular dynamics (MTS-MD) of a biomolecule steered by the solvation free energy (effective solvation forces) obtained from the 3D-RISM-KH molecular theory of solvation (three-dimensional reference interaction site model complemented with the Kovalenko-Hirata closure approximation). To reduce the computational expenses, we calculate the effective solvation forces acting on the biomolecule by using advanced solvation force extrapolation (ASFE) at inner time steps while converging the 3D-RISM-KH integral equations only at large outer time steps. The idea of ASFE consists in developing a discrete non-Eckart rotational transformation of atomic coordinates that minimizes the distances between the atomic positions of the biomolecule at different time moments. The effective solvation forces for the biomolecule in a current conformation at an inner time step are then extrapolated in the transformed subspace of those at outer time steps by using a modified least square fit approach applied to a relatively small number of the best force-coordinate pairs. The latter are selected from an extended set collecting the effective solvation forces obtained from 3D-RISM-KH at outer time steps over a broad time interval. The MTS-MD integration with effective solvation forces obtained by converging 3D-RISM-KH at outer time steps and applying ASFE at inner time steps is stabilized by employing the optimized isokinetic Nosé-Hoover chain (OIN) ensemble. Compared to the previous extrapolation schemes used in combination with the Langevin thermostat, the ASFE approach substantially improves the accuracy of evaluation of effective solvation forces and in combination with the OIN thermostat enables a dramatic increase of outer time steps. We demonstrate on a fully flexible model of alanine dipeptide in aqueous solution that the MTS-MD/OIN/ASFE/3D-RISM-KH multiscale method of molecular dynamics
Molecular theory of size exclusion chromatography for wide pore size distributions.
Sepsey, Annamária; Bacskay, Ivett; Felinger, Attila
2014-02-28
Chromatographic processes can conveniently be modeled at a microscopic level using the molecular theory of chromatography. This molecular or microscopic theory is completely general; therefore it can be used for any chromatographic process such as adsorption, partition, ion-exchange or size exclusion chromatography. The molecular theory of chromatography allows taking into account the kinetics of the pore ingress and egress processes, the heterogeneity of the pore sizes and polymer polydispersion. In this work, we assume that the pore size in the stationary phase of chromatographic columns is governed by a wide lognormal distribution. This property is integrated into the molecular model of size exclusion chromatography and the moments of the elution profiles were calculated for several kinds of pore structure. Our results demonstrate that wide pore size distributions have strong influence on the retention properties (retention time, peak width, and peak shape) of macromolecules. The novel model allows us to estimate the real pore size distribution of commonly used HPLC stationary phases, and the effect of this distribution on the size exclusion process. Copyright © 2014 Elsevier B.V. All rights reserved.
Distributed power control algorithm based on game theory for wireless sensor networks
无
2007-01-01
Energy saving is the most important issue in research and development for wireless sensor networks. A power control mechanism can reduce the power consumption of the whole network.Because the character of wireless sensor networks is restrictive energy,this paper proposes a distributed power control algorithm based on game theory for wireless sensor networks which objects of which are reducing power consumption and decreasing overhead and increasing network lifetime.The game theory and OPNET simulation shows that the power control algorithm converges to a Nash Equilibrium when decisions are updated according to a better response dynamic.
王伟; 孙会君; 吴建军
2015-01-01
The assumption widely used in the user equilibrium model for stochastic network was that the probability distributions of the travel time were known explicitly by travelers. However, this distribution may be unavailable in reality. By relaxing the restrictive assumption, a robust user equilibrium model based on cumulative prospect theory under distribution-free travel time was presented. In the absence of the cumulative distribution function of the travel time, the exact cumulative prospect value (CPV) for each route cannot be obtained. However, the upper and lower bounds on the CPV can be calculated by probability inequalities. Travelers were assumed to choose the routes with the best worst-case CPVs. The proposed model was formulated as a variational inequality problem and solved via a heuristic solution algorithm. A numerical example was also provided to illustrate the application of the proposed model and the efficiency of the solution algorithm.
Coalition of distributed generation units to virtual power players - a game theory approach
Morais, Hugo; Sousa, Tiago M; Santos, Gabriel
2015-01-01
of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP...... and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis...
Academic training: From Evolution Theory to Parallel and Distributed Genetic Programming
2007-01-01
2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 15, 16 March From 11:00 to 12:00 - Main Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming F. FERNANDEZ DE VEGA / Univ. of Extremadura, SP Lecture No. 1: From Evolution Theory to Evolutionary Computation Evolutionary computation is a subfield of artificial intelligence (more particularly computational intelligence) involving combinatorial optimization problems, which are based to some degree on the evolution of biological life in the natural world. In this tutorial we will review the source of inspiration for this metaheuristic and its capability for solving problems. We will show the main flavours within the field, and different problems that have been successfully solved employing this kind of techniques. Lecture No. 2: Parallel and Distributed Genetic Programming The successful application of Genetic Programming (GP, one of the available Evolutionary Algorithms) to optimization problems has encouraged an ...
Parameter estimation in nonlinear distributed systems - Approximation theory and convergence results
Banks, H. T.; Reich, Simeon; Rosen, I. G.
1988-01-01
An abstract approximation framework and convergence theory is described for Galerkin approximations applied to inverse problems involving nonlinear distributed parameter systems. Parameter estimation problems are considered and formulated as the minimization of a least-squares-like performance index over a compact admissible parameter set subject to state constraints given by an inhomogeneous nonlinear distributed system. The theory applies to systems whose dynamics can be described by either time-independent or nonstationary strongly maximal monotonic operators defined on a reflexive Banach space which is densely and continuously embedded in a Hilbert space. It is demonstrated that if readily verifiable conditions on the system's dependence on the unknown parameters are satisfied, and the usual Galerkin approximation assumption holds, then solutions to the approximating problems exist and approximate a solution to the original infinite-dimensional identification problem.
Robert M. Solow
2012-10-01
Full Text Available The paper surveys the neoclassical theory of growth. As a preliminary, the meaning of the adjective "neoclassical" is discussed. The basic model is then sketched, and the conditions ensuring a stationary state are illustrated. The issue of the convergence to a stationary state (and that of the speed of convergence is further considered. A discussion of "primary factors" opens the way to the "new" theory of growth, with endogenous technical progress. A number of extensions of the basic model are then recalled: two-sector and multi-sectoral models, overlapping generations models, the role of money in growth models. JEL Codes: O41, E25Keywords: Distribution, Growth, Income Distribution, Income
Designing a Digital Medical Management Training Simulator Using Distributed Cognition Theory
Rybing, Jonas; Prytz, Erik; Hornwall, Johan; Nilsson, Helene; Jonson, Carl-Oscar; Bång, Magnus
2017-01-01
Background Training of medical professionals is important to improve care during mass-causality events. Therefore, it is essential to extend knowledge on how to design valid and usable simulation-based training environments. Purpose This article investigates how distributed cognition and simulation theory concepts can guide design of simulation-based training environments. We present the design and user evaluation of DigEmergo, a simulator for training and assessing emergency medicine managem...
Medan, R. T.; Ray, K. S.
1974-01-01
A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.
Systematic analysis of transverse momentum distribution and non-extensive thermodynamics theory
Sena, I
2012-01-01
A systematic analysis of transverse momentum distribution of hadrons produced in ultra-relativistic $p+p$ and $A+A$ collisions is presented. We investigate the effective temperature and the entropic parameter from the non-extensive thermodynamic theory of strong interaction. We conclude that the existence of a limiting effective temperature and of a limiting entropic parameter is in accordance with experimental data.
Le consumérisme politique : Une innovation régulatoire à l’ère de la mondialisation
Marie-France Turcotte
2006-04-01
Full Text Available Cet article étudie le potentiel régulatoire du consumérisme politique à travers l’analyse de deux nouveaux mouvements sociaux économiques : la finance et la consommation responsables. La nouvelle génération de mouvement social, dont ces deux innovations témoignent, préside à l’institutionnalisation de mécanismes inédits ayant pour ambition de réguler le marché en fonction de critères sociaux et environnementaux. Mais ces mécanismes sont sujets à une dérive commerciale susceptible d’annihiler leur potentiel transformateur. Ces mécanismes méritent par ailleurs l’attention, dans la mesure où ils révèlent un compromis social au chapitre du contenu de la responsabilité sociale des acteurs économiques et donc des balises régulatoires à l’ère de la mondialisation.This article examines the regulatory potential of political consumerism through the analysis of two new economic social movements: finance and consumption. The new generation of social movement, of which these innovations are a manifestation, reflect the institutionalization of new mechanisms which aim to control the market according to social and environmental criteria. But these mechanisms are prone to a commercial drift likely to destroy their transformational potential. These mechanisms deserve attention insofar as they reveal a social compromise of the contents of the social responsibility of economic actors and thus of regulatory signals in the era of globalization.
A new probability distribution model of turbulent irradiance based on Born perturbation theory
无
2010-01-01
The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.
Luigi Pasinetti
2012-10-01
Full Text Available The paper surveys the main theories of income distribution in their relationship with the theories of economic growth. First, the Classical approach is considered, focusing on the Ricardian theory. Then the neoclassical theory is discussed, highlighting its origins (Bohm-Bawerk, Wicksell, Clark and the role of the aggregate production function. The emergence of a "Keynesian" theory of income distribution in the wake of Harrod's model of growth is then recalled together with the surprising resurgence of the neoclassical theory (following the contributions of Solow and Meade. But, as the paper shows, the neoclassical theory of income distribution lacks logical consistency and has shaky foundations, as has been revealed by the severe critiques moved to the neoclassical production function. Mainstream economic literature circumvents this problem by simply ignoring it; while the models of endogenous growth exclude the issue of distribution theory from their consideration. However, while mainstream economics bypasses the problems of income distribution, this is too relevant an issue to be ignored and a number of new research lines, briefly surveyed, try new approaches to it. JEL Codes: O41, E25Keywords: Distribution, Economic Growth, Growth, Income Distribution, Income
Health as normal function: a weak link in Daniels's theory of just health distribution.
Krag, Erik
2014-10-01
Drawing on Christopher Boorse's Biostatistical Theory (BST), Norman Daniels contends that a genuine health need is one which is necessary to restore normal functioning - a supposedly objective notion which he believes can be read from the natural world without reference to potentially controversial normative categories. But despite his claims to the contrary, this conception of health harbors arbitrary evaluative judgments which make room for intractable disagreement as to which conditions should count as genuine health needs and therefore which needs should be met. I begin by offering a brief summary of Boorse's BST, the theory to which Daniels appeals for providing the conception of health as normal functioning upon which his overall distributive scheme rests. Next, I consider what I call practical objections to Daniels's use of Boorse's theory. Finally I recount Elseljin Kingma's theoretical objection to Boorse's BST and discuss its impact on Daniels's overall theory. Though I conclude that Boorse's view, so weakened, will no longer be able to sustain the judgments which Daniels's theory uses it to reach, in the end, I offer Daniels an olive branch by briefly sketching an alternative strategy for reaching suitably objective conclusions regarding the health and/or disease status of various conditions.
Amano, Ken-Ichi; Liang, Yunfeng; Miyazawa, Keisuke; Kobayashi, Kazuya; Hashimoto, Kota; Fukami, Kazuhiro; Nishi, Naoya; Sakka, Tetsuo; Onishi, Hiroshi; Fukuma, Takeshi
2016-06-21
Atomic force microscopy (AFM) in liquids can measure a force curve between a probe and a buried substrate. The shape of the measured force curve is related to hydration structure on the substrate. However, until now, there has been no practical theory that can transform the force curve into the hydration structure, because treatment of the liquid confined between the probe and the substrate is a difficult problem. Here, we propose a robust and practical transform theory, which can generate the number density distribution of solvent molecules on a substrate from the force curve. As an example, we analyzed a force curve measured by using our high-resolution AFM with a newly fabricated ultrashort cantilever. It is demonstrated that the hydration structure on muscovite mica (001) surface can be reproduced from the force curve by using the transform theory. The transform theory will enhance AFM's ability and support structural analyses of solid/liquid interfaces. By using the transform theory, the effective diameter of a real probe apex is also obtained. This result will be important for designing a model probe of molecular scale simulations.
Optimization of pressure gauge locations for water distribution systems using entropy theory.
Yoo, Do Guen; Chang, Dong Eil; Jun, Hwandon; Kim, Joong Hoon
2012-12-01
It is essential to select the optimal pressure gauge location for effective management and maintenance of water distribution systems. This study proposes an objective and quantified standard for selecting the optimal pressure gauge location by defining the pressure change at other nodes as a result of demand change at a specific node using entropy theory. Two cases are considered in terms of demand change: that in which demand at all nodes shows peak load by using a peak factor and that comprising the demand change of the normal distribution whose average is the base demand. The actual pressure change pattern is determined by using the emitter function of EPANET to reflect the pressure that changes practically at each node. The optimal pressure gauge location is determined by prioritizing the node that processes the largest amount of information it gives to (giving entropy) and receives from (receiving entropy) the whole system according to the entropy standard. The suggested model is applied to one virtual and one real pipe network, and the optimal pressure gauge location combination is calculated by implementing the sensitivity analysis based on the study results. These analysis results support the following two conclusions. Firstly, the installation priority of the pressure gauge in water distribution networks can be determined with a more objective standard through the entropy theory. Secondly, the model can be used as an efficient decision-making guide for gauge installation in water distribution systems.
Roger Bruce Mason
2013-05-01
Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.
Rodriguez, G. (Editor)
1983-01-01
Two general themes in the control of large space structures are addressed: control theory for distributed parameter systems and distributed control for systems requiring spatially-distributed multipoint sensing and actuation. Topics include modeling and control, stabilization, and estimation and identification.
Luigi Pasinetti
2000-05-01
Full Text Available The paper surveys the main theories of income distribution in their relationship with the theories of economic growth. First, the Classical approach is considered, focusing on the Ricardian theory. Then the neoclassical theory is discussed, highlighting its origins (Bohm-Bawerk, Wicksell, Clark and the role of the aggregate production function. The emergence of a "Keynesian" theory of income distribution in the wake of Harrod's model of growth is then recalled together with the surprising resurgence of the neoclassical theory (following the contributions of Solow and Meade. But, as the paper shows, the neoclassical theory of income distribution lacks logical consistency and has shaky foundations, as has been revealed by the severe critiques moved to the neoclassical production function. Mainstream economic literature circumvents this problem by simply ignoring it; while the models of endogenous growth exclude the issue of distribution theory from their consideration. However, while mainstream economics bypasses the problems of income distribution, this is too relevant an issue to be ignored and a number of new research lines, briefly surveyed, try new approaches to it.
Fu, Hai-Bing; Cheng, Wei; Zhong, Tao
2016-01-01
We revisit the $\\rho$-meson longitudinal leading-twist distribution amplitude (DA) $\\phi_{2;\\rho}^\\|$ by using the QCD sum rules approach within the background field theory. To improve the accuracy of the sum rules for its moments $\\langle\\xi_{n;\\rho}^\\|\\rangle$, we include the next-to-leading order QCD correction to the perturbative part and keep all non-perturbative condensates up to dimension-six consistently within the background field theory. The first two moments read $\\langle \\xi_{2;\\rho}^\\| \\rangle|_{1{\\rm GeV}} = 0.241(28)$ and $\\langle \\xi_{4;\\rho}^\\| \\rangle|_{1{\\rm GeV}} = 0.108(27)$, indicating a double humped behavior for $\\phi_{2;\\rho}^\\|$ at low $q^2$-region. As an application, we apply them to the $B\\to \\rho $ transition form factors within the QCD light-cone sum rules, which are key components for the decay width $\\Gamma(B\\to \\rho \\ell \
Study of isospin nonconservation in the framework of spectral distribution theory
Kar, Kamales
2014-01-01
The observed isospin-symmetry breaking in light nuclei are caused not only by the Coulomb interaction but by the isovector one and two body plus isotensor two body nuclear interactions as well. Spectral distribution theory which treats nuclear spectroscopy and other structural properties in a statistical framework was earlier applied to isospin conserving Hamiltonians only. In this paper we extend that to include the nuclear interactions non-scalar in isospin and work out examples in sd shell to calculate the linear term in the isobaric mass-multiplet equation originating from these non-scalar parts.
Study of isospin nonconservation in the framework of spectral distribution theory
Kar, Kamales; Sarkar, Sukhendusekhar
2015-05-01
The observed isospin-symmetry breaking in light nuclei are caused not only by the Coulomb interaction but also by the isovector one- and two-body plus isotensor two- body nuclear interactions. Spectral distribution theory, which treats nuclear spectroscopy and other structural properties in a statistical framework, has been applied mostly to isospin conserving Hamiltonians. In this paper we extend that to include the nuclear interactions non-scalar in isospin and work out examples in the sd shell to calculate the linear term in the isobaric mass-multiplet equation originating from these non-isoscalar parts.
Finite de Finetti theorem for conditional probability distributions describing physical theories
Christandl, Matthias; Toner, Ben
2009-04-01
We work in a general framework where the state of a physical system is defined by its behavior under measurement and the global state is constrained by no-signaling conditions. We show that the marginals of symmetric states in such theories can be approximated by convex combinations of independent and identical conditional probability distributions, generalizing the classical finite de Finetti theorem of Diaconis and Freedman. Our results apply to correlations obtained from quantum states even when there is no bound on the local dimension, so that known quantum de Finetti theorems cannot be used.
Vereshchagin, D.A. [Theoretical Physics Department, Kaliningrad State University, A. Nevsky st. 14, Kaliningrad (Russian Federation); Leble, S.B. [Theoretical Physics Department, Kaliningrad State University, A. Nevsky st. 14, Kaliningrad (Russian Federation) and Theoretical Physics and Mathematical Methods Department, Gdansk University of Technology, ul. Narutowicza 11/12, Gdansk (Poland)]. E-mail: leble@mifgate.pg.gda.pl; Solovchuk, M.A. [Theoretical Physics Department, Kaliningrad State University, A. Nevsky st. 14, Kaliningrad (Russian Federation)]. E-mail: solovchuk@yandex.ru
2006-01-02
The system of hydrodynamic-type equations for a stratified gas in gravity field is derived from BGK equation by method of piecewise continuous distribution function. The obtained system of the equations generalizes the Navier-Stokes one at arbitrary Knudsen numbers. The problem of a wave disturbance propagation in a rarefied gas is explored. The verification of the model is made for a limiting case of a homogeneous medium. The phase velocity and attenuation coefficient values are in an agreement with former fluid mechanics theories; the attenuation behavior reproduces experiment and kinetics-based results at more wide range of the Knudsen numbers.
Khorashadizadeh, S. M., E-mail: smkhorashadi@birjand.ac.ir; Rastbood, E. [Physics Department, University of Birjand, Birjand 97179-63384 (Iran, Islamic Republic of); Niknam, A. R. [Laser and Plasma Research Institute, Shahid Beheshti University, G.C., Tehran 19839-63113 (Iran, Islamic Republic of)
2015-07-15
The evolution of filamentation instability in a weakly ionized current-carrying plasma with nonextensive distribution was studied in the diffusion frequency region, taking into account the effects of electron-neutral collisions. Using the kinetic theory, Lorentz transformation formulas, and Bhatnagar-Gross-Krook collision model, the generalized dielectric permittivity functions of this plasma system were achieved. By obtaining the dispersion relation of low-frequency waves, the possibility of filamentation instability and its growth rate were investigated. It was shown that collisions can increase the maximum growth rate of instability. The analysis of temporal evolution of filamentation instability revealed that the growth rate of instability increased by increasing the q-parameter and electron drift velocity. Finally, the results of Maxwellian and q-nonextensive velocity distributions were compared and discussed.
无
2006-01-01
In order to resolve the multisensor multiplied maneuvering target tracking problem, this paper presents a distributed interacted multiple model multisensor joint probabilistic data association algorithm (DIMM-MSJPDA). First of all, the interacted multiple model joint probabilistic data association algorithm is applied to each sensor, and then the state estimation, estimation covariance, model probability, combined innovation, innovation covariance are delivered to the fusion center. Then, the tracks from each sensor are correlated and the D-S evidence theory is used to gain the model probability of an identical target. Finally, the ultimate state estimation of each target is calculated according to the new model probability, and the state estimation is transmitted to each sensor. Simulations are designed to test the tracking performance of DIMM-MSJPDA algorithm. The results show that the use of DIMM-MSJPDA algorithm enables the distributed multisensor system to track multiplied maneuvering targets and its tracking performance is much better than that of IMMJPDA algorithm.
Amano, Ken-Ichi; Liang, Yunfeng; Miyazawa, Keisuke; Kobayashi, Kazuya; Hashimoto, Kota; Fukami, Kazuhiro; Nishi, Naoya; Sakka, Tetsuo; Onishi, Hiroshi; Fukuma, Takeshi
2016-08-07
Correction for 'Number density distribution of solvent molecules on a substrate: a transform theory for atomic force microscopy' by Ken-ichi Amano et al., Phys. Chem. Chem. Phys., 2016, 18, 15534-15544.
Matthews, Thomas J; Whittaker, Robert J
2014-06-01
Published in 2001, The Unified Neutral Theory of Biodiversity and Biogeography (UNTB) emphasizes the importance of stochastic processes in ecological community structure, and has challenged the traditional niche-based view of ecology. While neutral models have since been applied to a broad range of ecological and macroecological phenomena, the majority of research relating to neutral theory has focused exclusively on the species abundance distribution (SAD). Here, we synthesize the large body of work on neutral theory in the context of the species abundance distribution, with a particular focus on integrating ideas from neutral theory with traditional niche theory. First, we summarize the basic tenets of neutral theory; both in general and in the context of SADs. Second, we explore the issues associated with neutral theory and the SAD, such as complications with fitting and model comparison, the underlying assumptions of neutral models, and the difficultly of linking pattern to process. Third, we highlight the advances in understanding of SADs that have resulted from neutral theory and models. Finally, we focus consideration on recent developments aimed at unifying neutral- and niche-based approaches to ecology, with a particular emphasis on what this means for SAD theory, embracing, for instance, ideas of emergent neutrality and stochastic niche theory. We put forward the argument that the prospect of the unification of niche and neutral perspectives represents one of the most promising future avenues of neutral theory research.
Research of CWS’ Particle Size Distribution based on Ultrasonic Attenuation Theory
WANG Weidong
2010-11-01
Full Text Available the key to reduce coal pollution is the development of clean coal technology and the improvement of the backward coal-burning technology. The coal water slurry (CWS is the first substitute of the oil. The particle size distribution of CWS plays an important role in the quality control of CWS. Now there are three methods that are used to analysis the particle size distribution of CWS, screening method, settlement method, laser method. These methods produce some disadvantages when be used to forecast the distribution of CWS. Thus, this article proposes an ultrasonic method with effective medium theory model which can be accurately reflected in the acoustic attenuation characteristics of coal-water slurry based on structural average. Experimental simulation proved that effective medium model is fully capable of achieving on-line detection of coal-water slurry particle size, for detection of fine-and coarse-sized particle size distribution. Non-linear relationship between attenuation and particle size, the three-frequency method can be used to inverse calculation of its. Which we can achieve CWS granularity on-line, and continuously control the quality of CWS.
The empirical mass distribution of hot B subdwarfs: Implications for stellar evolution theory
Green E.M.
2013-03-01
Full Text Available Subdwarf B (sdB stars are hot, compact, and evolved objects that form the very hot end of the horizontal branch, the so-called Extreme Horizontal Branch (EHB. Understanding the formation of sdB stars is one of the remaining challenges of stellar evolution theory. Several scenarios have been proposed to account for the existence of such objects, made of He-burning core surrounded by very thin H-rich envelope. They give quite different theoretical mass distributions for the resulting sdB stars. Detailed asteroseismic analyses, including mass estimates, of 15 pulsating hot B subdwarfs have been published since a decade. The masses have also been reliably determined by light curve modeling and spectroscopy for 7 sdB components of eclipsing and/or reflection effect binaries. These empirical mass distributions, although based on small-number statistics, can be compared with the expectations of stellar evolution theory. In particular, the two He white dwarfs merger scenario does not seem to be the dominant channel to form isolated sdB stars, while the post-red giant branch scenario is reinforced. This opens new questions on extreme mass loss of red giants to form EHB stars, possibly in connection with the recently discovered close substellar companions and planets orbiting sdB stars.
Moving beyond abundance distributions: neutral theory and spatial patterns in a tropical forest.
May, Felix; Huth, Andreas; Wiegand, Thorsten
2015-03-01
Assessing the relative importance of different processes that determine the spatial distribution of species and the dynamics in highly diverse plant communities remains a challenging question in ecology. Previous modelling approaches often focused on single aggregated forest diversity patterns that convey limited information on the underlying dynamic processes. Here, we use recent advances in inference for stochastic simulation models to evaluate the ability of a spatially explicit and spatially continuous neutral model to quantitatively predict six spatial and non-spatial patterns observed at the 50 ha tropical forest plot on Barro Colorado Island, Panama. The patterns capture different aspects of forest dynamics and biodiversity structure, such as annual mortality rate, species richness, species abundance distribution, beta-diversity and the species-area relationship (SAR). The model correctly predicted each pattern independently and up to five patterns simultaneously. However, the model was unable to match the SAR and beta-diversity simultaneously. Our study moves previous theory towards a dynamic spatial theory of biodiversity and demonstrates the value of spatial data to identify ecological processes. This opens up new avenues to evaluate the consequences of additional process for community assembly and dynamics.
Study of modeling theory of multiphase gas distribution in exhaust process of automobile
臧杰
2004-01-01
According to experiments and the phenomena that tailpipes often have dirty particulate matter, this paper takes dynamic theory analysis as its study aim, beginning with the description method of multiphase gas distribution differential equation. According to the characteristics that exhaust gas will flow with high velocity in a tailpipe, it is supposed that gas mass that differ largely will layer when flowing with high velocity in a tailpipe.This means the exhaust gas is mixed with particulate matter, gas with large mass (CO2 ,HC,NOx ) and gas with small mass (CO,H2O,N2 ,O2). The interface of two phase fluid will be become clearer as it flows in the pipe for a long distance. The fluid continuous equation between gas phase and solid phase and the mathematical relationship between the geometry parameter and the flowing are established by a multiphase gas flowing theory. Analyzing the interface and state of layers will provide a basic theory for developing a catalytic converter with high efficiency.
Asymptotic distribution of ∆AUC, NRIs, and IDI based on theory of U-statistics.
Demler, Olga V; Pencina, Michael J; Cook, Nancy R; D'Agostino, Ralph B
2017-09-20
The change in area under the curve (∆AUC), the integrated discrimination improvement (IDI), and net reclassification index (NRI) are commonly used measures of risk prediction model performance. Some authors have reported good validity of associated methods of estimating their standard errors (SE) and construction of confidence intervals, whereas others have questioned their performance. To address these issues, we unite the ∆AUC, IDI, and three versions of the NRI under the umbrella of the U-statistics family. We rigorously show that the asymptotic behavior of ∆AUC, NRIs, and IDI fits the asymptotic distribution theory developed for U-statistics. We prove that the ∆AUC, NRIs, and IDI are asymptotically normal, unless they compare nested models under the null hypothesis. In the latter case, asymptotic normality and existing SE estimates cannot be applied to ∆AUC, NRIs, or IDI. In the former case, SE formulas proposed in the literature are equivalent to SE formulas obtained from U-statistics theory if we ignore adjustment for estimated parameters. We use Sukhatme-Randles-deWet condition to determine when adjustment for estimated parameters is necessary. We show that adjustment is not necessary for SEs of the ∆AUC and two versions of the NRI when added predictor variables are significant and normally distributed. The SEs of the IDI and three-category NRI should always be adjusted for estimated parameters. These results allow us to define when existing formulas for SE estimates can be used and when resampling methods such as the bootstrap should be used instead when comparing nested models. We also use the U-statistic theory to develop a new SE estimate of ∆AUC. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
Yuan, Ke-Hai; Lu, Laura
2008-01-01
This article provides the theory and application of the 2-stage maximum likelihood (ML) procedure for structural equation modeling (SEM) with missing data. The validity of this procedure does not require the assumption of a normally distributed population. When the population is normally distributed and all missing data are missing at random…
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
Feil, Dirk
1992-01-01
Quantum chemistry and the concepts used daily in chemistry are increasingly growing apart. Among the concepts that are able to bridge the gap between theory and experimental practice, electron density distribution has an important place. The study of this distribution has led to new developments in
Jump Markov models and transition state theory: the quasi-stationary distribution approach.
Di Gesù, Giacomo; Lelièvre, Tony; Le Peutrec, Dorian; Nectoux, Boris
2016-12-22
We are interested in the connection between a metastable continuous state space Markov process (satisfying e.g. the Langevin or overdamped Langevin equation) and a jump Markov process in a discrete state space. More precisely, we use the notion of quasi-stationary distribution within a metastable state for the continuous state space Markov process to parametrize the exit event from the state. This approach is useful to analyze and justify methods which use the jump Markov process underlying a metastable dynamics as a support to efficiently sample the state-to-state dynamics (accelerated dynamics techniques). Moreover, it is possible by this approach to quantify the error on the exit event when the parametrization of the jump Markov model is based on the Eyring-Kramers formula. This therefore provides a mathematical framework to justify the use of transition state theory and the Eyring-Kramers formula to build kinetic Monte Carlo or Markov state models.
Lee, Hyo-Chang; Chung, Chin-Wook
2015-10-20
Hysteresis, which is the history dependence of physical systems, is one of the most important topics in physics. Interestingly, bi-stability of plasma with a huge hysteresis loop has been observed in inductive plasma discharges. Despite long plasma research, how this plasma hysteresis occurs remains an unresolved question in plasma physics. Here, we report theory, experiment, and modeling of the hysteresis. It was found experimentally and theoretically that evolution of the electron energy distribution (EED) makes a strong plasma hysteresis. In Ramsauer and non-Ramsauer gas experiments, it was revealed that the plasma hysteresis is observed only at high pressure Ramsauer gas where the EED deviates considerably from a Maxwellian shape. This hysteresis was presented in the plasma balance model where the EED is considered. Because electrons in plasmas are usually not in a thermal equilibrium, this EED-effect can be regarded as a universal phenomenon in plasma physics.
Jump Markov models and transition state theory: the Quasi-Stationary Distribution approach
Di Gesù, Giacomo; Peutrec, Dorian Le; Nectoux, Boris
2016-01-01
We are interested in the connection between a metastable continuous state space Markov process (satisfying e.g. the Langevin or overdamped Langevin equation) and a jump Markov process in a discrete state space. More precisely, we use the notion of quasi-stationary distribution within a metastable state for the continuous state space Markov process to parametrize the exit event from the state. This approach is useful to analyze and justify methods which use the jump Markov process underlying a metastable dynamics as a support to efficiently sample the state-to-state dynamics (accelerated dynamics techniques). Moreover, it is possible by this approach to quantify the error on the exit event when the parametrization of the jump Markov model is based on the Eyring-Kramers formula. This therefore provides a mathematical framework to justify the use of transition state theory and the Eyring-Kramers formula to build kinetic Monte Carlo or Markov state models.
Liu, Yuan; Ning, Chuangang
2015-10-01
Recently, the development of photoelectron velocity map imaging makes it much easier to obtain the photoelectron angular distributions (PADs) experimentally. However, explanations of PADs are only qualitative in most cases, and very limited works have been reported on how to calculate PAD of anions. In the present work, we report a method using the density-functional-theory Kohn-Sham orbitals to calculate the photodetachment cross sections and the anisotropy parameter β. The spherical average over all random molecular orientation is calculated analytically. A program which can handle both the Gaussian type orbital and the Slater type orbital has been coded. The testing calculations on Li-, C-, O-, F-, CH-, OH-, NH2-, O2-, and S2- show that our method is an efficient way to calculate the photodetachment cross section and anisotropy parameter β for anions, thus promising for large systems.
Vlah, Zvonimir; Okumura, Teppei; Desjacques, Vincent
2013-01-01
Numerical simulations show that redshift space distortions (RSD) introduce strong scale dependence in the power spectra of halos, with ten percent deviations relative to linear theory predictions even on relatively large scales (k<0.1h/Mpc) and even in the absence of satellites (which induce Fingers-of-God, FoG, effects). If unmodeled these effects prevent one from extracting cosmological information from RSD surveys. In this paper we use perturbation theory (PT) and halo biasing model and apply it to the distribution function approach to RSD, in which RSD is decomposed into several correlators of density weighted velocity moments. We model each of these correlators using PT and compare the results to simulations over a wide range of halo masses and redshifts. We find that with an introduction of a physically motivated halo biasing, and using dark matter power spectra from simulations, we can reproduce the simulation results at a percent level on scales up to k~0.15h/Mpc at z=0, without the need to have fr...
Revisiting the Pion Leading-Twist Distribution Amplitude within the QCD Background Field Theory
Zhong, Tao; Wang, Zhi-Gang; Huang, Tao; Fu, Hai-Bing; Han, Hua-Yong
2014-01-01
We study the pion leading-twist distribution amplitude (DA) within the framework of SVZ sum rules under the background field theory. To improve the accuracy of the sum rules, we expand both the quark propagator and the vertex $(z\\cdot \\tensor{D})^n$ of the correlator up to dimension-six operators in the background field theory. The sum rules for the pion DA moments are obtained, in which all condensates up to dimension-six have been taken into consideration. Using the sum rules, we obtain $\\left|_{\\rm 1\\;GeV} = 0.338 \\pm 0.032$, $\\left|_{\\rm 1\\;GeV} = 0.211 \\pm 0.030$ and $\\left|_{\\rm 1\\;GeV} = 0.163 \\pm 0.030$. It is shown that the dimension-six condensates shall provide sizable contributions to the pion DA moments. We show that the first Gegenbauer moment of the pion leading-twist DA is $a^\\pi_2|_{\\rm 1\\;GeV} = 0.403 \\pm 0.093$, which is consistent with those obtained in the literature within errors but prefers a larger central value as indicated by lattice QCD predictions.
ρ -meson longitudinal leading-twist distribution amplitude within QCD background field theory
Fu, Hai-Bing; Wu, Xing-Gang; Cheng, Wei; Zhong, Tao
2016-10-01
We revisit the ρ -meson longitudinal leading-twist distribution amplitude (DA) ϕ2;ρ ∥ by using the QCD sum rules approach within the background field theory. To improve the accuracy of the sum rules for its moments ⟨ξn;ρ ∥⟩ , we include the next-to-leading order QCD correction to the perturbative part and keep all nonperturbative condensates up to dimension-six consistently within the background field theory. The first two moments read ⟨ξ2;ρ ∥⟩|1 GeV=0.241 (28 ) and ⟨ξ4;ρ ∥⟩|1 GeV=0.109 (10 ) , indicating a double humped behavior for ϕ2;ρ ∥ at small energy scale. As an application, we apply them to the B →ρ transition form factors within the QCD light-cone sum rules, which are key components for the decay width Γ (B →ρ ℓνℓ) . To compare with the world average of Γ (B →ρ ℓνℓ) issued by Particle Data Group, we predict |Vub|=3.1 9-0.62+0.65 , which agrees with the BABAR and Omnès parametrization prediction within errors.
Non-Gaussianities in the topological charge distribution of the SU(3) Yang--Mills theory
Cé, Marco; Engel, Georg P; Giusti, Leonardo
2015-01-01
We study the topological charge distribution of the SU(3) Yang--Mills theory with high precision in order to be able to detect deviations from Gaussianity. The computation is carried out on the lattice with high statistics Monte Carlo simulations by implementing a naive discretization of the topological charge evolved with the Yang--Mills gradient flow. This definition is far less demanding than the one suggested from Neuberger's fermions and, as shown in this paper, in the continuum limit its cumulants coincide with those of the universal definition appearing in the chiral Ward identities. Thanks to the range of lattice volumes and spacings considered, we can extrapolate the results for the second and fourth cumulant of the topological charge distribution to the continuum limit with confidence by keeping finite volume effects negligible with respect to the statistical errors. Our best results for the topological susceptibility is t_0^2*chi=6.67(7)*10^-4, where t_0 is a standard reference scale, while for the...
Aquilanti, Vincenzo; Coutinho, Nayara Dantas; Carvalho-Silva, Valter Henrique
2017-04-28
This article surveys the empirical information which originated both by laboratory experiments and by computational simulations, and expands previous understanding of the rates of chemical processes in the low-temperature range, where deviations from linearity of Arrhenius plots were revealed. The phenomenological two-parameter Arrhenius equation requires improvement for applications where interpolation or extrapolations are demanded in various areas of modern science. Based on Tolman's theorem, the dependence of the reciprocal of the apparent activation energy as a function of reciprocal absolute temperature permits the introduction of a deviation parameter d covering uniformly a variety of rate processes, from those where quantum mechanical tunnelling is significant and d 0, corresponding to the Pareto-Tsallis statistical weights: these generalize the Boltzmann-Gibbs weight, which is recovered for d = 0. It is shown here how the weights arise, relaxing the thermodynamic equilibrium limit, either for a binomial distribution if d > 0 or for a negative binomial distribution if d theory for chemical kinetics including quantum mechanical tunnelling, and for case (iii) to the stereodirectional specificity of the dynamics of reactions strongly hindered by the increase of temperature.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).
Codis, Sandrine; Gavazzi, Raphaël; Pichon, Christophe; Gouin, Céline
2017-09-01
Aims: Gravitational lensing allows us to quantify the angular distribution of the convergence field around clusters of galaxies to constrain their connectivity to the cosmic web. We describe the corresponding theory in Lagrangian space in which analytical results can be obtained by identifying clusters to peaks in the initial field. Methods: We derived the three-point Gaussian statistics of a two-dimensional (2D) field and its first and second derivatives. The formalism allowed us to study the statistics of the field in a shell around a central peak, in particular its multipolar decomposition. Results: The peak condition is shown to significantly remove power from the dipolar contribution and to modify the monopole and quadrupole. As expected, higher order multipoles are not significantly modified by the constraint. Analytical predictions are successfully checked against measurements in Gaussian random fields. The effect of substructures and radial weighting is shown to be small and does not change the qualitative picture.The non-linear evolution is shown to induce a non-linear bias of all multipoles proportional to the cluster mass. Conclusions: We predict the Gaussian and weakly non-Gaussian statistics of multipolar moments of a 2D field around a peak as a proxy for the azimuthal distribution of the convergence field around a cluster of galaxies. A quantitative estimate of this multipolar decomposition of the convergence field around clusters in numerical simulations of structure formation and in observations will be presented in two forthcoming papers.
SHWETA DHILLON; RAMA KANT
2017-08-01
Randles-Ershler admittance model is extensively used in the modeling of batteries, fuel cells, sensors etc. It is also used in understanding response of the fundamental systems with coupled processes like charge transfer, diffusion, electric double layer charging and uncompensated solution resistance. Wegeneralize phenomenological theory for the Randles-Ershler admittance at the electrode with double layer capacitance and charge transfer heterogeneity, viz., non-uniform double layer capacitance and charge transfer resistance (c d and R CT). Electrode heterogeneity is modeled through distribution functions of R CT and c d, viz., log-normal distribution function. High frequency region captures influence of electric double layer while intermediate frequency region captures influence from the charge transfer resistance of heterogeneous electrode. A heterogeneous electrode with mean charge transfer resistance $\\bar{R CT}$ shows faster charge transfer kinetics over a electrode with uniform charge transfer resistance ($\\bar{R CT}$). It is also observed that a heterogeneous electrode having high mean with large variance in the R CT and c d can behave same as an electrode having low mean with small variance in the R CT and c d. The origin of coupling of uncompensated solution resistance (between working and reference electrode) with the charge transfer kinetics is explained. Finally, our model provides a simple route to understand the effect of spatial heterogeneity.
A new measure based on degree distribution that links information theory and network graph analysis
2012-01-01
Background Detailed connection maps of human and nonhuman brains are being generated with new technologies, and graph metrics have been instrumental in understanding the general organizational features of these structures. Neural networks appear to have small world properties: they have clustered regions, while maintaining integrative features such as short average pathlengths. Results We captured the structural characteristics of clustered networks with short average pathlengths through our own variable, System Difference (SD), which is computationally simple and calculable for larger graph systems. SD is a Jaccardian measure generated by averaging all of the differences in the connection patterns between any two nodes of a system. We calculated SD over large random samples of matrices and found that high SD matrices have a low average pathlength and a larger number of clustered structures. SD is a measure of degree distribution with high SD matrices maximizing entropic properties. Phi (Φ), an information theory metric that assesses a system’s capacity to integrate information, correlated well with SD - with SD explaining over 90% of the variance in systems above 11 nodes (tested for 4 to 13 nodes). However, newer versions of Φ do not correlate well with the SD metric. Conclusions The new network measure, SD, provides a link between high entropic structures and degree distributions as related to small world properties. PMID:22726594
FROM BALLOT THEOREMS TO THE THEORY OF QUEUES,
QUEUEING THEORY, DISTRIBUTION THEORY ), (*PROBABILITY, DISTRIBUTION THEORY ), (* DISTRIBUTION THEORY , QUEUEING THEORY), (*STOCHASTIC PROCESSES... DISTRIBUTION THEORY ), SEQUENCES(MATHEMATICS), DIFFERENCE EQUATIONS, INTEGRAL TRANSFORMS, TIME, STATISTICAL FUNCTIONS
Becker, A; Faisal, F
2001-03-26
Recently observed momentum distribution of doubly charged recoil-ions of atoms produced by femtosecond infrared laser pulses is analyzed using the so-called intense-field many-body S-matrix theory. Observed characteristics of the momentum distributions, parallel and perpendicular to the polarization axis, are reproduced by the theory. It is shown that correlated energy-sharing between the two electrons in the intermediate state and their 'Volkov-dressing' in the final state, can explain the origin of these characteristics.
Wendt, Kyle
2016-03-01
How large is the 48Ca nucleus? While the electric charge distribution of this nucleus was accurately measured decades ago, both experimental and ab initio descriptions of the neutron distribution are deficient. We address this question using ab initio calculations of the electric charge, neutron, and weak distributions of 48Ca based on chiral effective field theory. Historically, chiral effective field theory calculations of systems larger than 4 nucleons have been plagued by strong systematic errors which result in theoretical descriptions that are too dense and over bound. We address these errors using a novel approach that permits us to accurately reproduce binding energy and charge radius of 48Ca, and to constrain electroweak observables such as the neutron radius, electric dipole polarizability, and the weak form factor. For a full list of contributors to this work, please see ``Neutron and weak-charge distributions of the 48Ca nucleus,'' Nature Physics (2015) doi:10.1038/nphys3529.
Aquilanti, Vincenzo; Coutinho, Nayara Dantas; Carvalho-Silva, Valter Henrique
2017-03-01
This article surveys the empirical information which originated both by laboratory experiments and by computational simulations, and expands previous understanding of the rates of chemical processes in the low-temperature range, where deviations from linearity of Arrhenius plots were revealed. The phenomenological two-parameter Arrhenius equation requires improvement for applications where interpolation or extrapolations are demanded in various areas of modern science. Based on Tolman's theorem, the dependence of the reciprocal of the apparent activation energy as a function of reciprocal absolute temperature permits the introduction of a deviation parameter d covering uniformly a variety of rate processes, from those where quantum mechanical tunnelling is significant and d 0, corresponding to the Pareto-Tsallis statistical weights: these generalize the Boltzmann-Gibbs weight, which is recovered for d = 0. It is shown here how the weights arise, relaxing the thermodynamic equilibrium limit, either for a binomial distribution if d > 0 or for a negative binomial distribution if d kinetics, where transport phenomena accelerate processes as the temperature increases; (ii) the sub-Arrhenius kinetics, where quantum mechanical tunnelling propitiates low-temperature reactivity; (iii) the anti-Arrhenius kinetics, where processes with no energetic obstacles are rate-limited by molecular reorientation requirements. Particular attention is given for case (i) to the treatment of diffusion and viscosity, for case (ii) to formulation of a transition rate theory for chemical kinetics including quantum mechanical tunnelling, and for case (iii) to the stereodirectional specificity of the dynamics of reactions strongly hindered by the increase of temperature. This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'.
Dekkers, Petrus J; Tuinman, Ilse L; Marijnissen, Jan C M; Friedlander, Sheldon K; Scarlett, B
2002-04-15
The gas to particle synthesis route is a relatively clean and efficient manner for the production of high-quality ceramic powders. These powders can be subsequently sintered in any wanted shape. The modeling of these production systems is difficult because several mechanisms occur in parallel. From theoretical considerations it can be determined, however, that coagulation and sintering are dominant mechanisms as far as shape and size of the particles are considered. In part I of this article an extensive theoretical analysis was given on the self-preserving size distribution theory for power law particles. In this second part, cumulative particle size distributions of silicon and silicon nitride agglomerates, produced in a laser reactor, were determined from TEM pictures and compared to the distributions calculated from this self-preserving theory for power law particles. The calculated distributions were in fair agreement with the measured results, especially at the high end of the distributions. Calculated and measured particle growth rates were also in fair agreement. Using the self-preserving theory an analysis was made on the distribution of annealed silicon agglomerates, of interest in applications to nanoparticle technology.
Underwood, H R; Peterson, A F; Magin, R L
1992-02-01
A rectangular microstrip antenna radiator is investigated for its near-zone radiation characteristics in water. Calculations of a cavity model theory are compared with the electric-field measurements of a miniature nonperturbing diode-dipole E-field probe whose 3 mm tip was positioned by an automatic three-axis scanning system. These comparisons have implications for the use of microstrip antennas in a multielement microwave hyperthermia applicator. Half-wavelength rectangular microstrip patches were designed to radiate in water at 915 MHz. Both low (epsilon r = 10) and high (epsilon r = 85) dielectric constant substrates were tested. Normal and tangential components of the near-zone radiated electric field were discriminated by appropriate orientation of the E-field probe. Low normal to transverse electric-field ratios at 3.0 cm depth indicate that the radiators may be useful for hyperthermia heating with an intervening water bolus. Electric-field pattern addition from a three-element linear array of these elements in water indicates that phase and amplitude adjustment can achieve some limited control over the distribution of radiated power.
Babbush, Ryan; Parkhill, John; Aspuru-Guzik, Alán
2013-01-01
Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory.
Liu, Yuan [Department of Physics, State Key Laboratory of Low-Dimensional Quantum Physics, Tsinghua University, Beijing 100084 (China); Ning, Chuangang, E-mail: ningcg@tsinghua.edu.cn [Department of Physics, State Key Laboratory of Low-Dimensional Quantum Physics, Tsinghua University, Beijing 100084 (China); Collaborative Innovation Center of Quantum Matter, Beijing (China)
2015-10-14
Recently, the development of photoelectron velocity map imaging makes it much easier to obtain the photoelectron angular distributions (PADs) experimentally. However, explanations of PADs are only qualitative in most cases, and very limited works have been reported on how to calculate PAD of anions. In the present work, we report a method using the density-functional-theory Kohn-Sham orbitals to calculate the photodetachment cross sections and the anisotropy parameter β. The spherical average over all random molecular orientation is calculated analytically. A program which can handle both the Gaussian type orbital and the Slater type orbital has been coded. The testing calculations on Li{sup −}, C{sup −}, O{sup −}, F{sup −}, CH{sup −}, OH{sup −}, NH{sub 2}{sup −}, O{sub 2}{sup −}, and S{sub 2}{sup −} show that our method is an efficient way to calculate the photodetachment cross section and anisotropy parameter β for anions, thus promising for large systems.
Swartz, M A; Berk, D A; Jain, R K
1996-01-01
We present a novel integrative method for characterizing transport in the lymphatic capillaries in the tail of the anesthetized mouse, which is both sensitive and reproducible for quantifying uptake and flow. Interstitially injected, fluorescently labeled macromolecules were used to visualize and quantify these processes. Residence time distribution (RTD) theory was employed to measure net flow velocity in the lymphatic network as well as to provide a relative measure of lymphatic uptake of macromolecules from the interstitium. The effects of particle size and injection pressure were determined. The uptake rate was found to be independent of particle size in the range of a 6- to 18-nm radius; beyond this size, the interstitial matrix seemed to pose a greater barrier. A comparison of 10 vs. 40 cmH2O injection pressure showed a significant influence on the relative uptake rate but not on the net velocity within the network (3.3 +/- 0.8 vs. 3.8 +/- 1.0 micron/s). This suggested the presence of a systemic driving force for baseline lymph propulsion that is independent of the local pressure gradients driving the uptake. This model can be used to examine various aspects of transport physiology of the initial lymphatics.
The application of age distribution theory in the analysis of cytofluorimetric DNA histogram data.
Watson, J V
1977-03-01
Age distribution theory has been employed in a model to analyse a variety of histograms of the DNA content of single cells in samples from experimental tumours growing in tissue culture. The method has produced satisfactory correspondence with the experimental data in which there was a wide variation in the proportions of cells in the intermitotic phases, and generally good agreement between the 3H-thymidine labelling index and the computed proportion in S phase. The model has the capacity to analyse data from populations which contain a proportion of non-cycling cells. However, it is concluded that reliable results for the growth fraction and also for the relative durations of the intermitotic phase times cannot be obtained for the data reported here from the DNA histograms alone. To obtain reliable estimates of the growth fraction the relative durations of the phase time must be known, and conversely, reliable estimates of the relative phase durations can only be obtained if the growth fraction is known.
Active control of the spatial MRI phase distribution with optimal control theory
Lefebvre, Pauline M.; Van Reeth, Eric; Ratiney, Hélène; Beuf, Olivier; Brusseau, Elisabeth; Lambert, Simon A.; Glaser, Steffen J.; Sugny, Dominique; Grenier, Denis; Tse Ve Koon, Kevin
2017-08-01
This paper investigates the use of Optimal Control (OC) theory to design Radio-Frequency (RF) pulses that actively control the spatial distribution of the MRI magnetization phase. The RF pulses are generated through the application of the Pontryagin Maximum Principle and optimized so that the resulting transverse magnetization reproduces various non-trivial and spatial phase patterns. Two different phase patterns are defined and the resulting optimal pulses are tested both numerically with the ODIN MRI simulator and experimentally with an agar gel phantom on a 4.7 T small-animal MR scanner. Phase images obtained in simulations and experiments are both consistent with the defined phase patterns. A practical application of phase control with OC-designed pulses is also presented, with the generation of RF pulses adapted for a Magnetic Resonance Elastography experiment. This study demonstrates the possibility to use OC-designed RF pulses to encode information in the magnetization phase and could have applications in MRI sequences using phase images.
Vlah, Zvonimir; McDonald, Patrick; Okumura, Teppei; Baldauf, Tobias
2012-01-01
We develop a perturbative approach to redshift space distortions (RSD) using the phase space distribution function approach and apply it to the dark matter redshift space power spectrum and its moments. RSD can be written as a sum over density weighted velocity moments correlators, with the lowest order being density, momentum density and stress energy density. We use standard and extended perturbation theory (PT) to determine their auto and cross correlators, comparing them to N-body simulations. We show which of the terms can be modeled well with the standard PT and which need additional terms that include higher order corrections which cannot be modeled in PT. Most of these additional terms are related to the small scale velocity dispersion effects, the so called finger of god (FoG) effects, which affect some, but not all, of the terms in this expansion, and which can be approximately modeled using a simple physically motivated ansatz such as the halo model. We point out that there are several velocity dis...
Force-Field Functor Theory: Classical Force-Fields which Reproduce Equilibrium Quantum Distributions
Ryan eBabbush
2013-10-01
Full Text Available Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory.
Heavy Pseudoscalar Twist-3 Distribution Amplitudes within QCD Theory in Background Fields
Zhong, Tao; Huang, Tao; Fu, Hai-Bing
2016-01-01
In this paper, we study the properties of the twist-3 distribution amplitude (DA) of the heavy pseudo-scalars such as $\\eta_c$, $B_c$ and $\\eta_b$. New sum rules for the twist-3 DA moments $\\left_{\\rm HP}$ and $\\left_{\\rm HP}$ up to sixth orders and up to dimension-six condensates are deduced under the framework of the background field theory. Based on the sum rules for the twist-3 DA moments, we construct a new model for the two twist-3 DAs of the heavy pseudo-scalar with the help of the Brodsky-Huang-Lepage prescription. Furthermore, we apply them to the $B_c\\to\\eta_c$ transition form factor ($f^{B_c\\to\\eta_c}_+(q^2)$) within the light-cone sum rules approach, and the results are comparable with other approaches. It has been found that the twist-3 DAs $\\phi^P_{3;\\eta_c}$ and $\\phi^\\sigma_{3;\\eta_c}$ are important for a reliable prediction of $f^{B_c\\to\\eta_c}_+(q^2)$. For example, at the maximum recoil region, we have $f^{B_c\\to\\eta_c}_+(0) = 0.674 \\pm 0.066$, in which those two twist-3 terms provide $\\sim3...
Lee, Hyo-Chang; Chung, Chin-Wook
2016-09-01
Hysteresis, which is the history dependence of physical systems, indicates that there are more-than-two stable points in a given condition, and it has been considered to one of the most important topics in fundamental physics. Recently, the hysteresis of plasma has become a focus of research because stable plasma operation is very important for fusion reactors, bio-medical plasmas, and industrial plasmas for nano-device fabrication process. Interestingly, the bi-stability characteristics of plasma with a huge hysteresis loop have been observed in inductive discharge plasmas Because hysteresis study in such plasmas can provide a universal understanding of plasma physics, many researchers have attempted experimental and theoretical studies. Despite long plasma research, how this plasma hysteresis occurs remains an unresolved question in plasma physics. Here, we report theory, experiment, and modeling of the hysteresis. It was found experimentally and theoretically that evolution of the electron energy distribution (EED) makes a strong plasma hysteresis. In Ramsauer and non-Ramsauer gas experiments, it was revealed that the plasma hysteresis is observed only at high pressure Ramsauer gas where the EED deviates considerably from a Maxwellian shape. This hysteresis was presented in the plasma balance model where the EED is considered. Because electrons in plasmas are usually not in a thermal equilibrium, this EED-effect can be regarded as a universal phenomenon in plasma physics. This research was partially supported by Korea Research Institute of Standard and Science.
YANG Yun-tao; SHI Zhi-yong; L(U) Jian-gang; GUAN Zhen-zhen
2009-01-01
The interference of carrier magnetic field to geomagnetic field has been a difficult problem for a long time, which influences on the deviation of navigation compass and the error of geomagnetic measurement. To increase the geomagnetic measuring accuracy required for the geomagnetic matching localization, the strategy to eliminate the effect of connatural and induced magnetic fields of carrier on the geomagnetic measuring accuracy is investigated. The magnetic-dipoles magnetic field distributing theory is used to deduce the magnetic composition in the position of the sensor installed on the carrier. A geomagnetic measurement model is established by using the measuring data with the ideal sensor. Considering the magnetic disturbance of carrier and the error of sensor, a geomagnetic measuring compensation model is built. This model can be used to compensate the errors of carrier magnetic field and magnetic sensor in any case and its parameters have clear or specific physical meaning. The experimented results show that the model has higher geomagnetic measuring accuracy than that of others.
Algina, James; Keselman, H. J.
2008-01-01
Applications of distribution theory for the squared multiple correlation coefficient and the squared cross-validation coefficient are reviewed, and computer programs for these applications are made available. The applications include confidence intervals, hypothesis testing, and sample size selection. (Contains 2 tables.)
Cyr-Racine, Francis-Yan; Zavala, Jesus; Bringmann, Torsten; Vogelsberger, Mark; Pfrommer, Christoph
2015-01-01
We formulate an effective theory of structure formation (ETHOS) that enables cosmological structure formation to be computed in almost any microphysical model of dark matter physics. This framework maps the detailed microphysical theories of particle dark matter interactions into the physical effective parameters that shape the linear matter power spectrum and the self-interaction transfer cross section of non-relativistic dark matter. These are the input to structure formation simulations, which follow the evolution of the cosmological and galactic dark matter distributions. Models with similar effective parameters in ETHOS but with different dark particle physics would nevertheless result in similar dark matter distributions. We present a general method to map an ultraviolet complete or effective field theory of low energy dark matter physics into parameters that affect the linear matter power spectrum and carry out this mapping for several representative particle models. We further propose a simple but use...
Munaò, G; Costa, D; Saija, F; Caccamo, C
2010-02-28
We report molecular dynamics and reference interaction site model (RISM) theory of methanol and carbon tetrachloride mixtures. Our study encompasses the whole concentration range, by including the pure component limits. We majorly focus on an analysis of partial, total, and concentration-concentration structure factors, and examine in detail the k-->0 limits of these functions. Simulation results confirm the tendency of methanol to self-associate with the formation of ring structures in the high dilution regime of this species, in agreement with experimental studies and with previous simulations by other authors. This behavior emerges as strongly related to the high nonideality of the mixture, a quantitative estimate of which is provided in terms of concentration fluctuation correlations, through the structure factors examined. The interaggregate correlation distance is also thereby estimated. Finally, the compressibility of the mixture is found in good agreement with experimental data. The RISM predictions are throughout assessed against simulation; the theory describes better the apolar solvent than the alcohol properties. Self-association of methanol is qualitatively reproduced, though this trend is much less marked in comparison with simulation results.
LIU Defu; WANG Liping; PANG Liang
2006-01-01
In this paper, a new type of distribution,multivariate compound extreme value distribution(MCEVD), is introduced by compounding a discrete distribution with a multivariate continuous distribution of extreme sea events. In its engineering application the number over certain threshold level per year is fitting to Poisson distribution and the corresponding extreme sea events are fitting to Nested Logistic distribution, then the Poisson-Nested logistic trivariate compound extreme value distribution (PNLTCED) is proposed to predict extreme wave heights, periods and wind speeds in Yellow Sea. The new model gives more stable and reasonable predicted results.
Revisiting the theory of the evolution of pick-up ion distributions: magnetic or adiabatic cooling?
H. J. Fahr
2008-01-01
Full Text Available We study the phasespace behaviour of heliospheric pick-up ions after the time of their injection as newly created ions into the solar wind bulk flow from either charge exchange or photoionization of interplanetary neutral atoms. As interaction with the ambient MHD wave fields we allow for rapid pitch angle diffusion, but for the beginning of this paper we shall neglect the effect of quasilinear or nonlinear energy diffusion (Fermi-2 acceleration induced by counterflowing ambient waves. In the up-to-now literature connected with the convection of pick-up ions by the solar wind only adiabatic cooling of these ions is considered which in the solar wind frame takes care of filling the gap between the injection energy and energies of the thermal bulk of solar wind ions. Here we reinvestigate the basics of the theory behind this assumption of adiabatic pick-up ion reactions and correlated predictions derived from it. We then compare it with the new assumption of a pure magnetic cooling of pick-up ions simply resulting from their being convected in an interplanetary magnetic field which decreases in magnitude with increase of solar distance. We compare the results for pick-up ion distribution functions derived along both ways and can point out essential differences of observational and diagnostic relevance. Furthermore we then include stochastic acceleration processes by wave-particle interactions. As we can show, magnetic cooling in conjunction with diffusive acceleration by wave-particle interaction allows for an unbroken power law with the unique power index γ=−5 beginning from lowest velocities up to highest energy particles of about 100 KeV which just marginally can be in resonance with magnetoacoustic turbulences. Consequences for the resulting pick-up ion pressures are also analysed.
A model for habitat selection and species distribution derived from central place foraging theory.
Olsson, Ola; Bolin, Arvid
2014-06-01
We have developed a habitat selection model based on central place foraging theory. An individual's decision to include a patch in its habitat depends on the marginal fitness contribution of that patch, which is characterized by its quality and distance to the central place. The essence of the model we have developed is a fitness isocline which is a function of patch quality and travel time to the patch. It has two parameters: the maximum travel distance to a patch of infinite quality and a coefficient that appropriately scales quality by travel time. Patches falling below the isocline will have positive marginal fitness values and should be included in the habitat. The maximum travel distance depends on the availability and quality of patches, as well as on the forager's life history, whereas the scaling parameter mostly depends on life history properties. Using the model, we derived a landscape quality metric (which can be thought of as a connectivity measure) that sums the values of available habitat in the landscape around a central place. We then fitted the two parameters to foraging data on breeding white storks (Ciconia ciconia) and estimated landscape quality, which correlated strongly with reproductive success. Landscape quality was then calculated for a larger region where re-introduction of the species is currently going on in order to demonstrate how this model can also be regarded as a species distribution model. In conclusion, we have built a general habitat selection model for central place foragers and a novel way of estimating landscape quality based on a behaviorally scaled connectivity metric.
Heavy pseudoscalar twist-3 distribution amplitudes within QCD theory in background fields
Zhong, Tao [Henan Normal University, College of Physics and Materials Science, Xinxiang (China); Wu, Xing-Gang [Chongqing University, Department of Physics, Chongqing (China); Huang, Tao [Chinese Academy of Sciences, Institute of High Energy Physics and Theoretical Physics Center for Science Facilities, Beijing (China); Fu, Hai-Bing [Guizhou Minzu University, School of Science, Guiyang (China)
2016-09-15
In this paper, we study the properties of the twist-3 distribution amplitude (DA) of the heavy pseudoscalars such as η{sub c}, B{sub c}, and η{sub b}. New sum rules for the twist-3 DA moments left angle ξ{sup n}{sub P} right angle {sub HP} and left angle ξ{sup n}{sub σ} right angle {sub HP} up to sixth order and up to dimension-six condensates are deduced under the framework of the background field theory. Based on the sum rules for the twist-3 DA moments, we construct a new model for the two twist-3 DAs of the heavy pseudoscalar with the help of the Brodsky-Huang-Lepage prescription. Furthermore, we apply them to the B{sub c} → η{sub c} transition form factor (f{sub +}{sup B{sub c}→η{sub c}}(q{sup 2})) within the light-cone sum rules approach, and the results are comparable with other approaches. It has been found that the twist-3 DAs φ{sup P}{sub 3;η{sub c}} and φ{sup σ}{sub 3;η{sub c}} are important for a reliable prediction of f{sub +}{sup B{sub c}→η{sub c}}(q{sup 2}). For example, at the maximum recoil region, we have f{sub +}{sup B{sub c}→η{sub c}}(0) = 0.674 ± 0.066, in which those two twist-3 terms provide ∝33 and ∝22% contributions. Also we calculate the branching ratio of the semi-leptonic decay B{sub c} → η{sub c}lν Br(B{sub c} → η{sub c}lν) = (9.31{sup +2.27}{sub -2.01}) x 10{sup -3}. (orig.)
Nakatsuka, Takao [Okayama Shoka University, Laboratory of Information Science, Okayama (Japan); Okei, Kazuhide [Kawasaki Medical School, Dept. of Information Sciences, Kurashiki (Japan); Iyono, Atsushi [Okayama university of Science, Dept. of Fundamental Science, Faculty of Science, Okayama (Japan); Bielajew, Alex F. [Univ. of Michigan, Dept. Nuclear Engineering and Radiological Sciences, Ann Arbor, MI (United States)
2015-12-15
Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)
Marshall, Bennett D; Chapman, Walter G
2013-08-07
We develop a new theory for associating fluids with multiple association sites. The theory accounts for small bond angle effects such as steric hindrance, ring formation, and double bonding. The theory is validated against Monte Carlo simulations for the case of a fluid of patchy colloid particles with three patches and is found to be very accurate. Once validated, the theory is applied to study the phase diagram of a fluid composed of three patch colloids. It is found that bond angle has a significant effect on the phase diagram and the very existence of a liquid-vapor transition.
Grabner, Peter
2017-01-01
This volume is dedicated to Robert F. Tichy on the occasion of his 60th birthday. Presenting 22 research and survey papers written by leading experts in their respective fields, it focuses on areas that align with Tichy’s research interests and which he significantly shaped, including Diophantine problems, asymptotic counting, uniform distribution and discrepancy of sequences (in theory and application), dynamical systems, prime numbers, and actuarial mathematics. Offering valuable insights into recent developments in these areas, the book will be of interest to researchers and graduate students engaged in number theory and its applications.
Direct simulation of groundwater transit-time distributions using the reservoir theory
Etcheverry, David; Perrochet, Pierre
Groundwater transit times are of interest for the management of water resources, assessment of pollution from non-point sources, and quantitative dating of groundwaters by the use of environmental isotopes. The age of water is the time water has spent in an aquifer since it has entered the system, whereas the transit time is the age of water as it exits the system. Water at the outlet of an aquifer is a mixture of water elements with different transit times, as a consequence of the different flow-line lengths. In this paper, transit-time distributions are calculated by coupling two existing methods, the reservoir theory and a recent age-simulation method. Based on the derivation of the cumulative age distribution over the whole domain, the approach accounts for the whole hydrogeological framework. The method is tested using an analytical example and its applicability illustrated for a regional layered aquifer. Results show the asymmetry and multimodality of the transit-time distribution even in advection-only conditions, due to the aquifer geometry and to the velocity-field heterogeneity. Résumé Les temps de transit des eaux souterraines sont intéressants à connaître pour gérer l'évaluation des ressources en eau dans le cas de pollution à partir de sources non ponctuelles, et aussi pour dater quantitativement les eaux souterraines au moyen des isotopes du milieu. L'âge de l'eau est le temps qu'elle a passé dans un aquifère depuis qu'elle est entrée dans le système, alors que le temps de transit est l'âge de l'eau au moment où elle quitte le système. L'eau à la sortie d'un aquifère est un mélange d'eaux possédant différents temps de transit, du fait des longueurs différentes des lignes de courant suivies. Dans ce papier, les distributions des temps de transit sont calculées en couplant deux méthodes, la théorie du réservoir et une méthode récente de simulation des âges. Basée sur la dérivation de la distribution cumulées des âges sur
Casault Sébastien
2016-05-01
Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture
Okie, Jordan G.; Van Horn, David J.; Storch, David; Barrett, John E.; Gooseff, Michael N.; Kopsova, Lenka; Takacs-Vesbach, Cristina D.
2015-01-01
The causes of biodiversity patterns are controversial and elusive due to complex environmental variation, covarying changes in communities, and lack of baseline and null theories to differentiate straightforward causes from more complex mechanisms. To address these limitations, we developed general diversity theory integrating metabolic principles with niche-based community assembly. We evaluated this theory by investigating patterns in the diversity and distribution of soil bacteria taxa across four orders of magnitude variation in spatial scale on an Antarctic mountainside in low complexity, highly oligotrophic soils. Our theory predicts that lower temperatures should reduce taxon niche widths along environmental gradients due to decreasing growth rates, and the changing niche widths should lead to contrasting α- and β-diversity patterns. In accord with the predictions, α-diversity, niche widths and occupancies decreased while β-diversity increased with increasing elevation and decreasing temperature. The theory also successfully predicts a hump-shaped relationship between α-diversity and pH and a negative relationship between α-diversity and salinity. Thus, a few simple principles explained systematic microbial diversity variation along multiple gradients. Such general theory can be used to disentangle baseline effects from more complex effects of temperature and other variables on biodiversity patterns in a variety of ecosystems and organisms. PMID:26019154
1980-09-29
FOUNDATIONS OF EIGENVALUE DISTRIBUTION THEORY FOR GENERAL A NON--ETC(U) SEP 80 M MARCUS, M GOLDBERG, M NEWMAN AFOSR-79-0127 UNCLASSIFIED AFOSR-TR-80...September 1980 Title of Research: Foundations of Eigenvalue Distribution Theory for General & Nonnegative Matrices, Stability Criteria for Hyperbolic
Burns Tom
2014-08-01
Full Text Available This article presents a relatively straightforward theoretical framework about distributive justice with applications. It draws on a few key concepts of Sociological Game Theory (SGT. SGT is presented briefly in section 2. Section 3 provides a spectrum of distributive cases concerning principles of equality, differentiation among recipients according to performance or contribution, status or authority, or need. Two general types of social organization of distributive judgment are distinguished and judgment procedures or algorithms are modeled in each type of social organization. Section 4 discusses briefly the larger moral landscapes of human judgment – how distribution may typically be combined with other value into consideration. The article suggests that Rawls, Elster, and Machado point in this direction. Finally, it is suggested that the SGT framework presented provides a useful point of departure to systematically link it and compare the Warsaw School of Fair Division, Rawls, and Elster, among others.
Pimpinelli, Alberto; Einstein, T. L.; González, Diego Luis; Sathiyanarayanan, Rajesh; Hamouda, Ajmi Bh.
2011-03-01
Earlier we showed [PRL 99, 226102 (2007)] that the CZD in growth could be well described by P (s) = asβ exp (-bs2) , where s is the CZ area divided by its average value. Painstaking simulations by Amar's [PRE 79, 011602 (2009)] and Evans's [PRL 104, 149601 (2010)] groups showed inadequacies in our mean field Fokker-Planck argument relating β to the critical nucleus size. We refine our derivation to retrieve their β ~ i + 2 [PRL 104, 149602 (2010)]. We discuss applications of this formula and methodology to experiments on Ge/Si(001) and on various organics on Si O2 , as well as to kinetic Monte Carlo studies homoepitaxial growth on Cu(100) with codeposited impurities of different sorts. In contrast to theory, there can be significant changes to β with coverage. Some experiments also show temperature dependence. Supported by NSF-MRSEC at UMD, Grant DMR 05-20471.
Bibinger, Markus
2011-01-01
The article is devoted to the nonparametric estimation of the quadratic covariation of non-synchronously observed It\\^o processes in an additive microstructure noise model. In a high-frequency setting, we aim at establishing an asymptotic distribution theory for a generalized multiscale estimator including a feasible central limit theorem with optimal convergence rate on convenient regularity assumptions. The inevitably remaining impact of asynchronous deterministic sampling schemes and noise corruption on the asymptotic distribution is precisely elucidated. A case study for various important examples, several generalizations of the model and an algorithm for the implementation warrant the utility of the estimation method in applications.
Akemann, G; Bloch, J; Shifrin, L; Wettig, T
2008-01-25
We analyze how individual eigenvalues of the QCD Dirac operator at nonzero quark chemical potential are distributed in the complex plane. Exact and approximate analytical results for both quenched and unquenched distributions are derived from non-Hermitian random matrix theory. When comparing these to quenched lattice QCD spectra close to the origin, excellent agreement is found for zero and nonzero topology at several values of the quark chemical potential. Our analytical results are also applicable to other physical systems in the same symmetry class.
Zhe Zhang
2014-01-01
Full Text Available In order to solve the problems of the existing wide-area backup protection (WABP algorithms, the paper proposes a novel WABP algorithm based on the distribution characteristics of fault component current and improved Dempster/Shafer (D-S evidence theory. When a fault occurs, slave substations transmit to master station the amplitudes of fault component currents of transmission lines which are the closest to fault element. Then master substation identifies suspicious faulty lines according to the distribution characteristics of fault component current. After that, the master substation will identify the actual faulty line with improved D-S evidence theory based on the action states of traditional protections and direction components of these suspicious faulty lines. The simulation examples based on IEEE 10-generator-39-bus system show that the proposed WABP algorithm has an excellent performance. The algorithm has low requirement of sampling synchronization, small wide-area communication flow, and high fault tolerance.
2016-06-02
is derived to facilitate use of secondary polarization. The model is supported by exper- imental MFOV lidar measurements carried out in a controlled ...Retrieval of droplet-size density distribution from multiple-field-of- view cross-polarized lidar signals: theory and experimental validation...Gilles Roy, Luc Bissonnette, Christian Bastille, and Gilles Vallee Multiple-field-of- view (MFOV) secondary-polarization lidar signals are used to
Om Prakash Mishra
2013-12-01
Full Text Available Distribution process of a supply chain management is very important and strategic. To stay efficient in delivering the finished goods in the hands of end user, the speed and responsiveness of distribution mechanism are most essential. Just in Time (JIT has proved result oriented in the manufacturing systems since 1985. Since then its applicability in several fields are being investigated. Low inventory, small lot size, least lead time and quality product are some desirable features of JIT, which proves beneficiary to the firm. This paper highlights JIT applied distribution process in order to shorten the lead time, inventory holding costs. Application of Graph theory (GTA methodology has been used to investigate the interdependencies of dimensions and its attributes of JIT implemented distribution. Then an empirical value of JIT distribution process (JDP of an organisation has been derived , subsequently coefficient of distribution dissimilarity and similarity have been found for two organisation and based on these values comparison 0f any two organisation can be done. Approach can be used to know the lacks of various organizations based on their feasibility index of transition (FIT value and which is required to develop to transform the organisation applying JIT in distribution process.
Aitken, R J; Elton, R A
1984-11-01
The value of Poisson distribution theory in describing and predicting the nature of sperm-egg interaction in vitro has been investigated using an interspecific in-vitro fertilization system, incorporating zona-free hamster oocytes and human spermatozoa. The frequency distribution of polyspermic oocyte penetrations in 72 experiments exhibited good agreement with the Poisson distribution at all levels of fertilization indicating that each oocyte must be of equal penetrability and that there can be no block to polyspermy in this interspecific system. Poisson distribution theory also accurately described the relationship between oocyte penetration and sperm motility in 50 out of 54 separate experiments spread across 10 serial dilution curves. For each dilution series the shape of the fitted curve was fixed but its location along the x-axis varied from donor to donor. The fixed nature of the relationship between sperm motility and egg penetration enables the results of such in-vitro fertilization experiments to be corrected for the number of motile spermatozoa in the incubation media. On the basis of these findings a protocol is described for assessing the results of the zona-free hamster oocyte penetration assay, which involves analysis of the degree of polyspermy followed by the application of Poisson distribution theory to correct the results to a standard concentration of motile spermatozoa. Changes in the penetrating ability of human spermatozoa after vasectomy and characterization of the degree of inter-ejaculate variation in penetrating potential are two clinical examples of such analyses given in the text. The statistical methods described in this paper should also be of general relevance to the study of fertilization mechanisms, in providing a rationale by which to analyse the quantitative nature of sperm-egg interaction in vitro.
Erdem Cam
2006-12-01
Full Text Available In this article; salaries of "career occupations" dependent on act of number 657 İn Turkey and wages of labor dependent on act of number 4857(Labor Law are compared to state an example of income distribution inequality in Turkey. This study consists of three sections. In the first section, the importance of wage policy in income distribution policy is explained. In second section, wage formation and education and employment relations are taken İn hand İn the frame of assumptions of human capital theory. In the last section, wages of labor dependent on act of number 4857 and salaries of career occupations dependent on act of number 657 are compared in a state economics enterprise which displays activity in energy sector in Turkey.
SHEN Yan
2005-01-01
@@ With developments of technology of computer and network, researching on distributed measurement system becomes one of the hot problems in the field of automatic test. However, existing resolutions to distributed measurement system still have great limit,e.g. intelligence, self-adaptivity, collaboration, system load balance and integer view, and their capabilities need to be enhanced. Based on two key projects, this paper studies on collaboration mechanism and real-time of communication platform in distributed measurement system comprehensively and systematically.
Control limitations from distributed sensing: theory and Extremely Large Telescope application
Sarlette, Alain; Sepulchre, Rodolphe
2013-01-01
We investigate performance bounds for feedback control of distributed plants where the controller can be centralized (i.e. it has access to measurements from the whole plant), but sensors only measure differences between neighboring subsystem outputs. Such "distributed sensing" can be a technological necessity in applications where system size exceeds accuracy requirements by many orders of magnitude. We formulate how distributed sensing generally limits feedback performance robust to measure...
Regnier, D; Schunck, N; Verriere, M
2016-01-01
Accurate knowledge of fission fragment yields is an essential ingredient of numerous applications ranging from the formation of elements in the r-process to fuel cycle optimization for nuclear energy. The need for a predictive theory applicable where no data is available is an incentive to develop a fully microscopic approach to fission dynamics. In this work, we calculate the pre-neutron emission charge and mass distributions of the fission fragments formed in the neutron-induced fission of 239Pu using a microscopic method based on nuclear energy density functional (EDF) method, where large amplitude collective motion is treated adiabatically using the time dependent generator coordinate method (TDGCM) under the Gaussian overlap approximation (GOA). Fission fragment distributions are extracted from the flux of the collective wave packet through the scission line. We find that the main characteristics of the fission charge and mass distributions can be well reproduced by existing energy functionals even in tw...
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
Larwin, Karen H.; Larwin, David A.
2011-01-01
Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…
CLAUDIA FUENTES
2011-01-01
Full Text Available Este artículo revisa el contenido, los supuestos y las implicancias del principio de distribución social del poder en la teoría general de la separación de los poderes del Estado de Montesquieu. Contra una tradición que ha consagrado el principio de distribución jurídica de las funciones ejecutiva, legislativa y judicial de esta teoría, denunciando al mismo tiempo el anacronismo del principio de distribución social, sostengo, primero, que este principio es independiente del modelo estamental al que lo aplica Montesquieu; segundo, que la distribución jurídica depende de la distribución social para evitar el abuso del poder y salvaguardar la libertad de los ciudadanos; por último, que el principio de distribución social remite a la dimensión propiamente política de la teoría del poder de Montesquieu.This article reviews the content, assumptions and implications of the principle of the social distribution of power within Montesquieu's general theory of the separation of powers. In challenging a tradition that has been devoted to the principle of the juridical distribution of executive, legislative and judicial functions, and at the same time denouncing the anachronism of the principle of social distribution, Ifirst sustain that this principle is independent of the class model to which it is applied by Montesquieu. Secondly, I maintain that juridical distribution depends on social distribution in order to avoid the abuse of power and to safeguard public freedom. Finally, I hold that the principle of social distribution refers to the essentially political dimension of Montesquieu's theory of power.
Advances in numerical solutions to integral equations in liquid state theory
Howard, Jesse J.
Solvent effects play a vital role in the accurate description of the free energy profile for solution phase chemical and structural processes. The inclusion of solvent effects in any meaningful theoretical model however, has proven to be a formidable task. Generally, methods involving Poisson-Boltzmann (PB) theory and molecular dynamic (MD) simulations are used, but they either fail to accurately describe the solvent effects or require an exhaustive computation effort to overcome sampling problems. An alternative to these methods are the integral equations (IEs) of liquid state theory which have become more widely applicable due to recent advancements in the theory of interaction site fluids and the numerical methods to solve the equations. In this work a new numerical method is developed based on a Newton-type scheme coupled with Picard/MDIIS routines. To extend the range of these numerical methods to large-scale data systems, the size of the Jacobian is reduced using basis functions, and the Newton steps are calculated using a GMRes solver. The method is then applied to calculate solutions to the 3D reference interaction site model (RISM) IEs of statistical mechanics, which are derived from first principles, for a solute model of a pair of parallel graphene plates at various separations in pure water. The 3D IEs are then extended to electrostatic models using an exact treatment of the long-range Coulomb interactions for negatively charged walls and DNA duplexes in aqueous electrolyte solutions to calculate the density profiles and solution thermodynamics. It is found that the 3D-IEs provide a qualitative description of the density distributions of the solvent species when compared to MD results, but at a much reduced computational effort in comparison to MD simulations. The thermodynamics of the solvated systems are also qualitatively reproduced by the IE results. The findings of this work show the IEs to be a valuable tool for the study and prediction of
Robert M. Solow
2000-05-01
Full Text Available The paper surveys the neoclassical theory of growth. As a preliminary, the meaning of the adjective "neoclassical" is discussed. The basic model is then sketched, and the conditions ensuring a stationary state are illustrated. The issue of the convergence to a stationary state (and that of the speed of convergence is further considered. A discussion of "primary factors" opens the way to the "new" theory of growth, with endogenous technical progress. A number of extensions of the basic model are then recalled: two-sector and multi-sectoral models, overlapping generations models, the role of money in growth models.
Wohletz, K.H. (Earth and Space Science Division Los Alamos National Laboratory, New Mexico (USA)); Sheridan, M.F. (Department of Geology, Arizona State University, Tempe (USA)); Brown, W.K. (Math/Science Division, Lassen College, Susanville, California (USA))
1989-11-10
The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: {ital n}({ital l})={ital kl}{sup {alpha}} exp(-{ital l}{beta}), where {ital n}({ital l}) represents the number of particles of diameter {ital l}, {ital l} is the normalized particle diameter, and {ital k}, {alpha}, and {beta} are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass {ital m}{prime}: {ital n}({ital x}, {ital m})={ital C} {integral}{integral} {ital n}({ital x}{prime}, {ital m}{prime}){ital p}({xi}) {ital dx}{prime} {ital dm}{prime}, where {ital x}{prime} denotes spatial location along a linear axis, {ital C} is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to {ital m}.
The application of value distribution theory to a doubly anharmonic oscillator
Hu Juan [Department of Applied Mathematics, Zhejiang University of Technology, Hangzhou 310023 (China); Yu Guofu, E-mail: gfyu@sjtu.edu.cn [Department of Mathematics, Shanghai Jiao Tong University, Shanghai 200240 (China)
2011-07-22
The model of doubly anharmonic oscillators is first transformed into certain periodic second ordinary differential equations. A class of exact solutions for eigenfunctions and eigenvalues is obtained from Bank and Laine's theory on periodic ordinary differential equations, which is expressed in the form of the products of the polynomial and exponential functions when parameters satisfy some special relations.
Maximum Entropy Principle Based Estimation of Performance Distribution in Queueing Theory
He, Dayi; Li, Ran; Huang, Qi; Lei, Ping
2014-01-01
In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated. PMID:25207992
Maximum entropy principle based estimation of performance distribution in queueing theory.
He, Dayi; Li, Ran; Huang, Qi; Lei, Ping
2014-01-01
In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated.
Maximum entropy principle based estimation of performance distribution in queueing theory.
Dayi He
Full Text Available In related research on queuing systems, in order to determine the system state, there is a widespread practice to assume that the system is stable and that distributions of the customer arrival ratio and service ratio are known information. In this study, the queuing system is looked at as a black box without any assumptions on the distribution of the arrival and service ratios and only keeping the assumption on the stability of the queuing system. By applying the principle of maximum entropy, the performance distribution of queuing systems is derived from some easily accessible indexes, such as the capacity of the system, the mean number of customers in the system, and the mean utilization of the servers. Some special cases are modeled and their performance distributions are derived. Using the chi-square goodness of fit test, the accuracy and generality for practical purposes of the principle of maximum entropy approach is demonstrated.
A new derivative with normal distribution kernel: Theory, methods and applications
Atangana, Abdon; Gómez-Aguilar, J. F.
2017-06-01
New approach of fractional derivative with a new local kernel is suggested in this paper. The kernel introduced in this work is the well-known normal distribution that is a very common continuous probability distribution. This distribution is very important in statistics and also highly used in natural science and social sciences to portray real-valued random variables whose distributions are not known. Two definitions are suggested namely Atangana-Gómez Averaging in Liouville-Caputo and Riemann-Liouville sense. We presented some relationship with existing integrals transform operators. Numerical approximations for first and second order approximation are derived in detail. Some Applications of the new mathematical tools to describe some real world problems are presented in detail. This is a new door opened the field of statistics, natural and socials sciences.
Wohletz, K. H.; Sheridan, M. F.; Brown, W. K.
1989-11-01
The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: n(l) = klα exp (-lβ), where n(l) represents the number of particles of diameter l, l is the normalized particle diameter, and k, α, and β are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass m': n(x, m) = C ∫∫ n(x', m')p(ξ)dx' dm', where x' denotes spatial location along a linear axis, C is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to m. We show that the probability function that models the production of particles of different size from an initial mass and sorts that distribution, p(ξ), is related to mg, where g (noted as γ for fragmentation processes) is a free parameter that determines the location, breadth, and skewness of the distribution; g(γ) must be greater than -1, and it increases from that value as the distribution matures with greater number of sequential steps in the fragmentation or transport process; γ is expected to be near -1 for "sudden" fragmentation mechanisms such as single-event explosions and transport mechanisms that are functionally dependent upon particle mass. This free parameter will be more positive for evolved fragmentation mechanisms such as ball milling and complex transport processes such as saltation. The SFT
Borchardt, Sarah
2014-01-01
We describe the intensity distribution of the parhelic circle for plate-oriented hexagonal ice crystals at very low solar elevations using geometrical optics. An experimental as well as theoretical study of in-plane ray-paths provides details on the mechanism for several halos, including the parhelia, the $120^{\\circ}$ parhelia, the blue spot and the Liljequist parhelia. Azimuthal coordinates for associated characteristic features in the intensity distribution are compared to data obtained using a rotating hexagonal glass prism.
Ahmad, Sabihi
2016-01-01
We solve some famous conjectures on the distribution of primes. These conjectures are to be listed as Legendre's, Andrica's, Oppermann's, Brocard's, Cram\\'{e}r's, Shanks', and five Smarandache's conjectures. We make use of both Firoozbakht's conjecture (which recently proved by the author) and Kourbatov's theorem on the distribution of and gaps between consecutive primes. These latter conjecture and theorem play an essential role in our methods for proving these famous conjectures. In order t...
Ahmad, Sabihi
2016-01-01
We solve some famous conjectures on the distribution of primes. These conjectures are to be listed as Legendre's, Andrica's, Oppermann's, Brocard's, Cram\\'{e}r's, Shanks', and five Smarandache's conjectures. We make use of both Firoozbakht's conjecture (which recently proved by the author) and Kourbatov's theorem on the distribution of and gaps between consecutive primes. These latter conjecture and theorem play an essential role in our methods for proving these famous conjectures. In order t...
Typical versus averaged overlap distribution in spin glasses: Evidence for droplet scaling theory
Monthus, Cécile; Garel, Thomas
2013-10-01
We consider the statistical properties over disordered samples (J) of the overlap distribution PJ(q) which plays the role of an order parameter in spin glasses. We show that near zero temperature (i) the typical overlap distribution is exponentially small in the central region of -1models in which the notion of length does not exist); (ii) the rescaled variable v=-[lnPJ(q)]/Nθ remains an O(1) random positive variable describing sample-to-sample fluctuations; (iii) the averaged distribution PJ(q)¯ is nontypical and dominated by rare anomalous samples. Similar statements hold for the cumulative overlap distribution IJ(q0)≡∫0q0dqPJ(q). These results are derived explicitly for the spherical mean-field model with θ=1/3, ϕ(q)=1-q2, and the random variable v corresponds to the rescaled difference between the two largest eigenvalues of Gaussian orthogonal ensemble random matrices. Then we compare numerically the typical and averaged overlap distributions for the long-ranged one-dimensional Ising spin glass with random couplings decaying as J(r)∝r-σ for various values of the exponent σ, corresponding to various droplet exponents θ(σ), and for the mean-field Sherrington-Kirkpatrick model (corresponding formally to the σ=0 limit of the previous model). Our conclusion is that future studies on spin glasses should measure the typical values of the overlap distribution Ptyp(q) or of the cumulative overlap distribution Ityp(q0)=elnIJ(q0)¯ to obtain clearer conclusions on the nature of the spin-glass phase.
Schroeder, J. W. R., E-mail: james-schroeder@uiowa.edu; Skiff, F.; Howes, G. G.; Kletzing, C. A. [Department of Physics and Astronomy, University of Iowa, Iowa City, Iowa 52242 (United States); Carter, T. A.; Dorfman, S. [Department of Physics and Astronomy, University of California, Los Angeles, California 90095-1547 (United States)
2015-12-10
Wave propagation can be an accurate method for determining material properties. High frequency whistler mode waves (0.7 < ω/|Ω{sub ce}| < 1) in an overdense plasma (ω{sub pe} > |Ω{sub ce}|) are damped primarily by Doppler-shifted electron cyclotron resonance. A kinetic description of whistler mode propagation parallel to the background magnetic field shows that damping is proportional to the parallel electron distribution function. This property enables an experimental determination of the parallel electron distribution function using a measurement of whistler mode wave absorption. The whistler mode wave absorption diagnostic uses this technique on UCLA’s Large Plasma Device (LaPD) to measure the distribution of high energy electrons (5 − 10v{sub te}) with 0.1% precision. The accuracy is limited by systematic effects that need to be considered carefully. Ongoing research uses this diagnostic to investigate the effect of inertial Alfvén waves on the electron distribution function. Results presented here verify experimentally the linear effects of inertial Alfvén waves on the reduced electron distribution function, a necessary step before nonlinear physics can be tested. Ongoing experiments with the whistler mode wave absorption diagnostic are making progress toward the first direct detection of electrons nonlinearly accelerated by inertial Alfvén waves, a process believed to play an important role in auroral generation.
A theory of the cancer age-specific incidence data based on extreme value distributions
Soto-Ortiz, Luis; Brody, James P.
2012-03-01
The incidence of cancers varies with age, if normalized this is called the age-specific incidence. A mathematical model that describes this variation should provide a better understanding of how cancers develop. We suggest that the age-specific incidence should follow an extreme value distribution, based on three widely accepted assumptions: (1) a tumor develops from a single cell, (2) many potential tumor progenitor cells exist in a tissue, and (3) cancer is diagnosed when the first of these many potential tumor cells develops into a tumor. We tested this by comparing the predicted distribution to the age-specific incidence data for colon and prostate carcinomas collected by the Surveillance, Epidemiology and End Results network of 17 cancer registries. We found that colon carcinoma age-specific incidence data is consistent with an extreme value distribution, while prostate carcinomas age-specific incidence data generally follows the distribution. This model indicates that both colon and prostate carcinomas only occur in a subset of the population (22% for prostate and 13.5% for colon.) Because of their very general nature, extreme value distributions might be applicable to understanding other chronic human diseases.
Short time kernel asymptotics for Young SDE by means of Watanabe distribution theory
Inahama, Yuzuru
2011-01-01
In this paper we study short time asymptotics of a density function of the solution of a stochastic differential equation driven by fractional Brownian motion with Hurst parameter $H \\in (1/2, 1)$ when the coefficient vector fields satisfy an ellipticity condition at the starting point. We prove both on-diagonal and off-diagonal asymptotics under mild additional assumptions. Our main tool is Malliavin calculus, in particular, Watanabe's theory of generalized Wiener functionals.
Isar, Aurelian
1995-01-01
The harmonic oscillator with dissipation is studied within the framework of the Lindblad theory for open quantum systems. By using the Wang-Uhlenbeck method, the Fokker-Planck equation, obtained from the master equation for the density operator, is solved for the Wigner distribution function, subject to either the Gaussian type or the delta-function type of initial conditions. The obtained Wigner functions are two-dimensional Gaussians with different widths. Then a closed expression for the density operator is extracted. The entropy of the system is subsequently calculated and its temporal behavior shows that this quantity relaxes to its equilibrium value.
Egedal, J; Daughton, W; Wetherton, B; Cassak, P A; Chen, L -J; Lavraud, B; Trobert, R B; Dorelli, J; Gershman, D J; Avanov, L A
2016-01-01
Supported by a kinetic simulation, we derive an exclusion energy parameter $\\cal{E}_X$ providing a lower kinetic energy bound for an electron to cross from one inflow region to the other during magnetic reconnection. As by a Maxwell Demon, only high energy electrons are permitted to cross the inner reconnection region, setting the electron distribution function observed along the low density side separatrix during asymmetric reconnection. The analytic model accounts for the two distinct flavors of crescent-shaped electron distributions observed by spacecraft in a thin boundary layer along the low density separatrix.
The twist-3 parton distribution function e(x) in large-Nc chiral theory
Cebulla, C; Schweitzer, P; Urbano, D
2007-01-01
The chirally-odd twist-3 parton distribution function e(x) of the nucleon is studied in the large-Nc limit in the framework of the chiral quark-soliton model. It is demonstrated that in spite of properties not shared by other distribution functions, namely the appearance of a delta(x)-singularity and quadratic divergences in e(x), an equally reliable calculation is possible. Among the most remarkable results obtained in this work is the fact that the coefficient of the delta(x)-singularity can be computed exactly in this model, avoiding involved numerics. Our results complete existing studies in literature.
THE LEBESGUE-STIELJES INTEGRAL AS APPLIED IN PROBABILITY DISTRIBUTION THEORY
bounded variation and Borel measureable functions are set forth in the introduction. Chapter 2 is concerned with establishing a one to one correspondence between LebesgueStieljes measures and certain equivalence classes of functions which are monotone non decreasing and continuous on the right. In Chapter 3 the Lebesgue-Stieljes Integral is defined and some of its properties are demonstrated. In Chapter 4 probability distribution function is defined and the notions in Chapters 2 and 3 are used to show that the Lebesgue-Stieljes integral of any probability distribution
Risk Assessment of Distribution Network Based on Random set Theory and Sensitivity Analysis
Zhang, Sh; Bai, C. X.; Liang, J.; Jiao, L.; Hou, Z.; Liu, B. Zh
2017-05-01
Considering the complexity and uncertainty of operating information in distribution network, this paper introduces the use of random set for risk assessment. The proposed method is based on the operating conditions defined in the random set framework to obtain the upper and lower cumulative probability functions of risk indices. Moreover, the sensitivity of risk indices can effectually reflect information about system reliability and operating conditions, and by use of these information the bottlenecks that suppress system reliability can be found. The analysis about a typical radial distribution network shows that the proposed method is reasonable and effective.
Theory of Distribution Estimation of Hyperparameters in Markov Random Field Models
Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato
2016-06-01
We investigated the performance of distribution estimation of hyperparameters in Markov random field models proposed by Nakanishi-Ohno et al., http://doi.org/10.1088/1751-8113/47/4/045001, J. Phys. A 47, 045001 (2014) when used to evaluate the confidence of data. We analytically calculated the configurational average, with respect to data, of the negative logarithm of the posterior distribution, which is called free energy based on an analogy with statistical mechanics. This configurational average of free energy shrinks as the amount of data increases. Our results theoretically confirm the numerical results from that previous study.
Rhiel, G Steven
2010-02-01
In 2007, Rhiel presented a technique to estimate the coefficient of variation from the range when sampling from skewed distributions. To provide an unbiased estimate, a correction factor (a(n)) for the mean was included. Numerical correction factors for a number of skewed distributions were provided. In a follow-up paper, he provided a proof he claimed showed the correction factor was independent of the mean and standard deviation, making the factors useful as these parameters vary; however, that proof did not establish independence. Herein is a proof which establishes the independence.
Maruyama, Tomoyuki; Cheoun, Myung-Ki; Kajino, Toshitaka; Mathews, Grant J.
2016-06-01
We study pion production by proton synchrotron radiation in the presence of a strong magnetic field when the Landau numbers of the initial and final protons are ni,f ∼104-105. We find in our relativistic field theory calculations that the pion decay width depends only on the field strength parameter which previously was only conjectured based upon semi-classical arguments. Moreover, we also find new results that the decay width satisfies a robust scaling relation, and that the polar angular distribution of emitted pion momenta is very narrow and can be easily obtained. This scaling implies that one can infer the decay width in more realistic magnetic fields of 1015 G, where ni,f ∼1012-1013, from the results for ni,f ∼104-105. The resultant pion intensity and angular distributions for realistic magnetic field strengths are presented and their physical implications discussed.
Sumitomo, Yoske; Wong, Sam S C
2013-01-01
We study a racetrack model in the presence of the leading alpha'-correction in flux compactification in Type IIB string theory, for the purpose of getting conceivable de-Sitter vacua in the large compactified volume approximation. Unlike the K\\"ahler Uplift model studied previously, the alpha'-correction is more controllable for the meta-stable de-Sitter vacua in the racetrack case since the constraint on the compactified volume size is very much relaxed. We find that the vacuum energy density \\Lambda for de-Sitter vacua approaches zero exponentially as the volume grows. We also analyze properties of the probability distribution of \\Lambda in this class of models. As in other cases studied earlier, the probability distribution again peaks sharply at \\Lambda=0. We also study the Racetrack K\\"ahler Uplift model in the Swiss-Cheese type model.
Noguchi, Yoshifumi [Department of Physics, Graduate School of Engineering, Yokohama National University, 79-5 Tokiwadai, Hodogaya-ku, Yokohama 240-8501 (Japan); Computational Materials Science Center, National Institute for Materials Science, 1-2-1 Sengen, Tsukuba, Ibaraki 305-0047 (Japan)], E-mail: NOGUCHI.Yoshifumi@nims.go.jp; Ishii, Soh; Ohno, Kaoru [Department of Physics, Graduate School of Engineering, Yokohama National University, 79-5 Tokiwadai, Hodogaya-ku, Yokohama 240-8501 (Japan)
2007-05-15
Short-range electron correlation plays a very important role in small systems and significantly affects the double ionization energy (DIE) spectra and the two-electron distribution functions of a CO molecule, for example. In our calculations, the local density approximation (LDA) of the density functional theory is chosen as a starting point, the GW approximation (GWA) is performed in a next step, and finally the Bethe-Salpeter equation for the T-matrix, describing the particle-particle ladder diagrams up to the infinite order, is solved via the eigenvalue problem. The calculated DIE spectra, which are directly given by the eigenvalues, reflect the short-range electron correlation and are in good agreement with the experiment. We confirm that the Coulomb hole appears in the two-electron distribution function constructed from the eigenfunction.
Air method measurements of apple vessel length distributions with improved apparatus and theory
Shabtal Cohen; John Bennink; Mel Tyree
2003-01-01
Studies showing that rootstock dwarfing potential is related to plant hydraulic conductance led to the hypothesis that xylem properties are also related. Vessel length distribution and other properties of apple wood from a series of varieties were measured using the 'air method' in order to test this hypothesis. Apparatus was built to measure and monitor...
Distribution Theory for Glass's Estimator of Effect Size and Related Estimators.
Hedges, Larry V.
1981-01-01
Glass's estimator of effect size, the sample mean difference divided by the sample standard deviation, is studied in the context of an explicit statistical model. The exact distribution of Glass's estimator is obtained and the estimator is shown to have a small sample bias. Alternatives are proposed and discussed. (Author/JKS)
McGill, B.J.; Etienne, R.S.; Gray, J.S.; Alonso, D.; Anderson, M.J.; Benecha, H.K.
2007-01-01
Species abundance distributions (SADs) follow one of ecology's oldest and most universal laws ¿ every community shows a hollow curve or hyperbolic shape on a histogram with many rare species and just a few common species. Here, we review theoretical, empirical and statistical developments in the stu
A Grounded Theory Analysis of E-Collaboration Effects for Distributed Project Management
S. Qureshi (Sadja); M. Liu (Miaojia); D. Vogel
2004-01-01
textabstractThe emergence and widespread use of collaborative technologies for distributed project management has brought opened up a myriad of opportunities for business. While the opportunities for off-shore outsourcing and collaborative development are enticing, most tools and techniques for proj
Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density.
Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A
2009-06-01
We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f(0) = exp varphi(0) where varphi(0) is a concave function on R. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log-concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, infinity) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of H(k), the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of varphi(0) = log f(0) at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f(0)) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.
Xu, Peng; Yao, Dezhong; Luo, Fen
2005-08-01
The registration method based on mutual information is currently a popular technique for the medical image registration, but the computation for the mutual information is complex and the registration speed is slow. In engineering process, a subsampling technique is taken to accelerate the registration speed at the cost of registration accuracy. In this paper a new method based on statistics sample theory is developed, which has both a higher speed and a higher accuracy as compared with the normal subsampling method, and the simulation results confirm the validity of the new method.
Theory of relativistic heat polynomials and one-sided Lévy distributions
Dattoli, G.; Górska, K.; Horzela, A.; Penson, K. A.; Sabia, E.
2017-06-01
The theory of pseudo-differential operators is a powerful tool to deal with differential equations involving differential operators under the square root sign. These types of equations are pivotal elements to treat problems in anomalous diffusion and in relativistic quantum mechanics. In this paper, we report on new links between fractional diffusion, quantum relativistic equations, and particular families of polynomials, linked to the Bessel polynomials in Carlitz form and playing the role of relativistic heat polynomials. We introduce generalizations of these polynomial families and point out their specific use for the solutions of problems of practical importance.
新唑漱口液微生物限度检查方法的验证%Validation of Microbial Limit Tests of Neomycin and Metronidazole Garga-rism
余晓霞; 邱凯锋; 刘春霞
2014-01-01
OBJECTIVE To establish a method to test microbial limit in Neomycin and Metronidazole garga-rism.METHODS Microbial limits test in Chinese Pharmacopoeia 2010 ( section 2 ) appendix verified methodology was used.RESULTS The durg had no inhibitory action to white beads bacterium and aspergillus ,and Membrane filtration method can eliminate the drug on staphylococcus aureus ,e.coli,bacillus subtilis inhibitory effect.The recov-ery can reach up to 75%.CONCLUSION Through the technological validation ,the result is consistent with the re-quirement of ChP.( edition 2010 ) in force.%目的：建立新唑漱口液微生物限度检查方法。方法采用2010年版《中国药典》（二部）附录“微生物限度检查法”项下相关内容进行方法学验证。结果该漱口液对白色念珠菌和黑曲霉菌无抑制作用；薄膜过滤法可消除该药物对金黄色葡萄球菌、大肠埃希菌、枯草芽孢杆菌的抑制作用，回收率达75％以上。结论经方法学验证，结果符合现行《中国药典》2010年版的要求。
Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)
2007-07-20
By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.
Tianhao Wu
2016-09-01
Full Text Available While firm-level and micro issue analysis become an important part in research of international trade, only a few work is concerned about the goodness-of-fit for size distribution of firms. In this paper, we revisit the statistical aspects of firm productivity and sales revenue, in order to compare different definitions of statistical distances. We first deduce the exact form of size distribution of firms by only implementing the assumptions of productivity and demand function, and then introduce the famous g-divergence as well as its statistical implications. We also do the simulation and calibration so as to compare those different divergences, moreover, tests the combined assumptions. We conclude that minimizing Pearson χ2 and Neyman χ2 produces similar results and minimizing Kullback-Leibler divergence is likely to take the expense of other distance measures. Additionally, selection among different statistical distances is much more significant than demand functions
LIU Jian-hua; LIANG Rui; WANG Chong-lin; FAN Di-peng
2009-01-01
Single-phase low current grounding faults areoften seen in power distribution system of coal mines. These faults are difficult to reliably identify. We propose a new method of single-phase ground fault protection based upon a discernible matrix of the fractal dimension associated with line currents. The method builds on existing selective protection methods. Faulted feeders are distinguished using differences in the zero-sequence transient current fractal dimension. The current signals were first processed through a fast Fourier transform and then the characteristics of a faulted line were identified using a discernible matrix. The method of calculation is illustrated. The results show that the method involves simple calculations, is easy to do and is highly accurate. It is, therefore, suitable for distribution networks having different neutral grounding modes.
Continuous-Time Discrete-Distribution Theory for Activity-Driven Networks
Zino, Lorenzo; Rizzo, Alessandro; Porfiri, Maurizio
2016-11-01
Activity-driven networks are a powerful paradigm to study epidemic spreading over time-varying networks. Despite significant advances, most of the current understanding relies on discrete-time computer simulations, in which each node is assigned an activity potential from a continuous distribution. Here, we establish a continuous-time discrete-distribution framework toward an analytical treatment of the epidemic spreading, from its onset to the endemic equilibrium. In the thermodynamic limit, we derive a nonlinear dynamical system to accurately model the epidemic spreading and leverage techniques from the fields of differential inclusions and adaptive estimation to inform short- and long-term predictions. We demonstrate our framework through the analysis of two real-world case studies, exemplifying different physical phenomena and time scales.
Input modeling with phase-type distributions and Markov models theory and applications
Buchholz, Peter; Felko, Iryna
2014-01-01
Containing a summary of several recent results on Markov-based input modeling in a coherent notation, this book introduces and compares algorithms for parameter fitting and gives an overview of available software tools in the area. Due to progress made in recent years with respect to new algorithms to generate PH distributions and Markovian arrival processes from measured data, the models outlined are useful alternatives to other distributions or stochastic processes used for input modeling. Graduate students and researchers in applied probability, operations research and computer science along with practitioners using simulation or analytical models for performance analysis and capacity planning will find the unified notation and up-to-date results presented useful. Input modeling is the key step in model based system analysis to adequately describe the load of a system using stochastic models. The goal of input modeling is to find a stochastic model to describe a sequence of measurements from a real system...
Munaò, Gianmarco; Costa, Dino; Caccamo, Carlo
2016-10-01
Inspired by significant improvements obtained for the performances of the polymer reference interaction site model (PRISM) theory of the fluid phase when coupled with ‘molecular closures’ (Schweizer and Yethiraj 1993 J. Chem. Phys. 98 9053), we exploit a matrix generalization of this concept, suitable for the more general RISM framework. We report a preliminary test of the formalism, as applied to prototype square-well homonuclear diatomics. As for the structure, comparison with Monte Carlo shows that molecular closures are slightly more predictive than their ‘atomic’ counterparts, and thermodynamic properties are equally accurate. We also devise an application of molecular closures to models interacting via continuous, soft-core potentials, by using well established prescriptions in liquid state perturbation theories. In the case of Lennard-Jones dimers, our scheme definitely improves over the atomic one, providing semi-quantitative structural results, and quite good estimates of internal energy, pressure and phase coexistence. Our finding paves the way to a systematic employment of molecular closures within the RISM framework to be applied to more complex systems, such as molecules constituted by several non-equivalent interaction sites.
Munaò, Gianmarco; Costa, Dino; Caccamo, Carlo
2016-10-19
Inspired by significant improvements obtained for the performances of the polymer reference interaction site model (PRISM) theory of the fluid phase when coupled with 'molecular closures' (Schweizer and Yethiraj 1993 J. Chem. Phys. 98 9053), we exploit a matrix generalization of this concept, suitable for the more general RISM framework. We report a preliminary test of the formalism, as applied to prototype square-well homonuclear diatomics. As for the structure, comparison with Monte Carlo shows that molecular closures are slightly more predictive than their 'atomic' counterparts, and thermodynamic properties are equally accurate. We also devise an application of molecular closures to models interacting via continuous, soft-core potentials, by using well established prescriptions in liquid state perturbation theories. In the case of Lennard-Jones dimers, our scheme definitely improves over the atomic one, providing semi-quantitative structural results, and quite good estimates of internal energy, pressure and phase coexistence. Our finding paves the way to a systematic employment of molecular closures within the RISM framework to be applied to more complex systems, such as molecules constituted by several non-equivalent interaction sites.
Shiqian Nie
2017-01-01
Full Text Available The fractional advection-diffusion equation (fADE model is a new approach to describe the vertical distribution of suspended sediment concentration in steady turbulent flow. However, the advantages and parameter definition of the fADE model in describing the sediment suspension distribution are still unclear. To address this knowledge gap, this study first reviews seven models, including the fADE model, for the vertical distribution of suspended sediment concentration in steady turbulent flow. The fADE model, among others, describes both Fickian and non-Fickian diffusive characteristics of suspended sediment, while the other six models assume that the vertical diffusion of suspended sediment follows Fick’s first law. Second, this study explores the sensitivity of the fractional index of the fADE model to the variation of particle sizes and sediment settling velocities, based on experimental data collected from the literatures. Finally, empirical formulas are developed to relate the fractional derivative order to particle size and sediment settling velocity. These formulas offer river engineers a substitutive way to estimate the fractional derivative order in the fADE model.
The complete two-loop integrated jet thrust distribution in soft-collinear effective theory
von Manteuffel, Andreas; Schabinger, Robert M.; Zhu, Hua Xing
2014-03-01
In this work, we complete the calculation of the soft part of the two-loop integrated jet thrust distribution in e+e- annihilation. This jet mass observable is based on the thrust cone jet algorithm, which involves a veto scale for out-of-jet radiation. The previously uncomputed part of our result depends in a complicated way on the jet cone size, r, and at intermediate stages of the calculation we actually encounter a new class of multiple polylogarithms. We employ an extension of the coproduct calculus to systematically exploit functional relations and represent our results concisely. In contrast to the individual contributions, the sum of all global terms can be expressed in terms of classical polylogarithms. Our explicit two-loop calculation enables us to clarify the small r picture discussed in earlier work. In particular, we show that the resummation of the logarithms of r that appear in the previously uncomputed part of the two-loop integrated jet thrust distribution is inextricably linked to the resummation of the non-global logarithms. Furthermore, we find that the logarithms of r which cannot be absorbed into the non-global logarithms in the way advocated in earlier work have coefficients fixed by the two-loop cusp anomalous dimension. We also show that in many cases one can straightforwardly predict potentially large logarithmic contributions to the integrated jet thrust distribution at L loops by making use of analogous contributions to the simpler integrated hemisphere soft function.
Verron, E.; Gros, A.
2017-09-01
Most network models for soft materials, e.g. elastomers and gels, are dedicated to idealized materials: all chains admit the same number of Kuhn segments. Nevertheless, such standard models are not appropriate for materials involving multiple networks, and some specific constitutive equations devoted to these materials have been derived in the last few years. In nearly all cases, idealized networks of different chain lengths are assembled following an equal strain assumption; only few papers adopt an equal stress assumption, although some authors argue that such hypothesis would reflect the equilibrium of the different networks in contact. In this work, a full-network model with an arbitrary chain length distribution is derived by considering that chains of different lengths satisfy the equal force assumption in each direction of the unit sphere. The derivation is restricted to non-Gaussian freely jointed chains and to affine deformation of the sphere. Firstly, after a proper definition of the undeformed configuration of the network, we demonstrate that the equal force assumption leads to the equality of a normalized stretch in chains of different lengths. Secondly, we establish that the network with chain length distribution behaves as an idealized full-network of which both chain length and density of are provided by the chain length distribution. This approach is finally illustrated with two examples: the derivation of a new expression for the Young modulus of bimodal interpenetrated polymer networks, and the prediction of the change in fluorescence during deformation of mechanochemically responsive elastomers.
Distribution of Schmidt-like eigenvalues for Gaussian ensembles of the random matrix theory
Pato, Mauricio P.; Oshanin, Gleb
2013-03-01
We study the probability distribution function P(β)n(w) of the Schmidt-like random variable w = x21/(∑j = 1nx2j/n), where xj, (j = 1, 2, …, n), are unordered eigenvalues of a given n × n β-Gaussian random matrix, β being the Dyson symmetry index. This variable, by definition, can be considered as a measure of how any individual (randomly chosen) eigenvalue deviates from the arithmetic mean value of all eigenvalues of a given random matrix, and its distribution is calculated with respect to the ensemble of such β-Gaussian random matrices. We show that in the asymptotic limit n → ∞ and for arbitrary β the distribution P(β)n(w) converges to the Marčenko-Pastur form, i.e. is defined as P_{n}^{( \\beta )}(w) \\sim \\sqrt{(4 - w)/w} for w ∈ [0, 4] and equals zero outside of the support, despite the fact that formally w is defined on the interval [0, n]. Furthermore, for Gaussian unitary ensembles (β = 2) we present exact explicit expressions for P(β = 2)n(w) which are valid for arbitrary n and analyse their behaviour.
Density Functional Theory Based on the Electron Distribution on the Energy Coordinate
Takahashi, Hideaki
2016-01-01
We introduced a new electron density n({\\epsilon}) by projecting the spatial electron density n(r) onto the energy coordinate {\\epsilon} defined with the external potential \\upsion (r) of interest. Then, a density functional theory (DFT) was formulated, where n({\\epsilon}) serves as a fundamental variable for the electronic energy. It was demonstrated that the Kohn-Sham equation can also be adapted to the DFT that employs the density n({\\epsilon}) as an argument to the exchange energy functional. An important attribute of the energy density is that it involves the spatially non-local population of the spin-adapted density n(r) at the bond dissociation. By taking advantage of this property we developed a prototype of the static correlation functional employing no empirical parameters, which realized a reasonable dissociation curve for H2 molecule.
Stenull, O; Janssen, H K
2001-03-01
We study the multifractal moments of the current distribution in randomly diluted resistor networks near the percolation threshold. When an external current is applied between two terminals x and x(') of the network, the lth multifractal moment scales as M((l))(I)(x,x(')) approximately equal /x-x'/(psi(l)/nu), where nu is the correlation length exponent of the isotropic percolation universality class. By applying our concept of master operators [Europhys. Lett. 51, 539 (2000)] we calculate the family of multifractal exponents [psi(l)] for l>or=0 to two-loop order. We find that our result is in good agreement with numerical data for three dimensions.
Dark and visible matter distribution in Coma cluster: theory vs observations
Brilenkov, Ruslan; Zhuk, Alexander
2015-01-01
We investigate dark and visible matter distribution in the Coma cluster in the case of the Navarro-Frenk-White (NFW) profile. A toy model where all galaxies in the cluster are concentrated inside a sphere of an effective radius $R_{eff}$ is considered. It enables to obtain the mean velocity dispersion as a function of $R_{eff}$. We show that, within the observation accuracy of the NFW parameters, the calculated value of $R_{eff}$ can be rather close to the observable cutoff of the galaxy distribution . Moreover, the comparison of our toy model with the observable data and simulations leads to the following preferable NFW parameters for the Coma cluster: $R_{200} \\approx 1.77\\,h^{-1} \\, \\mathrm{Mpc} = 2.61\\, \\mathrm{Mpc}$, $c=3\\div 4$ and $M_{200}= 1.29 h^{-1}\\times10^{15}M_{\\odot}$. In the Coma cluster the most of galaxies are concentrated inside a sphere of the effective radius $R_{eff}\\sim 3.7$ Mpc and the line-of-sight velocity dispersion is $1004\\, \\mathrm{km}\\, \\mathrm{s}^{-1}$.
Dekkers, Petrus J; Friedlander, Sheldon K
2002-04-15
Gas-phase synthesis of fine solid particles leads to fractal-like structures whose transport and light scattering properties differ from those of their spherical counterparts. Self-preserving size distribution theory provides a useful methodology for analyzing the asymptotic behavior of such systems. Apparent inconsistencies in previous treatments of the self-preserving size distributions in the free molecule regime are resolved. Integro-differential equations for fractal-like particles in the continuum and near continuum regimes are derived and used to calculate the self-preserving and quasi-self-preserving size distributions for agglomerates formed by Brownian coagulation. The results for the limiting case (the continuum regime) were compared with the results of other authors. For these cases the finite difference method was in good in agreement with previous calculations in the continuum regime. A new analysis of aerosol agglomeration for the entire Knudsen number range was developed and compared with a monodisperse model; Higher agglomeration rates were found for lower fractal dimensions, as expected from previous studies. Effects of fractal dimension, pressure, volume loading and temperature on agglomerate growth were investigated. The agglomeration rate can be reduced by decreasing volumetric loading or by increasing the pressure. In laminar flow, an increase in pressure can be used to control particle growth and polydispersity. For D(f)=2, an increase in pressure from 1 to 4 bar reduces the collision radius by about 30%. Varying the temperature has a much smaller effect on agglomerate coagulation.
The analysis of linear partial differential operators I distribution theory and Fourier analysis
Hörmander, Lars
2003-01-01
The main change in this edition is the inclusion of exercises with answers and hints. This is meant to emphasize that this volume has been written as a general course in modern analysis on a graduate student level and not only as the beginning of a specialized course in partial differen tial equations. In particular, it could also serve as an introduction to harmonic analysis. Exercises are given primarily to the sections of gen eral interest; there are none to the last two chapters. Most of the exercises are just routine problems meant to give some familiarity with standard use of the tools introduced in the text. Others are extensions of the theory presented there. As a rule rather complete though brief solutions are then given in the answers and hints. To a large extent the exercises have been taken over from courses or examinations given by Anders Melin or myself at the University of Lund. I am grateful to Anders Melin for letting me use the problems originating from him and for numerous valuable comm...
Soft gluon resummation of Drell-Yan rapidity distributions: theory and phenomenology
Bonvini, Marco; Ridolfi, Giovanni
2010-01-01
We examine critically the theoretical underpinnings and phenomenological implications of soft gluon (threshold) resummation of rapidity distributions at a hadron collider, taking Drell-Yan production at the Tevatron and the LHC as a reference test case. First, we show that in perturbative QCD soft gluon resummation is necessary whenever the partonic (rather the hadronic) center-of-mass energy is close enough to threshold, and we provide tools to assess when resummation is relevant for a given process. Then, we compare different prescriptions for handling the divergent nature of the series of resummed perturbative corrections, specifically the minimal and Borel prescriptions. We assess the intrinsic ambiguities of resummed results, both due to the asymptotic nature of their perturbative expansion, and to the treatment of subleading terms. Turning to phenomenology, we introduce a fast and accurate method for the implementation of resummation with the minimal and Borel prescriptions using an expansion on a basis...
Mingqi Qiao
2017-01-01
Full Text Available We performed an epidemiological investigation of subjects with premenstrual dysphoric disorder (PMDD to identify the clinical distribution of the major syndromes and symptoms. The pathogenesis of PMDD mainly involves the dysfunction of liver conveyance and dispersion. Excessive liver conveyance and dispersion are associated with liver-qi invasion syndrome, while insufficient liver conveyance and dispersion are expressed as liver-qi depression syndrome. Additionally, a nonconditional logistic regression was performed to analyze the symptomatic features of liver-qi invasion and liver-qi depression. As a result of this analysis, two subtypes of PMDD are proposed, namely, excessive liver conveyance and dispersion (liver-qi invasion syndrome and insufficient liver conveyance and dispersion (liver-qi depression syndrome. Our findings provide an epidemiological foundation for the clinical diagnosis and treatment of PMDD based on the identification of different types.
Multiobjective Optimization of Water Distribution Networks Using Fuzzy Theory and Harmony Search
Zong Woo Geem
2015-07-01
Full Text Available Thus far, various phenomenon-mimicking algorithms, such as genetic algorithm, simulated annealing, tabu search, shuffled frog-leaping, ant colony optimization, harmony search, cross entropy, scatter search, and honey-bee mating, have been proposed to optimally design the water distribution networks with respect to design cost. However, flow velocity constraint, which is critical for structural robustness against water hammer or flow circulation against substance sedimentation, was seldom considered in the optimization formulation because of computational complexity. Thus, this study proposes a novel fuzzy-based velocity reliability index, which is to be maximized while the design cost is simultaneously minimized. The velocity reliability index is included in the existing cost optimization formulation and this extended multiobjective formulation is applied to two bench-mark problems. Results show that the model successfully found a Pareto set of multiobjective design solutions in terms of cost minimization and reliability maximization.
Regnier, D.; Dubray, N.; Schunck, N.; Verrière, M.
2016-05-01
Background: Accurate knowledge of fission fragment yields is an essential ingredient of numerous applications ranging from the formation of elements in the r process to fuel cycle optimization for nuclear energy. The need for a predictive theory applicable where no data are available, together with the variety of potential applications, is an incentive to develop a fully microscopic approach to fission dynamics. Purpose: In this work, we calculate the pre-neutron emission charge and mass distributions of the fission fragments formed in the neutron-induced fission of 239Pu using a microscopic method based on nuclear density functional theory (DFT). Methods: Our theoretical framework is the nuclear energy density functional (EDF) method, where large-amplitude collective motion is treated adiabatically by using the time-dependent generator coordinate method (TDGCM) under the Gaussian overlap approximation (GOA). In practice, the TDGCM is implemented in two steps. First, a series of constrained EDF calculations map the configuration and potential-energy landscape of the fissioning system for a small set of collective variables (in this work, the axial quadrupole and octupole moments of the nucleus). Then, nuclear dynamics is modeled by propagating a collective wave packet on the potential-energy surface. Fission fragment distributions are extracted from the flux of the collective wave packet through the scission line. Results: We find that the main characteristics of the fission charge and mass distributions can be well reproduced by existing energy functionals even in two-dimensional collective spaces. Theory and experiment agree typically within two mass units for the position of the asymmetric peak. As expected, calculations are sensitive to the structure of the initial state and the prescription for the collective inertia. We emphasize that results are also sensitive to the continuity of the collective landscape near scission. Conclusions: Our analysis confirms
Morgenthaler, George W.
1989-01-01
The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.
Morgenthaler, George W.
1989-01-01
The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.
Using reactor network for global identification based on residence time distribution theory
Hocine, S.; Pibouleau, L.; Azzaro-Pantel, C.; Domenech, S. [Laboratoire de Genie Chimique - UMR 5503 CNRS/ INPT ENSIACET, 31 - Toulouse (France)
2006-07-01
In the ventilation systems, the control of transfer contaminants is one of the principal problems during the design and control phases. The installation of a suitable ventilation system for the control of contaminant transfer is essential in industry, because it makes it possible to detect and to prevent chemical and radiological risks. Research on air distribution in ventilated rooms traditionally involves full-scale experiments, scale -model experiments and application of the computational fluid dynamics (C.F.D.) tools. Most of the time, particularly in our case of large and cluttered enclosures, the predictive approach based on C.F.D. codes can not be used. The solution retained here is the establishment of a model based on the well known residence time distribution. This model is widely used in chemical engineering to treat non-ideal flows. The proposed method is based on the experimental determination of the residence time distribution curve, generally obtained through the response of the system to tracer release. A superstructure involving the set of all the possible solutions corresponding to the physical reactor is then defined, and the model will be selected from this superstructure according to its simulated response. The superstructure is identified as a combination of elementary systems, representing ideal flow patterns, as perfect mixed flows, plug flows, continuous stirred tank reactors, etc. The selected model is derived from the comparison between the simulated response to a stimulus, and the experimental response. The structure and parameters of the model are simultaneously optimized in order to fit the experimental curve with a minimal number of elementary units, constituting a key point for future control purposes of the process. This problem is a dynamic M.I.N.L.P. (Mixed Integer Non Linear Programming) problem with bilinear equality constraints. Generally, these constraints lead to numerical difficulties for reaching an optimum solution (even a
Mohammad, S. Noor
1988-03-01
A theoretical method for potential distribution in abrupt heterojunctions (HJs) made of uniformly doped degenerate semiconductors has been developed. The method reduces automatically to that in HJs from nondegenerate semiconductors in the limits of low carrier concentrations. For the development of the method the rigid band approximation of degenerate semiconductors has been considered to be valid. The transport equations of Marshak and Van Vliet [Solid-State Electron. 21, 417 (1978)] and an analytical approximation for the Fermi-Dirac integral of order half by the present author [Solid-State Electron. 30, 713 (1987)] have been employed for the formulation. The average of the scattered experimental data for band-gap narrowing of n-Si, n-Ge, p-GaAs, and n-InP have been fitted to the same form as that for the Fermi-Dirac integral of order 1/2 to ease this formulation. Local electrostatic field and local electrostatic potentials obtained from the formulation reduce to those of Chatterjee and Marshak [Solid-State Electron. 24, 1111 (1981)], Cserveny [Int. J. Electron. 25, 65 (1968)], and Kroemer [J. Appl. Phys. 52, 873 (1981)] under special conditions. It is noted that band-gap narrowing and consideration of Fermi-Dirac statistics represent opposite effects for effective intrinsic carrier concentration and local electrostatic field. At some critical concentration belonging to the degenerate limit of a semiconductor, these two effects cancel the influence of each other on effective intrinsic carrier concentration of the semiconductor and on transition region properties of an HJ. Below this critical concentration, band-gap narrowing rather than a consideration of Fermi-Dirac statistics dominantly influences the device properties. However, above this critical concentration, consideration of Fermi-Dirac statistics dominates over the other. Applications of electrostatic field and electrostatic potential to isotype and anisotype HJs have been discussed. On the basis of
O. Klemp
2006-01-01
Full Text Available In order to satisfy the stringent demand for an accurate prediction of MIMO channel capacity and diversity performance in wireless communications, more effective and suitable models that account for real antenna radiation behavior have to be taken into account. One of the main challenges is the accurate modeling of antenna correlation that is directly related to the amount of channel capacity or diversity gain which might be achieved in multi element antenna configurations. Therefore spherical wave theory in electromagnetics is a well known technique to express antenna far fields by means of a compact field expansion with a reduced number of unknowns that was recently applied to derive an analytical approach in the computation of antenna pattern correlation. In this paper we present a novel and efficient computational technique to determine antenna pattern correlation based on the evaluation of the surface current distribution by means of a spherical mode expansion.
Freifelder, R.; Prakash, M.; Alexander, John M.
1986-02-01
We examine the application of transition-state theory for fission-fragment angular distributions to composite nuclei near the limits of stability. The possible roles of saddle-point and scission-point configurations are explored. For many heavy-ion reactions that involve large angular momenta, the observed anisotropies are between the predictions of the saddle-point and scisson-point models. Empirical correlations are shown between the effective moments of inertia and the spin and {Z 2}/{A} of the compound nucleus. These correlations provide evidence for a class of transition-state nuclei intermediate between saddle- and scission-point configurations. An important indication of these patterns is that the speed of collective deformation toward fission may well be slow enough to allow for statistical equilibrium in the tilting mode even for configurations well beyond the saddle point.
Antipin, K V; Silaev, P K
2016-01-01
We use both numerical and analytical approaches to study the dynamics of the gravitational collapse in the framework of the relativistic theory of gravitation (RTG). We use various equations of state for the collapsing matter and relatively realistic initial conditions with smooth matter distribution, which corresponds to static solution for the given equation of state. We also obtain results concerning the influence of the graviton mass on the properties of static solutions. We specify several characteristics of the process of the collapse, in particular, we determine the dependence of the turning point time (when contraction is replaced by inflation) on the graviton mass. We also study the influence of non-zero pressure on the dynamics of the collapse.
Cantelaube, Y C
2012-01-01
In a central potential the usual resolution of the Schr\\"odinger equation in spherical coordinates consists in determining the solutions R(r) or u(r) of the radial equations considered as the radial parts of the Schr\\"odinger equation. However, the solutions must be supplemented with the boundary condition u(0) = 0 in order to rule out singular solutions. There is still no consensus to justify this condition, with good reason. It is based on a misunderstanding that comes from the fact that the radial equation in terms of R(r) is derived from the Schr\\"odinger equation, and the radial equation in terms of u(r) from the former, by taking the Laplacians in the sense of the functions. By taking these Laplacians in the sense of the distributions, as it is required, we show that the radial equations are derived from the Schrodinger equation when their solutions are regular, but not when they are singular, so that the equations need not be supplemented with any supplementary condition such as u(0) = 0.
Choy, C. W.; Xiao, J. J.; Yu, K. W.
2007-05-01
The recent Green function formalism (GFF) has been used to study the local field distribution near a periodic interface separating two homogeneous media of different dielectric constants. In the GFF, the integral equations can be solved conveniently because of the existence of an analytic expression for the kernel (Greenian). However, due to a severe singularity in the Greenian, the formalism was formerly applied to compute the electric fields away from the interface region. In this work, we have succeeded in extending the GFF to compute the electric field inside the interface region by taking advantage of a sum rule. To our surprise, the strengths of the electric fields are quite similar in both media across the interface, despite of the large difference in dielectric constants. Moreover, we propose a simple effective medium approximation (EMA) to compute the electric field inside the interface region. We show that the EMA can indeed give an excellent description of the electric field, except near a surface plasmon resonance.
A new theory for the distribution of ancestors in biparental reproduction species
Caruso, M
2011-01-01
To calculate the number of ancestor of one individual from any biparental reproduction specie is necessary to use an statistical approach. It's not possible to build the progenitor's tree only assuming that the spice has a sexual biparental reproduction. In particular it not true that the simple progression of $2^{t+1}$ describes in a realistic way the number of ancestors after a few generations. The reason is that when we go back in generations the probability of some ancestors been relatives has an increase. This implies a restriction in the ancestor's mean number respect of $2^{t+1}$. Although this number correspond to the maximum ancestor's number in the $t$-generation. Considering the possibility of blood relationship between the ancestors, we show how to re-build the progenitor's tree by modeling our problem with a Markov's Chain. First we work in a continuous time variable and then by doing a discretization process we obteined the ancestor's distribution. Using covariant derivative in a proper way we n...
Heavy pseudoscalar twist-3 distribution amplitudes within QCD theory in background fields
Zhong, Tao; Wu, Xing-Gang; Huang, Tao; Fu, Hai-Bing
2016-09-01
In this paper, we study the properties of the twist-3 distribution amplitude (DA) of the heavy pseudoscalars such as η _c, B_c, and η _b. New sum rules for the twist-3 DA moments Huang-Lepage prescription. Furthermore, we apply them to the B_c→ η _c transition form factor (f^{B_c→ η _c}_+(q^2)) within the light-cone sum rules approach, and the results are comparable with other approaches. It has been found that the twist-3 DAs φ ^P_{3;η _c} and φ ^σ _{3;η _c} are important for a reliable prediction of f^{B_c→ η _c}_+(q^2). For example, at the maximum recoil region, we have f^{B_c→ η _c}_+(0) = 0.674 ± 0.066, in which those two twist-3 terms provide {˜ }33 and {˜ }22 % contributions. Also we calculate the branching ratio of the semi-leptonic decay B_c → η _c lν as Br(B_c → η _c lν ) = ( 9.31^{+2.27}_{-2.01} ) × 10^{-3}.
Portfolio Theory for α-Symmetric and Pseudoisotropic Distributions: k-Fund Separation and the CAPM
Nils Chr. Framstad
2015-01-01
Full Text Available The shifted pseudoisotropic multivariate distributions are shown to satisfy Ross’ stochastic dominance criterion for two-fund monetary separation in the case with risk-free investment opportunity and furthermore to admit the Capital Asset Pricing Model under an embedding in Lα condition if 1<α≤2, with the betas given in an explicit form. For the α-symmetric subclass, the market without risk-free investment opportunity admits 2d-fund separation if α=1+1/(2d-1, d∈N, generalizing the classical elliptical case d=1, and we also give the precise number of funds needed, from which it follows that we cannot, except degenerate cases, have a CAPM without risk-free opportunity. For the symmetric stable subclass, the index of stability is only of secondary interest, and several common restrictions in terms of that index can be weakened by replacing it by the (no smaller indices of symmetry/of embedding. Finally, dynamic models with intermediate consumption inherit the separation properties of the static models.
Ghanti, Dipanwita
2016-01-01
Application of pulling force, under force-clamp conditions, to kinetochore-microtubule attachments {\\it in-vitro} revealed a catch-bond-like behavior. In an earlier paper ({\\it Sharma et al. Phys. Biol. (2014)} the physical origin of this apparently counter-intuitive phenomenon was traced to the nature of the force-dependence of the (de-)polymerization kinetics of the microtubules. In this brief communication that work is extended to situations where the external forced is ramped up till the attachment gets ruptured. In spite of the fundamental differences in the underlying mechanisms, the trend of variation of the rupture force distribution observed in our model kinetochore-microtubule attachment with the increasing loading rate is qualitatively similar to that displayed by the catch bonds formed in some other ligand-receptor systems. Our theoretical predictions can be tested experimentally by a straightforward modification of the protocol for controlling the force in the optical trap set up that was used in...
Kinetic theory of phase space plateaux in a non-thermal energetic particle distribution
Eriksson, F., E-mail: frida.eriksson@chalmers.se; Nyqvist, R. M. [Department of Earth and Space Sciences, Chalmers University of Technology, 41296 Göteborg (Sweden); Lilley, M. K. [Physics Department, Imperial College, London SW7 2AZ (United Kingdom)
2015-09-15
The transformation of kinetically unstable plasma eigenmodes into hole-clump pairs with temporally evolving carrier frequencies was recently attributed to the emergence of an intermediate stage in the mode evolution cycle, that of an unmodulated plateau in the phase space distribution of fast particles. The role of the plateau as the hole-clump breeding ground is further substantiated in this article via consideration of its linear and nonlinear stability in the presence of fast particle collisions and sources, which are known to affect the production rates and subsequent frequency sweeping of holes and clumps. In particular, collisional relaxation, as mediated by e.g. velocity space diffusion or even simple Krook-type collisions, is found to inhibit hole-clump generation and detachment from the plateau, as it should. On the other hand, slowing down of the fast particles turns out to have an asymmetrically destabilizing/stabilizing effect, which explains the well-known result that collisional drag enhances holes and their sweeping rates but suppresses clumps. It is further demonstrated that relaxation of the plateau edge gradients has only a minor quantitative effect and does not change the plateau stability qualitatively, unless the edge region extends far into the plateau shelf and the corresponding Landau pole needs to be taken into account.
Oreiro José Luis
2013-01-01
Full Text Available This article analyzes the relationship between economic growth, income distribution and real exchange rate within the neo-Kaleckian literature, through the construction of a nonlinear macrodynamic model for an open economy in which investment in fixed capital is assumed to be a quadratic function of the real exchange rate. The model demonstrates that the prevailing regime of accumulation in a given economy depends on the type of currency misalignment, so if the real exchange rate is overvalued, then the regime of accumulation will be profit-led, but if the exchange rate is undervalued, then the accumulation regime is wage-led. Subsequently, the adherence of the theoretical model to data is tested for Brazil in the period 1994/Q3-2008/Q4. The econometric results are consistent with the theoretical non-linear specification of the investment function used in the model, so that we can define the existence of a real exchange rate that maximizes the rate of capital accumulation for the Brazilian economy. From the estimate of this optimal rate we show that the real exchange rate is overvalued in 1994/Q3- 2001/Q1 and 2005/Q4-2008/Q4 and undervalued in the period 2001/Q2-2005/Q3. As a direct corollary of this result, it follows that the prevailing regime of accumulation in the Brazilian economy after the last quarter of 2005 is profit-led.
Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.
2016-05-01
We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.
Huggins, David J
2012-11-21
The structures of biomolecules and the strengths of association between them depend critically on interactions with water molecules. Thus, understanding these interactions is a prerequisite for understanding the structure and function of all biomolecules. Inhomogeneous fluid solvation theory provides a framework to derive thermodynamic properties of individual water molecules from a statistical mechanical analysis. In this work, two biomolecules are analysed to probe the distribution and thermodynamics of surrounding water molecules. The great majority of hydration sites are predicted to contribute favourably to the total free energy with respect to bulk water, though hydration sites close to non-polar regions of the solute do not contribute significantly. Analysis of a biomolecule with a positively and negatively charged functional group predicts that a charged species perturbs the free energy of water molecules to a distance of approximately 6.0 Å. Interestingly, short simulations are found to provide converged predictions if samples are taken with sufficient frequency, a finding that has the potential to significantly reduce the required computational cost of such analysis. In addition, the predicted thermodynamic properties of hydration sites with the potential for direct hydrogen bonding interactions are found to disagree significantly for two different water models. This study provides important information on how inhomogeneous fluid solvation theory can be employed to understand the structures and intermolecular interactions of biomolecules.
Hüttner, Bernd; Rohr, Gernot C.
1998-04-01
The interaction of short laser pulses with metals is described in terms of a new model involving a generalised nonlocal heat flow in time. The resulting system of coupled differential equations contains additional terms that originate in the thermal inertia of the electron subsystem. They are connected to the time derivative of both the laser intensity and the coupling of the electrons to the phonons. After briefly discussing the importance of the coefficient of heat exchange and of the different contributions to its main parts, the electron-phonon coupling and the average of the squared phonon frequencies, we present snap-shot-like calculations of the spatial temperature distribution along the z-axis for Al, Au, Pb and Nb for different intensities but fixed pulse duration τL=250 fs. In addition, the electron and phonon temperatures inside a thin Au film are studied in detail. Our results are compared to those arising from a standard two temperature model (TTM) and those of the conventional theory. While the traditional theory becomes invalid in most cases at least in the ps-range the differences between our approach and the TTM become the more pronounced the smaller the coefficient of heat exchange.
Non-Gaussianity of the topological charge distribution in $\\mathrm{SU}(3)$ Yang-Mills theory
Cè, Marco
2015-01-01
In Yang-Mills theory, the cumulants of the na\\"ive lattice discretization of the topological charge evolved with the Yang-Mills gradient flow coincide, in the continuum limit, with those of the universal definition. We sketch in these proceedings the main points of the proof. By implementing the gradient-flow definition in numerical simulations, we report the results of a precise computation of the second and the fourth cumulant of the $\\mathrm{SU}(3)$ Yang-Mills theory topological charge distribution, in order to measure the deviation from Gaussianity. A range of high-statistics Monte Carlo simulations with different lattice volumes and spacings is used to extrapolate the results to the continuum limit with confidence by keeping finite-volume effects negligible with respect to the statistical errors. Our best result for the topological susceptibility is $t_0^2\\chi=6.67(7)\\times 10^{-4}$, while for the ratio between the fourth and the second cumulant we obtain $R=0.233(45)$.
Hopkins, Paul; Fortini, Andrea; Archer, Andrew J; Schmidt, Matthias
2010-12-14
We describe a test particle approach based on dynamical density functional theory (DDFT) for studying the correlated time evolution of the particles that constitute a fluid. Our theory provides a means of calculating the van Hove distribution function by treating its self and distinct parts as the two components of a binary fluid mixture, with the "self " component having only one particle, the "distinct" component consisting of all the other particles, and using DDFT to calculate the time evolution of the density profiles for the two components. We apply this approach to a bulk fluid of Brownian hard spheres and compare to results for the van Hove function and the intermediate scattering function from Brownian dynamics computer simulations. We find good agreement at low and intermediate densities using the very simple Ramakrishnan-Yussouff [Phys. Rev. B 19, 2775 (1979)] approximation for the excess free energy functional. Since the DDFT is based on the equilibrium Helmholtz free energy functional, we can probe a free energy landscape that underlies the dynamics. Within the mean-field approximation we find that as the particle density increases, this landscape develops a minimum, while an exact treatment of a model confined situation shows that for an ergodic fluid this landscape should be monotonic. We discuss possible implications for slow, glassy, and arrested dynamics at high densities.
Raschke, Mathias
2015-01-01
In this short note, I comment on the research of Pisarenko et al. (2014) regarding the extreme value theory and statistics in case of earthquake magnitudes. The link between the generalized extreme value distribution (GEVD) as an asymptotic model for the block maxima of a random variable and the generalized Pareto distribution (GPD) as a model for the peak over thresholds (POT) of the same random variable is presented more clearly. Pisarenko et al. (2014) have inappropriately neglected that the approximations by GEVD and GPD work only asymptotically in most cases. This applies particularly for the truncated exponential distribution (TED), being a popular distribution model for earthquake magnitudes. I explain why the classical models and methods of the extreme value theory and statistics do not work well for truncated exponential distributions. As a consequence, the classical methods should be used for the estimation of the upper bound magnitude and corresponding parameters. Furthermore, different issues of s...
Sun, Feng
A new method is introduced for retrieving the particle size distribution function (PSDF) from optical measurements. Criteria are derived and demonstrated for assessing the validity of the multi-dimensional least-squares deconvolution results derived from experimental data. Using the covariance matrices C and the newly revealed B, stability of this deconvolution technique is discussed and compared with the linear inversion method. The relations of the variances of the PSDF parameters and the experimental errors have been derived and the effects and restrictions of the profiles for the PSDF have been discussed. All the theoretical predictions of this method are tested using Monte Carlo -simulated experimental data which consists of scattering and extinction measurements for water and soot particles, respectively. The results show that for water droplets of 1 μm mean diameter, the method is good for +/-25% random experiment error for scattering and +6% for extinction. The differences in the scattering and extinction results are explained using information content analysis. Experimental confirmation of this newly developed deconvolution technique was accomplished using multi-angle scattering measurements of a water droplet spray field. Sensitivity of the deconvolution to uncertainties in the index of refraction was determined using droplet compositions of water and laser-dye mixtures. The laser-dye concentration was varied to provide a variable and known imaginary term of the index of refraction. A range of discrete laser wavelengths was used for this study to span the wavelength range of the variations of the index of refraction. The application of this deconvolution method yield, in addition to the PSDF parameters, the real and imaginary terms of the index of refraction which agreed with the computed values. A method was developed and verified for the proper treatment of the variation with observation angle of the scattering volume of the experiment, and the results
Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor
2012-08-01
The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.
古典与新古典收入分配理论之比较%Comparisons on income distribution theories between classicism and neo-classicism
刘娟
2011-01-01
收入分配理论是经济学理论研究的基础,古典、新古典的收入分配研究是经济学理论研究基础之基础。价值理论决定了收入分配理论,本文主要从古典、新古典的价值理论入手,比较了其收入分配理论的不同并对其原因进行了分析。%Income distribution theory based on value theory is the basis of the economics study and classical and neo-classical income distribution research are the basis of the basis.This study compares differences between the two income distribution theories and a
Pisarenko, V F; Sornette, D; Rodkin, M V
2008-01-01
We present a generic and powerful approach to study the statistics of extreme phenomena (meteorology, finance, biology...) that we apply to the statistical estimation of the tail of the distribution of earthquake sizes. The chief innovation is to combine the two main limit theorems of Extreme Value Theory (EVT) that allow us to derive the distribution of T-maxima (maximum magnitude occurring in sequential time intervals of duration T) for arbitrary T. We propose a method for the estimation of the unknown parameters involved in the two limit theorems corresponding to the Generalized Extreme Value distribution (GEV) and to the Generalized Pareto Distribution (GPD). We establish the direct relations between the parameters of these distributions, which permit to evaluate the distribution of the T-maxima for arbitrary T. The duality between the GEV and GPD provides a new way to check the consistency of the estimation of the tail characteristics of the distribution of earthquake magnitudes for earthquake occurring ...
朱妙宽
2013-01-01
深入研究马克思主义的价值理论、分配理论和分配制度改革，具有重大的理论意义和现实意义。社会主义分配理论应该发展创新，应该建立在发展了的劳动价值论和剩余价值论的基础之上，由此建立按劳分配、按生产要素分配、按基本需要分配三结合的分配理论和分配制度，实现价值理论、分配理论和分配制度的“三统一”。应该把分配理论和分配制度落实到分配实践中；把按劳分配的主体地位落实到工资福利制度中；把多种分配方式落实到多种分配制度中；同时要改革和完善与收入分配相关的一系列具体制度。%An in-depth study of Marx's value theory, distribution theory and the reform of the distribution system are of great theoretical and practical significance. Socialist distribution theory should be developed. A distribution theory and system should be formed according to labor, productive factors and basic needs on the foundation of developed labor value theory and surplus value theory so as to achieve the unity of value theory, distribution theory and distribution system. A series of concrete systems should be reformed and improved by putting the distribution theory and system into practice and taking other effective measures.
Launey, K D; Dytrych, T; Draayer, J P
2014-01-01
We present a program in C that employs spectral distribution theory for studies of characteristic properties of a many-particle quantum-mechanical system and the underlying few-body interaction. In particular, the program focuses on two-body nuclear interactions given in a JT-coupled harmonic oscillator basis and calculates correlation coefficients, a measure of similarity of any two interactions, as well as Hilbert-Schmidt norms specifying interaction strengths. An important feature of the program is its ability to identify the monopole part (centroid) of a 2-body interaction, as well as its 'density-dependent' one-body and two-body part, thereby providing key information on the evolution of shell gaps and binding energies for larger nuclear systems. As additional features, we provide statistical measures for 'density-dependent' interactions, as well as a mechanism to express an interaction in terms of two other interactions. This, in turn, allows one to identify, e.g., established features of the nuclear in...
Saichev, A
2005-01-01
Using the ETAS branching model of triggered seismicity, we apply the formalism of generating probability functions to calculate exactly the average difference between the magnitude of a mainshock and the magnitude of its largest aftershock over all generations. This average magnitude difference is found empirically to be independent of the mainshock magnitude and equal to 1.2, a universal behavior known as Bath's law. Our theory shows that Bath's law holds only sufficiently close to the critical regime of the ETAS branching process. Allowing for error bars +- 0.1 for Bath's constant value around 1.2, our exact analytical treatment of Bath's law provides new constraints on the productivity exponent alpha and the branching ratio n: $0.9 <= alpha <= 1$ and 0.8 <= n <= 1. We propose a novel method for measuring alpha based on the predicted renormalization of the Gutenberg-Richter distribution of the magnitudes of the largest aftershock. We also introduce the ``second Bath's law for foreshocks: the pro...
Ghadiri, Majid; Shafiei, Navvab
2016-04-01
In this study, thermal vibration of rotary functionally graded Timoshenko microbeam has been analyzed based on modified couple stress theory considering temperature change in four types of temperature distribution on thermal environment. Material properties of FG microbeam are supposed to be temperature dependent and vary continuously along the thickness according to the power-law form. The axial forces are also included in the model as the thermal and true spatial variation due to the rotation. Governing equations and boundary conditions have been derived by employing Hamiltonian's principle. The differential quadrature method is employed to solve the governing equations for cantilever and propped cantilever boundary conditions. Validations are done by comparing available literatures and obtained results which indicate accuracy of applied method. Results represent effects of temperature changes, different boundary conditions, nondimensional angular velocity, length scale parameter, different boundary conditions, FG index and beam thickness on fundamental, second and third nondimensional frequencies. Results determine critical values of temperature changes and other essential parameters which can be applicable to design micromachines like micromotor and microturbine.
Distributed Leadership: A Good Theory but What if Leaders Won't, Don't Know How, or Can't Lead?
McKenzie, Kathryn Bell; Locke, Leslie Ann
2014-01-01
This article presents the results from an empirical qualitative study of the challenges faced by teacher leaders in their attempts to work directly with their colleagues to change instructional strategies and improve student success. Additionally, it offers a challenge to the utility of a naïvely espoused theory of distributed leadership, which…
1979-01-01
ELECTE! Mg++ and K+ Distribution in Frog Muscle and Egg: B A Disproof of the Donnan Theory of Membrane B Equilibrium Applied to the Living Cells GILBERT...19107 J ABSTRACT 1. We studied the equilibrium distribution of Mg** in the form of chlo- ride and pulfate at two temperatures (5* and 25°C) in frog ...vicinity of 90 jmoles/g/ fresh muscle cells. 4. We observed a similar rectilinear distribution of Mg** in frog ovarian eggs. As in muscle tissues, no major
Kido, Kentaro, E-mail: kido.kentaro@jaea.go.jp [Nuclear Safety Research Center, Japan Atomic Energy Agency, 2-4 Shirane, Shirakata, Tokai-mura, Naka-gun, Ibaraki 319-1195 (Japan); Kasahara, Kento [Department of Molecular Engineering, Graduate School of Engineering, Kyoto University, Nishikyo-ku, Kyoto 615-8510 (Japan); Yokogawa, Daisuke [Department of Chemistry, Graduate School of Science, Nagoya University, Chikusa, Nagoya 464-8602 (Japan); Institute of Transformative Bio-Molecules (WPI-ITbM), Nagoya University, Chikusa, Nagoya 464-8062 (Japan); Sato, Hirofumi [Department of Molecular Engineering, Graduate School of Engineering, Kyoto University, Nishikyo-ku, Kyoto 615-8510 (Japan); Elements Strategy Institute for Catalysts and Batteries (ESICB), Kyoto University, Katsura, Kyoto 615-8520 (Japan)
2015-07-07
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein–Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple S{sub N}2 reaction (Cl{sup −} + CH{sub 3}Cl → ClCH{sub 3} + Cl{sup −}) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Danesh-Yazdi, Mohammad; Foufoula-Georgiou, Efi; Karwan, Diana L.; Botter, Gianluca
2016-10-01
Climatic trends and anthropogenic changes in land cover and land use are impacting the hydrology and water quality of streams at the field, watershed, and regional scales in complex ways. In poorly drained agricultural landscapes, subsurface drainage systems have been successful in increasing crop productivity by removing excess soil moisture. However, their hydroecological consequences are still debated in view of the observed increased concentrations of nitrate, phosphorus, and pesticides in many streams, as well as altered runoff volumes and timing. In this study, we employ the recently developed theory of time-variant travel time distributions within the StorAge Selection function framework to quantify changes in water cycle dynamics resulting from the combined climate and land use changes. Our results from analysis of a subbasin in the Minnesota River Basin indicate a significant decrease in the mean travel time of water in the shallow subsurface layer during the growing season under current conditions compared to the pre-1970s conditions. We also find highly damped year-to-year fluctuations in the mean travel time, which we attribute to the "homogenization" of the hydrologic response due to artificial drainage. The dependence of the mean travel time on the spatial heterogeneity of some soil characteristics as well as on the basin scale is further explored via numerical experiments. Simulations indicate that the mean travel time is independent of scale for spatial scales larger than approximately 200 km2, suggesting that hydrologic data from larger basins may be used to infer the average of smaller-scale-driven changes in water cycle dynamics.
Fleishman, Gregory D
2013-01-01
Currently there is a concern about ability of the classical thermal (Maxwellian) distribution to describe quasi-steady-state plasma in solar atmosphere including active regions. In particular, other distributions have been proposed to better fit observations, for example, kappa- and $n$-distributions. If present, these distributions will generate radio emissions with different observable properties compared with the classical gyroresonance (GR) or free-free emission, which implies a way of remote detecting these non-Maxwellian distributions in the radio observations. Here we present analytically derived GR and free-free emissivities and absorption coefficients for the kappa- and $n$-distributions and discuss their properties, which are in fact remarkably different from each other and from the classical Maxwellian plasma. In particular, the radio brightness temperature from a gyrolayer increases with the optical depth $\\tau$ for kappa-distribution, but decreases with $\\tau$ for $n$-distribution. This property ...
Tian, Meng; Risku, Mika; Collin, Kaija
2016-01-01
This article provides a meta-analysis of research conducted on distributed leadership from 2002 to 2013. It continues the review of distributed leadership commissioned by the English National College for School Leadership (NCSL) ("Distributed Leadership: A Desk Study," Bennett et al., 2003), which identified two gaps in the research…
Product Distributions for Distributed Optimization. Chapter 1
Bieniawski, Stefan R.; Wolpert, David H.
2004-01-01
With connections to bounded rational game theory, information theory and statistical mechanics, Product Distribution (PD) theory provides a new framework for performing distributed optimization. Furthermore, PD theory extends and formalizes Collective Intelligence, thus connecting distributed optimization to distributed Reinforcement Learning (FU). This paper provides an overview of PD theory and details an algorithm for performing optimization derived from it. The approach is demonstrated on two unconstrained optimization problems, one with discrete variables and one with continuous variables. To highlight the connections between PD theory and distributed FU, the results are compared with those obtained using distributed reinforcement learning inspired optimization approaches. The inter-relationship of the techniques is discussed.
现代配电网健康指数理论最新研究进展%Recent Development of Health Index Theory for Modern Distribution Network
周莉梅; 马钊; 盛万兴
2016-01-01
The origin and concept of Health Index for power equipment are briefly introduced and domestic research work related to Health Index concept for distribution network is outlined.Then the definition of Health Index for distribution network is discussed.Research status and recent development of Health Index theory for modern distribution network both domestic and abroad are described and existing problems of distribution assets management and Health Index theory application in China are pointed out.Finally the main problem existing and improvement direction of present Health Index research for distribution network are put forward in the paper.%简要介绍了电力设备健康指数的起源和概念,以及国内在配电网健康指数概念方面的相关研究工作,进而探讨了配电网健康指数的定义;概述了国内外在配电资产管理和健康指数理论方面的研究现状与进展情况,指出了我国在配电资产管理和健康指数理论应用方面存在的一些问题;最后,针对目前配电网健康指数理论研究存在的主要问题提出了未来的改进方向.
Wei-Bin Zhang
2014-01-01
Full Text Available This paper proposes a growth model of heterogeneous households with economic structure, wealth accumulation, endogenous labour supply, and tax rates. The paper is focused on effects of redistribution policies on income and wealth distribution, economic structure and economic growth. The paper integrates the Walrasian general equilibrium theory and neoclassical economic growth within a comprehensive framework. We overcome the controversial features in the two traditional theories by applying an alternative approach to households. We build an analytical framework for a disaggregated and microfounded general theory of economic growth with endogenous wealth accumulation. We simulate the model to identify equilibrium, stability and to plot the motion of the dynamic system with three groups. We also carry out comparative dynamic analysis with regard to the lump tax, human capital and propensity to use leisure time.
Hofstetter, Stefan
2013-01-01
This dissertation primarily aims at filling a number of gaps in the theory of comparison, in particular with respect to phrasal comparison in Turkish, intensifiers, Negative Island Effects and the cross-linguistic distribution of measure phrases. In its first main section, it provides a fairly exhaustive overview of the inventory of comparison constructions attested in the Turkish language and shows that the adverb daha is a largely optional element with these except for constructions th...
Matthews, Thomas J; Whittaker, Robert J
2014-01-01
...‐based view of ecology. While neutral models have since been applied to a broad range of ecological and macroecological phenomena, the majority of research relating to neutral theory has focused exclusively on the species...
Clevenger, Shelly L; Navarro, Jordana N; Jasinski, Jana L
2016-09-01
This study examined the demographic and background characteristic differences between those arrested for child pornography (CP) possession (only), or CP production/distribution, or an attempted or completed sexual exploitation of a minor (SEM) that involved the Internet in some capacity within the context of self-control theory using data from the second wave of the National Juvenile Online Victimization Study (N-JOV2). Results indicate few demographic similarities, which thereby suggest these are largely heterogeneous groupings of individuals. Results also indicate CP producers/distributers engaged in a greater number of behaviors indicative of low self-control compared with CP possessors. Specifically, offenders arrested for CP production/distribution were more likely to have (a) had problems with drugs/alcohol at the time of the crime and (b) been previously violent. In contrast, the only indicator of low self-control that reached statistical significance for CP possessors was the previous use of violence. Moreover, in contrast to CP producers/distributers, full-time employment and marital status may be important factors to consider in the likelihood of arrest for CP possessors, which is congruent with the tenets of self-control theory. © The Author(s) 2014.
Cornaton, F; 10.1016/j.advwatres.2005.10.009
2011-01-01
We present a methodology for determining reservoir groundwater age and transit time probability distributions in a deterministic manner, considering advective-dispersive transport in steady velocity fields. In a first step, we propose to model the statistical distribution of groundwater age at aquifer scale by means of the classical advection-dispersion equation for a conservative and nonreactive tracer, associated to proper boundary conditions. The evaluated function corresponds to the density of probability of the random variable age, age being defined as the time elapsed since the water particles entered the aquifer. An adjoint backward model is introduced to characterize the life expectancy distribution, life expectancy being the time remaining before leaving the aquifer. By convolution of these two distributions, groundwater transit time distributions, from inlet to outlet, are fully defined for the entire aquifer domain. In a second step, an accurate and efficient method is introduced to simulate the tr...
Horvat, D; Narancic, Z; Horvat, Dubravko; Ilijic, Sasa; Narancic, Zoran
2004-01-01
Spherically symmetric distributions of electrically counterpoised dust (ECD) are used to construct solutions to Einstein-Maxwell equations in Majumdar-Papapetrou formalism. Unexpected bifurcating behavior of regular and singular solutions with regard to source strength is found for localized, as well as for the delta-function ECD distributions. Unified treatment of general ECD distributions is accomplished and it is shown that for certain source strengths one class of regular solutions approaches Minkowski spacetime, while the other comes arbitrarily close to black hole solutions.
Morrison, James L.
A computerized delivery system in consumer economics developed at the University of Delaware uses the PLATO system to provide a basis for analyzing consumer behavior in the marketplace. The 16 sequential lessons, part of the Consumer in the Marketplace Series (CMS), demonstrate consumer economic theory in layman's terms and are structured to focus…
Trapman, Pieter; Bootsma, Martinus Christoffel Jozef
2009-05-01
In this paper we establish a relation between the spread of infectious diseases and the dynamics of so called M/G/1 queues with processor sharing. The relation between the spread of epidemics and branching processes, which is well known in epidemiology, and the relation between M/G/1 queues and birth death processes, which is well known in queueing theory, will be combined to provide a framework in which results from queueing theory can be used in epidemiology and vice versa. In particular, we consider the number of infectious individuals in a standard SIR epidemic model at the moment of the first detection of the epidemic, where infectious individuals are detected at a constant per capita rate. We use a result from the literature on queueing processes to show that this number of infectious individuals is geometrically distributed.
Stapleton, Larry; Duffy, D.; Lakov, D; Jordanova, M.; Lyng, M
2004-01-01
the relationship between humans and advanced technology can be viewed as a network of interests of technical and non-technical agents. Drawing upon instrumental realist approaches as set out in agent network theory the paper describes a project currently underway in Ireland and Bulgaria which delivers comprehensive, assistive systems for people with learning disabilities. These systems address many of the difficulties associated with current assistive technology (AT) programmes, problems typi...
最概然分布理论的探究和剖析%A Summary of Background Knowledge of the Most Probable Distribution Theory
周昱; 魏蔚; 张艳燕; 马晓栋
2011-01-01
文章总结了最概然分布理论推导所需的一些基本概念和基本结论，在基本概念的表达、基本结论的理解和所有相关知识点的关联上提出一些体会与心得。%By summarizing basic concepts and conclusions in deriving the most probable distribution theory, this paper propose some comments and experiences of the expression of basic concepts, comprehend of basic conclusions, and relations of those pertinent knowle
Wilson, William G; Lundberg, Per
2004-09-22
Theoretical interest in the distributions of species abundances observed in ecological communities has focused recently on the results of models that assume all species are identical in their interactions with one another, and rely upon immigration and speciation to promote coexistence. Here we examine a one-trophic level system with generalized species interactions, including species-specific intraspecific and interspecific interaction strengths, and density-independent immigration from a regional species pool. Comparisons between results from numerical integrations and an approximate analytic calculation for random communities demonstrate good agreement, and both approaches yield abundance distributions of nearly arbitrary shape, including bimodality for intermediate immigration rates.
Drusano, George L.
1991-01-01
The optimal sampling theory is evaluated in applications to studies related to the distribution and elimination of several drugs (including ceftazidime, piperacillin, and ciprofloxacin), using the SAMPLE module of the ADAPT II package of programs developed by D'Argenio and Schumitzky (1979, 1988) and comparing the pharmacokinetic parameter values with results obtained by traditional ten-sample design. The impact of the use of optimal sampling was demonstrated in conjunction with NONMEM (Sheiner et al., 1977) approach, in which the population is taken as the unit of analysis, allowing even fragmentary patient data sets to contribute to population parameter estimates. It is shown that this technique is applicable in both the single-dose and the multiple-dose environments. The ability to study real patients made it possible to show that there was a bimodal distribution in ciprofloxacin nonrenal clearance.
Bhattacharyya, Pratip; Chakrabarti, Bikas K.
2008-01-01
We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…
Herzog, J.
1974-01-01
A method of calculating stage parameters and flow distribution of axial turbines is described. The governing equations apply to space between the blade rows and are based on the assumption of rotationally symmetrical, compressible, adiabatic flow conditions. Results are presented for stage design and flow analysis calculations. Theoretical results from the calculation system are compared with experimental data from low pressure steam turbine tests.
Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq
2008-01-01
In this paper, we propose a novel distributed causal multi-dimensional hidden Markov model (DHMM). The proposed model can represent, for example, multiple motion trajectories of objects and their interaction activities in a scene; it is capable of conveying not only dynamics of each trajectory, but also interactions information between multiple trajectories, which can be critical in many applications. We firstly provide a solution for non-causal, multi-dimensional hidden Markov model (HMM) by distributing the non-causal model into multiple distributed causal HMMs. We approximate the simultaneous solution of multiple HMMs on a sequential processor by an alternate updating scheme. Subsequently we provide three algorithms for the training and classification of our proposed model. A new Expectation-Maximization (EM) algorithm suitable for estimation of the new model is derived, where a novel General Forward-Backward (GFB) algorithm is proposed for recursive estimation of the model parameters. A new conditional independent subset-state sequence structure decomposition of state sequences is proposed for the 2D Viterbi algorithm. The new model can be applied to many other areas such as image segmentation and image classification. Simulation results in classification of multiple interacting trajectories demonstrate the superior performance and higher accuracy rate of our distributed HMM in comparison to previous models.
Drake, James Bob
1981-01-01
From results on the tensile strength and nick-break average jury evaluations test, it was concluded that with the same total practice time, different distributions of welding practice time intervals (15, 30, and 45 minutes) influence the quality of butt welds made by ninth-grade vocational agriculture students. (Author/SJL)
当代西方分配正义理论范式之争%A Study of the Different Norms in Contemporary Western Distributive Justice Theories
李广斌; 王勇
2011-01-01
在社会正义问题凸显的当代中国,从当代西方正义理论范式中寻求建构正义社会的基本元素具有重要的现实意义。通过对罗尔斯等人分配正义观的解析得出,分配正义内在于权利平等自由,权利自由、过程平等是罗尔斯等人分配正义理论共同坚守的底线。分配正义理论范式之争本质上是自由、民主、平等、正义在政治基本价值体系中排序及其正义与善关系之争,争论背后的焦点则是对国家权力正当性的追问。%In contemporary China where social justice is increasingly is standing out,seeking the basic elements from Western distributive justice norms to build a just society is of great practical significance.Through an analysis of contemporary Western distributive justice theory by such scholars as John Rawls,the paper concludes that the core of distributive justice lies in rights equality and freedom,and the freedom of rights,or the freedom of rights and the equality of procedure are the bottom line for all the above-mentioned distributive justice theory.The debates on justice norms are essentially arguments over the order of freedom,democracy,equality and justice in the basic political values system and over the relationship between justice and good.The focus of the debates is the inquiry into the legitimacy of state power.
Hess, Samuel T.; Gould, Travis J.; Gudheti, Manasa V.; Maas, Sarah A.; Mills, Kevin D.; Zimmerberg, Joshua
2007-01-01
Organization in biological membranes spans many orders of magnitude in length scale, but limited resolution in far-field light microscopy has impeded distinction between numerous biomembrane models. One canonical example of a heterogeneously distributed membrane protein is hemagglutinin (HA) from influenza virus, which is associated with controversial cholesterol-rich lipid rafts. Using fluorescence photoactivation localization microscopy, we are able to image distributions of tens of thousands of HA molecules with subdiffraction resolution (≈40 nm) in live and fixed fibroblasts. HA molecules form irregular clusters on length scales from ≈40 nm up to many micrometers, consistent with results from electron microscopy. In live cells, the dynamics of HA molecules within clusters is observed and quantified to determine an effective diffusion coefficient. The results are interpreted in terms of several established models of biological membranes. PMID:17959773
Norheim, Ole Frithjof; Asada, Yukiko
2009-11-18
The past decade witnessed great progress in research on health inequities. The most widely cited definition of health inequity is, arguably, the one proposed by Whitehead and Dahlgren: "Health inequalities that are avoidable, unnecessary, and unfair are unjust." We argue that this definition is useful but in need of further clarification because it is not linked to broader theories of justice. We propose an alternative, pluralist notion of fair distribution of health that is compatible with several theories of distributive justice. Our proposed view consists of the weak principle of health equality and the principle of fair trade-offs. The weak principle of health equality offers an alternative definition of health equity to those proposed in the past. It maintains the all-encompassing nature of the popular Whitehead/Dahlgren definition of health equity, and at the same time offers a richer philosophical foundation. This principle states that every person or group should have equal health except when: (a) health equality is only possible by making someone less healthy, or (b) there are technological limitations on further health improvement. In short, health inequalities that are amenable to positive human intervention are unfair. The principle of fair trade-offs states that weak equality of health is morally objectionable if and only if: (c) further reduction of weak inequality leads to unacceptable sacrifices of average or overall health of the population, or (d) further reduction in weak health inequality would result in unacceptable sacrifices of other important goods, such as education, employment, and social security.
Pisarenko, V. F.; Sornette, A.; Sornette, D.; Rodkin, M. V.
2014-08-01
The present work is a continuation and improvement of the method suggested in P isarenko et al. (Pure Appl Geophys 165:1-42, 2008) for the statistical estimation of the tail of the distribution of earthquake sizes. The chief innovation is to combine the two main limit theorems of Extreme Value Theory (EVT) that allow us to derive the distribution of T-maxima (maximum magnitude occurring in sequential time intervals of duration T) for arbitrary T. This distribution enables one to derive any desired statistical characteristic of the future T-maximum. We propose a method for the estimation of the unknown parameters involved in the two limit theorems corresponding to the Generalized Extreme Value distribution (GEV) and to the Generalized Pareto Distribution (GPD). We establish the direct relations between the parameters of these distributions, which permit to evaluate the distribution of the T-maxima for arbitrary T. The duality between the GEV and GPD provides a new way to check the consistency of the estimation of the tail characteristics of the distribution of earthquake magnitudes for earthquake occurring over an arbitrary time interval. We develop several procedures and check points to decrease the scatter of the estimates and to verify their consistency. We test our full procedure on the global Harvard catalog (1977-2006) and on the Fennoscandia catalog (1900-2005). For the global catalog, we obtain the following estimates: = 9.53 ± 0.52 and = 9.21 ± 0.20. For Fennoscandia, we obtain = 5.76 ± 0.165 and = 5.44 ± 0.073. The estimates of all related parameters for the GEV and GPD, including the most important form parameter, are also provided. We demonstrate again the absence of robustness of the generally accepted parameter characterizing the tail of the magnitude-frequency law, the maximum possible magnitude M max, and study the more stable parameter Q T ( q), defined as the q-quantile of the distribution of T-maxima on a future interval of duration T.
Narsimhan, Vivek; Zhao, Hong; Shaqfeh, Eric S. G.
2013-06-01
We develop a coarse-grained theory to predict the concentration distribution of a suspension of vesicles or red blood cells in a wall-bound Couette flow. This model balances the wall-induced hydrodynamic lift on deformable particles with the flux due to binary collisions, which we represent via a second-order kinetic master equation. Our theory predicts a depletion of particles near the channel wall (i.e., the Fahraeus-Lindqvist effect), followed by a near-wall formation of particle layers. We quantify the effect of channel height, viscosity ratio, and shear-rate on the cell-free layer thickness (i.e., the Fahraeus-Lindqvist effect). The results agree with in vitro experiments as well as boundary integral simulations of suspension flows. Lastly, we examine a new type of collective particle motion for red blood cells induced by hydrodynamic interactions near the wall. These "swapping trajectories," coined by Zurita-Gotor et al. [J. Fluid Mech. 592, 447-469 (2007), 10.1017/S0022112007008701], could explain the origin of particle layering near the wall. The theory we describe represents a significant improvement in terms of time savings and predictive power over current large-scale numerical simulations of suspension flows.
Wanxing Sheng
2016-05-01
Full Text Available In this paper, a reactive power optimization method based on historical data is investigated to solve the dynamic reactive power optimization problem in distribution network. In order to reflect the variation of loads, network loads are represented in a form of random matrix. Load similarity (LS is defined to measure the degree of similarity between the loads in different days and the calculation method of the load similarity of load random matrix (LRM is presented. By calculating the load similarity between the forecasting random matrix and the random matrix of historical load, the historical reactive power optimization dispatching scheme that most matches the forecasting load can be found for reactive power control usage. The differences of daily load curves between working days and weekends in different seasons are considered in the proposed method. The proposed method is tested on a standard 14 nodes distribution network with three different types of load. The computational result demonstrates that the proposed method for reactive power optimization is fast, feasible and effective in distribution network.
Maggiano, Corey M; Maggiano, Isabel S; Tiesler, Vera G; Chi-Keb, Julio R; Stout, Sam D
2016-01-01
This study compares two novel methods quantifying bone shaft tissue distributions, and relates observations on human humeral growth patterns for applications in anthropological and anatomical research. Microstructural variation in compact bone occurs due to developmental and mechanically adaptive circumstances that are 'recorded' by forming bone and are important for interpretations of growth, health, physical activity, adaptation, and identity in the past and present. Those interpretations hinge on a detailed understanding of the modeling process by which bones achieve their diametric shape, diaphyseal curvature, and general position relative to other elements. Bone modeling is a complex aspect of growth, potentially causing the shaft to drift transversely through formation and resorption on opposing cortices. Unfortunately, the specifics of modeling drift are largely unknown for most skeletal elements. Moreover, bone modeling has seen little quantitative methodological development compared with secondary bone processes, such as intracortical remodeling. The techniques proposed here, starburst point-count and 45° cross-polarization hand-drawn histomorphometry, permit the statistical and populational analysis of human primary tissue distributions and provide similar results despite being suitable for different applications. This analysis of a pooled archaeological and modern skeletal sample confirms the importance of extreme asymmetry in bone modeling as a major determinant of microstructural variation in diaphyses. Specifically, humeral drift is posteromedial in the human humerus, accompanied by a significant rotational trend. In general, results encourage the usage of endocortical primary bone distributions as an indicator and summary of bone modeling drift, enabling quantitative analysis by direction and proportion in other elements and populations.
Charles, Hanot; Stefan, Martin; Kurt, Liewer; Frank, Loya; Dimitri, Mawet; Pierre, Riaud; Olivier, Absil; Eugene, Serabyn; 10.1088/0004-637X/729/2/110
2011-01-01
A new "self-calibrated" statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10-4 can be obtained in the near-infrared, which means that null depths...
Beauchamp, G
2000-11-07
In population games, the optimal behaviour of a forager depends partly on courses of action selected by other individuals in the population. How individuals learn to allocate effort in foraging games involving frequency-dependent payoffs has been little examined. The performance of three different learning rules was investigated in several types of habitats in each of two population games. Learning rules allow individuals to weigh information about the past and the present and to choose among alternative patterns of behaviour. In the producer-scrounger game, foragers use producer to locate food patches and scrounger to exploit the food discoveries of others. In the ideal free distribution game, foragers that experience feeding interference from companions distribute themselves among heterogeneous food patches. In simulations of each population game, the use of different learning rules induced large variation in foraging behaviour, thus providing a tool to assess the relevance of each learning rule in experimental systems. Rare mutants using alternative learning rules often successfully invaded populations of foragers using other rules indicating that some learning rules are not stable when pitted against each other. Learning rules often closely approximated optimal behaviour in each population game suggesting that stimulus-response learning of contingencies created by foraging companions could be sufficient to perform at near-optimal level in two population games.
Russo, Lucia; Russo, Paola; Siettos, Constantinos I
2016-01-01
Based on complex network theory, we propose a computational methodology which addresses the spatial distribution of fuel breaks for the inhibition of the spread of wildland fires on heterogeneous landscapes. This is a two-level approach where the dynamics of fire spread are modeled as a random Markov field process on a directed network whose edge weights are determined by a Cellular Automata model that integrates detailed GIS, landscape and meteorological data. Within this framework, the spatial distribution of fuel breaks is reduced to the problem of finding network nodes (small land patches) which favour fire propagation. Here, this is accomplished by exploiting network centrality statistics. We illustrate the proposed approach through (a) an artificial forest of randomly distributed density of vegetation, and (b) a real-world case concerning the island of Rhodes in Greece whose major part of its forest was burned in 2008. Simulation results show that the proposed methodology outperforms the benchmark/conventional policy of fuel reduction as this can be realized by selective harvesting and/or prescribed burning based on the density and flammability of vegetation. Interestingly, our approach reveals that patches with sparse density of vegetation may act as hubs for the spread of the fire.
Jihong Xia
2016-02-01
Full Text Available A curved riparian zone can create highly complex flow patterns that have a great effect on erosion, pollutant transport, surface water-groundwater exchange and habitat qualities. The small-disturbance theory has been applied to derive the analytical solutions of pressure distributions along a sinusoidal riverbank. Experiments have also been performed to test the hydrodynamic and geomorphic effects on pressure distribution and to verify the applicability of the derived expressions. The derived expressions were simple, accurate and agreed remarkably well with experimental results for the riparian banks with a low degree of curvature. On the contrary, when a riparian bank had a high degree of curvature, these expressions applying the approach of small-disturbance, could not effectively estimate the pressure distributions for a complex bank boundary or complex flow conditions. Moreover, sensitive analysis has indicated that the disturbed pressures along the riparian banks increased with increasing Froude number Fr, as well as the ratio of bank amplitude to wavelength a/λ. However, a/λ has been found to have more significant influence on pressure variation in subcritical flow.
Kataoka, Hajime
2017-07-01
Body fluid volume regulation is a complex process involving the interaction of various afferent (sensory) and neurohumoral efferent (effector) mechanisms. Historically, most studies focused on the body fluid dynamics in heart failure (HF) status through control of the balance of sodium, potassium, and water in the body, and maintaining arterial circulatory integrity is central to a unifying hypothesis of body fluid regulation in HF pathophysiology. The pathophysiologic background of the biochemical determinants of vascular volume in HF status, however, has not been known. I recently demonstrated that changes in vascular and red blood cell volumes are independently associated with the serum chloride concentration, but not the serum sodium concentration, during worsening HF and its recovery. Based on these observations and the established central role of chloride in the renin-angiotensin-aldosterone system, I propose a unifying hypothesis of the "chloride theory" for HF pathophysiology, which states that changes in the serum chloride concentration are the primary determinant of changes in plasma volume and the renin-angiotensin-aldosterone system under worsening HF and therapeutic resolution of worsening HF. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kapanadze G. A.
2008-03-01
Full Text Available The problem of elastic equilibrium of a lower half-plane which is weakened by periodically distributed equi-strong holes, is considered. The hole boundaries are assumed to be free from external stresses, an absolutely smooth rigid stamp with a rectilinear base is applied to the boundary of the half-plane, and external normal contracting forces with principal vector P are applied to the stamp. The problem is to find stressed state of the half-plane as well as analytic forms of boundaries of equi-string holes under the condition that tangential normal stress takes on them constant value. Using the methods of the theory of analytic functions, the problem is reduced to the Keldysh-Sedov problem for a half-plane whose solution allows us to construct Kolosov-Muskhelishvili’s complex potentials and equations of unknown contours effectively (analytically.
Golovin, A V [Photon Factory, Institute of Materials Structure Science, Tsukuba 305-0801 (Japan); Institute of Physics, St Petersburg State University, 198504 St Petersburg (Russian Federation); Adachi, J [Photon Factory, Institute of Materials Structure Science, Tsukuba 305-0801 (Japan); Graduate School of Science, University of Tokyo, Bunkyo-ku, Tokyo 113-0033 (Japan); Motoki, S [Graduate School of Science, University of Tokyo, Bunkyo-ku, Tokyo 113-0033, (Japan); Takahashi, M [Institute for Molecular Science, Okazaki 444-8585 (Japan); Yagishita, A [Photon Factory, Institute of Materials Structure Science, Tsukuba 305-0801 (Japan); Graduate School of Science, University of Tokyo, Bunkyo-ku, Tokyo 113-0033 (Japan)
2005-10-28
Photoelectron angular distributions (PADs) for O 1s, C 1s and S 2p{sub 1/2}, 2p{sub 3/2} ionization of OCS molecules have been measured in shape resonance regions. These PAD results are compared with the results for O 1s and C 1s ionization of CO molecules, and multi-scattering X{alpha} (MSX{alpha}) calculations. The mechanism of the PAD formation both for parallel and perpendicular transitions differs very significantly in these molecules and a step from a two-centre potential (CO) to a three-centre potential (OCS) plays a principal role in electron scattering and the formation of the resulting PAD. For parallel transitions, it is found that for the S 2p and O 1s ionization the photoelectrons are emitted preferentially in a hemisphere directed to the ionized S and O atom, respectively. In OCS O 1s ionization, the S-C fragment plays the role of a strong 'scatterer' for photoelectrons, and in the shape resonance region most intensities of the PADs are concentrated on the region directed to the O atom. The MSX{alpha} calculations for perpendicular transitions reproduce the experimental data, but not so well as in the case of parallel transitions. The results of PAD, calculated with different l{sub max} on different atomic centres, reveal the important role of the d (l = 2) partial wave for the S atom in the partial wave decompositions of photoelectron wavefunctions.
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Clark, G
2003-04-28
This report describes a feasibility study. We are interested in calculating the angular and linear velocities of a re-entry vehicle using six acceleration signals from a distributed accelerometer inertial measurement unit (DAIMU). Earlier work showed that angular and linear velocity calculation using classic nonlinear ordinary differential equation (ODE) solvers is not practically feasible, due to mathematical and numerical difficulties. This report demonstrates the theoretical feasibility of using model-based nonlinear state estimation techniques to obtain the angular and linear velocities in this problem. Practical numerical and calibration issues require additional work to resolve. We show that the six accelerometers in the DAIMU are not sufficient to provide observability, so additional measurements of the system states are required (e.g. from a Global Positioning System (GPS) unit). Given the constraint that our system cannot use GPS, we propose using the existing on-board 3-axis magnetometer to measure angular velocity. We further show that the six nonlinear ODE's for the vehicle kinematics can be decoupled into three ODE's in the angular velocity and three ODE's in the linear velocity. This allows us to formulate a three-state Gauss-Markov system model for the angular velocities, using the magnetometer signals in the measurement model. This re-formulated model is observable, allowing us to build an Extended Kalman Filter (EKF) for estimating the angular velocities. Given the angular velocity estimates from the EKF, the three ODE's for the linear velocity become algebraic, and the linear velocity can be calculated by numerical integration. Thus, we do not need direct measurements of the linear velocity to provide observability, and the technique is mathematically feasible. Using a simulation example, we show that the estimator adds value over the numerical ODE solver in the presence of measurement noise. Calculating the velocities in the
Application of Kruskal Theory in E-commerce Logistics Distribution%Kruskal理论在电子商务类物流配送中的应用
曾萍
2013-01-01
In this paper we studied whether the historical distribution path was the optimal path and suitable for theChinese e-commerce logistics enterprises and then in the empirical case of an enterprise,applied the Kruskal theory in the study and quantitatively determined the optimal distribution route of the enterprises,thus effectively reducing the total production cost.%针对目前国内对电子商务类物流企业的历史配送路径是否合理,以及是否为最佳路径的问题展开分析研究.采用选取一家真实企业作为实证对象进行分析的方法,将信息科学中著名的Kruskal理论运用到研究中,量化确定了该企业的最佳配送路径,从而有效降低了其总生产成本.
Van Goethem, Nicolas
2010-01-01
This paper develops a geometrical model of dislocations and disclinations in single crystals at the mesoscopic scale. In the continuation of previous work the distribution theory is used to represent concentrated effects in the defect lines which in turn form the branching lines of the multiple-valued elastic displacement and rotation fields. Fundamental identities relating the incompatibility tensor to the dislocation and disclination densities are proved in the case of countably many parallel defect lines, under global 2D strain assumptions relying on the geometric measure theory. Our theory provides the appropriate objective internal variables and the required mathematical framework for a rigorous homogenization from mesoscopic to macroscopic scale.
Ling, Daniel Y.; Ling, Xinsheng Sean
2016-01-01
In this short note, a correction is made to the recently proposed solution [1] to a 1D biased diffusion model for linear DNA translocation and a new analysis will be given to the data in [1]. It was pointed out [2] by us recently that this 1D linear translocation model is equivalent to the one that was considered by Schrödinger [3] for the Enrenhaft-Millikan measurements [4,5] on electron charge. Here we apply Schrödinger’s first-passage-time distribution formula to the data set in [1]. It is found that Schrödinger’s formula can be used to describe the time distribution of DNA translocation in solid-state nanopores. These fittings yield two useful parameters: drift velocity of DNA translocation and diffusion constant of DNA inside the nanopore. The results suggest two regimes of DNA translocation: (I) at low voltages, there are clear deviations from Smoluchowski’s linear law of electrophoresis [6] which we attribute to the entropic barrier effects; (II) at high voltages, the translocation velocity is a linear function of the applied electric field. In regime II, the apparent diffusion constant exhibits a quadratic dependence on applied electric field, suggesting a mechanism of Taylor dispersion effect likely due the electro-osmotic flow field in the nanopore channel. This analysis yields a dispersion-free diffusion constant value of 11.2 nm2/µs for the segment of DNA inside the nanopore which is in agreement with Stokes-Einstein theory quantitatively. The implication of Schrödinger’s formula for DNA sequencing is discussed. PMID:23963318
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
黎光明; 张敏强
2012-01-01
方差分量估计是进行概化理论分析的关键。采用MonteCarlo模拟技术，探讨心理与教育测量数据分布对概化理论各种方法估计方差分量的影响。数据分布包括正态、二项和多项分布，估计方法包括Traditional、Jackknife、Bootstrap和MCMC方法。结果表明：（1）Traditional方法估计正态分布和多项分布数据的方差分量相对较好，估计二项分布数据需要校正，Jackknife方法准确地估计了三种分布数据的方差分量，校正的Bootstrap方法和有先验信息的MCMC方法（MCMCinf）估计三种分布数据的方差分量结果较好；（2）心理与教育测量数据分布对四种方法估计概化理论方差分量有影响，数据分布制约着各种方差分量估计方法性能的发挥，需要加以区分地使用。%Estimating variability is an essential part of generalizability theory and is of central importance. The study adopted Monte Carlo data simulation technique to explore the effect of three data distribution on four method of estimating variance components for generalizability theory. Three data distribution were normal data distribution, dichotomous data distribution and polytomous data distribution. Four estimated methods were traditional method, bootstrap method, jackknife method and Markov Chain Monte Carlo method (MCMC). The results show that the performance of four methods is different for three data distribution. Traditional method is good for normal distribution data and polychromous distribution data. But it is not good and needs to be adjusted for dichotomous distribution data. Jackknife method accurately estimates variance components for three data distribution. As for estimating variance components, adjusted bootstrap method is better than unadjusted bootstrap methods. Compared with MCMC method with non-informative priors, MCMC method with informative priors is good for estimating variance components in generalizability theory. Data
张磊; 苟小菊
2012-01-01
This paper analyses the probability distribution of stock index returns in three different time scales by using nonextensive statistical mechanics theory proposed by Tsallis, closely associated with dynamic system described by the nonlinear Fokker-Planck equations in the financial market modeling, according to the high-frequency data of Shanghai and Shenzhen stock market Index from 2004-1-1 to 2008-11-13, and suggests that Tsallis distribution can describe the characteristics of fat-tail and finite variance of the two markets, and gives the market microstructure dynamic explanation. It indicates that the stock price processes of Shanghai and Shenzhen markets is not consistent with random walk, but the anomalous diffusion process. The two markets have very similar characteristics of nonlinear dynamic systems. These results for asset allocation and pricing, risk management and institutional development in China! Financial markets are of great significance.%应用Tsallis提出的非广延统计力学理论以及与之密切相关的非线性Fokker-Planck方程所描述的动力系统,根据我国上证指数和深证指数2004年1月1日～2008年11月13日的高频数据,分析了在三种不同的时间标度下股指收益的概率分布,发现Tsallis分布可以很好地描述两市收益分布的尖峰厚尾有限方差等特征,同时也给出了市场微观动力学层面的解释.揭示出我国上海和深圳股市的价格过程并不符合随机游走,而是反常扩散过程,两市具有十分接近的非线性动力系统特征.所得结论对于研究我国金融市场的资产配置和定价、风险管理和制度建设都具有重要的意义.
Verma, Pragya; Truhlar, Donald G
2017-05-24
Dipole moments are the first moment of electron density and are fundamental quantities that are often available from experiments. An exchange-correlation functional that leads to an accurate representation of the charge distribution of a molecule should accurately predict the dipole moments of the molecule. It is well known that Kohn-Sham density functional theory (DFT) is more accurate for the energetics of single-reference systems than for the energetics of multi-reference ones, but there has been less study of charge distributions. In this work, we benchmark 48 density functionals chosen with various combinations of ingredients, against accurate experimental data for dipole moments of 78 molecules, in particular 55 single-reference molecules and 23 multi-reference ones. We chose both organic and inorganic molecules, and within the category of inorganic molecules there are both main-group and transition-metal-containing molecules, with some of them being multi-reference. As one would expect, the multi-reference molecules are not as well described by single-reference DFT, and the functionals tested in this work do show larger mean unsigned errors (MUEs) for the 23 multi-reference molecules than the single-reference ones. Five of the 78 molecules have relatively large experimental error bars and were therefore not included in calculating the overall MUEs. For the 73 molecules not excluded, we find that three of the hybrid functionals, B97-1, PBE0, and TPSSh (each with less than or equal to 25% Hartree-Fock (HF) exchange), the range-separated hybrid functional, HSE06 (with HF exchange decreasing from 25% to 0 as interelectronic distance increases), and the hybrid functional, PW6B95 (with 28% HF exchange) are the best performing functionals with each yielding an MUE of 0.18 D. Perhaps the most significant finding of this study is that there exists great similarity among the success rate of various functionals in predicting dipole moments. In particular, of 39
Marchand, Jean-Paul
2007-01-01
In a simple but mathematically coherent manner, this text examines the basis of the distribution theories devised by Schwartz and by Mikusinki. Rigorous and concise, it surveys the functional theory of distributions as well as the algebraic theory. Its easy generalizations offer applications to a wide variety of problems.The two-part treatment begins with the functional theory of distributions, exploring differentiation, formation of products, translation and regularization, convergence, Fourier transforms, and partial differential equations. The second half focuses on the algebraic theory of
Pesetskaya, N. N.; Timofeev, I. YA.; Shipilov, S. D.
1988-01-01
In recent years much attention has been given to the development of methods and programs for the calculation of the aerodynamic characteristics of multiblade, saber-shaped air propellers. Most existing methods are based on the theory of lifting lines. Elsewhere, the theory of a lifting surface is used to calculate screw and lifting propellers. In this work, methods of discrete eddies are described for the calculation of the aerodynamic characteristics of propellers using the linear and nonlinear theories of lifting surfaces.
Pesetskaya, N. N.; Timofeev, I. YA.; Shipilov, S. D.
1988-01-01
In recent years much attention has been given to the development of methods and programs for the calculation of the aerodynamic characteristics of multiblade, saber-shaped air propellers. Most existing methods are based on the theory of lifting lines. Elsewhere, the theory of a lifting surface is used to calculate screw and lifting propellers. In this work, methods of discrete eddies are described for the calculation of the aerodynamic characteristics of propellers using the linear and nonlinear theories of lifting surfaces.
强新伟; 武良臣; 李建华; 牛永生
2001-01-01
本文通过对三种不同压力分布(弹流润滑理论、赫兹理论、平均赫兹理论)下牵引力的MATLAB编程计算，探讨了赫兹理论、平均赫兹理论的精确程度，以及其各自的适用范围。%: In this paper, through the programming calculation of traction force with MATLAB in three different pressure distribution (EHL theory、 HZ theory、 MP theory), we discuss the precision degree of both HZ theory and MP theory, and their applying field respectively.
探析正态分布理论在体育运动统计分析中的作用%Analysis of Normal Distribution Theory in Analysis of Sports Statistics
田利军
2012-01-01
Used literature information method and case analysis method, on normal distribution theory in sports statistics analysis in the of role for analysis description, results came in sports area in the of variable obedience normal distribution Shi, can used normal distribution related theory to reveals random variable Xia of data law, but different nature of sports project to under specific situation, used suitable of model to solution encountered of actual problem, this aimed at for General Sports workers select suitable of sports statistics method provides theory reference.%采用文献资料法和案例分析法，对正态分布理论在体育运动统计分析中的作用进行分析说明，结果得出在体育领域中的变量服从正态分布时，可采用正态分布相关理论来揭示随机变量下的数据规律，但不同性质的体育项目要根据具体情况，采用合适的模型来解决遇到的实际问题，本文旨在为广大体育工作者选择适宜的体育统计方法提供理论参考．
杨志平; 文波; 洪彬倬
2016-01-01
More and more wide application of distributed generation (DG)technology in power system brings certain impact and influence on power distribution network planning. Therefore,uncertain programming theory is used for fuzzy simulation on output power of wind power generation and photovoltaic generation and uncertainty of load in power distribution net-work,and an optimization model for power distribution network structure based on this theory is established as well. Mean-while,improved partheno-genetic algorithm based on tree structure coding is used for solving the model. Finally,simulating analysis on 16-node system example verifies feasibility of the planning model based on uncertain programming theory and va-lidity of the improved partheno-genetic algorithm.%随着分布式电源(distributed generation,DG)发电技术越来越多地渗透到电力系统中,给配电网规划带来了一定的冲击和影响.为此,利用不确定规划理论,对配电网中风力发电输出功率、光伏发电输出功率以及负荷大小的不确定性进行了模糊模拟,并建立了基于该理论的配电网网架优化模型.采用了基于树形结构编码的改进单亲遗传算法对模型进行求解.最后,通过对16节点配电网络算例的仿真分析,验证了基于不确定规划理论网架规划模型的可行性和改进单亲遗传算法的有效性.
Silveira, Rodrigo L; Stoyanov, Stanislav R; Gusarov, Sergey; Skaf, Munir S; Kovalenko, Andriy
2015-01-02
Plant biomass recalcitrance, a major obstacle to achieving sustainable production of second generation biofuels, arises mainly from the amorphous cell-wall matrix containing lignin and hemicellulose assembled into a complex supramolecular network that coats the cellulose fibrils. We employed the statistical-mechanical, 3D reference interaction site model with the Kovalenko-Hirata closure approximation (or 3D-RISM-KH molecular theory of solvation) to reveal the supramolecular interactions in this network and provide molecular-level insight into the effective lignin-lignin and lignin-hemicellulose thermodynamic interactions. We found that such interactions are hydrophobic and entropy-driven, and arise from the expelling of water from the mutual interaction surfaces. The molecular origin of these interactions is carbohydrate-π and π-π stacking forces, whose strengths are dependent on the lignin chemical composition. Methoxy substituents in the phenyl groups of lignin promote substantial entropic stabilization of the ligno-hemicellulosic matrix. Our results provide a detailed molecular view of the fundamental interactions within the secondary plant cell walls that lead to recalcitrance.
曾云; 刘宗武
2011-01-01
The paper studies the optimization of logistics distribution routes based on the theory of complex networks and, in view of the complex character of the distribution network, builds a distribution network graph model which is solved through dynamic programming method. An empirical example is presented at the end to verify the feasibility of the model.%在复杂网络理论的基础上研究物流配送路径优化问题,结合配送网络的复杂性特点,构建配送网络图模型,利用动态规划方法求解,最终通过实例进行验证.
SAW, J.G.
THIS VOLUME DEALS WITH THE BIVARIATE NORMAL DISTRIBUTION. THE AUTHOR MAKES A DISTINCTION BETWEEN DISTRIBUTION AND DENSITY FROM WHICH HE DEVELOPS THE CONSEQUENCES OF THIS DISTINCTION FOR HYPOTHESIS TESTING. OTHER ENTRIES IN THIS SERIES ARE ED 003 044 AND ED 003 045. (JK)
Dov Monderer; Moshe Tennenholtz
1997-01-01
The Internet exhibits forms of interactions which are not captured by existing models in economics, artificial intelligence and game theory. New models are needed to deal with these multi-agent interactions. In this paper we present a new model--distributed games. In such a model each players controls a number of agents which participate in asynchronous parallel multi-agent interactions (games). The agents jointly and strategically control the level of information monitoring by broadcasting m...
Uniform distribution of sequences
Kuipers, L
2006-01-01
The theory of uniform distribution began with Hermann Weyl's celebrated paper of 1916. In later decades, the theory moved beyond its roots in diophantine approximations to provide common ground for topics as diverse as number theory, probability theory, functional analysis, and topological algebra. This book summarizes the theory's development from its beginnings to the mid-1970s, with comprehensive coverage of both methods and their underlying principles.A practical introduction for students of number theory and analysis as well as a reference for researchers in the field, this book covers un
Midtgaard, Søren Flinch
2012-01-01
Thomas Pogge’s ingenious and influential Rawlsian theory of global justice asserts that principles of justice such as the difference principle or, alternatively, a universal criterion of human rights consisting of a subset of the principles of social justice apply to the global basic structure...
Ahsanullah, Mohammad
2016-01-01
The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.
贾后明
2015-01-01
中国分配改革从农村开始，渐近发展到国有企业、机关事业单位，并推进到整个社会。中国的收入分配改革适应了中国整个社会经济发展，促进了经济繁荣和社会进步，在改革中始终坚持了渐进和统筹的方针。分配改革的实践引起了关于分配理论的争议，推动了社会主义分配思想的创新和发展，促进了中国特色社会主义分配制度的形成和完善。%Chinese distribution reform started from the countryside, gradually passed into state-owned enterprises, institutions and moved forward into the whole society. The reform of the income distribution adapted to the whole social economic development in China, promoted economic prosperity and social progress, and always adhered to the progressive and comprehensive policy. Practice of allocation reform caused controversy about distribution theory, pushed forward the innovation and development of socialism distribution idea, and promoted the formation and perfection of the socialism with Chinese characteristics in distribution system.
Wæver, Ole
2009-01-01
Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism and refle......Kenneth N. Waltz's 1979 book, Theory of International Politics, is the most influential in the history of the discipline. It worked its effects to a large extent through raising the bar for what counted as theoretical work, in effect reshaping not only realism but rivals like liberalism...... and reflectivism. Yet, ironically, there has been little attention to Waltz's very explicit and original arguments about the nature of theory. This article explores and explicates Waltz's theory of theory. Central attention is paid to his definition of theory as ‘a picture, mentally formed' and to the radical anti......-empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory...
苏浩益; 贺伟明; 吴小勇; 谢振宁; 黄升; 罗杰
2014-01-01
On the basis of stating basic principles and characteristics of distribution automation system based on cable fault in-dicator,cost-benefit analysis model for distribution automation system reliability was proposed. By analyzing characteristics of distribution network in our country and combining engineering calculation method for distribution system reliability, quantitative calculation on cost and benefit of distribution automation system composed of cable fault indicator for improving system reliability was conducted. Combining cloud theory,economic evaluation on transform of distribution automation sys-tem was proceeded in order to find an entry point for ideal investment on distribution automation system. Example analysis indicated that this model was provided with better practicability and feasibility.%在阐述基于电缆故障指示器的配电自动化系统的基本原理和特点的基础上，提出配电自动化系统可靠性成本效益分析模型。通过分析我国配电网的特点，结合配电系统可靠性的工程计算方法，对通过电缆故障指示器构成的配电自动化系统在提高系统可靠性方面的成本和效益进行定量计算。结合云理论对配电自动化系统改造进行经济评估，找到实施配电自动化系统理想投资的切入点。算例分析表明该模型具有较好的实用性和可行性。
Marchesi Julian R
2007-03-01
Full Text Available Abstract Background The question of how a circle or line segment becomes covered when random arcs are marked off has arisen repeatedly in bioinformatics. The number of uncovered gaps is of particular interest. Approximate distributions for the number of gaps have been given in the literature, one motivation being ease of computation. Error bounds for these approximate distributions have not been given. Results We give bounds on the probability distribution of the number of gaps when a circle is covered by fragments of fixed size. The absolute error in the approximation is typically on the order of 0.1% at 10× coverage depth. The method can be applied to coverage problems on the interval, including edge effects, and applications are given to metagenomic libraries and shotgun sequencing.
Wang, Xin; Li, Zhaosheng; Zou, Zhigang
2015-07-15
Although the crystallographic space group has been determined, detailed first principles calculations of the LaTiO2N semiconductor photocatalyst crystal have not been performed because of the nitrogen/oxygen sosoloid-like anion distribution. In this study, based on the Heyd-Scuseria-Ernzerhof method and experimental anion content, we present the possibility of determining detailed information about the LaTiO2N sosoloid-like anion distribution by dividing the anions into possible primitive cells. The detailed information about the anion distribution based on the characteristics of the energetically acceptable primitive cell structures suggests that the LaTiO2N structure is composed of aperiodic stacks of six building-block primitive cells, the non-vacancy primitive cells are located at the surface as effective photoreaction sites, and vacancy structures are located in the bulk. The surface oxide-rich structures increase the near-surface conduction band minimum rise and strengthen photoelectron transport to the bulk, while the content of the bulk vacancy structures should be balanced because of being out of photoreactions. This study is expected to provide a different perspective to understanding the LaTiO2N sosoloid-like anion distribution.
Kliegl, Reinhold
2007-01-01
K. Rayner, A. Pollatsek, D. Drieghe, T. J. Slattery, and E. D. Reichle argued that the R. Kliegl, A. Nuthmann, and R. Engbert corpus-analytic evidence for distributed processing during reading should not be accepted because (a) there might be problems of multicollinearity, (b) the distinction between content and function words and the skipping…
Caffarel, Michel; Giner, Emmanuel; Scemama, Anthony; Ramírez-Solís, Alejandro
2014-12-09
We present a comparative study of the spatial distribution of the spin density of the ground state of CuCl2 using Density Functional Theory (DFT), quantum Monte Carlo (QMC), and post-Hartree-Fock wave function theory (WFT). A number of studies have shown that an accurate description of the electronic structure of the lowest-lying states of this molecule is particularly challenging due to the interplay between the strong dynamical correlation effects in the 3d shell and the delocalization of the 3d hole over the chlorine atoms. More generally, this problem is representative of the difficulties encountered when studying open-shell metal-containing molecular systems. Here, it is shown that qualitatively different results for the spin density distribution are obtained from the various quantum-mechanical approaches. At the DFT level, the spin density distribution is found to be very dependent on the functional employed. At the QMC level, Fixed-Node Diffusion Monte Carlo (FN-DMC) results are strongly dependent on the nodal structure of the trial wave function. Regarding wave function methods, most approaches not including a very high amount of dynamic correlation effects lead to a much too high localization of the spin density on the copper atom, in sharp contrast with DFT. To shed some light on these conflicting results Full CI-type (FCI) calculations using the 6-31G basis set and based on a selection process of the most important determinants, the so-called CIPSI approach (Configuration Interaction with Perturbative Selection done Iteratively) are performed. Quite remarkably, it is found that for this 63-electron molecule and a full CI space including about 10(18) determinants, the FCI limit can almost be reached. Putting all results together, a natural and coherent picture for the spin distribution is proposed.
Dudík, J; Mason, H E; Dzifčáková, E
2014-01-01
We investigate the possibility of diagnosing the degree of departure from the Maxwellian distribution using single-ion spectra originating in astrophysical plasmas in collisional ionization equilibrium. New atomic data for excitation of Fe IX-XIII are integrated under the assumption of a kappa-distribution of electron energies. Diagnostic methods using lines of a single ion formed at any wavelength are explored. Such methods minimize uncertainties from the ionization and recombination rates, as well as the possible presence of non-equilibrium ionization. Approximations to the collision strengths are also investigated. The calculated intensities of most of the Fe IX-XIII EUV lines show consistent behaviour with kappa at constant temperature. Intensities of these lines decrease with kappa, with the vast majority of ratios of strong lines showing little or no sensitivity to kappa. Several of the line ratios, especially involving temperature-sensitive lines, show a sensitivity to kappa that is of the order of sev...
Kok, Jasper F
2010-01-01
Mineral dust aerosols impact Earth's radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (< 2 {\\mu}m diameter) by a factor of ~2 - 8 relative to measurements. This discrepancy is resolved by deriving a simple theoretical expression of the emitted dust size distribution that is in excellent agreement with measurements. This expression is based on the physics of the scale-invariant fragmentation of brittle materials, which is shown to be applicable to dust emission. Because clay aerosols produce a strong radiative cooling, the overestimation of the clay fraction causes GCMs to also overestimate the radiative cooling of a given quantity o...
B. Gustavsson
2008-12-01
Full Text Available We present bi-static observations of radio-wave induced optical emissions at 6300 and 5577 Å from a night-time radio-induced optical emission ionospheric pumping experiment at the HIPAS (Fairbanks facility in Alaska. The optical observations were made at HIPAS and from HAARP located 285 km south-east. From these observations the altitude distribution of the emissions is estimated with tomography-like methods. These estimates are compared with theoretical models. Other diagnostics used to support the theoretical calculations include the new Poker Flat AMISR UHF radar near HIPAS. We find that the altitude distribution of the emissions agree with electron transport modeling with a source of accelerated electrons located 20 km below the upper-hybrid altitude.
Caffarel, Michel; Scemama, Anthony; Ramírez-Solís, Alejandro
2014-01-01
We present a comparative study of the spatial distribution of the spin density (SD) of the ground state of CuCl2 using Density Functional Theory (DFT), quantum Monte Carlo (QMC), and post-Hartree-Fock wavefunction theory (WFT). A number of studies have shown that an accurate description of the electronic structure of the lowest-lying states of this molecule is particularly challenging due to the interplay between the strong dynamical correlation effects in the 3d shell of the copper atom and the delocalization of the 3d hole over the chlorine atoms. It is shown here that qualitatively different results for SD are obtained from these various quantum-chemical approaches. At the DFT level, the spin density distribution is directly related to the amount of Hartree-Fock exchange introduced in hybrid functionals. At the QMC level, Fixed-node Diffusion Monte Carlo (FN-DMC) results for SD are strongly dependent on the nodal structure of the trial wavefunction employed (here, Hartree-Fock or Kohn-Sham with a particula...
Statistical theory and inference
Olive, David J
2014-01-01
This text is for a one semester graduate course in statistical theory and covers minimal and complete sufficient statistics, maximum likelihood estimators, method of moments, bias and mean square error, uniform minimum variance estimators and the Cramer-Rao lower bound, an introduction to large sample theory, likelihood ratio tests and uniformly most powerful tests and the Neyman Pearson Lemma. A major goal of this text is to make these topics much more accessible to students by using the theory of exponential families. Exponential families, indicator functions and the support of the distribution are used throughout the text to simplify the theory. More than 50 ``brand name" distributions are used to illustrate the theory with many examples of exponential families, maximum likelihood estimators and uniformly minimum variance unbiased estimators. There are many homework problems with over 30 pages of solutions.
Quasihomogeneous distributions
von Grudzinski, O
1991-01-01
This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.
朱学磊
2015-01-01
John Rawls’s justice theory is mostly concerned about “distributive justice”,and “justice as fairness”expressed Rawls’s special consideration to equality.After John Rawls,Robert Nozick ,Michael J.Sandel and Amartya Sen criticized this theory from the perspective of“holding justice”theory,communitarianism,and the relationship between ability,rights,develop-ment and freedom.These plural theories about “distributive justice”may offer us different viewpoints to study social issues in China.%罗尔斯的正义论是关于“分配正义”的理论，“作为公平的正义”表达出罗尔斯对于平等的特别关注。在罗尔斯之后，诺齐克、桑德尔、阿玛蒂亚·森分别从“持有正义”理论、社群主义立场，以及能力、权利、发展与自由的关系等角度，展开对罗尔斯正义理论的批评和讨论。总结不同学者关于“分配正义”的理论，对于更好地认识我国社会发展过程中出现的问题大有裨益。
Nekrasov, Nikita
2004-01-01
We present the evidence for the existence of the topological string analogue of M-theory, which we call Z-theory. The corners of Z-theory moduli space correspond to the Donaldson-Thomas theory, Kodaira-Spencer theory, Gromov-Witten theory, and Donaldson-Witten theory. We discuss the relations of Z-theory with Hitchin's gravities in six and seven dimensions, and make our own proposal, involving spinor generalization of Chern-Simons theory of three-forms. Based on the talk at Strings'04 in Paris.
Harman, Ciaran J.
2015-01-01
Transport processes and pathways through many hydrodynamic systems vary over time, often driven by variations in total water storage. This paper develops a very general approach to modeling unsteady transport through an arbitrary control volume (such as a watershed) that accounts for temporal variability in the underlying transport dynamics. Controls on the selection of discharge from stored water are encapsulated in probability distributions ΩQ>(ST,t>) of age-ranked storage ST (the volume of water in storage ranked from youngest to oldest). This framework is applied to a long-term record of rainfall and streamflow chloride in a small, humid watershed at Plynlimon, UK. While a time-invariant gamma distribution for ΩQ produced a good fit to data, the fit was significantly improved when the distribution was allowed to vary with catchment storage. However, the variation was inverse to that of a "well-mixed" system where storage has a pure dilution effect. Discharge at high storage was predicted to contain a larger fraction of recent event water than at low storage. The effective volume of storage involved in transport was 3411 mm at mean catchment wetness, but declined by 71 mm per 1 mm of additional catchment storage, while the fraction of event water in discharge increased by 1.4%. This "inverse storage effect" is sufficient to reproduce the observed long-memory 1/f fractal spectral structure of stream chloride. Metrics quantifying the strength and direction of storage effects are proposed as useful signatures, and point toward a unified framework for observing and modeling coupled watershed flow and transport.
1983-12-16
control of K+ distribution of the smooth muscle in teniae coli of guinea pig were reported by Jones [68]. 42 Ling 40 - S20- 100 r- °P02 so / 60 20 0 .5...p. 252.) Inset, 2 from Relsin and Gulati [65] shows temperature transitions from K + to the Na state in the guinea pig teniae coli. (Prom G. N. Ling...mammalian smooth muscle, the guinea pig teniae coli [65]. If this temperature transition also occurs in uterine smooth muscle-- a reasonable assumption
Income distribution: Second thoughts
J. Tinbergen (Jan)
1977-01-01
textabstractAs a follow-up of his book on income distribution the author reformulates his version on the scarcity theory of income from productive contributions. The need to introduce into an earnings theory several job characteristics, non-cognitive as well as cognitive, and the corresponding perso
Jensen, Lotte Groth; Bossen, Claus
2016-01-01
different socio-technical systems (paper-based and electronic patient records). Drawing on the theory of distributed cognition and narrative theory, primarily inspired by the work done within health care by Cheryl Mattingly, we propose that the creation of overview may be conceptualised as ‘distributed plot......-making’. Distributed cognition focuses on the role of artefacts, humans and their interaction in information processing, while narrative theory focuses on how humans create narratives through the plot construction. Hence, the concept of distributed plot-making highlights the distribution of information processing...... between different social actors and artefacts, as well as the filtering, sorting and ordering of such information into a narrative that is made coherent by a plot. The analysis shows that the characteristics of paper-based and electronic patient records support or obstruct the creation of overview in both...
Reed, Benjamin E.; Peters, Daniel M.; McPheat, Robert; Smith, Andrew J. A.; Grainger, R. G.
2017-09-01
Simultaneous measurements were made of the spectral extinction (from 0.33-19 μm) and particle size distribution of silica aerosol dispersed in nitrogen gas. Two optical systems were used to measure the extinction spectra over a wide spectral range: a Fourier transform spectrometer in the infrared and two diffraction grating spectrometers covering visible and ultraviolet wavelengths. The particle size distribution was measured using a scanning mobility particle sizer and an optical particle counter. The measurements were applied to one amorphous and two crsystalline silica (quartz) samples. In the infrared peak values of the mass extinction coefficient (MEC) of the crystalline samples were 1.63 ± 0.23 m2g-1 at 9.06 μm and 1.53 ± 0.26 m2g-1 at 9.14 μm with corresponding effective radii of 0.267 and 0.331 μm, respectively. For the amorphous sample the peak MEC value was 1.37 ± 0.18 m2g-1 at 8.98 μm and the effective radius of the particles was 0.374 μm. Using the measured size distribution and literature values of the complex refractive index as inputs, three scattering models were evaluated for modelling the extinction: Mie theory, the Rayleigh continuous distribution of ellipsoids (CDE) model, and T-matrix modelling of a distribution of spheroids. Mie theory provided poor fits to the infrared extinction of quartz (R2 0.82 for crsytalline sillica and R2 = 0.98 for amorphous silica. The T-matrix approach was able to fit the amorphous infrared extinction data with an R2 value of 0.995. Allowing for the possibility of reduced crystallinity in the milled crystal samples, using a mixture of amorphous and crystalline T-matrix cross-sections provided fits with R2 values greater than 0.97 for the infrared extinction of the crystalline samples.
Matching theory for wireless networks
Han, Zhu; Saad, Walid
2017-01-01
This book provides the fundamental knowledge of the classical matching theory problems. It builds up the bridge between the matching theory and the 5G wireless communication resource allocation problems. The potentials and challenges of implementing the semi-distributive matching theory framework into the wireless resource allocations are analyzed both theoretically and through implementation examples. Academics, researchers, engineers, and so on, who are interested in efficient distributive wireless resource allocation solutions, will find this book to be an exceptional resource. .
Number theory arising from finite fields analytic and probabilistic theory
Knopfmacher, John
2001-01-01
""Number Theory Arising from Finite Fields: Analytic and Probabilistic Theory"" offers a discussion of the advances and developments in the field of number theory arising from finite fields. It emphasizes mean-value theorems of multiplicative functions, the theory of additive formulations, and the normal distribution of values from additive functions. The work explores calculations from classical stages to emerging discoveries in alternative abstract prime number theorems.
Wiktor, Julia; Jomard, Gérald; Torrent, Marc
2015-09-01
Many techniques have been developed in the past in order to compute positron lifetimes in materials from first principles. However, there is still a lack of a fast and accurate self-consistent scheme that could handle accurately the forces acting on the ions induced by the presence of the positron. We will show in this paper that we have reached this goal by developing the two-component density functional theory within the projector augmented-wave (PAW) method in the open-source code abinit. This tool offers the accuracy of the all-electron methods with the computational efficiency of the plane-wave ones. We can thus deal with supercells that contain few hundreds to thousands of atoms to study point defects as well as more extended defects clusters. Moreover, using the PAW basis set allows us to use techniques able to, for instance, treat strongly correlated systems or spin-orbit coupling, which are necessary to study heavy elements, such as the actinides or their compounds.
Kok, Jasper F
2011-01-18
Mineral dust aerosols impact Earth's radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (climate predictions in dusty regions. On a global scale, the dust cycle in most GCMs is tuned to match radiative measurements, such that the overestimation of the radiative cooling of a given quantity of emitted dust has likely caused GCMs to underestimate the global dust emission rate. This implies that the deposition flux of dust and its fertilizing effects on ecosystems may be substantially larger than thought.
Tkáč, Štefan
2015-11-01
To achieve the smart growth and equitable development in the region, urban planners should consider also lateral energies represented by the energy urban models like further proposed EEPGC focused on energy distribution via connections among micro-urban structures, their onsite renewable resources and the perception of micro-urban structures as decentralized energy carriers based on pre industrialized era. These structures are still variously bound when part of greater patterns. After the industrial revolution the main traded goods became energy in its various forms. The EEPGC is focused on sustainable energy transportation distances between the villages and the city, described by the virtual "energy circles". This more human scale urbanization, boost the economy in micro-urban areas, rising along with clean energy available in situ that surely gives a different perspective to human quality of life in contrast to overcrowded multicultural mega-urban structures facing generations of problems and struggling to survive as a whole.
Tkáč Štefan
2015-11-01
Full Text Available To achieve the smart growth and equitable development in the region, urban planners should consider also lateral energies represented by the energy urban models like further proposed EEPGC focused on energy distribution via connections among micro-urban structures, their onsite renewable resources and the perception of micro-urban structures as decentralized energy carriers based on pre industrialized era. These structures are still variously bound when part of greater patterns. After the industrial revolution the main traded goods became energy in its various forms. The EEPGC is focused on sustainable energy transportation distances between the villages and the city, described by the virtual “energy circles”. This more human scale urbanization, boost the economy in micro-urban areas, rising along with clean energy available in situ that surely gives a different perspective to human quality of life in contrast to overcrowded multicultural mega-urban structures facing generations of problems and struggling to survive as a whole.
周留征; 刘江宁; 王明雁
2016-01-01
Through establishing a game model of income distribution based on the endogenous institutional change theory,the paper analyses the different role of the managers and ordinary employees in the decision making process of the income distribution system,and provides enlightenment for the managers in establishing the income distribution system. Game result shows that the tougher the average employee gets,the higher the credibility of the threat is,the smaller the cost of its execution is,and the greater the impact on the managers' strategy choice is,which forces the managers to choose the income distribution system in favour of ordinary employees.%本文通过建立收入分配博弈模型将收入分配制度内生化，进而分析管理者和普通员工在收入分配制度的决定过程中起到的不同作用，以给管理者制定收入分配制度提供启示。模型分析结果表明，普通员工越强硬，且其威胁的可信度越高，其执行威胁所付出的成本越小，其在博弈中对管理者的策略选择就会造成更大的影响，从而迫使管理者更倾向于选择有利于普通员工的收入分配制度。
马草原; 孙展展; 葛森; 蒋峰景
2016-01-01
设计并改进供电效率和综合供电效率评估指标，运用复杂网络理论分析直接接入主电网的分布式电源对电网供电效率的影响。基于综合供电效率指标对含分布式发电的电网进行不同故障模式下的脆弱度评估，且对IEEE39节点系统进行仿真，比较分析不同故障模式下在分布式发电加入前后的电网脆弱度，以验证指标的有效性。定性评估分布式发电接入后对电网脆弱度的影响，对实现高效、清洁能源的大力发展和应用具有重要的理论和现实意义。%This paper designs and improves the evaluation index of power supply efficiency and integrated power effi -ciency , and the influence on the power supply efficiency of distributed generation directly access to the main power grid is analyzed by using complex network theory .The vulnerability assessment under different fault modes of power grid containing distributed generation is done based on an integrated power supply efficiency index , and the IEEE39 nodes system is tested as well , and comparatively analyze the power grid vulnerability at distributed generation to join before and after under different fault modes in order to verify the validity of indicators .The qualitative assessment of the impact on the grid vulnerability as the distributed generation access to grid has important theoretical and practical significance to develop and apply the high -efficient and clean energy .
洪月华
2011-01-01
In the research and application of Wireless Sensor Networks（WSN）,the use of data mining to improve energy efficiency is an important direction.A distributed data mining algorithm based on rough set theory and BP network was designed and applied to wireless sensor networks.Raw data are discretized and reduced rough set attributes.Minimun condition attributes set is obtained by distributed data mining algorithm.Finally,the reduced decision attributes were used to construct BP neural network classification data.Constructed data mining algorithm can be integrated in each sensor network node.We simulated the distributed data mining algorithm.The simulation result had indicated： This distributed data mining algorithm can reduce data dimension,eliminate data redundancy,decrease communication traffic and lengthen the WSN working hours.%利用数据挖掘来提高网络中能量利用率是无线传感器网络（WSN）的一个重要研究方向.本文构建了基于粗糙集与神经网络相结合的无线传感器网络分布式数据挖掘算法.该算法用粗糙集对节点内的原始数据进行离散化与属性约简后得到的最简决策表训练BP神经网络,再将构造好的BP神经网络集成在每个传感器节点上.仿真结果表明,该算法可以降低数据维数,消除冗余数据、减少网络通信量、延长网络寿命.
Totani, T; Iwamuro, F; Maihara, T; Motohara, K
2001-01-01
Galaxy counts in the K band, (J-K)-colors, and apparent size distributions of faint galaxies in the Subaru Deep Field (SDF) down to K~24.5 were studied in detail. Special attention has been paid to take into account various selection effects including the cosmological dimming of surface brightness, to avoid any systematic bias which may be the origin of controversy in previously published results. We also tried to be very careful about systematic model uncertainties; we present a comprehensive surveys of these systematic uncertainties and dependence on various parameters. We found that the pure luminosity evolution (PLE) model is well consistent with all the SDF data down to K~22.5, without any evidence for number or size evolution in a low-density, Lambda-dominated flat universe which is now favored by various cosmological observations. If the popular Lambda-dominated universe is taken for granted, our result then gives a strong constraint on the number evolution of giant elliptical or early-type galaxies to...
主动配电网规划理论与实践方向探索%Exploration of Active Distribution Network Planning Theory and Practice Direction
刘开俊; 宋毅
2015-01-01
主动配电网(active distribution network,ADN)是低碳经济背景下分布式可再生能源大规模并网与高效利用的有效解决方案,对于促进我国能源结构的优化与调整具有重要的战略意义.该文介绍了ADN的基本概念,剖析了ADN的主要特征与功能特点,并列举了国内外典型ADN示范工程的建设目标、内容与应用效果;通过分析传统配电网规划方法存在的不足,指出ADN规划面临的关键技术问题,进而提出ADN规划方法的理论框架以及关键要素,同时指出工程实践应从网架结构、配电自动化、智能通信网、信息化平台、智能用电等5个方面进行建设,并提出了建设思路与原则.
罗尔斯分配正义及对我国分配问题的启示%Rawls Theory of Distribution Justice from the Perspective of Contemporary China
陈红杰; 郭金彪
2014-01-01
Rawls theory of justice is not only to inherit the traditional conception of justice, but for its innovation and develop-ment. It deeply discusses the relationship between individual and society. Distributive justice is the core problem in transformation peri-od Chinese social encounters, and it is also the reason of social conflicts. The basic principles and ideas for solving the distribution pro-posed by Rawls give us a sketch of a freedom and equality of social vision . "The theory of justice" can be said to provide a solution to social injustice for us.%罗尔斯的正义论不只是对传统正义观的继承，更是对其的创新和发展。它较为深刻地论述了个人与社会关系这一带有普遍意义的社会基本问题。分配正义是转型期中国社会遇到的核心问题，是诱发各种社会矛盾的根源之一。罗尔斯所提出的解决分配不正义的基本原则和设想，给我们勾画了一个自由又平等的社会愿景。其《正义论》可以说给我们提供了一个解决社会不正义的极其优化的解决方案。
刘艳艳; 王晓峰
2011-01-01
采用田野调查、文献检索、Internet搜索、专家深度访谈等方法进行休闲农业发展态势分析和策略研究;应用城市经济学、综合区划、环城游憩带三大空间布局理论进行休闲农业推进范式研究.从理论层面提出推进我国休闲农业发展的县区引领、村镇提升、农庄优化、农家转型、产业融合和园区带动＂六大范式＂,并界定其内涵,确定其推进目标;从实践层面提出推进范式培育提升的具体策略.%This study adopts a series of methods,including field research,literature review,Internet search,expert interviews ect.to analyze the development trend and study the strategies of leisure agriculture.Besides,the promoting paradigm of Chinese leisure agriculture is studied based on the three spatial distribution theories of applied urban economics,theory of comprehensive district planning and REBAM theory.Then the six paradigms,which are leading role of the counties and districts,promotion of villages and towns,optimization of farms,transformation of farmers,integration of industries and the leading of industrial park,are proposed from the theoretical level.At the same time,its connotation is defined,the objective is determined and the concrete strategies of promoting China＇s leisure agriculture from the practical level.
4 C Theory and“Hercy Morning News” Distribution Marketing Strategy%4C 理论与《海西晨报》发行营销策略的构建
张涛
2015-01-01
从营销学看4C 营销理论，强调读者的中心地位，符合市场经济条件下报业经济发展的要求，4C营销理论对《海西晨报》社的发行营销策略具有重要启发。可以从生产读者欲求的报纸产品、降低读者满足阅读和购买需求的成本、提高读者阅读和购买报纸的方便程度、与读者加强沟通等四个方面进行设计和构想。%4C marketing theory emphasizes the central position of readers , to meet the requirements of the develop-ment of newspaper industry under the market economy condition .The 4C marketing theory has important inspiration to “Hercy Morning News” agency distribution marketing strategy .Specifically , it is from four aspects as issuing the newspaper for reader's desire , reducing reading cost , improving convenience of purchase , and strengthening com-munication with readers to design the marketing strategy .
Marino Beiras, Marcos
2001-01-01
We give an overview of the relations between matrix models and string theory, focusing on topological string theory and the Dijkgraaf--Vafa correspondence. We discuss applications of this correspondence and its generalizations to supersymmetric gauge theory, enumerative geometry and mirror symmetry. We also present a brief overview of matrix quantum mechanical models in superstring theory.
Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....
Enßlin, Torsten
2013-01-01
Non-linear image reconstruction and signal analysis deal with complex inverse problems. To tackle such problems in a systematic way, I present information field theory (IFT) as a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms even for non-linear and non-Gaussian signal inference problems. IFT algorithms exploit spatial correlations of the signal fields and b...
Gritti, Fabrice; Farkas, Tivadar; Heng, Josuah; Guiochon, Georges
2011-11-11
The influence of the particle size distribution (PSD) on the band broadening and the efficiency of packed columns is investigated on both theoretical and practical viewpoints. Each of the classical contributions to mass transfer kinetics, those due to longitudinal diffusion, eddy dispersion, and solid-liquid mass transfer resistance are measured and analyzed in terms of their expected and observed intensity as a function of the PSD of mixtures of the commercially available packing materials, 5 and 3 μm Luna-C₁₈ particles (Phenomenex, Torrance, CA, USA). Six 4.6 mm × 150 mm columns were packed with different mixtures of these two materials. The efficiencies of these columns were measured for a non-retained and a retained analytes in a mixture of acetonitrile and water. The longitudinal diffusion coefficient was directly measured by the peak parking method. The solid-liquid mass transfer coefficient was measured from the combination of the peak parking method, the best model of effective diffusion coefficient and the actual PSDs of the different particle mixtures measured by Coulter counter experiments. The eddy diffusion term was measured according to a recently developed protocol, by numerical integration of the peak profiles. Our results clearly show that the PSD has no measurable impact on any of the coefficients of the van Deemter equation. On the contrary and surprisingly, adding a small fraction of large particles to a batch of small particles can improve the quality of the packing of the fine particles. Our results indirectly confirm that the success of sub-3 μm shell particles is due to the roughness of their external surface, which contributes to eliminate most of the nefarious wall effects. Copyright © 2011 Elsevier B.V. All rights reserved.
Neutral theory in community ecology
无
2008-01-01
One of the central goals of community ecology is to understand the forces that maintain species diversity within communities. The traditional niche-assembly theory asserts that species live together in a community only when they differ from one another in resource uses. But this theory has some difficulties in explaining the diversity often observed in specie-rich communities such as tropical forests. As an alternative to the niche theory, Hubbell and other ecologists introduced a neutral model. Hubbell argues that the number of species in a community is controlled by species extinction and immigration or speciation of new species. Assuming that all individuals of all species in a trophically similar com-munity are ecologically equivalent, Hubbell's neutral theory predicts two important statistical distributions. One is the asymptotic log-series distribution for the metacommunities under point mutation speciation, and the other is the zero-sum multinomial distribution for both local communities under dispersal limitation and metacommunities under random fission speciation. Unlike the niche-assembly theory, the neutral theory takes similarity in species and individuals as a starting point for investigating species diversity. Based on the fundamental processes of birth, death, dispersal and spe-ciation, the neutral theory provided the first mechanistic explanation of species abundance distribution commonly observed in natural communities. Since the publication of the neutral theory, there has been much discussion about it, pro and con. In this paper, we summarize recent progress in the assumption, prediction and speciation mode of the neutral theory, including progress in the theory itself, tests about the assumption of the theory, prediction and speciation mode at the metacommunity level. We also suggest that the most important task in the future is to bridge the niche-assembly theory and the neutral theory, and to add species differences to the neutral theory and
Navigating Distributed Services
Beute, Berco
2002-01-01
, to a situation where they are distributedacross the Internet. The second trend is the shift from a virtual environment that solelyconsists of distributed documents to a virtual environment that consists of bothdistributed documents and distributed services. The third and final trend is theincreasing diversity...... of devices used to access information on the Internet.The focal point of the thesis is an initial exploration of the effects of the trends onusers as they navigate the virtual environment of distributed documents and services.To begin the thesis uses scenarios as a heuristic device to identify and analyse...... themain effects of the trends. This is followed by an exploration of theory of navigationInformation Spaces, which is in turn followed by an overview of theories, and the stateof the art in navigating distributed services. These explorations of both theory andpractice resulted in a large number of topics...
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Prata, Bruno de Athayde; Arruda, Joao Bosco Furtado [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Nucleo de Pesquisa em Logistica, Transporte e Desenvolvimento
2004-07-01
The use of Natural Gas is nowadays increasing in Brazilian scene and this fact shows the necessity of effective planning tasks in that sector. In the case of Natural Gas Vehicular (NGV) distribution one can face problems of actor's (distributor, retailers, customers and non-users) point of view conflicts and fuel stations expand in most Brazilian urban areas in an uncontrolled way, despising counties regulation on land use. This paper reports a study using a model based in Game Theory concepts to determine some key-variables as the number of fuel stations which must deliver NGV in a given study area. Although some information could not be available the results of simulation shows the usefulness of using such an approach to give solutions to distribution questions in NGV sector. The model was applied to the case of a district in Fortaleza city which is the study area of a project entitled Projeto GASLOG presently on process under the sponsoring of Brazilian Government, PETROBRAS and Brazilian GasEnergy Research Network. (author)
Distributions of Dirac Operator Eigenvalues
Akemann, G
2004-01-01
The distribution of individual Dirac eigenvalues is derived by relating them to the density and higher eigenvalue correlation functions. The relations are general and hold for any gauge theory coupled to fermions under certain conditions which are stated. As a special case, we give examples of the lowest-lying eigenvalue distributions for QCD-like gauge theories without making use of earlier results based on the relation to Random Matrix Theory.
Distributed computer control systems
Suski, G.J.
1986-01-01
This book focuses on recent advances in the theory, applications and techniques for distributed computer control systems. Contents (partial): Real-time distributed computer control in a flexible manufacturing system. Semantics and implementation problems of channels in a DCCS specification. Broadcast protocols in distributed computer control systems. Design considerations of distributed control architecture for a thermal power plant. The conic toolset for building distributed systems. Network management issues in distributed control systems. Interprocessor communication system architecture in a distributed control system environment. Uni-level homogenous distributed computer control system and optimal system design. A-nets for DCCS design. A methodology for the specification and design of fault tolerant real time systems. An integrated computer control system - architecture design, engineering methodology and practical experience.
Johnstone, PT
2014-01-01
Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.
Information theory of molecular systems
Nalewajski, Roman F
2006-01-01
As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information ""distance"" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory
Williams, Jeffrey
1994-01-01
Considers the recent flood of anthologies of literary criticism and theory as exemplifications of the confluence of pedagogical concerns, economics of publishing, and other historical factors. Looks specifically at how these anthologies present theory. Cites problems with their formatting theory and proposes alternative ways of organizing theory…
Linder, Stefan; Foss, Nicolai Juul
Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....
The Weibull distribution a handbook
Rinne, Horst
2008-01-01
The Most Comprehensive Book on the SubjectChronicles the Development of the Weibull Distribution in Statistical Theory and Applied StatisticsExploring one of the most important distributions in statistics, The Weibull Distribution: A Handbook focuses on its origin, statistical properties, and related distributions. The book also presents various approaches to estimate the parameters of the Weibull distribution under all possible situations of sampling data as well as approaches to parameter and goodness-of-fit testing.Describes the Statistical Methods, Concepts, Theories, and Applications of T
In defence of the right shift theory.
Annett, M
1996-02-01
The right shift (RS) theory of a gene for left-cerebral dominance which increases the probability of right-handedness is outlined, together with two proposed alternatives, the 1985a genetic theory of McManus and the 1993 developmental instability theory of Yeo and Gangestad. Similarities and differences among the three theories are reviewed. Both of the genetic theories can predict the distribution of handedness in families and in twins more efficiently than the developmental instability theory, and the RS theory better than the McManus theory.
刘春学; 李连举; 李春雪
2013-01-01
由于矿产资源的特殊性，利益分配不均现象在矿产资源开发中尤为显著，资源富饶的地区经济发展还较为落后，居民生活水平提高不大，矛盾冲突多，社会安定堪忧。本文根据对云南部分矿种和矿区的实际调研，利用相关的博弈理论和方法，构建了矿产资源开发企业和矿区居民之间的不完全信息动态博弈模型，通过对该模型精炼贝叶斯均衡的求解，表明矿产资源开发中企业倾向于向矿区居民分配很低的利益且这种分配策略属于“混同均衡”，进而推导出了各均衡点存在的必要条件，并以实际调研案例验证了其存在的真实性。在此基础上从监管、补偿机制及分配制度三个方面提出了确保在矿产资源开发中进行利益合理分配的对策建议，为维护矿区社会稳定、实现经济社会的可持续发展提供参考。%Due to the special characteristics of the mineral resources, the unreasonable benefit distribution phenomenon is particu-larly significant in mineral resources exploitation. Mining area residents lived for generations in the fertile area of mineral resources, but the development of the local economy is still lagging behind. Many of the mineral resource-rich regions become more poverty relatively, which leads to the various contradictions and conflicts which affecting social stability. According to the field survey on some mine areas in Yunnan province, the game theories and methods are used to establish an incomplete information dynamic game theory model for the stakeholders between the mining enterprises and mining area residents. After analyzing perfect Bayesian equilibrium of the proposed model, it was proved that there exists a pooling equilibrium for the mining enterprises to distribute very low benefit to the mining area residents. The necessary conditions for each equilibrium point were derived, and the authenticities of their existences were
Loring, FH
2014-01-01
Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec
Linder, Stefan; Foss, Nicolai Juul
2015-01-01
Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex ...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism.......Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...
Linder, Stefan; Foss, Nicolai Juul
Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex a...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism.......Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...
Rowen, Louis H
1991-01-01
This is an abridged edition of the author's previous two-volume work, Ring Theory, which concentrates on essential material for a general ring theory course while ommitting much of the material intended for ring theory specialists. It has been praised by reviewers:**""As a textbook for graduate students, Ring Theory joins the best....The experts will find several attractive and pleasant features in Ring Theory. The most noteworthy is the inclusion, usually in supplements and appendices, of many useful constructions which are hard to locate outside of the original sources....The audience of non
Harris, Tina
2015-04-29
Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.
Fractal tracer distributions in turbulent field theories
Hansen, J. Lundbek; Bohr, Tomas
1998-01-01
We study the motion of passive tracers in a two-dimensional turbulent velocity field generated by the Kuramoto-Sivashinsky equation. By varying the direction of the velocity-vector with respect to the field-gradient we can continuously vary the two Lyapunov exponents for the particle motion and t...
周永勇; 周湶; 刘佳宾
2008-01-01
As the first step of service restoration of distribution system, rapid fault diagnosis is a significant task for reducing power outage time, decreasing outage loss, and subsequently improving service reliability and safety. This paper analyzes a fault diagnosis approach by using rough set theory in which how to reduce decision table of data set is a main calculation intensive task. Aiming at this reduction problem, a heuristic reduction algorithm based on attribution length and frequency is proposed. At the same time, the corresponding value reduction method is proposed in order to fulfill the reduction and diagnosis rules extraction. Meanwhile, a Euclid matching method is introduced to solve confliction problems among the extracted rules when some information is lacking. Principal of the whole algorithm is clear and diagnostic rules distilled from the reduction are concise. Moreover, it needs less calculation towards specific discernibility matrix, and thus avoids the corresponding NP hard problem. The whole process is realized by MATLAB programming. A simulation example shows that the method has a fast calculation speed, and the extracted rules can reflect the characteristic of fault with a concise form. The rule database, formed by different reduction of decision table, can diagnose single fault and multi-faults efficiently, and give satisfied results even when the existed information is incomplete. The proposed method has good error-tolerate capability and the potential for on-line fault diagnosis.
The Impact of Education on Distribution
J. Tinbergen (Jan)
1972-01-01
textabstractIn this paper the author adds some further empirical tests of his theory of income distribution. This theory (cf. this Review, Series 16, Number 3, September 1970, p. 221 ff) sees income distribution as the distribution of prices of production factors, especially labour, of different qua
Public finance in theory and practice
Auerbach, Alan J
1993-01-01
.... the measurement of the revenue, and 2. the distributional effects of proposed tax changes. Revenue and distributional analysis are important tax policy tools supplied by economists and supposedly grounded in economic theory and practice...
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Aubin, Jean-Pierre; Saint-Pierre, Patrick
2011-01-01
Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai
Roman, Steven
2006-01-01
Intended for graduate courses or for independent study, this book presents the basic theory of fields. The first part begins with a discussion of polynomials over a ring, the division algorithm, irreducibility, field extensions, and embeddings. The second part is devoted to Galois theory. The third part of the book treats the theory of binomials. The book concludes with a chapter on families of binomials - the Kummer theory. This new edition has been completely rewritten in order to improve the pedagogy and to make the text more accessible to graduate students. The exercises have also been im
Hashiguchi, Koichi
2009-01-01
This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..
Cox, David A
2012-01-01
Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo
Dufwenberg, Martin
2011-03-01
Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website.
Distributed Cognition and Distributed Morality: Agency, Artifacts and Systems.
Heersmink, Richard
2017-04-01
There are various philosophical approaches and theories describing the intimate relation people have to artifacts. In this paper, I explore the relation between two such theories, namely distributed cognition and distributed morality theory. I point out a number of similarities and differences in these views regarding the ontological status they attribute to artifacts and the larger systems they are part of. Having evaluated and compared these views, I continue by focussing on the way cognitive artifacts are used in moral practice. I specifically conceptualise how such artifacts (a) scaffold and extend moral reasoning and decision-making processes, (b) have a certain moral status which is contingent on their cognitive status, and (c) whether responsibility can be attributed to distributed systems. This paper is primarily written for those interested in the intersection of cognitive and moral theory as it relates to artifacts, but also for those independently interested in philosophical debates in extended and distributed cognition and ethics of (cognitive) technology.
LeVeque, William J
1996-01-01
This excellent textbook introduces the basics of number theory, incorporating the language of abstract algebra. A knowledge of such algebraic concepts as group, ring, field, and domain is not assumed, however; all terms are defined and examples are given - making the book self-contained in this respect.The author begins with an introductory chapter on number theory and its early history. Subsequent chapters deal with unique factorization and the GCD, quadratic residues, number-theoretic functions and the distribution of primes, sums of squares, quadratic equations and quadratic fields, diopha
Manning, Phillip
2011-01-01
The study of quantum theory allowed twentieth-century scientists to examine the world in a new way, one that was filled with uncertainties and probabilities. Further study also led to the development of lasers, the atomic bomb, and the computer. This exciting new book clearly explains quantum theory and its everyday uses in our world.
Ion N.Chiuta
2009-05-01
Full Text Available The paper determines relations for shieldingeffectiveness relative to several variables, includingmetal type, metal properties, thickness, distance,frequency, etc. It starts by presenting some relationshipsregarding magnetic, electric and electromagnetic fieldsas a pertinent background to understanding and applyingfield theory. Since literature about electromagneticcompatibility is replete with discussions about Maxwellequations and field theory only a few aspects arepresented.
Security of Quantum Key Distribution
Renner, R
2005-01-01
We propose various new techniques in quantum information theory, including a de Finetti style representation theorem for finite symmetric quantum states. As an application, we give a proof for the security of quantum key distribution which applies to arbitrary protocols.
郑丽辉; 赵志刚; 方晓汾
2015-01-01
A battery/ultra-capacitor hybrid energy storage system is proposed for mini pure electric vehicle, the power distri-bution control strategy based on fuzzy control theory is used for pure electric vehicle. Fuzzy controller is designed for vehicle power system structure. Taking the bus current of driving motor and SOC of ultra capacitor as input, the fuzzy controller calcu-lates optimal output current of ultra capacitor, and reasonably distributes power between the battery and ultra capacitor. The hybrid energy storage system simulation is modeled on the platform of the Matlab-Simulink. The simulation results show that the proposed power allocation strategy can effectively allocate the composite power output of the battery and ultra capacitor, im-prove working condition of the accumulator, and increase the energy utilization.%以某微型纯电动汽车的蓄电池-超级电容复合电源系统为研究对象,提出了一种基于模糊控制理论的复合电源纯电动汽车驱动功率分配控制策略。针对整车动力系统结构,设计了模糊控制器,综合考虑母线电流和超级电容SOC,计算超级电容最佳输出电流,合理分配蓄电池与超级电容之间的功率。在Matlab/Simulink环境下搭建复合电源能量管理系统仿真模型,仿真结果表明所提出的功率分配策略能有效分配复合蓄电池和超级电容的输出功率,改善了蓄电池的工作环境,并提高了车载能量利用率。
Bjerg, Ole; Presskorn-Thygesen, Thomas
2017-01-01
The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...
Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří
1988-01-01
Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...
Hjørland, Birger
2009-01-01
Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...
Bjerg, Ole; Presskorn-Thygesen, Thomas
2017-01-01
The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...
Bernardo, Jose M
2000-01-01
This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica
Kathleen Holtz Deal
2007-05-01
Full Text Available Psychodynamic theory, a theory of personality originated by Sigmund Freud, has a long and complex history within social work and continues to be utilized by social workers. This article traces the theory’s development and explains key concepts with an emphasis on its current relational focus within object relations theory and self-psychology. Empirical support for theoretical concepts and the effectiveness of psychodynamic therapies is reviewed and critiqued. Future directions are discussed, including addressing cultural considerations, increasing research, and emphasizing a relational paradigm
Andrews, George E
1994-01-01
Although mathematics majors are usually conversant with number theory by the time they have completed a course in abstract algebra, other undergraduates, especially those in education and the liberal arts, often need a more basic introduction to the topic.In this book the author solves the problem of maintaining the interest of students at both levels by offering a combinatorial approach to elementary number theory. In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simpl
Smith, Shelley
This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...
Lubliner, Jacob
2008-01-01
The aim of Plasticity Theory is to provide a comprehensive introduction to the contemporary state of knowledge in basic plasticity theory and to its applications. It treats several areas not commonly found between the covers of a single book: the physics of plasticity, constitutive theory, dynamic plasticity, large-deformation plasticity, and numerical methods, in addition to a representative survey of problems treated by classical methods, such as elastic-plastic problems, plane plastic flow, and limit analysis; the problem discussed come from areas of interest to mechanical, structural, and
Theory of vibration protection
Karnovsky, Igor A
2016-01-01
This text is an advancement of the theory of vibration protection of mechanical systems with lumped and distributed parameters. The book offers various concepts and methods of solving vibration protection problems, discusses the advantages and disadvantages of different methods, and the fields of their effective applications. Fundamental approaches of vibration protection, which are considered in this book, are the passive, parametric and optimal active vibration protection. The passive vibration protection is based on vibration isolation, vibration damping and dynamic absorbers. Parametric vibration protection theory is based on the Shchipanov-Luzin invariance principle. Optimal active vibration protection theory is based on the Pontryagin principle and the Krein moment method. The book also contains special topics such as suppression of vibrations at the source of their occurrence and the harmful influence of vibrations on humans. Numerous examples, which illustrate the theoretical ideas of each chapter, ar...
Introduction to electromagnetic theory
Owen, George E
2003-01-01
A direct, stimulating approach to electromagnetic theory, this text employs matrices and matrix methods for the simple development of broad theorems. The author uses vector representation throughout the book, with numerous applications of Poisson's equation and the Laplace equation (the latter occurring in both electronics and magnetic media). Contents include the electrostatics of point charges, distributions of charge, conductors and dielectrics, currents and circuits, and the Lorentz force and the magnetic field. Additional topics comprise the magnetic field of steady currents, induced ele
Wolpert, David H.
2005-01-01
Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.
Nel, Louis
2016-01-01
This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Koschmann, Timothy; Roschelle, Jeremy; Nardi, Bonnie A.
1998-01-01
Includes three articles that discuss activity theory, based on "Context and Consciousness." Topics include human-computer interaction; computer interfaces; hierarchical structuring; mediation; contradictions and development; failure analysis; and designing educational technology. (LRW)
Gould, Ronald
2012-01-01
This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S
1988-06-30
MATRICES . The monograph Nonnegative Matrices [6] is an advanced book on all aspect of the theory of nonnegative matrices and...and on inverse eigenvalue problems for nonnegative matrices . The work explores some of the most recent developments in the theory of nonnegative...k -1, t0 . Define the associated polynomial of type <z>: t t-t 2 t-t 3 t-tk_ 1,X - x - x . . .X- where t = tk . The
Characterizations of univariate continuous distributions
Ahsanullah, Mohammad
2017-01-01
Provides in an organized manner characterizations of univariate probability distributions with many new results published in this area since the 1978 work of Golambos & Kotz "Characterizations of Probability Distributions" (Springer), together with applications of the theory in model fitting and predictions.
Understanding Second-Order Theory of Mind
2015-03-01
2.0 [ Artificial Intelligence ]: General—cognitive simula- tion; I.2.11 [ Artificial Intelligence ]: Distributed Artificial Intelligence — intelligent ...agents, coherence and coordination General Terms Theory Keywords theory of mind; human-robot teams 1. INTRODUCTION Theory of mind (ToM) is a critical...posit that that mechanism is simulation. Overall, robots with theory of mind are viewed as more natural and intelligent teammates to their human
Random Matrix theory approach to Quantum mechanics
Chaitanya, K. V. S. Shiv
2015-01-01
In this paper, we give random matrix theory approach to the quantum mechanics using the quantum Hamilton-Jacobi formalism. We show that the bound state problems in quantum mechanics are analogous to solving Gaussian unitary ensemble of random matrix theory. This study helps in identify the potential appear in the joint probability distribution function in the random matrix theory as a super potential. This approach allows to extend the random matrix theory to the newly discovered exceptional ...
Possibility Theory versus Probability Theory in Fuzzy Measure Theory
Parul Agarwal
2015-05-01
Full Text Available The purpose of this paper is to compare probability theory with possibility theory, and to use this comparison in comparing probability theory with fuzzy set theory. The best way of comparing probabilistic and possibilistic conceptualizations of uncertainty is to examine the two theories from a broader perspective. Such a perspective is offered by evidence theory, within which probability theory and possibility theory are recognized as special branches. While the various characteristic of possibility theory within the broader framework of evidence theory are expounded in this paper, we need to introduce their probabilistic counterparts to facilitate our discussion.
汪圣毅; 雷伟; 陈志武; 韩涵; 刘弋
2014-01-01
目的 了解本科生普外科理论课考试成绩的分布和影响因素.方法 整群抽样选择171名本科生为研究对象,分析其普外科理论考试成绩分布及其影响因素.采用SPSS 17.0软件进行统计分析,计量资料用(-x)±s表示;正态性检验用Kolmogorov-Smirnov test,不符合正态分布定量资料的比较用Mann-Whitney U和Kruskal-Wallis H检验;秩变换univariate多因素方差LSD(Levene检验组间误差方差相等)或Tamhane法(Levene检验组间误差方差不等)进行组间两两比较;总成绩是否优秀的影响因素分析用非条件单因素及多因素Logistic回归模型,检验水准α=0.05.结果 总成绩呈正态分布,77分为异常值.女生的总成绩、各题型得分均高于男生,不同班级的总成绩、选择题和问答题得分差异均有统计学意义.多因素Logistic分析显示,男生(OR=0.212,95％CI:0.077～0.584)是总成绩优秀的不利因素;名词解释得分高(OR=12.160,95％CI:1.985～74.495)、选择题得分高(OR =9.887,95％CI:2.997～32.617)、问答题得分高(OR=18.323,95％CI:6.593～50.928)是总成绩优秀的有利因素.结论 应注意分析成绩异常值和性别差异的原因,注重试卷题型对成绩的影响.%Objective To analyze the distribution and influential factors of exam results in medical students' general surgery theory course.Methods 171 medical students were selected as subjects by cluster sampling,and the distribution and influential factors of exam results were analyzed.SPSS 17.0 software was used for statistical analysis,measurement data with (-x) ± s,and normality test with Kolmogorov-Smirnov test.Those quantitative data which do not meet the normal distribution were compared with Mann-Whitney U and Kruskal-Wallis H test.Rank transformation univariate multi-factor variance of LSD (Levene test equal error variance between groups) or Tamhane method (Levene test range error variance between groups) were compared between two groups(3-4) and
Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie
2017-01-01
Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... and ideological beliefs, and artistic practices such as music, dance, painting, and storytelling. Establishing biocultural theory as a program that self-consciously encompasses the different particular forms of human evolutionary research could help scholars and scientists envision their own specialized areas...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...
Weibel instability with nonextensive distribution
Qiu, Hui-Bin; Liu, Shi-Bing [Strong-field and Ultrafast Photonics Lab, Institute of Laser Engineering, Beijing University of Technology, Beijing 100124 (China)
2013-10-15
Weibel instability in plasma, where the ion distribution is isotropic and the electron component of the plasma possesses the anisotropic temperature distribution, is investigated based on the kinetic theory in context of nonextensive statistics mechanics. The instability growth rate is shown to be dependent on the nonextensive parameters of both electron and ion, and in the extensive limit, the result in Maxwellian distribution plasma is recovered. The instability growth rate is found to be enhanced as the nonextensive parameter of electron increases.
Donnellan, Thomas; Maxwell, E A; Plumpton, C
1968-01-01
Lattice Theory presents an elementary account of a significant branch of contemporary mathematics concerning lattice theory. This book discusses the unusual features, which include the presentation and exploitation of partitions of a finite set. Organized into six chapters, this book begins with an overview of the concept of several topics, including sets in general, the relations and operations, the relation of equivalence, and the relation of congruence. This text then defines the relation of partial order and then partially ordered sets, including chains. Other chapters examine the properti
Stewart, Ian
2003-01-01
Ian Stewart's Galois Theory has been in print for 30 years. Resoundingly popular, it still serves its purpose exceedingly well. Yet mathematics education has changed considerably since 1973, when theory took precedence over examples, and the time has come to bring this presentation in line with more modern approaches.To this end, the story now begins with polynomials over the complex numbers, and the central quest is to understand when such polynomials have solutions that can be expressed by radicals. Reorganization of the material places the concrete before the abstract, thus motivating the g
THEORETICAL ANALYSIS ON THE VERTICAL DISTRIBUTION OF PARTICLE CONCENTRATION
Guangqian WANG; Xudong FU
2001-01-01
In steady, solid-liquid two-phase turbulent flows, there exist two typical patterns of the vertical distribution of particle concentration. The pattern I shows a maximum concentration at an elevation above the bed. The pattern II shows an increase of the particle concentration downward over the whole vertical,with the maximum at the bed. Most of the theories on particle concentration distribution have been done with the pattern II, and it is lack of a successful theory coveting both of the two patterns. This paper reviews the particle distribution theories, including the diffusion theory, the mixture theory, the energy theory, the similarity theory, the stochastic theory and the kinetic theory. The kinetic theory is also applied to describe the vertical distribution of particle concentration in both dilute and dense flows.
Three Liberal Theories of Justice
Jiří MACHÁČEK
2013-01-01
The mail goal of this thesis is to introduce the modern theory of liberal justice with a focus on distributive justice. In addition, the author addresses the issue of value neutrality in the liberal state and the concept of equality in liberal theory. The author presents the concept of "justice as fairness" described by liberal political philosopher John Rawls. Afterwards his concept is subjected to criticism of other contemporary liberal philosophers Robert Nozick and Ronald Dworkin. The aut...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
A nonlinear theory of generalized functions
1990-01-01
This book provides a simple introduction to a nonlinear theory of generalized functions introduced by J.F. Colombeau, which gives a meaning to any multiplication of distributions. This theory extends from pure mathematics (it presents a faithful generalization of the classical theory of C? functions and provides a synthesis of most existing multiplications of distributions) to physics (it permits the resolution of ambiguities that appear in products of distributions), passing through the theory of partial differential equations both from the theoretical viewpoint (it furnishes a concept of weak solution of pde's leading to existence-uniqueness results in many cases where no distributional solution exists) and the numerical viewpoint (it introduces new and efficient methods developed recently in elastoplasticity, hydrodynamics and acoustics). This text presents basic concepts and results which until now were only published in article form. It is in- tended for mathematicians but, since the theory and applicati...
Effective theories of universal theories
Wells, James D
2015-01-01
It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably $S$ and $T$ parameters) are only applicable to a special class of new physics scenarios known as universal theories. In the effective field theory (EFT) framework, the oblique parameters should not be associated with Wilson coefficients in a particular operator basis, unless restrictions have been imposed on the EFT so that it describes universal theories. We work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16 parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the $h^3$, $hff$, $hVV$ vertices, 3 parameters for $hVV$ vertices absent in the Standard Model, and 1 four-fermion coupling of order $y_f^2$. All these parameters are defined in an unambiguous and basis-indepen...
Friedrich, H.; Tavasszy, L.A.; Davydenko, I.
2013-01-01
Distribution structures are important elements of the freight transportation system. Goods are routed via warehouses on their way from production to consumption. This chapter discusses drivers behind these structures, logistics decisions connected to distribution structures on the micro level, and
Hansen, Klaus Marius; Damm, Christian Heide
2005-01-01
An extension of Knight (2005) that support distributed synchronous collaboration implemented using type-based publish/subscribe......An extension of Knight (2005) that support distributed synchronous collaboration implemented using type-based publish/subscribe...
Distributed photovoltaic grid transformers
Shertukde, Hemchandra Madhusudan
2014-01-01
The demand for alternative energy sources fuels the need for electric power and controls engineers to possess a practical understanding of transformers suitable for solar energy. Meeting that need, Distributed Photovoltaic Grid Transformers begins by explaining the basic theory behind transformers in the solar power arena, and then progresses to describe the development, manufacture, and sale of distributed photovoltaic (PV) grid transformers, which help boost the electric DC voltage (generally at 30 volts) harnessed by a PV panel to a higher level (generally at 115 volts or higher) once it is
Lenz, Alexander
2016-01-01
We set the scene for theoretical issues in charm physics that were discussed at CHARM 2016 in Bologna. In particular we emphasize the importance of improving our understanding of standard model contributions to numerous charm observables and we discuss also possible tests of our theory tools, like the Heavy Quark Expansion via the lifetime ratios of $D$-mesons
Friedrich, Harald [Technische Univ. Muenchen, Garching (Germany). Physik-Department
2013-08-01
Written by the author of the widely acclaimed textbook. Theoretical Atomic Physics Includes sections on quantum reflection, tunable Feshbach resonances and Efimov states. Useful for advanced students and researchers. This book presents a concise and modern coverage of scattering theory. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. The level of abstraction is kept as low as at all possible, and deeper questions related to mathematical foundations of scattering theory are passed by. The book should be understandable for anyone with a basic knowledge of nonrelativistic quantum mechanics. It is intended for advanced students and researchers, and it is hoped that it will be useful for theorists and experimentalists alike.
Plummer, MD
1986-01-01
This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.
R. Veenhoven (Ruut)
2014-01-01
markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how much we
Monthoux, Pierre Guillet de; Statler, Matt
2014-01-01
The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specificall...
Guillet de Monthoux, Pierre; Statler, Matt
2017-01-01
The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer's Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specificall...
de Vreese, C.H.; Lecheler, S.; Mazzoleni, G.; Barnhurst, K.G.; Ikeda, K.; Maia, R.C.M.; Wessler, H.
2016-01-01
Political issues can be viewed from different perspectives and they can be defined differently in the news media by emphasizing some aspects and leaving others aside. This is at the core of news framing theory. Framing originates within sociology and psychology and has become one of the most used th
Hall, Marshall
2011-01-01
Includes proof of van der Waerden's 1926 conjecture on permanents, Wilson's theorem on asymptotic existence, and other developments in combinatorics since 1967. Also covers coding theory and its important connection with designs, problems of enumeration, and partition. Presents fundamentals in addition to latest advances, with illustrative problems at the end of each chapter. Enlarged appendixes include a longer list of block designs.
Bertelsen, Olav Wedege; Bødker, Susanne
2003-01-01
the young HCI research tradition. But HCI was already facing problems: lack of consideration for other aspects of human behavior, for interaction with other people, for culture. Cognitive science-based theories lacked means to address several issues that came out of the empirical projects....
Monthoux, Pierre Guillet de; Statler, Matt
2014-01-01
The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...
Random matrix theory within superstatistics.
Abul-Magd, A Y
2005-12-01
We propose a generalization of the random matrix theory following the basic prescription of the recently suggested concept of superstatistics. Spectral characteristics of systems with mixed regular-chaotic dynamics are expressed as weighted averages of the corresponding quantities in the standard theory assuming that the mean level spacing itself is a stochastic variable. We illustrate the method by calculating the level density, the nearest-neighbor-spacing distributions, and the two-level correlation functions for systems in transition from order to chaos. The calculated spacing distribution fits the resonance statistics of random binary networks obtained in a recent numerical experiment.
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
Dentistry and distributive justice.
Dharamsi, Shafik; MacEntee, Michael I
2002-07-01
There is a growing concern in most countries to address the problem of inequities in health-care within the context of financial restraints on the public purse and the realities of health professions that are influenced strongly by the economic priorities of free-market economies. Dental professionals, like other health professionals, are well aware that the public expects oral health-related services that are effective, accessible, available and affordable. Yet, there is remarkably little reference in the literature to the theories of distributive justice that might offer guidance on how an equitable oral health service could be achieved. This paper considers three prominent theories of distributive justice--libertarianism, egalitarianism and contractarianism--within the controversial context of basic care and quality of life. The discussion leads towards a socially responsible, egalitarian perspective on prevention augmented by a social contract for curative care with the aim of providing maximum benefit to the least advantaged in society.
Mathematical game theory and applications
Mazalov, Vladimir
2014-01-01
An authoritative and quantitative approach to modern game theory with applications from diverse areas including economics, political science, military science, and finance. Explores areas which are not covered in current game theory texts, including a thorough examination of zero-sum game.Provides introductory material to game theory, including bargaining, parlour games, sport, networking games and dynamic games.Explores Bargaining models, discussing new result such as resource distributions, buyer-seller instructions and reputation in bargaining models.Theoretical results are presented along
Algebraic Theories and (Infinity,1)-Categories
Cranch, James
2010-01-01
We adapt the classical framework of algebraic theories to work in the setting of (infinity,1)-categories developed by Joyal and Lurie. This gives a suitable approach for describing highly structured objects from homotopy theory. A central example, treated at length, is the theory of E_infinity spaces: this has a tidy combinatorial description in terms of span diagrams of finite sets. We introduce a theory of distributive laws, allowing us to describe objects with two distributing E_infinity stuctures. From this we produce a theory of E_infinity ring spaces. We also study grouplike objects, and produce theories modelling infinite loop spaces (or connective spectra), and infinite loop spaces with coherent multiplicative structure (or connective ring spectra). We use this to construct the units of a grouplike E_infinity ring space in a natural manner. Lastly we provide a speculative pleasant description of the K-theory of monoidal quasicategories and quasicategories with ring-like structures.
Distributional Watson transforms
Dijksma, A.; Snoo, H.S.V. de
1974-01-01
For all Watson transforms W in L2(R+) a triple of Hilbert space LG ⊂ L2(R+) ⊂ L'G is constructed such that W may be extended to L'G. These results allow the construction of a triple L ⊂ L2(R+) ⊂ L', where L is a Gelfand-Fréchet space. This leads to a theory of distributional Watson transforms.
Van Renesse, R
1991-01-01
This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.
Chiu, Huei-Huang
1989-01-01
A theoretical method is being developed by which the structure of a radiation field can be predicted by a radiation potential theory, similar to a classical potential theory. The introduction of a scalar potential is justified on the grounds that the spectral intensity vector is irrotational. The vector is also solenoidal in the limits of a radiation field in complete radiative equilibrium or in a vacuum. This method provides an exact, elliptic type equation that will upgrade the accuracy and the efficiency of the current CFD programs required for the prediction of radiation and flow fields. A number of interesting results emerge from the present study. First, a steady state radiation field exhibits an optically modulated inverse square law distribution character. Secondly, the unsteady radiation field is structured with two conjugate scalar potentials. Each is governed by a Klein-Gordon equation with a frictional force and a restoring force. This steady potential field structure and the propagation of radiation potentials are consistent with the well known results of classical electromagnetic theory. The extension of the radiation potential theory for spray combustion and hypersonic flow is also recommended.
Wilde, Mark M
2017-01-01
Developing many of the major, exciting, pre- and post-millennium developments from the ground up, this book is an ideal entry point for graduate students into quantum information theory. Significant attention is given to quantum mechanics for quantum information theory, and careful studies of the important protocols of teleportation, superdense coding, and entanglement distribution are presented. In this new edition, readers can expect to find over 100 pages of new material, including detailed discussions of Bell's theorem, the CHSH game, Tsirelson's theorem, the axiomatic approach to quantum channels, the definition of the diamond norm and its interpretation, and a proof of the Choi–Kraus theorem. Discussion of the importance of the quantum dynamic capacity formula has been completely revised, and many new exercises and references have been added. This new edition will be welcomed by the upcoming generation of quantum information theorists and the already established community of classical information theo...
Nielsen, M A
1998-01-01
Quantum information theory is the study of the achievable limits of information processing within quantum mechanics. Many different types of information can be accommodated within quantum mechanics, including classical information, coherent quantum information, and entanglement. Exploring the rich variety of capabilities allowed by these types of information is the subject of quantum information theory, and of this Dissertation. In particular, I demonstrate several novel limits to the information processing ability of quantum mechanics. Results of especial interest include: the demonstration of limitations to the class of measurements which may be performed in quantum mechanics; a capacity theorem giving achievable limits to the transmission of classical information through a two-way noiseless quantum channel; resource bounds on distributed quantum computation; a new proof of the quantum noiseless channel coding theorem; an information-theoretic characterization of the conditions under which quantum error-cor...
Stein, Irene F.; Stelter, Reinhard
2011-01-01
Communication theory covers a wide variety of theories related to the communication process (Littlejohn, 1999). Communication is not simply an exchange of information, in which we have a sender and a receiver. This very technical concept of communication is clearly outdated; a human being...... is not a data processing device. In this chapter, communication is understood as a process of shared meaning-making (Bruner, 1990). Human beings interpret their environment, other people, and themselves on the basis of their dynamic interaction with the surrounding world. Meaning is essential because people...... ascribe specific meanings to their experiences, their actions in life or work, and their interactions. Meaning is reshaped, adapted, and transformed in every communication encounter. Furthermore, meaning is cocreated in dialogues or in communities of practice, such as in teams at a workplace or in school...
Helms, Lester L
2014-01-01
Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...
Hashiguchi, Koichi
2014-01-01
This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...
2015-01-01
A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.
MOLECULAR DESCRIPTION OF ELECTROLYTE SOLUTION IN A CARBON AEROGEL ELECTRODE
A.Kovalenko
2003-01-01
Full Text Available We develop a molecular theory of aqueous electrolyte solution sorbed in a nanoporous carbon aerogel electrode, based on the replica reference interaction site model (replica RISM for realistic molecular quenched-annealed systems. We also briefly review applications of carbon aerogels for supercapacitor and electrochemical separation devices, as well as theoretical and computer modelling of disordered porous materials. The replica RISM integral equation theory yields the microscopic properties of the electrochemical double layer formed at the surface of carbon aerogel nanopores, with due account of chemical specificities of both sorbed electrolyte and carbon aerogel material. The theory allows for spatial disorder of aerogel pores in the range from micro- to macroscopic size scale. We considered ambient aqueous solution of 1 M sodium chloride sorbed in two model nanoporous carbon aerogels with carbon nanoparticles either arranged into branched chains or randomly distributed. The long-range correlations of the carbon aerogel nanostructure substantially affect the properties of the electrochemical double layer formed by the solution sorbed in nanopores.
Diestel, Reinhard
2017-01-01
This standard textbook of modern graph theory, now in its fifth edition, combines the authority of a classic with the engaging freshness of style that is the hallmark of active mathematics. It covers the core material of the subject with concise yet reliably complete proofs, while offering glimpses of more advanced methods in each field by one or two deeper results, again with proofs given in full detail. The book can be used as a reliable text for an introductory course, as a graduate text, and for self-study. From the reviews: “This outstanding book cannot be substituted with any other book on the present textbook market. It has every chance of becoming the standard textbook for graph theory.”Acta Scientiarum Mathematiciarum “Deep, clear, wonderful. This is a serious book about the heart of graph theory. It has depth and integrity. ”Persi Diaconis & Ron Graham, SIAM Review “The book has received a very enthusiastic reception, which it amply deserves. A masterly elucidation of modern graph theo...
Friedrich, Harald
2016-01-01
This corrected and updated second edition of "Scattering Theory" presents a concise and modern coverage of the subject. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. The book contains sections on special topics such as near-threshold quantization, quantum reflection, Feshbach resonances and the quantum description of scattering in two dimensions. The level of abstraction is k...
An intermediate distribution between Gaussian and Cauchy distributions
Liu, Tong; Dai, Wu-Sheng; Xie, Mi
2012-01-01
In this paper, we construct an intermediate distribution linking the Gaussian and the Cauchy distribution. We provide the probability density function and the corresponding characteristic function of the intermediate distribution. Because many kinds of distributions have no moment, we introduce weighted moments. Specifically, we consider weighted moments under two types of weighted functions: the cut-off function and the exponential function. Through these two types of weighted functions, we can obtain weighted moments for almost all distributions. We consider an application of the probability density function of the intermediate distribution on the spectral line broadening in laser theory. Moreover, we utilize the intermediate distribution to the problem of the stock market return in quantitative finance.
The theory of electromagnetism
Jones, D S
1964-01-01
The Theory of the Electomagnetism covers the behavior of electromagnetic fields and those parts of applied mathematics necessary to discover this behavior. This book is composed of 11 chapters that emphasize the Maxwell's equations. The first chapter is concerned with the general properties of solutions of Maxwell's equations in matter, which has certain macroscopic properties. The succeeding chapters consider specific problems in electromagnetism, including the determination of the field produced by a variable charge, first in isolation and then in the surface distributions of an antenna. The
Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.
2010-01-01
A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re
Distributed Decision Making and Control
Rantzer, Anders
2012-01-01
Distributed Decision Making and Control is a mathematical treatment of relevant problems in distributed control, decision and multiagent systems, The research reported was prompted by the recent rapid development in large-scale networked and embedded systems and communications. One of the main reasons for the growing complexity in such systems is the dynamics introduced by computation and communication delays. Reliability, predictability, and efficient utilization of processing power and network resources are central issues and the new theory and design methods presented here are needed to analyze and optimize the complex interactions that arise between controllers, plants and networks. The text also helps to meet requirements arising from industrial practice for a more systematic approach to the design of distributed control structures and corresponding information interfaces Theory for coordination of many different control units is closely related to economics and game theory network uses being dictated by...
Nason, Paolo
2016-01-01
I review few selected topics on recent theoretical progress in top physics. In particular I will discuss recent progress in the computation of the relationship between the MS-bar and pole top mass, in the NNLO calculation of top differential distributions, and in the simulation of top production and decays. Implications for top mass measurements will be discussed.
Chen, Wenjie
2016-01-01
During the last decade, the importance of “last mile delivery” has become increasingly prominent due to the booming development of e-commerce. Logistics services have become the bottleneck of the development of e-commerce in China. This thesis focuses on the research of self-built and outsourced logistics distribution system. The main research objective is to identify why two similar e-commerce companies have chosen different logistics distribution system In this study, two e-commer...
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Adsorption theory for polydisperse polymers.
Roefs, S.P.F.M.; Scheutjens, J.M.H.M.; Leermakers, F.A.M.
1994-01-01
Most polymers are polydisperse. We extend the self-consistent field polymer adsorption theory due to Scheutjens and Fleer to account for an arbitrary polymer molecular weight distribution with a cutoff chain length Nmax. In this paper, the treatment is restricted to homopolymers. For this case a ver
Investigating School Leadership Practice: A Distributed Perspective.
Spillane, James P.; Halverson, Richard; Diamond, John B.
2001-01-01
Argues for scholarship that investigates leadership practice; specifically, the practice of leading classroom instruction. Articulates a distributed perspective, grounded in activity theory and distributed cognition, to frame such investigations. Suggests that school leadership is best understood as a distributed practice stretched over the…
A Field Theory with Curvature and Anticurvature
M. I. Wanas
2014-01-01
Full Text Available The present work is an attempt to construct a unified field theory in a space with curvature and anticurvature, the PAP-space. The theory is derived from an action principle and a Lagrangian density using a symmetric linear parameterized connection. Three different methods are used to explore physical contents of the theory obtained. Poisson’s equations for both material and charge distributions are obtained, as special cases, from the field equations of the theory. The theory is a pure geometric one in the sense that material distribution, charge distribution, gravitational and electromagnetic potentials, and other physical quantities are defined in terms of pure geometric objects of the structure used. In the case of pure gravity in free space, the spherical symmetric solution of the field equations gives the Schwarzschild exterior field. The weak equivalence principle is respected only in the case of pure gravity in free space; otherwise it is violated.
Joan Robinson and economic theory
A. ASIMAKOPULOS
2013-12-01
Full Text Available Joan Robinson’s interest in the question of the distribution of income and her disdain for what she considered to be theories that tried to justify existing distributions of income never flagged. Her work is marked by a strong inclination for clear, well reasoned arguments that left no room for sloppy habits of thought. The wide scope and quantity of Robinson’s writings make it difficult to present a critical evaluation of her contributions within the context of even a lengthy paper. The present one concentrates on her writings in five main areas: (i the economics of imperfect competition; (ii the theory of employment; (iii the theory of accumulation in the long run; (iv the concept of capital and the production function; and (v the problem of time in economics as reflected in her writings on the theme history versus equilibrium.
Blyth, T S; Sneddon, I N; Stark, M
1972-01-01
Residuation Theory aims to contribute to literature in the field of ordered algebraic structures, especially on the subject of residual mappings. The book is divided into three chapters. Chapter 1 focuses on ordered sets; directed sets; semilattices; lattices; and complete lattices. Chapter 2 tackles Baer rings; Baer semigroups; Foulis semigroups; residual mappings; the notion of involution; and Boolean algebras. Chapter 3 covers residuated groupoids and semigroups; group homomorphic and isotone homomorphic Boolean images of ordered semigroups; Dubreil-Jacotin and Brouwer semigroups; and loli
Diestel, Reinhard
2012-01-01
HauptbeschreibungThis standard textbook of modern graph theory, now in its fourth edition, combinesthe authority of a classic with the engaging freshness of style that is the hallmarkof active mathematics. It covers the core material of the subject with concise yetreliably complete proofs, while offering glimpses of more advanced methodsin each field by one or two deeper results, again with proofs given in full detail.The book can be used as a reliable text for an introductory course, as a graduatetext, and for self-study. Rezension"Deep, clear, wonderful. This is a serious book about the
2009-01-01
This book deals with the basic subjects of design theory. It begins with balanced incomplete block designs, various constructions of which are described in ample detail. In particular, finite projective and affine planes, difference sets and Hadamard matrices, as tools to construct balanced incomplete block designs, are included. Orthogonal latin squares are also treated in detail. Zhu's simpler proof of the falsity of Euler's conjecture is included. The construction of some classes of balanced incomplete block designs, such as Steiner triple systems and Kirkman triple systems, are also given.
Goldie, Charles M
1991-01-01
This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas.
Merris, Russell
2001-01-01
A lively invitation to the flavor, elegance, and power of graph theoryThis mathematically rigorous introduction is tempered and enlivened by numerous illustrations, revealing examples, seductive applications, and historical references. An award-winning teacher, Russ Merris has crafted a book designed to attract and engage through its spirited exposition, a rich assortment of well-chosen exercises, and a selection of topics that emphasizes the kinds of things that can be manipulated, counted, and pictured. Intended neither to be a comprehensive overview nor an encyclopedic reference, th